Articles on DiGyt's Bloghttps://blogs.python-gsoc.orgUpdates on different articles published on DiGyt's BlogenMon, 26 Aug 2019 08:17:46 +0000Week 13: Weekly Check-In (#7) - Last Check-Inhttps://blogs.python-gsoc.org/en/digyts-blog/week-13-weekly-check-in-7-last-check-in/<p><strong>1. What did you do this week?</strong></p> <p>As described in my previous post, I spent last week doing some smaller corrections on my Pull Requests. The biggest amount of work was dedicated to allowing fast and memory saving computation to the tfr_stockwell function, the last function where I haven't implemented this yet. Principally, most of this went similar to the other functions, i.e. I had to create an alternative function which computes stuff separately for lists of SourceEstimate objects, and is capable of handling input from generator objects. However, the tricky and time consuming part was again to make sure the data fields were completely equal.<br> Another step last week was to change the examples I created, from one example to three example that cover the diffferent TFR functions and SourceEstimate types which can be processed equally.<br> Finally I did some smaller commits, correcting some stuff that my reviewers mentioned could be made better.</p> <p><strong>2. What is coming up next?</strong></p> <p>Well, as my project is finished now, and all of the important functional stuff has been implemented, I will only spend my time working on review corrections, in order to get everything merged into master.<br> Concerning the extended plotting that I mentioned over the last blog posts, I will probably do a bigger independent PR, that can enhance plotting functionality and modularity.</p> <p><strong>3. Did you get stuck anywhere?</strong></p> <p>Yes. As I already mentioned I first had problems with making the data fields completely equivalent, when implementing a time/memory saving version of tfr_stockwell.<br> But probably even more annoying (because I didn't expect it) was trying to erase Errors when submitting the freshly made examples. I rewrote the exampels first to a MNE-testing dataset which contained real neurophysiological data. Then, when submitting it, noticed that my version of the testing data was outdated (the respective dataset has been revised for another MNE-Python GSoC project running this summer). So I had to adapt the respective file paths again, which would've been no problem at all, if one of the files that I needed for one of my examples (a trans.fif file) wouldn't have been removed from the dataset. So this resulted in trying various solutions to make things work again, until I finally decided to change the example and make it run on a different dataset, where all needed files were accessible.<br> So next time I'll be using testing data, I'll definitely will make sure to update my testing data folder first.<br> <br> So this was the last regular report on my GSoC project, and I hope that you've found it an interesting thing to read. As you might have noticed from reading, I've definitely learned alot of things during the project (probably a consequence of doing a lot of mistakes during the project), but I'm glad that I could really notably enhance my coding skills during this summer.<br> From now (and after having all the stuff from the project entirely merged), I will still try to stay involved in MNE, so I hope that this won't be the last thing that you'll hear from me.<br> <br> Finally I want to say thanks to everyone who participated in this Google Summer of Code project with me - from my mentors to all reviewers to the people from the Salzburg Brain Dynamics lab to finally you, the reader of my blog.<br> Thank's to everyone and have a good time - hopefully profiting from my works on MNE-Python this Google Summer of Code ;).<br> <br> Cheers!</p> <p><br> Dirk</p>s1041134@stud.sbg.ac.at (DiGyt)Mon, 26 Aug 2019 08:17:46 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-13-weekly-check-in-7-last-check-in/Week 12: Blog Post (#6)https://blogs.python-gsoc.org/en/digyts-blog/week-12-blog-post-6/<p>Last week, I worked on several steps to finalize my project.<br> Some work was put into correcting and improving my pull requests, so they can be merged soon. I created an example file to demonstrate the useability of this project, and opened up a new pull request for it (next to another PR, that introduces the plotting support I implemented the previous week).<br> As mentioned in my last blog post, I also thought about an alternative solution, i.e. to create a genuine plotting function for SourceTFR. But as I already suspected, this will not work without a lot of additional work. I originally intended to introduce a new type of plot derived from MNE's "TimeViewer", where you can not only skim along the time axis through the plotting GUI, but also along the frequency (and maybe the epochs) axis. After trying out some stuff, my best intuition to reaching this goal would be to simplify the "TimeViewer" class, and make it some kind of a "AxisViewer" class. This class would allow a developer to decide which axis (or even multiple axes) should be manipulable through the GUI. Yet, this would be only a part of the work, as this does not include the actual plotting function used by SourceTFR, and also would only work for the plots made on surface type source level data (since volume type source level data employs an entirely different plotting GUI). In my opinion, this functionality should rather be added later, so I can now concentrate on the finishing touches of the rest of the project.<br> One such finishing touch is for example time and memory saving computation of tfr_stockwell, which is for now only available for tfr_morlet and tfr_multitaper. I already made attempts to tackle this problem, and currently try to make it pass the equivalence tests. But this will definitely be one of the things to do this week, as it will have big implications on the practical useability of the function.</p> <p>Another thing I worked on last week (and will continue to work on this week) was the project page where I'll upload my final results next week. I decided to create a GitHub gist for this purpose, since it is simple and good looking at the same time. Most of the gist is already finished, only the links to the PRs and some direct examples will need to be added there. So I hope, everything will work out this week, and I'll be able to show you a nice conclusion on my Google Summer of Code project until next week.<br> <br> So, for the last time (this summer?!): Stay tuned!</p>s1041134@stud.sbg.ac.at (DiGyt)Mon, 19 Aug 2019 19:41:51 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-12-blog-post-6/Week 11: Weekly Check-In (#6)https://blogs.python-gsoc.org/en/digyts-blog/week-11-weekly-check-in-6/<div class="lead"> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-375 cms-render-model"><strong>1. What did you do this week?</strong></p> <p>After sending out my Pull Request to the MNE master branch, I spent a part of my time correcting the PR according to the feedback I got. This included some fixes and refactoring, to make the code more compact and readable (thanks to my tutor Joan Massich, who gave some great input).<br> Then, I implemented a fix for the problem I mentioned in last week's post, where the processing on memory saving data (in form of two factor arrays of kernel and sensor data) wasn't supported anymore. With my current fix, the time consuming time-frequency transform will be performed only on the smaller data, but then transformed into the full data field within the function. This means the full data will be constructed earlier as originally intended. However, I believe that this is the best option, since the other alternative would have involved a lot of data hauling from the tfr functions to the SourceTFR objects, while ultimately achieving the same thing.<br> Finally I made a first simple try to plot the SourceTFR data. This basically works by telling the plot functions which epoch and which frequencies should be plotted, and then using the plot functions of normal source time series for visualization. This might sound like a hack, but it's simple, intuitive, and easy to work with.</p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-375 cms-render-model"><strong>2. What is coming up next?</strong></p> <p>After the first option to plot, I will think about a good method to make a genuine SourceTFR plot. This could for example involve manually switching over the frequencies (and maybe epochs) in the same manner, as you can switch over normal SourceEstimate plots when using the Time Viewer. However, I realize that building support for this on all SourceEstimate types will result in a huge load of work (which might be a bad move, given that there's already a working alternative, and only very little time left for the project). So I'll discuss this with my mentors in order to figure out what's the best thing to do now.<br> Next to this, I will of course continue with the code correction for the Pull Requests, and maybe start to write a nice tutorial in order to demonstrate the new functionality after the project.</p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-375 cms-render-model"><strong>3. Did you get stuck anywhere?</strong></p> <p>Not really. Of course there were some small obstacles (as there always are), but in general this week went exceptionally smooth.</p> </div>s1041134@stud.sbg.ac.at (DiGyt)Mon, 12 Aug 2019 19:18:23 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-11-weekly-check-in-6/Week 10: Blog Post (#5)https://blogs.python-gsoc.org/en/digyts-blog/week-10-blog-post-5/<p>Last week, I finished the main functional goal of this GSoC project, by making tfr_functions compute all the things they should compute correctly. It even was possible to allow tfr_morlet and tfr_multitaper to process lists and generator objects subsequently, which is quite important in order to save memory. However, while doing so, I realized that this will basically scrap all the support I introduced for supporting kernelized data (the other memory saving trick I used, where memory is stored in two small arrays, which can be combined by taking the dot product). So this will be a point to work on again, however, this will probably more look like some dirty fix, as I really don't see any elegant solution to this at the moment.<br> Another thing I can finally start to work on now will be plotting. To be honest, I'm kind of excited to work on graphical/ user interface stuff, since I often experienced that the implementation of these things is quite fun (i.e. if they work, they work ;) ). But we'll see how that turns out. The other thing to do will be to write some nice tutorial, showing off the newly gained functionality (which I think will also be fun).<br> So, it's time to start the final sprint of the project, and I think I'm prepared for it. Stay tuned!</p>s1041134@stud.sbg.ac.at (DiGyt)Tue, 06 Aug 2019 06:15:02 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-10-blog-post-5/Week 9: Weekly Check-In (#5)https://blogs.python-gsoc.org/en/digyts-blog/week-9-weekly-check-in-5/<p><strong>1. What did you do this week?</strong></p> <p>A lot of different stuff:</p> <p>- Introduced some small tests to make sure the multitaper and stockwell functions do what they should to.<br> - Made tfr_stockwell catch up with tfr_morlet and tfr_multitaper.<br> - Pushed a smaller PR that gives the user an option to return SourceEstimates data as kernelized (i.e. memory saving).<br> - Made tfr_multitaper and tfr_morlet take lists and generators as input, a crucial further step that allows the Inter-Trial-Coherence to be calcualted (how good this works remains to be checked, however).</p> <p><strong>2. What is coming up next?</strong></p> <p>We'll need to see how good the IRT stuff works and finish all of it. When everythings done, the core stuff of this GSoC project will be finished. This will involve pushing a quite big chuck of code, reviewing and correcting it. Then I'll make sure the same stuff works for tfr_stockwell as well. Finally I might start this week with the implementation of the last functional aspect, which is plotting the data.</p> <p><strong>3. Did you get stuck anywhere?</strong></p> <p>Yes. When introducing lists into tfr functions, the perfect solution would have been to process each "epoch" or element of the list subsequently, in order to save data. After trying to implement this (and getting it to work for tfr_morlet), I found out that if this was to be done cleany and working for all functions, it would require a huge recstructuring of the tfr_functions, that might have grave consequences for other aspects, e.g. parallel computing of the functions. Therefore I had to start of with a bit of a poorer solution, where the data is simply concatenated together at the start of the function, and then treated very much like epochs data. This will increase memory usage. However, I hope that I can implement a memory saving solution at some later point.</p>s1041134@stud.sbg.ac.at (DiGyt)Tue, 30 Jul 2019 07:29:59 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-9-weekly-check-in-5/Week 8: Blog Post (#4)https://blogs.python-gsoc.org/en/digyts-blog/week-8-blog-post-4/<p>Last week, I was working on the parametrizing tests for SourceEstimates with tfr_morlet, and making sure the tests are also passed for all possible different functional arguments. Overall this worked quite smoothly for most parameters. In fact, I was already able to start implementing the same procedure for tfr_multitaper. Anyhow, for tfr_multitaper there is no reference function as for tfr_morlet, so tests for tfr_multitaper are for now basically just running the function and checking that everything is in the right place.</p> <p>However, some of the parameters still were causing problems. By far the most time this week was spent to work on tfr_morlet processing data from 'vector' representations (i.e. instead of one virtual "sensor" in source space, there are 3 virtual "sensors" created, with each one representating one directional activation axis. Besides adding a further dimension (which was no big problem to adapt to), the data fields consistently showed a slight difference (of about 2-4%).<br> After noticing that for this case, the results are theoretically not expected to be the same for the two procedures taken in the functions, I'll have to find a case where a comparison is possible between the two.</p> <p>So this is one thing i'm going to do the following week. But mainly, I'm going to continue covering cases for the tfr_multitaper function, as well as starting with the last function, tfr_stockwell.<br> Another think to possibly look at now is a further option of the functions, which is to additionally return the inter-trial coherence of the time-frequency transforms.</p>s1041134@stud.sbg.ac.at (DiGyt)Mon, 22 Jul 2019 20:20:47 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-8-blog-post-4/Week 7: Weekly Check-In (#4)https://blogs.python-gsoc.org/en/digyts-blog/week-7-weekly-check-in-4/<div class="lead"> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model cms-plugin-aldryn_newsblog-article-lead_in-173"><strong>1. What did you do this week?</strong></p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model cms-plugin-aldryn_newsblog-article-lead_in-173">This week i finally got my first case of equivalence between the new SourceTFR API and the old source_induced_power function. As it turned out, the unequal results were not actually a problem of the code, but instead resulted from a tiny bug in the source_induced_power function, which i use to test my project. It resulted from a parameter not properly passed, which caused different wavelet functions to return different time_frequency transforms...</p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model cms-plugin-aldryn_newsblog-article-lead_in-173"><strong>2. What is coming up next?</strong></p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model cms-plugin-aldryn_newsblog-article-lead_in-173">Now, the first thing i'll do is to introduce fully parametrized tests to make sure that the wavelet function works for all cases of SourceEstimates and all cases of parameters. If i get this through fast, i might even start to do the same thing for multitapers.</p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model cms-plugin-aldryn_newsblog-article-lead_in-173"><strong>3. Did you get stuck anywhere?</strong></p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model cms-plugin-aldryn_newsblog-article-lead_in-173">Yeah, well: If you have to analyze and compare each line between two different (and heavily nested) functions, until you notice that a rather unrelated part is causing the actual problem, i wouldn't exactly call that 'getting stuck'. Anyhow, it results in the same time loss ;)</p> </div>s1041134@stud.sbg.ac.at (DiGyt)Mon, 15 Jul 2019 18:52:22 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-7-weekly-check-in-4/Week 6: Blog Post (#3)https://blogs.python-gsoc.org/en/digyts-blog/week-6-blog-post-3/<p>This week i was doing several smaller things:</p> <p>- Added some more commits to the read_curry function (which in my opinion is definitely ready to be merged now).<br> - Enhanced tests for the SourceTFR class, to push coverage to 89%.<br> - Further tried to get tfr_morlet() and source_induced_power() to produce a 100% equal data field.<br> <br> For the last part i faced several problems. Often, the two named functions are not entirely equivalent. For Example, when SourceEstimates are created, the vector length is calculated from the x,y, and z components of each dipole. This is done with the plain old norm = sqrt(x**2 + y **2 + z **2). Since ready-made SourceEstimates are fed to TFR functions, this step is already performed when doing the time-frequency transform. Contrarily, when using source_induced_power, the norm is calculated after the TFR function. Since tfr(norm(data)) and norm(tfr(data)) are not equivalent in this case, i had to use VectorSourceEstimates instead. For VectorSourceEstimates, the x y and z components are returned separately and allow separate data handling.<br> Although the data looks more similar when performing TFR functions for each component of the VectorSourceEstimate and <em>then</em> calculating the norm, there are still various other steps that may cause slight differences in the data. After trying to handle some of them i decided for my next steps to go through the entire functions from start to end, and check where they vary in calculations.<br> <br> So, that's going to be an intensive task, but i hope it will work out fine.</p> <p><br> Stay tuned and see you next week!</p>s1041134@stud.sbg.ac.at (DiGyt)Mon, 08 Jul 2019 16:25:41 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-6-blog-post-3/Week 5: Weekly Check-In (#3)https://blogs.python-gsoc.org/en/digyts-blog/week-5-weekly-check-in-3/<p><strong>1. What did you do t<span style="font-size: 12px;">his week</span>?</strong></p> <p><span style="font-size: 14px;"><span style=""><span style="line-height: 100%;"><font face="Calibri, sans-serif">Unfortunately, the read-curry-PR is still not done, so I had to focus on this problem once again. However, i still had a little bit time to focus on the main project besides. There, i started to introduce more structured tests for both, multitaper and morlet transformed source level data. </font></span></span></span></p> <p><strong>2. What is coming up next?</strong></p> <p><span style="font-size: 14px;"><span style=""><span style="line-height: 100%;"><font face="Calibri, sans-serif">I just REALLY hope that the are no more things in need of change with the currently running read curry PR and I finally can focus entirely on the main task. </font></span></span></span></p> <p><strong>3. Did you get stuck anywhere?</strong></p> <p><span style="font-size: 14px;"><span style=""><span style="line-height: 100%;"><font face="Calibri, sans-serif">There were some wishes to refactor the testing suite for mor clarity (which is of course a good thing), but also some needs of additional coverage. Additional coverage, however, in this case meant additional testing data. This testing data wouldn‘t be a problem if it wasn‘t a relatively time-consuming process. Testing data must need to be added in the testing repository, and then after adding, version and download hash have to be adapted in the main repository, before being able to commit the changes that would rely on this data. Of course, its important to maintain a stable repo, but as said, it can really stretch out the developing process. </font></span></span></span></p>s1041134@stud.sbg.ac.at (DiGyt)Mon, 01 Jul 2019 20:06:04 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-5-weekly-check-in-3/Week 4: Blog Post (#2)https://blogs.python-gsoc.org/en/digyts-blog/week-4-blog-post-2/<p>This week, i was mainly focussing on another Pull Request, which allows MNE to read in data created by an external program, Curry Neuroimaging Suite.<br> This pull request was lingering on for quite some time (i already mentioned it in Blog Post #2), anyhow there was little progress because of a tough problem concerning the testing data.<br> After contacting one of Curry's developers we could eliminate the implicated problem for our Curry reader. Since my mentor asked me to focus on the curry issue first before progressing,<br> i worked to get this issue done this week so that i can have my mind entirely set on the Source Space TFR from now on. The PR is now about to be merged and i hope i'll have some great progress on the Source TFR stuff next week.</p>s1041134@stud.sbg.ac.at (DiGyt)Sun, 23 Jun 2019 23:23:54 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-4-blog-post-2/Week 3: Weekly Check-In (#2)https://blogs.python-gsoc.org/en/digyts-blog/week-3-weekly-check-in-2/<p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model"><strong>1. What did you do this week?</strong></p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model">This week i tried to adapt the computation of the `tfr_multitaper` function to give out the correct data field when feeding it with `SourceEstimate` objects.<br> Later, i switched to adapt the `tfr_morlet` function in the same way i did with `tfr_multitaper`, since i noticed that this would be a better place to start in terms of comparability (There's an already premade function existing for morlet wavelet transforms in source space to which i can compare my work, while for multitaper, there exists none).</p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model"><strong>2. What is coming up next?</strong></p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model">I'll try to create exact results from the morlet data now. it already looks very good (let's say the data fields are about 80 - 90 % similar, however there are still some small differences i haven't figured out yet).</p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model"><strong>3. Did you get stuck anywhere?</strong></p> <p class="cms-plugin cms-plugin-aldryn_newsblog-article-lead_in-92 cms-render-model">Yes. When getting `tfr_multitaper` to read `SourceEstimates`, i thought that testing the data on artificially created frequency data seemed a good idea. However, i noticed that this is not an optimal solution to test, especially since it's better to use valid `SourceEstimate` data. This was also the reason i switched to working on  `tfr_morlet` before getting `tfr_multitaper` done entirely.</p>s1041134@stud.sbg.ac.at (DiGyt)Wed, 19 Jun 2019 06:36:46 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-3-weekly-check-in-2/Week 2: Blog Post (#1)https://blogs.python-gsoc.org/en/digyts-blog/week-2-blog-post-1/<p>Last week, i got the first `SourceTFR` draft far enough to actually use it for TFR functions, which was the other main task this week.<br> The process was more or less like force feeding this alien object into a function constantly complaining, checking out every error it makes and trying to correct it. Yay!<br> The main task for now was pretty much juggling data fields around, until everything's fine and introducing new if cases for the `SourceEstimate` data types.<br> I don't really expect the function to produce correct time-frequency transforms at this point, but i think i'm getting closer and closer.</p> <p>Unfortunately, i also had to work on another MNE pull request last week which cost me A LOT of time, yet i wasn't able to make much progress on it.<br> So this week, i'm going to focus more on the GSoC project, which is of course the more important thing to work on.<br> <br> So for this week, I'm going to continue working on the multitaper TFR function and hope to produce some first results of time frequency transforms.</p>s1041134@stud.sbg.ac.at (DiGyt)Mon, 10 Jun 2019 14:22:17 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-2-blog-post-1/Week 1: Weekly Check-In (#1)https://blogs.python-gsoc.org/en/digyts-blog/week-1-weekly-check-in-1/<p><strong>1. What did you do this week?</strong></p> <p>This week i started creating a `SourceTFR` class to work with, as well as making some first tries to feed `SourceEstimates` into mutlitaper the TFR function.</p> <p><strong>2. What is coming up next?</strong></p> <p>I'll continue to work on the `SourceTFR` data, up to a point where i can use them for the TFR functions (that means plotting might not yet be involved). Then i'll try to make an entire multitaper function work for one case of `SourceEstimate`.</p> <p><strong>3. Did you get stuck anywhere?</strong></p> <p>As expected, there were several walls i ran into. One Aspect is of course the difference of dimensionality and type of data that `SourceEstimates` can contain, but are not easily adapted to time frequency functions. Another type of problem certainly is to make sure the function works correctly and works well for all kinds of tunable parameters (which is of course a big part of the goal of this project).</p>s1041134@stud.sbg.ac.at (DiGyt)Mon, 03 Jun 2019 18:57:20 +0000https://blogs.python-gsoc.org/en/digyts-blog/week-1-weekly-check-in-1/First Stepshttps://blogs.python-gsoc.org/en/digyts-blog/first-steps/<p>This week i started creating a `SourceTFR` class to work with, as well as making some first tries to feed `SourceEstimates` into mutlitaper the TFR function.<br> It already looks like it's going to be a lot of work to do, not only because new types of data need to be feed into the functions, but also concerning the time-frequency transforms themselves.<br> The main motivation behind this project is to allow users to be able to handle time-frequency transformed source space data with as little restrictions as possible, so I'll try to make sure all of these options are covered well for source data (i.e. all the options that are theoretically possible). But that surely will require a lot of work to be done.</p> <p>So, I'll keep on making small steps, working my way through the multitaper functions and then later make sure it works nicely for every case.</p> <p>See you next week,<br> Dirk</p>s1041134@stud.sbg.ac.at (DiGyt)Sun, 02 Jun 2019 19:03:16 +0000https://blogs.python-gsoc.org/en/digyts-blog/first-steps/GSoC: Let's get this started.https://blogs.python-gsoc.org/en/digyts-blog/gsoc-let-s-get-this-started/<p>Hey there and welcome to my Google Summer of Code blog!</p> <p>For the next months, I will be working on improving Time-Frequency Analysis for source space projected data in MNE-Python.</p> <p>After having set up my developing environment as well as working on some unrelated commits, I'm currently making the first steps for the actual project. Over the last weeks, me and some of the MNE members were discussing a more precise outline for the kind of data container object, in which we would like to store our newly created time-frequency transformed source data. The result of this discussion was a pretty cool type of container, that would store source space data in a very memory efficient way, which at the same time could drastically reduce the time needed for computing a time-frequency transform on it. However, creating such a class could even be made bigger, as to include <em>any</em> kind of `SourceEstimate` data type, and might lead to a significant restructuring of the entire source space API. While this task is probably too large to be handled within one single GSoC project, I'm currently starting out on creating a kind of a 'draft' class, which for now will only be used for the `SourceTFR` objects that i'll create during my project.<br> <br> So there's a lot of interesting stuff coming up right now and I'm very excited to find out how it's going to turn out...<br> <br> Stay tuned,<br> Dirk</p>s1041134@stud.sbg.ac.at (DiGyt)Tue, 28 May 2019 20:01:53 +0000https://blogs.python-gsoc.org/en/digyts-blog/gsoc-let-s-get-this-started/