Articles on jadrew43's Bloghttps://blogs.python-gsoc.orgUpdates on different articles published on jadrew43's BlogenMon, 29 Aug 2022 22:25:25 +0000Week 12 (August 29) Blog Posthttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-12-august-29-blog-post/<p>Last week was successful in including all the necessary steps of performing Principal Component Analysis. The first issue was that the covariance matrix needed to be scaled to account for the SNR boosting step in which 3 epochs were averaged into a single epoch, therefore the covariance matrix needed to be divided by 3. Then I had to ensure that the average epochs were being processed in the same manner that my original epochs were processed. Next, a projection operator had to be applied to the forward matrix according to the epochs.info structure. The API is working successfully and re-producing the ground truth as produced from my command line model, within a small level of tolerance. However, the data scaling function that I had gotten to work earlier in the summer, when I was not using a ground truth to check my work, is not working now. I recall being able to reproduce the scaling function using MNE functions, so this week will focus on replacing that function with MNE functions. Then I will work to publish this script to MNE-Python's example data scripts for use of the functional connectivity model.<br> <br> Although I only completed 1 of 2 major contributions that were intended for this summer, I learned a ton in regards to the MNE-Python functions, collaborations using Github, building objects and classes, and most importantly to use a ground truth when converting code into a new environment to ensure that each step is being transformed correctly and the frequent use of sanity checks. </p>jadrew43@uw.edu (jadrew43)Mon, 29 Aug 2022 22:25:25 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-12-august-29-blog-post/Week 11 (August 22) Check-inhttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-11-august-22-check-in/<p>The first step of the API, simply loading the epoch data from a folder, was successful. I was also successful in implementing the SNR boost step which works to average a small number of epochs in order to increase the SNR per epoch input in to the model. I confirmed that the bootstrapped epoch output from the in-progress API was equivalent to the bootstrapped epoch output from the command line model. Then I got stuck on the PCA step. I first discovered that the epochs were being trimmed before the bootstrap step in my API, and after bootstrapping in the command line model. I made it so these were the same but the PCA step is producing a different number of principal components in the API than in command line. The remainder of the inputs seem to be the same but I will investigate deeper this week. After confirming the PCA output is the same, I will fit the model. </p>jadrew43@uw.edu (jadrew43)Mon, 22 Aug 2022 19:42:34 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-11-august-22-check-in/Week 10 (August 15) Blog posthttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-10-august-15-blog-post/<p>Last week was successful in altering the pre-processing steps so that the model would process data in regards to the 4 labels included with the sample dataset instead of the 12 labels I had used for a previous dataset in past usage of the model. The model also ran successfully in the conda environment used by MNE-Python which includes several dependencies specific to MNE-Python. I had to add a single dependency used for calculating derivatives and performing back propagation. Moving forward, I will begin to reconstruct the API I had working in week 8 one step at a time and ensuring that I can compute the same output after each processing stop. The first step is an SNR boosting step in which a small number of epochs (per experimental condition) are averaged together in order de-noise the data. Once I check that the averaged epochs are the same from my API output compared to my original model output (computed strictly using command line), I can move forward toward computing PCA on the data. I will ensure that the same number of principal components are computed from my API and command line, and again, ensure from this step that I can produce the same output as produced from my model in command line.</p>jadrew43@uw.edu (jadrew43)Mon, 15 Aug 2022 20:26:50 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-10-august-15-blog-post/Week 9 (August 8) Check-inhttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-9-august-8-check-in/<p>Last week I was able to re-structure the sample dataset so that it would work with my original connectivity model. The model currently outputs connectivity measures for the 12 regions of interest used in my PhD connectivity project. This week I will work to get this model to work in the conda environment that MNE-Python uses to develop their tools. The following step will be to get the model to work with the 4 regions of interest used with the sample dataset from MNE. Then I can re-create the API I constructed over the last several weeks while ensuring that my output is as expected with each iterative step.</p>jadrew43@uw.edu (jadrew43)Mon, 08 Aug 2022 20:46:13 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-9-august-8-check-in/Week 8 (August 1) Blog Posthttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-8-august-1-blog-post/<p>Last week was challenging. I was able to successfully change the scaling data function so that it incorporated the MNE functions to accomplish the same goal. The challenge arose when processing the 16 subjects' data that was meant to act as a ground truth to compare the model's output to an expectation. The output is currently pretty far from the expectation. I, more or less, need to start from the beginning and check step-by-step that each part of the model is producing something that I would expect. The biggest challenge being that it's so far in the summer and I am concerned I won't be able to get to the second part of my project. My plan for this week is to run the sample dataset, frequently used on the MNE-Python website to be used in examples, on my already existing pipeline which uses primarily command line and configuration files to process data. Once that is working I can do a step by step comparison of each step's output between the command line pipeline and the API I have been working on for this project. </p>jadrew43@uw.edu (jadrew43)Mon, 01 Aug 2022 20:02:37 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-8-august-1-blog-post/Week 7 (July 25) Check-Inhttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-7-july-25-check-in/<p>Last week I worked to create functions that would allow multiple subjects' data to be input to the connectivity model. I got stuck on replacing the function that manually scaled the data according to the type of sensor used, with MNE's functions for scaling. I followed my mentor's suggestions for creating this function and will go over it with him this week. I had to run the model for 16 subjects over the weekend. I will plot those results this week and if they are in agreement with what I expect from previous usage of the model with this dataset, I will move on to building the statistical model that measures the statistical differences between functionally connected networks. </p>jadrew43@uw.edu (jadrew43)Mon, 25 Jul 2022 17:22:53 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-7-july-25-check-in/Week 6 (July 18) Blog Posthttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-6-july-18-blog-post/<p>Last week I attended a research conference where I presented a poster showing the application of the autoregressive model I am working to publish for this project. I shared measurements for a single condition as compared between a neurotypical population and a population with autism spectrum disorder. I was able to integrate a few stylistic/coding preferences as requested by my mentor. This week I will continue to work on the style edits as well as ensuring that the user interface for uploading data and using the model is minimal, requiring the minimal amount of data from the user, and that all the calculations are hidden from the user. One thing I'm having trouble with is deciding which functions and variables should be marked as private (indicated by a leading underscore) and which should be open to editing from the public. My goal for this week is to have the model itself publishable, so that next week I can move on to building the statistical model which will be used to calculate the statistical significance of the connectivity measures from the model. </p>jadrew43@uw.edu (jadrew43)Tue, 19 Jul 2022 18:31:08 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-6-july-18-blog-post/Week 5 (July 11) Check-inhttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-5-july-11-check-in/<p>Last week was successful. I was able to run PCA to decrease the dimensionality of the sensor data, covariance matrix, and forward matrix. This reduced dimension data is able to run through the connectivity model smoothly. I was also able to take a list of cortical labels from the user, which informs the model which cortical regions the functional connectivity should be measured. I implemented a graphing tool in order to plot the connectivity values between the chosen cortical regions over time. This week I am attending a research conference. Once I return, I will ensure the code is aligned with MNE-Python's formatting guidelines and work to publish it.</p>jadrew43@uw.edu (jadrew43)Wed, 13 Jul 2022 15:19:17 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-5-july-11-check-in/Week 4 (July 5) Blog Posthttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-4-july-5-blog-post/<p>My changes last week had to be restored - in an attempt to have the user only interface with the minimum amount of functions, I hid some functions within other processes. However, the further I got in building my script for measuring functional connectivity, the more I needed variables that were hidden in other functions. So I restored my program to a previous version where all the variables I needed were available within the script, and did not require going into any other functions. My plan is to get the script working with all of the functions/variables needed to initiate the model within this script, and once things are working smoothly, I can shift things (with some guidance from mentorship) such that the user is interacting with as few functions/variables as possible. I was able to process the sample data successfully so that the model for functional connectivity could be initialized, and the algorithm could be ran so that the connectivity measurements could be made. However, the algorithm fitting the connectivity measurements is currently crashing python. I plan to take two steps that should fix this problem this week: 1) Perform PCA on the sensor data so that the algorithm performs on fewer dimensions. 2) Select regions of interest to perform the analysis on such that connectivity is only measured for user selected cortical regions.</p>jadrew43@uw.edu (jadrew43)Tue, 05 Jul 2022 19:18:30 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-4-july-5-blog-post/Week 3 (June 27) Check-inhttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-3-june-27-check-in/<p>Over the past week I was able to fix the issue I was facing with the sample dataset, by using MNE's functions to load a forward matrix from the sample data, instead of computing the matrix from the data myself. Then I was able to move some functions around so that the user only interfaces with the functions they absolutely need for working with their data. Next steps will be to test out MNE functions to see if some of my pre processing steps can be replaced. I did not hit any severe obstacles last week.</p>jadrew43@uw.edu (jadrew43)Mon, 27 Jun 2022 19:21:24 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-3-june-27-check-in/Week 2 (June 20) Blog Posthttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-2-june-20-blog-post/<p>During my first week in GSoC, I started to build a simple script that utilized the tools I've used to measure functional connectivity in my research. The challenge in building this script is that its goal is to utilize example data that MNE-Python uses in their examples and tutorials. In my first attempts, it was clear that the structure of the sample data is different from the data I've used for my research (in terms of the number of sensors, how bad channels are disregarded, and the use of data in both left and right hemispheres). I left off last week deciding that I needed to use a different, more visual based tool to understand the sample data so that I can manipulate the structure to fit into my connectivity model.<br> <br> I've received lots of feedback in the first draft of this script and I see that over the next week, in addition to changing the example data structure to fit my model, I will need to change variable and function names to be aligned with MNE-Python's coding format. Outside of preferred naming format, I am learning that my scripts that will serve as tutorials for MNE-Connectivity will need to give the minimal amount of information to the users for them to navigate the tools, and that functions not important to the use should be hidden in the backend as to not overstimulate the user. </p> <p> </p>jadrew43@uw.edu (jadrew43)Mon, 20 Jun 2022 19:39:21 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-2-june-20-blog-post/Week 1 (June 13) Check Inhttps://blogs.python-gsoc.org/en/jadrew43s-blog/week-1-june-13-check-in/<p>Over the last few weeks I have gotten acquainted with the MNE-Python subgroup by joining their Discord, following the contribution guide to make my first pull request, and starting an API proposal on what methods I will implement over the next several weeks. This week I plan to begin packaging a toolbox used for measuring functional connectivity in a format compatible using a coding style aligned with other MNE-Python functionality. In working on my first pull request, I chose to add a feature to the mne-qt-browser, used to display neuroimaging data over time. I got stuck in figuring out what part of the code needed to be modified to add this new feature. So I chose to edit the same function using a matplotlib backend which was more familiar and easier to navigate. </p>jadrew43@uw.edu (jadrew43)Mon, 13 Jun 2022 18:32:15 +0000https://blogs.python-gsoc.org/en/jadrew43s-blog/week-1-june-13-check-in/