Week 10

For my tenth week, I presented my poster for the end-of-summer CS research poster presentation. From there, I started writing my final report for the DREU program and finishing up my website. This is my last week, and overall I loved my experience doing research at UIUC and will continue the research I am currently doing during the school year.

Week 9

For my ninth week, I managed to fix the reservoir structure. I then had to redo the tests that I had done in the previous weeks using the new reservoir. However, I had to finish up my poster at the same time so I only tested using sin data and I mainly focused on testing reservoir parameters such as reservoir size and spectral radius. During this week, I also finished my presentation, my abstract, and started plotting next steps for reservoir testing. At the end of the week I presented my poster within my lab group.

Week 8

For my eigth week, I learned about the MiV simulator which is the simulator that my group uses to do large biophysical neuron simulations. I then set up docker to set up the simulator on my device and read through the MiV simulator documentation. I learned that the MiV simulator only accept spike train data. Because of this, I looked into how to encode inputs into spike trains. I then decided to compare my reservoir computing Nengo model to a baseline. My baseline was just performing linear regression on the input. During this process, I realized there were architectural mistakes with my current reservoir. I then spent the rest of the week fixing the reservoir.

Week 7

For my seventh week, I got started on my final presentation that I would presenting on the July 28th and July 31st. I also started testing parameter tuning for the reservoir and seeing what the effects would be. I found out that increasing the reservoir size and setting the spectral radius to 1 greatly improved the training MSE. However, it also caused overfitting for the testing MSE. Because of this, I added a regularization parameter which fixed the issues with overfitting and yieled good results. I also tested using preprocessing such as PCA and testing different training layers. Overall, MLP performed the best out of linear regression and SVM. Also having PCA decreased the MSE.

Week 6

For my sixth week, I took my Nengo reservoir I tried to integrate it into my team’s reservoir_computing repository. I had to make my code modular and eventually got it set up for airplane data and my Nengo reservoir. From there, I tested using just a sine input to make sure that the reservoir worked on simple inputs. Then, I looked through the literature to see what layers, architectures, and parameters were good to test with my new Nengo model. I wrote up a list of things I would want to test and started setting up modules for them in the reservoir_computing repository. By the end of the week, I made SVM and MLP training layers. I also made PCA and encoder preprocessing classes.

Week 5

For my fifth week, I started constructing a reservoir using Nengo. I managed to set up a working reservoir. Then, I had set up the airplane data so that I could use it with the Nengo reservoir. Finally, I started doing a few tests with the Nengo reservoir and airplane data. I tested the effects of preprocessing the airplane data vs. leaving the airplane data unprocessed. Nengo also has a method called solvers that I tested against using ridge regression directly. Overall, preprocessing the data and using ridge regression directly yielded better results.

Week 4

For my fourth week, I continued reading articles and documentation for Nengo and reservoir computing. In addition, I started testing possible training methods for the readout of our future nengo reservoir. Specifically, I wrote and tested linear regression and multilayer perceptron classes.

Week 3

For my third week, I learned in-depth of what my project entails. Essentially, the end goal is to be able to implement reservoir computing using bio-physical hardware. To achieve this goal, the lab I am working in has already created a simulator known as the MiV simulator. However, before using this simulator, my goal is to implement reservoir computing using Nengo which a neural simulator. Testing different reservoir computing models in Nengo will act as another layer of evaluation in comparison to the MiV simulator and will help the lab learn more about how reservoir computing would fair in a bio-physical setting. Thus, this week I was mainly going through tutorials and documentation of Nengo so that I can get started on this process. In addition, I continued to read through articles about reservoir computing.

Week 2

For my second week, I actually tried to implement reservoir computing. Using an example dataset (number of airplane passengers over the course of 12 years), I trained the dataset to forcast the number of passengers. I learned how to preprocess a temporal dataset, how to construct a reservoir, and how to train the resulting output of the reservoir using ridge regression. I also spent a large portion of the week reading articles on reservoir computing.

Week 1

For my first week, I mainly worked through tutorials essential for understanding the project I am working on. These tutorials included data visualization, introductory machine learning material, and temporal data handling. From there, I read introductory articles on reservoir computing which is a key part of the project that I am working on. I learned that reservoir computing is a type of recurrent neural network (RNN). Reservoir computing components include an input, a reservoir (the random, recurrently connected nodes), a readout layer, and a final output layer. Unlike a typical RNN in which you have to train all the weights in the network which then requires the use of backpropagation, reservoir computing only trains the weights between the readout and final output layer. Only training the readout weights means that the reservoir is left completely untrained its weights are typical set to random values. The use of a reservoir makes training much easier and more flexible.