Today I presented everything I did over the summer. I felt that all of the presentations went great and I am very grateful to RIT as well as my mentors for this amazing experience. Here is my presentation.
I started today debugging my code for my neural network. After a healthy amount of googling and some perseverance I was able to roughly train and evaluate my model. The performance was subpar but hopefully I will be able to improve that next week. When that was over I went to the free pizza lunch provided for all of the REU summer research students and I then attended a lecture on visual perception. Although this was not very related to my project I still found it to be a valuable use of my time. It was extremely mind-blowing to realize that our eyes can only see details in a very small field of vision. Our peripheral vision is a lot worse than I previously thought. One of the coolest visual demos was these two tables shown below. Believe it or not the tabletops are exactly the same size and shape. Human visual perception is fascinating and I plan to continue attending the lectures throughout the summer. Source: http://www.optical-illusionist.com/illusions/table-size-ill...
While I was waiting for my experiments to run on the server I spent the day presenting in front of various members of kLab. I got some very good advice from Dr. Kanan in the morning which I used to update my presentation. Later in the day with the help of Tyler, Dr. Kemker and Kushal I was able to get a presentation that I was happy with. I plan to update the presentation with the results from the experiments I ran.
Today I continued what I had started yesterday. I ran into some trouble with installing spacy a natural language processing library. Natural language processing is a term used to describe computer analysis of human languages. After finally figuring out how to install all of the packages I needed I was ready to start experimenting with some VQA code. Using some sample code from an online blog I was able to explore how VQA uses recurrent neural networks(RNN) in order to analyze sentences. The rest of my day was spent going line by line through the code and googling anything I did not understand. I gained a very basic understanding of how a RNN is able to retain data it has already processed and why this is useful in VQA.
Comments
Post a Comment