Day 5: Beginning to create my own convolutional neural network

At the start of today I watched 2 of Stanford's lectures one on feed-forward neural networks and the other on convolutional neural networks. In a feed-forward network there is no connection to previous nodes. This is a great visualization of the relationship between nodes and how a previous layer provides input for a future layer. The other type of neural network is a recurrent neural network. What makes a recurrent network different is that outputs on a certain layer can act as inputs for the previous layer. I have not gone into depth on recurrent networks so my knowledge is limited. A convolutional network is not another type of network but rather a specific layer on a neural network which alters the data. One example of convolution is pooling. As demonstrated below pooling takes a filter full of values and then averages or sums them creating a new array. Pooling is useful because it reduces the size of your matrix while retaining a similar value. This allows for calculations to be performed quicker. I will continue to build my own convolutional neural net over the next week.


Image result for pooling machine learning gif
Source: https://hackernoon.com/visualizing-parts-of-convolutional-neural-networks-using-keras-and-cats-5cc01b214e59

Comments

Popular posts from this blog

Day 11: Network success and an interesting lecture

Day 26: Polishing my presentation

Day 22: Implementing new lifelong learning models