#014 CNN Residual nets
Residual networks Last time we saw how VGG Networks work, however very deep neural networks are difficult to train because of the vanishing and exploding gradients types of problems. In this post we’ll learn about skip connections which allows us to take the activation from one layer and feed it to another layer much deeper in the neural network. Using that we will build \(Resnets \) which enable us to train very deep networks, that…
Read more