Author: datahacker.rs

017 CNN Inception Network

#017 CNN Inception Network

\(Inception\enspace network \)  Motivation for the \(Inception\enspace network \): In the last post we talked about why \(1\times 1 \) convolutional layer can be useful and now we will use it for building the \(Inception\enspace network \). When designing a layer for a \(convnet \) we might have to choose between a \(1 \times 3 \) filter, a \(3 \times 3\) or \(5\times 5\) or maybe a pooling layer. An \(Inception\enspace network \) solves this…
Read more

#016 CNN Network in Network – 1×1 Convolutions

Network in Network – 1×1 Convolutions In terms of designing \(ConvNet \) architectures one of the ideas that really helps is using a \(1\times 1 \) convolution. You might be wondering what does \(1\times 1 \) convolution do? Isn’t that just multiplying by a number? It seems like a funny thing to do. However, it turns out that it’s not quite like that. Let’s take a look! What does a \(1\times 1 \) convolution do?…
Read more

#015 CNN Why ResNets work ?

Why \(ResNets \) work? Why do \(ResNets \) work so well? Let’s go through one example that illustrates why \(ResNets \) work so well, at least in the sense of how we can make them deeper and deeper without really hurting our ability to get them to do well on the training set. Hopefully, doing well on the training set is usually a prerequisite to doing well on the test set. So, being able to…
Read more

#014 CNN Residual nets

Residual networks Last time we saw how VGG Networks work, however very deep neural networks are difficult to train because of the vanishing and exploding gradients types of problems. In this post we’ll learn about skip connections which allows us to take the activation from one layer and feed it to another layer much deeper in the neural network.  Using that we will build \(Resnets \) which enable us to train very deep networks, that…
Read more

#013 CNN VGG 16 and VGG 19

\(VGG \) neural network In the previous posts we talked about \(LeNet-5 \) and AlexNet  . Let’s now see one more example of a convolutional neural network called  \(VGG-16 \) and \(VGG-19 \) network. In this network smaller filters are used, but the network was built to be deeper than convolutional neural networks we have seen in the previous posts. Architecture of \(VGG-16 \) Remarkable thing about the \(VGG-16 \) is that instead of having so many…
Read more