datahacker.rs@gmail.com

Category: Machine Learning

#025 CNN Bounding Box Predictions

Bounding box predictions In the last post, we learned how to use a convolutional implementation of sliding windows. That’s more computationally efficient, but it still has a problem of not outputting the most accurate bounding boxes.  In this post, we will see how we can obtain more accurate predictions of bounding boxes.  Output accurate bounding boxes With sliding windows, we take the sets of windows that we move throughout the image and we obtain a set of sliding…

#017 CNN Inception Network

$$Inception\enspace network$$  Motivation for the $$Inception\enspace network$$: In the last post we talked about why $$1\times 1$$ convolutional layer can be useful and now we will use it for building the $$Inception\enspace network$$. When designing a layer for a $$convnet$$ we might have to choose between a $$1 \times 3$$ filter, a $$3 \times 3$$ or $$5\times 5$$ or maybe a pooling layer. An $$Inception\enspace network$$ solves this…

#010 CNN An Example of a Neural Network

Convolutional Neural Network – An Example In previous posts (CNN 004, CNN 005 and CNN 009) we have defined all building blocks for building a full convolutional neural network. Let’s now look at an example of a convolutional neural network (CNN). Let’s say that we have a $$32 \times 32 \times 3$$ dimensional image as an input to the CNN. So it’s an RGB image and supoose we want to try to do a handwritten…
Deep L-layer Neural Network In this post we will make a Neural Network overview. We will see what is the simplest representation of a Neural Network and how deep representation of a Neural Network looks like. You may have heard that the perceptron is the simplest version of a Neural Network. The perceptron is a one layer Neural Network with the $$step$$ activation function . In the previous posts we have defined a Logistic Regression…