Category: Machine Learning

017 CNN Inception Network

#017 CNN Inception Network

\(Inception\enspace network \)  Motivation for the \(Inception\enspace network \): In the last post we talked about why \(1\times 1 \) convolutional layer can be useful and now we will use it for building the \(Inception\enspace network \). When designing a layer for a \(convnet \) we might have to choose between a \(1 \times 3 \) filter, a \(3 \times 3\) or \(5\times 5\) or maybe a pooling layer. An \(Inception\enspace network \) solves this…
Read more

#010 CNN An Example of a Neural Network

Convolutional Neural Network – An Example In previous posts (CNN 004, CNN 005 and CNN 009) we have defined all building blocks for building a full convolutional neural network. Let’s now look at an example of a convolutional neural network (CNN). Let’s say that we have a \(32 \times 32 \times 3 \) dimensional image as an input to the CNN. So it’s an RGB image and supoose we want to try to do a handwritten…
Read more

001-CNN-Convolutional-Neural-Networks

#001 CNN Convolutional Neural Networks

Source: Stanford CS 231n Convolutional Neural Networks What is Computer Vision?      Computer vision is an interdisciplinary field that deals with how computers can be made to gain high-level understanding from digital images or videos. From the perspective of engineering, it seeks to automate tasks that the human visual system can do.                               Computer Vision is one of the fields of artificial intelligence that is rapidly progressing thanks to Deep…
Read more

011 Deep L-layer Neural Network

#011 Deep L-layer Neural Network

Deep L-layer Neural Network In this post we will make a Neural Network overview. We will see what is the simplest representation of a Neural Network and how deep representation of a Neural Network looks like. You may have heard that the perceptron is the simplest version of a Neural Network. The perceptron is a one layer Neural Network with the \(step\) activation function . In the previous posts we have defined a Logistic Regression…
Read more

Neural Network Representation

#007 Neural Networks Representation

A quick overview In previous posts we had talked about Logistic Regression and we saw how this model corresponds to the following computation graph: We have a feature vector \(x \) , parameters \(w \) and \(b \) as the inputs to the computation graph. That allows us to compute \(z \) which is then used to compute \(a \) and we use \(a \) interchangeably with the output \(\hat{y} \). Finally, we can compute a…
Read more