datahacker.rs@gmail.com

Category: Other

#002 CNN Edge detection

Convolutional operation The operation of the convolution is one of the foundations of the convolutional neural networks. From the Latin word , “to convolve” means to roll together. For mathematical purposes, a convolution is the integral measuring how much two functions overlap as one passes over the other. Think of a convolution as a way of mixing two functions by multiplying them. Using the edge detection as a starting point, we will see how the convolution…
Read more

#010 B How to train a shallow Neural Network with a Gradient Desecent?

In this post we will see how to build a shallow Neural Network in Python. A Shallow Neural Network First we will import all libraries that we will use it this code.  Then we will define our datasets. Those are two linearly non-separable datasets. To getreate them we can use either make_circles or make_moons function from Sci-kit learn. We need to define activation functions that we will use in our code.  Following function initializes parameters…
Read more

#012 B Building a Deep Neural Network from scratch in Python

In this post we will see how to implemet a deep Neural Network in Python from scratch. It isn’t something that we will do often in praxis, but it is good way to understand the inner workings of a Deep Learning.  First we will import libraries we will use in the following code. In the following code we will define activation functions: \(sigmoid \) , \(ReLU\) and \(tanh\) we will also save values that we…
Read more

#004B The Computation Graph – Example

The Computation graph – Example Let’s say that we’re trying to compute a function \(J\), which is a function of three variables \(a\), \(b\), and \(c\) and let’s say that function \(J\) is \(3(a+bc)\). Computation of this function has actually three distinct steps: Compute  and store it in the variable \(u\), so \(u = bc \) Compute \(v = a + u \), Output \(J \) is \(3v\). Let’s summarize: $$ J(a, b, c) =…
Read more

#003C Gradient Descent in Python

Gradient Descent in Python We will first import libraries as NumPy, matplotlib, pyplot and derivative function. Then with a NumPy function – linspace() we define our variable \(w \) domain between 1.0 and 5.0 and 100 points. Also we define alpha which will represent learning rate. Next, we will define our \(y \) ( in our case \(J(w) \)) and plot to see a convex function, we will use \((w-3)^2  \). So we can see that…
Read more