datahacker.rs@gmail.com

Category: Deep Learning

#002 RNN – Architecture, Mapping, and Propagation

Highlights: Recurrent Neural Networks (RNN) are sequence models that are a modern, more advanced alternative to traditional Neural Networks. Right from Speech Recognition to Natural Language Processing to Music Generation, RNNs have continued to play a transformative role in handling sequential datasets. In this blog post, we will learn how to build and map a Recurrent Neural Network with some interesting examples. In addition, we will represent basic RNN models using the mathematical notations of…
Read more

#001 RNN – Recurrent Neural Networks

Highlights: Recurrent Neural Networks (RNN) are sequence models that are a modern, more advanced alternative to traditional Neural Networks. Right from Speech Recognition to Natural Language Processing to Music Generation, RNNs have continued to play a transformative role in handling sequential datasets. In this blog post, we will learn the fundamental theory of Recurrent Neural Networks, the motivation for its use, and basic notation. So, let us start, shall we? Tutorial Overview: Fundamental Concept of…
Read more

# TF Implementing a VGG-19 network in TensorFlow 2.0

Highlights: In this post we will show how to implement a fundamental Convolutional Neural Network like \(VGG-19\) in TensorFlow. The VGG-19 architecture was design by Visual Geometry Group, Department of Engineering Science, University of Oxford. It competed in the ImageNet Large Scale Visual Recognition Challenge in 2014. Tutorial Overview: Theory recapitulation Implementation in TensorFlow 1. Theory recapitulation With ConvNets becoming more of a popular in the computer vision field, a number of attempts have been made to improve the…
Read more

#015 TF Implementing AlexNet in TensorFlow 2.0

Highlights: In this post we will show how to implement a fundamental Convolutional Neural Network \(AlexNet\) in TensorFlow 2.0. The AlexNet architecture is designed by Alex Krizhevsky and published with Ilya Sutskever and Geoffrey Hinton. It competed in the ImageNet Large Scale Visual Recognition Challenge in 2012. Tutorial Overview: Review of the Theory Implementation in TensorFlow 2.0 1. Review of the Theory Real life Computer Vision problems requires big amount of quality data to be trained on. In the past, people…
Read more

#014 TF Implementing LeNet-5 in TensorFlow 2.0

Highlights: In this post we will show how to implement a foundamental Convolutional Neural Network like \(LeNet-5\) in TensorFlow. The LeNet-5 architecture was invented by Yann LeCun in 1998 and was the first Convolutional Neural Network. Tutorial Overview: Theory recapitulation Implementation in TensorFlow 1. Theory recapitulation The goal of \(LeNet-5 \) was to recognize handwritten digits. So, it takes as an input \(32\times32\times1 \) image. It is a grayscale image, thus the number of channels is \(1 \).…
Read more