datahacker.rs@gmail.com

# K An implementation of a Shallow Neural Network in Keras – MNIST dataset

# K An implementation of a Shallow Neural Network in Keras – MNIST dataset

In this post we will see how we can classify handwritten digits using shallow neural network implemented in Keras.

Our model will have 2 layers, with 64(height x width) neurons in the input layer and 10 neurons in the output layer.
We will use normal initializer that generates tensors with a normal distribution.

The optimizer we’ll use is Adam .It is an optimization algorithm that can be used instead of the classical stochastic gradient descent procedure to update network weights iterative based on training data. Adam is a popular algorithm in the field of deep learning because it achieves good results fast. Default parameters follow those provided in the original paper.

To make this work in Keras we need to compile a model. An important choice to make is the loss function. We use the categorical_crossentropy loss because it measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class).

 

Leave a Reply

Your email address will not be published. Required fields are marked *

seventeen − four =