#TF Logistic Regression in TensorFlow
In this post we will see how to implement Logistic Regression in TensorFlow.
Let’s generate a dataset using random values.
It’s very crucial to check the size of your array.
Let’s split our dataset into a train and test set. For the training set 80% of the original dataset would be used, while the remaining 20% would be used for testing.
Let’s set the parameters which would be used in training our model.
Now, let’s compare the predictions we got using the Logistic Regression model against the true outputs to know which examples were correctly classified. This can be done using the function tf.nn.softmax_cross_entropy_with_logits. To calculate the cost, the average of the cross-entropy loss must be taken. Then our result is then fed though a gradient descent optimizer to minimize the cost.
To initialize variables in TensorFlow, let’s using the command tf.global_variables_initializer().
After training the model, let’s have a visual look at the cost function. Just looking at it, we may see it drastically decreases over time.
Now let’s draw the line that actually seperates the red and blue class using the coefficent our model learnt overtime.
Click here, to find the complete code for this post.
In the next post, we will see how to implement a shallow neural network in TensorFlow using the circle dataset.