datahacker.rs@gmail.com

#TF Logistic Regression in TensorFlow

#TF Logistic Regression in TensorFlow

In this post we will see how to implement Logistic Regression in TensorFlow. We will start with importing libraries we will need in our code.

In this post we will use some artificial datasets. To generate them we will use the following code.

It is always good to check the size of matrices we use. 

Next, we will  plot our datasets.

With the next peace of code we will split our dataset in training set ( 80 % of hole dataset) and testi set  (20 % of our dataset).

With the following code we will set hyperparameters.

We need to compare predictions we got with Logistic Regression with ground truth labels to see which examples are correctly classified. The tf.nn.softmax_cross_entropy_with_logits API from TensorFlow does this for us.
The loss can be computed by averaging the cross-entropies. Then the crossentropy is fed through gradient descent optimization done by tf.train.GradientDescentOptimizer.

In TensorFlow variables need to be initialized. To do that we will use tf.global_variables_initializer() .

We will plot the cost function to see is our algorthm learning. From the following graphic we can see that the cost function is constantly decreasen, so we can conclude that our function is decreasing. 

Next, we will plot linear classifier with the parmeters we get and then we will see how it classifies data from a training and a test set.

Here you can see the complete code we saw in this post.

Leave a Reply

Your email address will not be published. Required fields are marked *

19 − ten =