datahacker.rs@gmail.com

# Category: Machine Learning

### #005 Linear Algebra – Inverse matrices, Rank, Column space and Null space

Highlights: Hello and welcome back. In this post we will learn about some very important topics in linear algebra. They involve: Tutorial Overview: Inverse matrices Rank Span Null space 1. Inverse matrices The following concepts we will observe under the light of linear transformations. They will help us to gain intuition how we can solve linear equations and better understand some concepts related to linear transformations in general. Usually, when you first hear about linear…

### #004 Linear Algebra – The determinant

Highlight: In this post we will explain what the determinant is and why we use it in linear algebra. We will give an interpretation of a determinant both in a 2-D and in a 3-D space. Also, we will show how to implement these calculations in Python. Tutorial Overview: Determinant in a 2-D coordinate system Determinant in a 3-D coordinate system 1. Determinant in a 2-D coordinate system In the previous post we saw how…

### #003 Linear Algebra – Linear transformations and matrices

Highlight: Hello and welcome back! This post will be quite an interesting one. We will show how a 2D plane can be transformed into another one. Understanding these concepts is a crucial step for some more advanced linear algebra/machine learning methods (e.g. SVD, PCA). So, let’s proceed and we will learn how to connect a matrix-vector multiplication with a linear transformation. Tutorial Overview: Linear transformation Linear transformation and basis vectors 2×2 Matrix as a linear…
Highlights: In this post we are going to continue our story about vectors. We will talk more about basis vectors, linear combination of vectors and what is the span of vectors. We provide a code examples to demonstrate how to work with vectors in Python. Tutorial Overview: Basis vectors $$\hat{i}$$ and $$\hat{j}$$ Different basis vectors Linear combination of vectors What is a span of vectors? Linearly independent and dependent vectors 1. Basis vectors $$\hat{i}$$ and…
Highlights: In this post we will show how to implement a fundamental Convolutional Neural Network $$AlexNet$$ in TensorFlow 2.0. The AlexNet architecture is designed by Alex Krizhevsky and published with Ilya Sutskever and Geoffrey Hinton. It competed in the ImageNet Large Scale Visual Recognition Challenge in 2012. Tutorial Overview: Review of the Theory Implementation in TensorFlow 2.0 1. Review of the Theory Real life Computer Vision problems requires big amount of quality data to be trained on. In the past, people… 