Mastering Linear Algebra for Machine Learning: Vectors, Dot Products, Projection and Unit Vector

Sayali Kumbhar
4 min readJun 7, 2024

--

Hey folks👋,

let’s talk about Linear Algebra in machine learning. Understanding the fundamentals of linear algebra for machine learning is crucial for anyone looking to excel in the field of data science. This blog post will introduce you to the core concepts, such as vectors and matrices in machine learning, and demonstrate their practical applications.

Why learn It?🤔

In machine learning, Linear algebra concepts are used in many Machine Linear Algebra is used in Data Representation, Understanding Algorithms, Optimization Techniques, Dimensionality Reduction, Algorithm Efficiency, and Advanced Applications. Mastering these concepts enables a deeper understanding of how algorithms work and enhances one’s ability to develop and implement advanced machine learning models.

Now you clearly understand why we start learning Linear Algebra. “Apni Khursi ki peti bandh le…😀 hum udan bharane vale hai in linear algebra for machine learning….”

Introduction to Vectors (2-D, 3-D, n-D), row vectors, and column vector

vectors: Imagine you’re standing in a park and want to describe how to get to a nearby tree. You could say: — “Walk 5 steps south.” This instruction has two key pieces of information:

  1. Magnitude: The number of steps (5 steps).

2. Direction: The direction to walk (south).

from the above example, we can say that a vector is like an arrow who is pointing from one place to another.

Vector can be represented in two ways. row vector represented in row form

This is a horizontal arrangement of the numbers.

column vector represented in column form

This is a vertical arrangement of the numbers

Dot product (scalar product) and the angle between 2 vectors

The dot product of two vectors is a way to multiply them to get a single number (a scalar). It combines the lengths of the vectors and the cosine of the angle between them. but wait what is cosine of angle…🤔?

This angle can tell us a lot about the relationship between the vectors. One way to quantify this relationship is by using the cosine of the angle between them.

cosine of the angle between two vectors

What is the cosine of the Angle Between Two Vectors?

The cosine of the angle between two vectors A and B is a measure of how aligned the vectors are. It is calculated using the dot product of the vectors and their magnitudes (lengths).

Formula for Cosine of the Angle

The formula to find the cosine of the angle theta ( θ ) between two vectors A and B is:

cos⁡(θ)=A⋅B / ∥A∥ . ∥B∥

where A.B is the dot product of A and B

||A|| and ||B|| are the magnitudes (lengths) of vectors A and B.

now you have understood about dot product and the angle between two vector, but wait why is this so important? well, In Machine Learning cosign of angle is used For measuring the similarity between data points.

Projection and Unit Vector

A projection in the context of vectors is like casting a shadow of one vector onto another vector. Imagine shining a light on a stick standing on the ground; the shadow of the stick on the ground is the projection.

Here’s how you can think about it with vectors:

  1. Vector A: The stick.
  2. Vector B: The ground (or another direction you want to project onto).
Projection of vector a on vector b

The projection of Vector A onto Vector B is essentially how much of Vector A lies in the direction of Vector B. It tells you how far along Vector B you would go if you were to follow the direction of Vector A.

What is a Unit Vector?

A unit vector is simply a vector that has a length (or magnitude) of 1. It’s used to describe a direction without worrying about the magnitude.

Why Projection and Unit Vectors is important?

Projection is used in Dimensionality Reduction Techniques like PCA(Principal Component Analysis), Feature Selection, and similarity measurements like in KNN(K-Nearest Neighbours) or Clustering methods.

Unit Vectors are used in Normalize data, Gradient Decent, and Direction of features( unit vectors can represent the direction of features in high dimensional space).

These concepts are useful for improving performance, better interpretability, and efficient training. By leveraging projections and unit vectors, machine learning practitioners can build more efficient, interpretable, and high-performing models.

Conclusion

Understanding the core concepts of linear algebra, such as vectors, dot products, projections, and unit vectors, is essential for anyone looking to excel in machine learning. These concepts form the foundation of many machine learning algorithms and techniques, enabling better data representation, optimization, and efficiency. By mastering linear algebra, you gain the tools to understand the inner workings of machine learning models, improve their performance, and create more interpretable and advanced models. So, fasten your seatbelts and get ready to dive deep into the world of linear algebra for machine learning! see you in the next blog 👋 on another more interesting concept of Linear Algebra.

--

--

Sayali Kumbhar

Python Enthusiast 🐍 | Machine Learning Enthusiast | Data Science Enthusiast