Geek Culture
Published in

Geek Culture

vector matrices

Machine Learning is all about Vector and Matrix computations

All things you need to know before implementing Machine Learning Algorithms

Photo by Antoine Dautry on Unsplash

Machine Learning is a new hype that has been talked about by many people in the past few years. Various industries from logistics, Insurance, Automotive, and even Medicine are trying to implement Machine Learning to make predictions based on their own problems. In Insurance, Machine Learning can predict the potential customer that is likely to produce a policy and insurance companies can improve their financial prediction based on the policy created. The automotive sector can produce self-driving cars that can detect the environmental surroundings. In Medicine Sector, Machine Learning can make predictions based on the detection of pneumonia, lung cancer, etc. These are a few examples of how powerful Machine Learning is to solve real-world problems which a few years ago has been just a prototype. Before going deeper into these complex problems, we have to understand fundamental things such as mastering vector and matrices computation first before delving into Machine learning. Vector and matrices are two fundamental things that underlie the power of machine learning in solving simple to complex problems aforementioned.

Data Representation

Data can be represented in any format such as text, images, audio, etc. The computer in which we will run our machine learning algorithms can not understand this type of data. Hence, These types of data are converted into vector representation in order to make computers able to run ML algorithms. So, The question comes to our mind

What is vector and why is it important in Machine Learning?

In an understandable way, a vector is a 1- dimensional array of numbers in which it has direction and length. The totality of all vectors with n entries is called n-dimensional vector space. If there are 3 entries in the vector, It is called 3-dimensional vector space. If there are 5 entries in the vector, it is called 5-dimensional vector space, and so on and so forth. So, How does this terminology relate to Machine Learning? A feature vector is a vector whose entries represent the features of the object. For instance, we have a vector that consists of the price(y-axis) and size (x-axis) entries of the house(object)(2-dimensional vector space). The red vector means the representation of the house that has size and price entities.

2-dimensional vector space

the vector space containing these entities is called feature space. The other example that we can see is through images. An image consists of pixel values from 0–255. The lower the value, the darker. As you can see from the image below the grayscale image of number 8 consists of many pixel representations. You can represent this image into vector representation by flattening each row from 0–0,0–29,0–0,0–1 until 0–0 to become a vector representation.

                                 [0 
2
15
0
.
.
6
0
0 ]
Image in Pixels

The other type of data that you might be finding is text in a few documents. Your task is to convert this type of document into a vector. You can do it by using a standard way called One hot Encoding in which to assign 1 or 0 to each word. For instance, we have 4 distinct words that are represented in the one-hot encoding as shown in the following

[0 0 0 1] as apple
[0 0 1 0] as mango
[0 1 0 0] as cat
[1 0 0 0] as tiger

You can also use as many words as possible and you can convert them into a vector. However, lots of words with this kind of representation can create a sparse vector where there will be a lot of zeros. Another caveat of this representation is the lack of similarity context among words by doing a dot product between the vectors. If you remember that the dot product will yield a scalar value (you can check your linear algebra textbook if you want to wider context besides this formula). And you can see the multiplication(dot product) between vector apple and mango yield 0 which means no similarity. In fact, these two words have some similarities (fruit). One another caveat is this representation can consume a lot of resources for computation. You will know about this when you are implementing machine learning algorithms that need matrix computation for solving specific problems.

apple vector x mango vector equal to 0 which means there is no similarity

Another vector representation that we can use to solve the problem of using One hot encoding methodology is vector embedding. Vector embedding is a process to reduce the vector representation from higher-dimensional space into lower-dimensional space. Finding vector Embedding can be done by carrying out Matrix Factorization. It is a transformation of finding a vector as a replaced vector to be represented in a smaller dimensional space. The simple idea behind matrix factorization is like if we can get 60 by multiplying three numbers(6,2,5). You can imagine these 3 numbers as the 3 matrices that yield the M(60) matrix shown in the image below. We implement the matrix factorization by using Singular Value Decomposition(SVD).

60 = 6*2*5
Matrix Factorization in the context of vector embedding/ SVD

SVD is beyond the context of this article. If you are interested, you can get the idea of SVD by going to Introduction to SVD tutorial and explanations.

Thank you for reading!

I really appreciate it! 🤗 If you liked the post and would like to see more, consider following me. I post topics related to machine learning and deep learning. I try to keep my posts simple but precise, always providing visualization, and simulations.

Josua Naiborhu is a business development analyst who turns into a self-taught Machine Learning Engineer. His interests include statistical learning, predictive modeling, and interpretable machine learning. He loves running and it teaches him against giving up doing anything, even when implementing the Machine Learning Lifecycle(MLOps). Apart from pursuing his passion for Machine Learning, he is keen on investing in the Indonesian Stock Exchange and Cryptocurrency. He has been running a full marathon in Jakarta Marathon in 2015 and Osaka Marathon in 2019. His next dreams are to run a marathon in Boston Marathon, TCS New York City Marathon, and Virgin Money London Marathon.

You can connect with him on LinkedIn, Twitter, Github, Kaggle or reach out to him directly on his personal website.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store