Unlocking the Robot’s Mind: Exploring Linear Algebra in Mobile Robotics!

Abhishek Mishra
3 min readJul 30, 2023

--

x+y=2
x+2y=7

No, not that easy, of course :P.

In the previous blog, we learned about the advent of mobile robotics and how it is currently rocking the entire automotive industry. But whatever happens inside the brain of a robot is definitely not magic!

No magic literally!

Many of us used to dread learning Mathematics. I remember my sister scolding me just before my exams for my poor mathematics :’(
and it used to scare the shit out of me.

Little did I know that the skills I acquired were way beyond just solving equations —

He is the big brain behind Linear Algebra. René Descartes, a French Philosopher.

Unlocking core-memories —

Vectors
- Array of numbers.
- Represents points in an n-dimensional space.

Scalar product, sum, dot product. Rang some bells? Umm, maybe not, but neither of the above is of much importance for now.

Secondly, LINEAR (IN)DEPENDENCE —
A vector is linearly dependent on another vectors if one vector can be obtained by summing up the scaled version [Kx] of other vectors.

BIG DEAL ?— DOT PRODUCT and LINEAR (IN)DEPENDENCE!
If there are two vectors A and B, then the dot product of A and B is nothing but the projection of one vector on the other, and if their dot product is zero, then the vectors are said to be orthogonal to each other.

Yeah, thinking of a chilled beer this sunday. Right? ;)

This blog or any of the blogs I make are never sponsored by Pilsen, just that there were no GIFs for showing orthogonality and sunday and beer and :3….

Anyways, let’s start with Matrix!

If you’re killed in the matrix, you die here?

The above GIF caption is from the movie matrix but rightly said, matrix is one of the key elements into what goes as the fundamental of what runs inside a robot’s mind!

But heyy, you know, Vector and matrix are same?

Ahaha, only if the vector is one-dimensional :3.

Some important matrix operations include —
- Multiplication by a scalar.
- Sum .
- Multiplication by a vector.
- Product of two matrices.
- Inversion.
- Transposition.

BIG DEAL? RANK OF A MATRIX!

  • Maximum number of linearly independent rows.
  • Dimension of the image of the transformation matrix.
  • Computation of the rank is done by —
    - Gaussian elimination of the matrix.
    - Counting the number of zeroes.

There is a great blog on this. To learn more about this, click here.

Here comes the BOSS stage👾. The Determinant.

I am determinant!!

As irrelevant as this GIF might look, trust me, I could not find anything better than this.

Finding the determinant of a 25x25 matrix is a compute-intensive process. This will need n! multiplications, which results in 1.5x10²⁵ multiplications in total.

Faster methods, like Gauss Elimination, bring the matrix into triangular forms, which can then be used to calculate the determinant.

Properties like row operations, transpose, and multiplication help bring the matrix to a triangular form, the determinant of which can be found out by multiplying the diagonal elements.

Let me give you one more reason why determinants are necessary.
They help us calculate Eigen Values [det(A-(Lambda)I = 0], area, and volume of closed figures.

Lastly, an orthogonal matrix is one in which the product of the vector and it’s transpose is an identity matrix. The determinant of the square of an orthogonal matrix is also an identity matrix.

Too much maths for the day! But this is what the roboticists signed up for ;).

Adios!

Pssttt, come here, will take you to the next blog :3.

--

--