Right and Left Matrix Multiplication

Joshua Pickard
Geek Culture
Published in
4 min readJan 27, 2022

Matrix multiplication is one of the most important operations in math and computing. This article presents 2 better ways to think about the matrix multiplication relative to how you probably learned it.

Image found on Wikipedia

Everyone first learns matrix multiplication where each element in the product is calculated separately. For A×b=c, where A is a matrix and b and c are vectors, each term in c is calculate by multiplying the the ith row of A with the column vector b. The terms of c are calculated one at a time, making this is a quick and easy way to learn matrix vector multiplication. However, this method doesn’t capture the structure of the matrix transformation well.

A better way to think about matrix vector multiplication is as linear combinations of the rows and columns in A. Rather than calculating each term in c one at a time, a series of vectors the size of c are calculated and then added together to obtain a final result. This can give an insight into the type of transformation that matrix multiplication is performing. Whether it is rows or columns of A depends on which side of the matrix b is on. The next sections describe 2 great ways to think about matrix vector multiplication.

Right Multiplication: Column Space

Most often, people do matrix vector multiplication as A×b, where b is on the right side of A. This is the more common way to see multiplication, because it fits well with solving linear systems, which is typically the motivation for matrix algebra. If A is a m×n matrix, so that it has m rows and n columns, then b must be a column vector with n rows.

The product of the multiplication, c, is a linear combination of the n column vectors of A. Because when the vector is on the right side of A the output is a linear combination of the columns of A, right multiplication uses the column space.

Right multiplication with the column space. Image by Eli Bendersky’s on thegreenplace.net

In the above figure, A is a 3×3 matrix, with columns of different colors. The vector b has 3 elements. The product, all the way on the right, is a linear combination of the columns of A, where the elements in b are the coefficients of the linear combination.

Left Multiplication: Row Space

Left multiplication functions with the same principles as right multiplication, but since the vector is on the other side of A, the column space is switched to the row space. The set up is similar: A is m×n and b a vector with m elements. In right or column space multiplication, b has as many elements as A has columns, but in left or row space multiplication, b has as many elements as A has rows.

The product of b×A is a linear combination of the m rows of A where the elements of b are the coefficients of the linear combination.

Left multiplication with the row space. Image by Eli Bendersky’s on thegreenplace.net

In the above figure, A is the matrix with rows of different colors, and b has exactly as many elements as there are rows in A. The product shown on the right, is a linear combination of the rows of A with the elements of b as the coefficients.

Python Implementations

Below are 2 functions written in Python that do right and left matrix vector multiplication. These functions are set to work based on the intuition described above. This code and some examples can be found in this colab file.

def right_multiplication(A, b):
C = np.zeros(b.shape)
m, n = A.shape
for i in range(n):
C += np.reshape(b[i,0] * A[:,i], (m,1))
return C
def left_multiplication(b, A):
C = np.zeros(b.shape)
m, n = A.shape
for i in range(m):
C += np.reshape(b[0,i] * A[i,:], (1,n))
return C

The above 2 functions are structured very similarly, only differing in the for loop. For right, row multiplication, the loop iterates over the rows of A, and for left, column multiplication, the loop iterates over the columns of A. Inside the loop, both functions increment c by either a row or column respectively times the corresponding ith element from b.

Although NumPy has built in functions for matrix multiplication, which are more efficient than the functions above, understanding these 2 perspectives on matrix multiplication can give an insight into how matrices are really linear transformations and the other secrets matrices hold.

If you made it this far, hit the clap button. I’m new to Medium, and trying to crank out some content about how I think about math, data science, and computers. Follow me that’s if your sort of thing.

--

--

Joshua Pickard
Geek Culture

Computer Science and Bioinformatics @ University of Michigan. Website: https://jpickard1.github.io/ Twitter: @JoshuaPickard_