Eigenvectors and Eigenvalues — Linear Algebra for QC

In the last post for the Math for Quantum Computing series we talked about matrices and some of their operations. Now is the time to discuss a very important concept related to matrices i.e. eigenvectors — Let’s get started

Harshit Gupta
Quantum Untangled
Published in
7 min readMar 15, 2021

--

This is the third article in the series Linear Algebra for Quantum Computing and everything you would need in order to get yourself started in the field. This article assumes that you are familiar with the key concepts of vectors and matrices and builds new concepts on top of them. You may find the previous articles in our series useful to brush upon your knowledge of the same here

In the previous posts we explored some very powerful mathematical representations of real world entities modeled by vectors and matrices. Today we would discuss something inherent for a matrix, its eigenvectors and eigenvalues.

Intuition behind Eigenvectors

We established the fact that matrices are linear operators which, when multiplied by a vector, transform our vector into some new vector. In this transformation it is quite safe to assume that the newly formed vector may have a different length and also a different orientation in space. Since we are not required to go beyond three dimensions, for the intents and purposes of this article, attaching orientation in space is a safe way to think about vectors as of now.

Building upon that fact we can somewhat think of matrices as machines and vectors as simple rays on which they act. Through this, we can model the action of the matrix as simply squeezing, stretching or rotating our ray in space by any arbitrary amount. This graphic may clear things out a bit.

Image showing how matrices transform vectors

What we see is that the resultant vector is a linear combination of the columns of the matrix and since the matrix can be arbitrary, the vector may rotate and change its length. What if we did not want our vector to rotate?

This is where eigenvectors come in. Eigenvectors are a special class of vectors, associated with a particular matrix, such that they maintain their sense of direction in space when acted upon by the matrix (again, this analogy would suffice until we are concerned with three or lesser number of dimensions). Here’s the formal definition for them —

An eigenvector or characteristic vector of a linear transformation, or a matrix, is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it.

The term eigenvector is generated from the German word ‘eigen’ which means proper or characteristic. This may give you some hint why eigenvectors are actually called proper because in some sense they are actually a characteristic of each matrix — each matrix has a particular set of eigenvectors defined for it.

For a simple analogy, you may think of matrices as machines which act on objects i.e. our vectors. These machines change their length and orientation but each machine has its special objects i.e. the eigenvectors which do shrink and expand but do not change their orientation in space.

The blue and violet vectors are the eigenvectors under the above linear transformation

Given the above intuition for the meaning of eigenvectors, let us dive into their mathematical form and how do eigenvalues come into play. We shall also look at how we could actually find out the eigenvectors of a particular matrix.

The Mathematics

For any matrix M, belonging to a vector space V, if a vector v, also from the vector space V, satisfies the following equation -

Eigenvector equation for matrix M —Eq. (1)

such that a is scalar quantity, then v is said to be an eigenvector of the matrix M and a is said to be the corresponding eigenvalue.

Coming this far and seeing Eq. 1, some of you may have a doubt. What if the scalar quantity is negative? Doesn’t that flip our vector? The answer is yes it does flip the vector and that’s a perfectly reasonable doubt, but according to convention we assume the direction or orientation parallel or anti-parallel to our initial vector still, in some sense, on the line which the vector initially lies upon.

Let’s look at a simple example involving the matrix I i.e. a 2 dimensional identity matrix and try to see what its eigenvectors and eigenvalues are.

The identity matrix

For this matrix, as we can easily see, each vector when multiplied by it stays the same. This means, for any 2 dimensional vector v,

Multiplying Identity with a vector v

This shows that every vector v is an eigenvector of I and the corresponding eigenvalue is +1.

Another simple example may be —

A matrix example

with eigenvectors of the form —

Eigenvectors of matrix M

with respective eigenvalues as 3 and -1.

Courtesy of 3Blue1Brown

Even in the left image you can clearly see what we mean by preservation of orientation. The vector [-1,2] only scales by 2(its eigenvalue) under the transformation by the original matrix.

But how do you find these eigenvectors and eigenvalues? For a 2-dimensional vector space, it is not very tough. The general idea is to write the matrix as :

A general 2-D matrix

and the vectors as:

A general 2-D vector

Now for the eigenvector equation to hold true, —

Solving for eigenvectors

for some scalar quantity e. Solving the product and simplifying the equations a bit to remove components of v gives us —

Eliminating vector components

You may easily solve for e using the quadratic formula and find out the eigen values of the matrix. Note that they may come out as complex and that’s okay!

For the general case too, you could write your vector — matrix product, equate each row element with the value e.vᵢ where vᵢ is the iᵗʰ component of the vector v. What you will get is a polynomial with degree n and the n roots of this polynomial correspond to the eigenvalues of your matrix.

The above task is kind of tedious though and we shall stick to the 2-dimensional eigenvectors because they are relevant for our discussion regarding quantum. For a better way to find out eigenvectors and values for an n x n matrix, you may check this out.

Application in Quantum

But why do we care about eigenvectors for quantum? This is because eigenvectors and eigenvalues are actually an abstract mathematical concept to model real world entities.

What exactly? In quantum mechanics, any observable or measurable quantity is represented as a linear operator and more precisely a Hermitian operator. Moreover, the quantum states are represented as vectors belonging to a 2-dimensional complex vector space. These Hermitian operators, as stated in earlier posts, can be represented as matrices in a particular basis. These operators act on quantum states and produce some different resulting states, which when measured, produce some results.

What exactly ? When these states are measured, they produce the results that are actually the allowed values of our operators. For example, the energy operator would act on a state, take the state into some new quantum state and produce the possible values of energy when measured.

Since the values of observables are real quantities, it makes sense to not have any imaginary parts in the values that represent them mathematically. This is where Hermitian matrices come in and thus, there eigenvectors and eigenvalues. With some effort and the definition of Hermitian matrices and eigenvectors, you can easily prove that the eigenvalues of a Hermitian operator are real. This takes us to define one principle in quantum mechanics which would help in accumulating in everything we have read till now —

The possible results of measuring an observable are the eigenvalues of the operator which represents the observable. If we call these eigenvalues as aᵢ , the state for which the result is unambiguously aᵢ is the corresponding eigenvector |aᵢ>.

When an observable is measured, the result is always a real number drawn from a set of possible results. For example, if the energy of an atom is measured, the result would be one of the energy levels of the atom.

The above principle defines the relation between the operator that represents an observable and the possible numerical outputs of the measurement. The result of a measurement is always one of the eigenvalues of the corresponding operator. Now you may have some intuition as to why observables are chosen as Hermitian operators. That is because they have real eigenvalues and thus perfectly fit so as to model observables.

So, we can say that eigenvectors and eigenvalues are actually the models used to represent the unambiguous states of observables and the possible results of measuring them!

Coming to the end of this post, you shall be somewhat familiar about what the concept of eigenvectors and eigenvalues is and how it comes about in quantum computation. Thank you for joining in for yet another exciting post, and stay tuned for more as we at Quantum Untangled look to explore new domains of math and technology related to Quantum Computation and Information.

--

--