Spans and Linear Dependence.

Benjamin Vanous
7 min readJun 9, 2022

--

This article is a follow up to my other article here on matrix inversion, if your not quite sure what that is or dont understand it very well, I’d recommend you check out that article first. In a short summary though I had wrote about matrix inversion, and went over a little about when it is and isn’t possible to solve a problem using matrix inversion. In this article I’ll be going a bit deeper into this problem with the help of Linear Dependence and Spans.

Linear Span:

Linear Span, or span for short, is the linear space formed by all the vectors that can be written as linear combinations of the vectors beloning to a given set.

For better understanding, lets first take a set of vectors:

As well as a set of corresponding scalar values:

In order to get all the linear combinations of this vector set in its dimension (ℝⁿ), we need to understand the set of all vectors b in ℝⁿ. We solve this by multiplying the vector set by its corresponding scalar values:

The collection of all its linear combinations of this vector set is considered its span, and is denoted like this:

Furthormore:

The above reads that in order to find the span of a vector set, you first need to be able to solve for b by multilplying the vectors and their scalars respectively, as well as verify that the scalar values are in the same dimention.

Lets look at a few pictures for an even better understanding:

If we have one vector v and we were to make multiples of itself, you can get anywhere along the line below shown with span{v}, this is considered a 1D space () because there’s one span:

If we take a scond vector w that’s NOT a multiple of v, then we can get anywhere in the plane of this board by taking combinations of v and w, shown with two spans, Span{v} and Span{w}. This is considered a 2D space (ℝ²) because there’s two spans:

If we take a third vector u that’s NOT a multiple of v or w, and it’s impossible to find u using some scalar values multiplied by the other vectors (xv+xw≠u), then we are able to get anywhere within this space, shown with three spans, Span{v}, Span{w} and Span{v,w}. This is considered a 3D space (ℝ³) because there’s three spans:

Now that we understand the concept of spans, we can move on to linear independence, which also plays an important role in understanding vectors spaces.

Linear Independence:

In the above section, we mentioned that in order to find the span of a vector set, we needed to be able to find a solution for this equation:

as well as proving that they were all on the same dimention (ℝⁿ). We could go further with this and say that if you are able to solve for b in this equation, that’d mean the vector set was in the same span, signifying that they’re linearly dependent on eachother.

To explain this further, lets first take a set of vectors like we did in the first section:

As well as a set of scalar values:

And multiply the set of vectors to the corresponding scalar:

We can say that a set of vectors are linearly independent if there exists a set of scalars that are all zero (xv=0).

Furthermore, we can say that a set of vectors are linearly dependent if there exists scalars that are all nonzero (xv≠0).

How to tell independence if scalars are unkown:

In most cases you will not have the scalar values at hand as they may be on a different basis. In order to get them we need to simplify the vectors down to its identity matrix using row operations, if your unsure what this is refer to my other article here that goes into depth about that topic. Lets look at a few examples of this.

Example 1:

Lets say we have a set of vectors like the one below:

And then multiply them by an unkown set of scalars below:

In order to find out if this set of vetcors is linearly independent or not, we use row operations to reduce it down to its identity matrix:

The set of vectors can be reduced down to its identity matrix, siginifying the only solution is the trivial solution. This also means that the scalars are all equal to 0, meaning this set of vectors are linealry independent.

When this set of vectors is plotted out, you will notice the vectors are in a 3D space as they should be, or in other words they’re all on different spans, This tells us that the vectors are all independent from eachother:

Example 2:

Lets use another set of vectors:

We then procede to multiply this set of vectors with its unkown scalars again:

Then use row operations to reduce the set down to its identity matrix:

*Notice the row of 0’s along the bottom row of the matrix.

From the above reduction, we find that its impossible to reduce this vector set down to its identity matrix, which further signifies that we need a nontrivial solution in order to get the scalars, in this example its x=-2z, y=-z, proving this vector set as being linearly dependent.

When this set of vectors is plotted out, you will notice that rather than the vector set being in a 3D space as it should be, they’re all on the same plane, similar to a 2D space. This shows that they’re linealy dependent on eachother:

Example 3:

Lets look at one last example, continuing the same process as the others only this time using a paramatic form, similar to the equations I used in my other articles:

And similar to before, A will be in place of our vector set, with the syntax looking similar to this:

If we were to take a random set of vectors in our variable A like the one below:

And apply row operations like we did in the other examples, we would find that the scalar values x would be equal to this:

These vectors are considered linealry independent, why is this? Lets look further:

We see from the image above that both x2 and x3 are equal to 0, and will work in any example due to the entries corresponding to the free variables which are equal to either 1 or 0. In other words, you can still partly simplify the vector set down to its identity matrix, as long as the ones run along the diagonal space, and there are all zeroes beneath them, it will work.

Extra:

One last thing to mention, linear independence and dependence are terms that apply to a collection of vectors. It wouldn’t make sense to say “this vector is linearly dependent on these other vectors”, or “this matrix is linearly independent”.

Conclusion:

In this article we learned that one simple equation gives us the ability to tell what the span of a vector set is, as well as if they’re linearly independent or not. I hope you’ve enjoyed, and if you notice anything wrong or out of place please let me know, many thanks!

Sources:

https://textbooks.math.gatech.edu/ila/spans.html

https://www.statlect.com/matrix-algebra/linear-span

https://en.wikipedia.org/wiki/Linear_span

https://en.wikipedia.org/wiki/Linear_independence

https://www.deeplearningbook.org/contents/linear_algebra.html

https://textbooks.math.gatech.edu/ila/linear-independence.html

https://www.math.ucdavis.edu/~linear/linear-guest.pdf

--

--