Confused about Tensors, Dimensions, Ranks, Orders, Matrices and Vectors?

When it comes to the terms _dimensionality_, _rank_ and _order_, you have to distinguish between the case where the term is describing a vector or matrix, or a tensor. The former are mathematical properties of a vector and matrix, the latter is describing the shape of a data structure called _tensor_.

Let me explain…

A tensor is nothing else than a mathematical data structure, a way to organize data and a set of rules (defined in linear algebra) which define mathematical operations on them.

The _dimensionality_, _rank_ or _order_ of a tensor itself defines its expansion in space. The following table illustrates this:

As the fundamental data structures are covered now, I introduce some of the capabilities and properties for each of them.

Scalars span a one-dimensional vector space since they can unambiguously address any point in it. A one-dimensional vector space is also called a _line_. This means, with an infinite number of scalar values you can address any point on a line.

Vectors span a vector space of any dimension depending on the number of elements in the vector as it can unambiguously address any point in an arbitrary large (regarding # of dimensions) vector space. The number of elements in a vector is also called dimension or order. This means, with an infinite number of vectors you can address any point in any vector space, regardless on how many dimensions it has. A two dimensional space is also called a plane. Depending on the theory you are following, our universe has either three, four (Einstein, Space-Time), five (Theodor Kaluza) or eleven (Edward Witten, Super-String Theory) dimensions. But the MNIST data set, the hello world data set in neural networks, has 784 dimensions since the images have 28 by 28 pixels.

Matrices span a vector space of any dimension because depending on the number of columns they can unambiguously address any point in an arbitrary large (regarding # of dimensions) vector space. Each row in a matrix corresponds to one point in that space. In other words, a matrix is just a collection of points or vectors of the same length. The number of rows tells you how many points you have in that space and the number of columns tells you how many dimensions this space has.

The order of a matrix is the number of rows (usually mentioned first) and columns (usually mentioned last).

The rank of a matrix is the number of linearly independent components and is often confused with the order of a matrix.

Three dimensional tensors span multiple parallel vector spaces of the same dimensionality. Depending on the number of columns it can unambiguously address any point in an arbitrary large (regarding # of dimensions) vector space. Each row in a three dimensional tensor corresponds to one point in that space and each layer in a three dimensional tensor can be seen as a separate vector space with the same number of dimensions and points.

I haven’t used four dimensional tensor for any practical application so far, but I’ve heard you need them to understand general relativity :)

--

--