# Part 23 : Orthonormal Vectors, Orthogonal Matrices and Hadamard Matrix

Published in

Two vector x and y are orthogonal if they are perpendicular to each other i.e. their dot product is 0.

# Orthonormal Vectors

If vector x and vector y are also unit vectors then they are orthonormal.

To summarize, for a set of vectors to be orthogonal :

1. They should be mutually perpendicular to each other (subtended at an angle of 90 degrees with each other).

For a set of vectors to be orthonormal :

1. They should be unit vectors.
2. They should be orthogonal.

## Property of orthonormal vector

Assuming vectors q1, q2, q3, ……., qn are orthonormal vectors. Then

Example

# Orthonormal Basis

A set of orthonormal vectors is an orthonormal set and the basis formed from it is an orthonormal basis.

or

The set of all linearly independent orthonormal vectors is an orthonormal basis.

# Orthogonal Matrix

A square matrix whose columns (and rows) are orthonormal vectors is an orthogonal matrix.

In other words, a square matrix whose column vectors (and row vectors) are mutually perpendicular (and have magnitude equal to 1) will be an orthogonal matrix.

## Properties of Orthogonal Matrix

1. An orthogonal matrix multiplied with its transpose is equal to the identity matrix.

2. Transpose and the inverse of an orthonormal matrix are equal.

For any square matrix, we know that

and from the first property, we know that

so we can conclude from both the facts that

3. The determinant of an orthogonal matrix has value +1 or -1.

To verify this, lets find the determinant of square of an orthogonal matrix

So if

then

## Solving a system of linear equation containing Orthogonal matrix

Say we have to find the solution (vector x) from the following equation

We have done this earlier using Gaussian elimination. But if matrix A is orthogonal and we multiply transpose of matrix A on both sides we get

Instead of performing Gaussian elimination you can just multiply transpose of coefficient matrix with constant matrix and get the solution.

A square matrix whose column (and row) vectors are orthogonal (not necessarily orthonormal) and its elements are only 1 or -1 is a Hadamard Matrix named after French mathematician Jacques Hadamard.

Hadamard matrices are used in signal processing and statistics.

# References

Lecture 17 | MIT 18.06 Linear Algebra, Spring 2005