ScPyT

Linear Algebra in Python

A thorough Linear Algebra Bootcamp as a Machine learning Practitioner

Guangyuan(Frank) Li
Jan 24 · 11 min read
Image for post
Image for post
Photo by Antoine Dautry on Unsplash

Concepts you need to know in Linear Algebra

a = np.random.randn(2,3)
a
# array([[ 0.39, 0.54, -0.71],
[-1.84, -0.6 , 0.53]])
a.T
# array([[ 0.39, -1.84],
[ 0.54, -0.6 ],
[-0.71, 0.53]])
b = np.random.randn(3,2)
b
# array([[ 1., -0.],
[ 1., -0.],
[-3., 1.]])
a @ b
# array([[-1.09762048, 0.96911979],
[-1.97859822, -3.15604207]])
x = np.random.rand(2,3)
np.linalg.matrix_rank(x)
# 2
# So, x has two linearly independent column vectors
x = np.random.rand(4,4)
np.linalg.det(x)
# 0.08437697073565216
a = np.array([4,5,6])
b = np.array([7,8,9])
np.inner(a,b). # 122
np.outer(a,b). # array([[28, 32, 36],
[35, 40, 45],
[42, 48, 54]])
a * b. # element wise or hadamard product
# array([28, 40, 54])
Image for post
Image for post
Projection of a to b
a = np.array([4,5,6])
b = np.array([7,8,9])
proj_b_a = np.inner(a,b) / np.inner(b,b) * b
# array([4.40206186, 5.03092784, 5.65979381])
a = np.random.randn(3,4)
p,l,u = scipy.linalg.lu(a)
#p
array([[0., 1., 0.],
[1., 0., 0.],
[0., 0., 1.]])
#l
array([[ 1. , 0. , 0. ],
[-0.71426919, 1. , 0. ],
[-0.85039495, 0.12237106, 1. ]])
#u
array([[-2.09219711, -0.48959089, 0.81707073, 0.77602155],
[ 0. , -1.53448255, -1.72785249, 0.04775144],
[ 0. , 0. , 1.5052947 , -0.27769281]])
a = np.random.randn(3,4)
q,r = np.linalg.qr(a)
#q
array([[-0.47569189, 0.71339517, 0.51457221],
[-0.82969021, -0.55818202, 0.00685511],
[ 0.29211536, -0.42367461, 0.85741964]])
#r
array([[-1.34089268, -1.73408897, -0.07436536, 0.78464807],
[ 0. , -1.66272812, 0.63477604, -1.60036506],
[ 0. , 0. , 0.43098896, -0.31316029]])
Image for post
Image for post
Eigen-decomposition (Image by Author)
ei = np.random.randn(4,4)
w,v = np.linalg.eig(ei)
# w. eigenvalues
array([-2.21516912+1.65582705j, -2.21516912-1.65582705j,
1.45929568+0.99548974j, 1.45929568-0.99548974j])
# v. eigenvector, 4*4
array([[-0.1070701 -0.40447805j, -0.1070701 +0.40447805j,
-0.03773179+0.56113399j, -0.03773179-0.56113399j],
[ 0.15294619-0.0172865j , 0.15294619+0.0172865j ,
0.65763758+0.j , 0.65763758-0.j ],
[ 0.12433426+0.47215025j, 0.12433426-0.47215025j,
0.3236955 +0.37453337j, 0.3236955 -0.37453337j],
[-0.75023815+0.j , -0.75023815-0.j ,
0.00770123+0.07813095j, 0.00770123-0.07813095j]])
Image for post
Image for post
D is diagonalizable
Image for post
Image for post
p(t) will be a polynomial with respect to t.
a = np.random.randn(3,4)
np.linalg.pinv(a)
#
array([[-0.09988152, 0.23637842, 0.29312192],
[-1.07695892, 0.49075645, -0.36409413],
[ 0.03471007, 0.57286174, -0.13192266],
[ 0.83630477, -0.20498149, 0.30882525]])
svd = np.random.rand(4,5)
u,s,vh = np.linalg.svd(svd)
#u. shape (4,4)
array([[ 0.66581103, 0.61992005, -0.21849383, -0.3530655 ],
[ 0.47769768, 0.06749606, 0.56339472, 0.67069784],
[ 0.44673911, -0.66240419, 0.31234136, -0.51389467],
[ 0.35906096, -0.41516755, -0.73300048, 0.40177286]])
#s. s is the np.diag(S), S will be of shape (4,5)
array([2.35262119, 0.87561858, 0.31537598, 0.043907 ])
# vh of shape (5,5)
array([[ 0.29743919, 0.569923 , 0.3576598 , 0.52216343, 0.43144237],
[-0.61102456, 0.15085062, 0.50163189, -0.46496796, 0.36886763],
[ 0.49893385, 0.39660661, -0.29349238, -0.68317228, 0.2022525 ],
[ 0.52538762, -0.60314147, 0.5557502 , -0.15677528, 0.16355866],
[-0.11494248, -0.3624299 , -0.47481454, 0.14088043, 0.78111244]])
Image for post
Image for post
Lp norm for vector and Frobenius Norm for a matrix
a = np.array([4,5,6])
np.linalg.norm(a,ord=3). # L3 norm for vector a
# 7.398636222991409
x = np.random.rand(2,3)
np.linalg.norm(x,ord='fro'). # frobenius norm for matrix x
# 1.309085506183174
x = np.random.rand(2,3)
y = np.random.rand(5,3)
np.einsum('ij,kj -> ik',x,y)
#
array([[0.3676856 , 0.33156855, 0.39793874, 0.70939856, 0.8353566 ],
[0.19219345, 0.19312881, 1.36739081, 1.0606612 ,1.15039307]])
# einsum
x = np.random.rand(2,3)
np.einsum('ij -> ji',x) # transpose
np.einsum('ij ->',x) # sum
np.einsum('ij -> i',x) # column sum
np.einsum('ij -> j',x) # row sum

x = np.random.rand(2,3)
y = np.random.rand(5,3)
np.einsum('ij,kj -> ik',x,y) # matrix multiplication

a = np.array([4,5,6])
b = np.array([7,8,9])
np.einsum('i,i ->',a,b) # inner product
np.einsum('i,j ->ij',a,b) # outer product
np.einsum('i,i ->i',a,b) # hadamard product

y = np.random.rand(5,3)
np.einsum('ij -> j',y) # diagonal
np.einsum('ij ->',y) # trace

How important is Linear Algebra?

The Startup

Medium's largest active publication, followed by +775K people. Follow to join our community.

Guangyuan(Frank) Li

Written by

Bioinformatics PhD student at Cincinnati Children's Hospital Medical Center; GitHub: https://github.com/frankligy

The Startup

Medium's largest active publication, followed by +775K people. Follow to join our community.

Guangyuan(Frank) Li

Written by

Bioinformatics PhD student at Cincinnati Children's Hospital Medical Center; GitHub: https://github.com/frankligy

The Startup

Medium's largest active publication, followed by +775K people. Follow to join our community.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store