Tensor Operations using PyTorch — 2

shaistha fathima
6 min readAug 25, 2019

--

Summary: I like to learn by playing around with code and sharing my knowledge via blogs like this. This a rap-up blog post of “Introduction to “Tensor” Series”, covering tensor operations like Element-wise operations, Arithmetic operations, Broadcasting Tensors, Comparison operation and reduction operations. For more posts like these on Machine Learning Basics follow me here or on Twitter — Shaistha Fathima.

Here is the link to all the posts of Introduction to “Tensors” series:

If Interested you may also check the other series on Basic Concepts You Should Know Before Starting with the “Neural Networks” (NN)

Lets begin…

Element-wise Operations

These are the operations between two tensors corresponding elements.So, What are corresponding elements?

Corresponding elements within a tensor can be defined as, two elements occupying the same position within the tensor, where the position is determined by the indexes used to locate each element.

Example:

#tensor element corresponding to the position at [0][0] respectively
t1[0][0] ----> tensor(1.) # in tensor t1 --> 1.
t2[0][0] ----> tensor(2.) # in tensor t2 --> 2.

Note: Element-wise operations can occur ONLY on SAME SIZE or SHAPE tensor and is Scalar component.

Arithmetic Operations

I think you are all aware of what arithmetic operations mean, its means addition, subtraction, multiplication and division like operations.

Examples:

Created tensor with same size or shape
General arithmetic operations

The above all follows the given rule, “Element-wise operations can occur ONLY on SAME SIZE or SHAPE tensor and is Scalar component”.

But, what if, you want to perform certain operation on all of the element at once?

Example:

In the above example, of addition of 2 to all the elements in tensor t1, it can be seen that the rank of t1 = 2 and size [2, 2], whereas, the rank of just number 2 = 0, but still, the operation is being performed and everything seems to work fine! Why is that?

This can be explained with the help of the next topic, “Broadcasting Tensor”, which explains the how part, of the above arithmetic operations.

Broadcasting Tensor

Broadcasting in tensors basically explains or describes how tensors with different shapes are treated during the arithmetic operations for the elements to perform element-wise operations.

So, in the above examples, when we write t1 + 2 , where scalar ‘2’ valued tensor is being added to a rank 2 tensor t1, which is not possible as per the rule!

Here, broadcasting comes to our rescue, it elevated the rank or dimension of the scalar value ‘2’ to a tensor of rank 2, thus, by following the rule! It helps perform the arithmetic operations.

Example:

But, what about when we want to add a tensor of shape [1, 2] to tensor of shape [2, 2]?

Example : Addition of [2, 2] tensor with [1, 2] tensor

Before the addition t2 is broadcasted to match t1.shape

Example : Addition of [2, 2] tensor with [2, 1] tensor

Before the addition t2 is broadcasted to match t1.shape

Broadcasting comes into play ONLY for tensors with DIFFERENT shapes.

Working steps that happens during broadcasting:

Compatibility check and Rules :

Dimensions are compatible if:

- Both are equal to each other (or)

- One of them is ‘1’

If not, the broadcasting cannot take place!

Note: Compatibility check occurs from right (R) to left (L)

Example 1. Same “rank” , different shape

Step-1: dimension compatibility check for “t1 + t2"

t1 = [1, 3] and t2 = [3, 1 ]

Comparison begins from R -> L ; see in box 1 obeys the rule ( Both are equal to each other (or) One of them is ‘1’ )so it will move to box 2 and hence both are compatible.

Step-2: Final tensor shape

Here the resultant tensor with be of shape [3, 3] as t1 = [1, 3] and t2 = [3, 1 ], taking the max dimension value from each box.

Step-3: Changing the shapes of both t1 and t2 i.e., broadcasting both t1 and t2.

t1 = [[1, 2, 3],
[1, 2, 3],
[1, 2, 3]]
t2 = [[4, 4, 4],
[5, 5, 5],
[6, 6, 6]]

And the result will be,

Example 2. Different “rank” , different shape

Here too, all the above three steps performed to know is broadcasting is possible or not.

Step-1: t1 = [1, 3] and t2 = [1, 1] (add one for the extra dimension) — passes! So moving to..

Step-2: Final tensor shape [1, 3] (taking max dimensions from each box!)

Step-3: t2 = [[5, 5, 5]] after broadcasting

Step-4: result

Example 3. Different “rank” , different shape BUT NOT compatible! So, NO broadcasting! Hence, arithmetic operation cannot be performed.

why did that happen? If you notice closely, you will see that t1 = [2, 3] and t2 = [3, 3] , box 1 is checked and passes as both are equal to ‘3’ but when it comes to box 2 with ‘2’ and ‘3’, clearly both are not equal, hence look at the second option of compatibility rule which states “either of them should be 1” which is not satisfied!

Hence, as the dimensions are not compatible, broadcasting could not take place, thus, as per element-wise operation rules, tensors with different shapes cannot undergo element-wise operation!

Comparison Operations

It is a type of element-wise comparison, where the result is given out as a Boolean of either 0 (false) or 1(true).

Example Using Functions

Reduction Operations

A reduction operation on a tensor “reduces” the no of elements contained within the tensor. Gives ability to manage our data just like other operations.

This allows us to perform operations on elements within a single tensor.

Some of the examples are

Argmax Tensor reduction operation: This returns the index location of the maximum value inside a tensor.

Example:

Conclusion

That’s it! You now know all the requisite basic concepts of PyTorch Tensors to get started with Machine Learning.

Like always you may get the code from here.

You might also want to check out the official documentation, here.

You may also look at my other series on Basic Concepts You Should Know Before Starting with the “Neural Networks” (NN) if interested!

--

--

shaistha fathima

ML Privacy and Security Enthusiast | Research Scientist @openminedorg | Computer Vision | Twitter @shaistha24