Auto gradients of tensor objects

Dipanwita Mallick
IWriteAsILearn
Published in
May 18, 2021

How do PyTorch’s tensor objects use built-in functionality to calculate gradients?

Differentiation and calculating gradients play a crucial role in updating weights in a neural network. Pytorch takes care of that. Here’s an example.

define a tensor and set requires_grad to True

Now the loss function, in this case, let’s say, the sum of the square values or square loss.

loss = Σ

gradient = 2*x

Let’s see how it is calculated using PyTorch built-in function. The function used is backward().

The gradients are stored in the x values or input values and can be obtained as:

Notice: The values are in fact 2*x

~END~

Coming up next: Why use PyTorch’s tensors over NumPy arrays?

--

--

Dipanwita Mallick
IWriteAsILearn

I am working as a Senior Data Scientist at Hewlett Packard Enterprise. I love exploring new ideas and new places !! :)