Introduction to PyTorch

Sarang Mete
Analytics Vidhya
Published in
3 min readDec 17, 2019

PyTorch is one of the most popular library for deep learning project. We’ll explore PyTorch in detail in series of articles.

Firstly we’ll see what is PyTorch and how it works

PyTorch features:

  1. Easy to use API
  2. Easy integration with other python libraries hence very useful in DataScience
  3. Dynamic computation Graph

Tensors:Main abstraction layer in PyTorch

Scalar has zero dimensions.It is a single number.Vector is two dimensional.Matrix is two or more dimensional.Tensor is everything except scalar.So anything more than one dimension can be called as tensor.

Tensors are similar to numpy array. However numpy array can’t be used with GPU.This is the main advantage of using tensor over numpy array. Now let’s see some comparison between numpy and torch tensors

import numpy as np
import torch
np_arr = np.array(3)
tensor = torch.tensor(5)
print(np_arr)
print(tensor)
print(type(np_arr))
print(type(tensor))

We can convert numpy array to tensor and vice a versa.

tensor_from_np = torch.from_numpy(np_arr)
print(tensor_from_np)
print(type(tensor_from_np))
np_arr_from_tensor = tensor.numpy()
print(np_arr_from_tensor)
print(type(np_arr_from_tensor))

Matrix Operations

Using Numpy:

np.random.seed(0)
mat1 = np.random.randn(2,2)
mat2 = np.array([[1,2],[3,4]])
print(mat1)
print(type(mat1))
print(mat2)
print(type(mat2))
print(mat1+mat2)
print(mat1-mat2)
print(mat1/mat2)
print(mat1*mat2)

Using torch tensors:

torch.random.seed = 0
mat1 = torch.randn(2,2)
mat2 = torch.from_numpy(np.array([[1,2],[3,4]]))
print(mat1)
print(type(mat1))
print(mat2)
print(type(mat2))
print(mat2.shape)
print(mat1+mat2)
print(mat1-mat2)
print(mat1/mat2)
print(mat1*mat2)

As we can see numpy array and torch tensors are very similar in usage.Notebook available at git-repo

In next article we’ll build neural network from scratch using numpy and PyTorch.

If you liked the article or have any suggestions/comments, please share them below!

LinkedIn

--

--