# Top 5 Statistical Functions in PyTorch to Rule in Data Science

PyTorch is an open source machine learning library based on the Torch library, used for applications such as computer vision and natural language processing, primarily developed by Facebook’s AI Research lab. We all are aware of the fact that Statistics has huge applications in Deep Learning. In this article, out of thousands of functions only five most useful are mentioned. You can check out my notebook here.

Let’s start the discussion by importing the library.

`# Import torch libraryimport torch`

# Function 1: torch.bernoulli()

In the above example, using `uniform_(0,1)` function we have generated a square matrix of order 4. The elements are probabilities and hence belongs to the range [0,1]. Applying `torch.bernoulli()` on a, we obtained a matrix whose elements are binary random numbers.

# Function 2: torch.multinomial()

Here, we have created a tensor as weights with probabilities and applying `torch.multinomial()` on it, we get the tensor containing indices using `Multinomial Probability Distribution`.

# Function 3: torch.poisson()

`torch.rand()` function gives us a tensor with random numbers from uniform distribution on the interval [0, 1). Using `torch.poisson()` function on the tensor rates1, we get our expected output.

# Function 4: torch.normal()

As expected we are returned to random numbers drawn from `Normal Distribution` with specified mean and standard deviation.

# Function 5: torch.randn()

Here we see the output has 5 random numbers which follow `Normal Distribution` with mean 0 and standard deviation 1.

# Conclusion

This is my first article on Medium. Your feedback will be highly appreciated. Click on the clap option, if you find the blog helpful.

Written by