Activation Functions in Neural Networks

Mkale
2 min readMar 7, 2022
Activation Function

Activation Functions

An Activation Function decides whether a neuron should be turned on or not. This means that it’ll choose whether the neuron’s input to the network is significant or not in the procedure of forecasting operating simpler mathematical jobs.

The purpose of the Activation Function is to decide output from a set of input values catered to a node (or a level).

Types of Neural Networks Activation Functions.

Binary Step Function

This activation function genuinely fundamental and it comes to brain every moment if we test to set output.

Linear Function

It’s a plain straight line activation function where our function is directly proportionate to the weighted sum total of neurons or input.

Non Linear Neural Network Activation Function

Sigmoid Function

It’s generally utilized for models where we’ve to foretell the probability as an output. Since probability of anything exists only between the range of 0 and 1, sigmoid is the accurate preference because of its range.

Tanh Function (Hyperbolic Tangent)

Tanh function is genuinely corresponding to the sigmoid/ logistic activation function, and indeed has the identical S- shape with the distinction in output range of-1 to 1.

ReLU Function

ReLU stands for Rectified Linear Unit.

Although it gives an stamp of a linear function, ReLU has a secondary function and allows for backpropagation while coincidentally forming it computationally effective.

The main catch currently that the ReLU function doesn’t drive all the neurons at the same moment.

Leaky ReLU Function

Leaky ReLU is an advanced interpretation of ReLU function to solve the Dying ReLU challenge as it has a small positive grade in the negative zone.

Softmax Function

The softmax activation function is another kind of AF applied in neural networks to calculate probability distribution from a vector of real figures. This function generates an output that ranges between values 0 and 1 and with the sum of the chances existing equal to 1.

The exponential linear units (ELUs) function is an AF that’s also applied to speed up the training of neural networks ( just like ReLU function).

Conclusion

In this blog, we discussed about Activation function in neural network and types of activation function.

--

--