Softsign function —‘S’ shaped function similar to the Sigmoid function.

Step by step implementation with its derivative

neuralthreads
3 min readDec 1, 2021

In this post, we will talk about the Softsign activation function and its derivative. The shape of the Softsign function is very similar to the Sigmoid function but the output range is (-1, 1) unlike (0, 1) which is for the Sigmoid.

You can download the Jupyter Notebook from here.

Back to the previous post

Back to the first post

3.3 What is Softsign activation function and its derivative?

This is the definition of the Softsign function.

And it is very easy to find the derivative of the Softsign function.

This may help if you are wondering how to take the derivative of absolute x.

This is the graph for the Softsign function and its derivative.

Softsign function and its derivative graph

We can easily implement the Softsign function in Python.

import numpy as np                             # importing NumPy
np.random.seed(42)
def softsign(x): # Softsign
return x / (abs(x) + 1)
def softsign_dash(x): # Softsign derivative
return 1 / (abs(x) + 1)**2
Defining Softsign function and its derivative

Let us have a look at an example.

x = np.array([[0.2], [0.5], [1.2], [-2.3], [0]])
x
softsign(x)softsign_dash(x)
Example for the Softsign function and its derivative

I hope now you understand how to implement the Softsign function and its derivative.

Watch the video on youtube and subscribe to the channel for videos and posts like this.
Every slide is 3 seconds long and without sound. You may pause the video whenever you like.
You may put on some music too if you like.

The video is basically everything in the post only in slides.

Many thanks for your support and feedback.

If you like this course, then you can support me at

It would mean a lot to me.

Continue to the next post — 3.4 ReLU and Leaky ReLU Activation functions and their derivatives.

--

--