What I’ve learned from my Udacity Deep Learning course (Sigmoid Function Vs SoftMax Function)
A week ago, I decided to dive deeper into deep learning and since then, I have collected some useful resources and started learning from them.The first one was Udacity Deep Learning By Google I just grab it because it was free.
My first lesson was From Machine Learning to deep learning.
In that lesson, I found many useful topics and I want to summarise them in this series of articles :
Here is what I will be talking about :
1. Sigmoid Function Vs SoftMax Function (This article )
2. RMSE(Root Mean Square Error ) Vs Cross Entropy(The next One)
Let’s start with the first one:
Sigmoid Function Vs SoftMax Function
For those who are familiar with collège mathematics and some machine learning course, I’m sure you have heard about sigmoid function.It’s defined as follows :
The sigmoid function takes any range real number and returns the output value which falls in the range of 0 to 1.
You can find the following picture his representation and his definition:
It’s often used in logistic regression for a binary classification and also as activation function in neural networks.
The output of the sigmoid function can be interpreted as a probability.
This function can be used for many tasks in real life: classify spams mails, classify banks transactions, etc.
You can find more about sigmoid function on this quora question.
In my deep learning course I heard about the softmax function, is it the same with the sigmoid function?
I haven’t heard about the softmax function before, and the first time I heard about it I was a bit confused:
What is a softmax function?
Softmax function (or multinomial logistic regression) is a generalization of sigmoid function to the case where we want to handle multiple classes (multi-class classification).
Here is the mathematic formula of softmax function :
Softmax function can take any kind of scores (an array or a vector )and return the proper probabilities, it will be large when the scores are large.The sum of returned values of the softmax function is always equal to 1.
It’s proven that sigmoid is a particular case of softmax with i=2
Softmax function is often used in deep learning when we work with neural networks and it can be used to classify images.
That all for today, hope you will learn something in this article, I will come tomorrow with something else about what I’ve learned.
To be honest, I’m not good in writing, but I’m obliged to do it because I want to master English language and improve my writing skills.
Feel free to give feedback.
You can find out more about the topics above in the links below :