Onepagecode
Published in

Onepagecode

Analyzing The activation functions of common neural networks

It is important to choose function activation for hidden and output layers that are differentiable when constructing Artificial Neural Networks (ANN). As a result, to calculate the backpropagated error signal necessary to determine ANN parameter updates, it is necessary to know the gradient of the activation function gradient. Identity functions, logistic sigmoid functions, and hyperbolic tangent functions are three of the most common activation functions used in ANNs. In Figure 1, examples of these functions and their derivatives (in one dimension) are…

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Onepagecode

Onepagecode

Studied cs from uni of essex. I like ML and DS.