The Anatomy of Deep Learning Frameworks
Gokula Krishnan Santhanam
2047

Hi Gokula,

Thank you for this post. I’m implementing my own deeplearning framework. Your post is very helpful.

One thing I noticed in this post is that you mentioned the existence of Auto-differentiation tools. What algorithm do they use? is it the same as gradient check?

In particular, in your article, you said:

Unfortunately, some nonlinearities like ReLU (Rectified Linear Units) are not differentiable at some points. So, we instead calculate the gradient in an iterative manner.

Which iterative algorithm are you referring to?

I calculate ReLU derivative this way:

if (output > 0) {
return 1;
} else {
return 0;
}

Is there a generic way of calculating any derivatives?

Thanks,

Like what you read? Give Shi Yan a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.