for example y = 0.01x for x<0 will make it a slightly inclined line rather than horizontal line. This is leaky ReLu.
Understanding Activation Functions in Neural Networks
Avinash Sharma V
5.3K22

Does Leaky Relu not outweigh it’s advantage of generating a sparse and light network? Is this a trade-off that needs to be considered?

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.