Machine Learning Basics

Maham Shafiq
1 min readJun 2, 2020

--

This article covers many of the basic Machine Learning concepts explanation which I overlooked in other articles to maintain the clarity of the topic

Leaky ReLu

We should use a leaky ReLU to allow gradients to flow backward through the layer unimpeded. A leaky ReLU is like a normal ReLU, except that there is a small non-zero output for negative input values.

Sigmoid Output

We’ll also take the approach of using a more numerically stable loss function on the outputs. Recall that we want the discriminator to output a value 0–1 indicating whether an image is _real or fake, which combines a `sigmoid` activation function **and** and binary cross-entropy loss in one function.

So, our final output layer should not have any activation function applied to it.

--

--