Great Explanation! Thank You very much.
Anam Qureshi

Just saw this. Im not too sure if anything would change? As long as the neural net is still end to end differentiable ie. uses a continuous activation function It would be fine. Im not sure what effect binary weights might have – i’m guessing less complexity but also maybe promotion of sparsity… but again, dunno.

Like what you read? Give Rohan Kapur a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.