Hey If you could fill your ideas here -
Arjun Menon
1

thanks Arjun, the synapse direction and weight updates are part of gradient descent, here’s an intro to this topic.

One of the limitations of neural nets is the need to rebuild the entire model when the training data changes. To understand this better here’s an explanation.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.