How Neural Networks Work

Very nice article, probably the first time I really understood the concept.. but I think there is an error with the sigmoid function handling zero. so [0, 0,0]
would **always** result in 0.5 (Also checked my logic adding more training sets and many more training cycles, always the same result) the 1/(1 + np.exp(-x)) would always result in 0.5 . Testing for [0 1 0] would give the expected result given enough training cycles. So the normalizing sigmoid should probably be fixed

Like what you read? Give Alon Nisser a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.