How Neural Networks Work
gk_
4044

Very nice article, probably the first time I really understood the concept.. but I think there is an error with the sigmoid function handling zero. so [0, 0,0]
would **always** result in 0.5 (Also checked my logic adding more training sets and many more training cycles, always the same result) the 1/(1 + np.exp(-x)) would always result in 0.5 . Testing for [0 1 0] would give the expected result given enough training cycles. So the normalizing sigmoid should probably be fixed