Mark Baynes
Apr 1 · 1 min read

Excellent article James. I did my first degree in Computing & A.I. in 1988 and had huge fun building a perceptron from scratch with this new-fangled ‘backprop’ idea in 1991 for my final project on a Mac SE/30. Those were heady days when A.I. was going to do this, that and everything else. Except it did not.

So I spent the next three decades faffing around with other careers until recently starting to look (and program) A.I. again. I thought I might be the only person who was baffled by the fact that A.I. had hardly changed at all in 30 years, but it seems not (thanks!).

As a relative outsider (with some insider knowledge) the scale of the huge resources being thrown at deep learning is fascinating if only for the fact that few people seem to be able to stand back and point out that these systems are brittle and are many human generations away from exhibiting the intelligence of a two-year-old child. Or a two-year-old bat. Or a two-year-old fox… And don’t get me started on the abilities of an octopus of any age. Now there is a brain to study. A reading of “Other Minds: The Octopus and the Evolution of Intelligent Life” by Peter Godfrey-Smith will explain why.

Great work, keep it up.

Mark Baynes

Written by

The Wapping Mole loves Wapping. - and the rest of the East End too! Keeping an eye on Tower Hamlets London.