Member-only story
Hands On Neural Networks and Time Series, with Python
From the very simple Feed Forward Neural Networks to the majestic transformers: everything you need to know
During my Bachelor’s Degree, my favorite professor told me this:
Once something works well enough, nobody calls it “AI” anymore
This concept goes in the same direction of Larry Tesler who said “AI is whatever hasn’t been done yet.” The first example of Artificial Intelligence was the calculator, which was (and is) able to do very complex mathematical computations in a fraction of a second while it would take minutes or hours for a human being. Nonetheless, when we talk about AI today we don’t think about a calculator. We don’t think of it because it simply works incredibly well, and you take it for granted. The Google Search algorithm, which is in many ways much more complex than a calculator, is a form of AI that we use in our everyday lives but we don’t even think about it.
So what is really “AI”? When do we stop defining something as AI?
The question is pretty complex as, if we really think about it, AI has multiple layers and domains.