In the Land of Artificial Intelligence

AI Masterclass Tutor — The Storytelling Teaching Mode

Shahani Markus
3 min readNov 13, 2023

--

This is the 3rd of a sequence of posts — links to the first and the second.

Here is my Personal GPT Tutor’s attempt to teach me complex AI topics in a storytelling manner, after I requested it to “go easy” with the complex topics. And the result? Hilarious or educational?

The Story of Deep Learning

Chapter 1: In the Land of Artificial Intelligence

In a vast land called Artificial Intelligence, there existed many methods to solve complex problems. One day, a method named Deep Learning emerged, promising to revolutionize how machines learned and processed information.

Chapter 2: The Birth of Neural Networks

Deep Learning was based on structures called Neural Networks, inspired by the human brain. Imagine a bustling city, where each building is a neuron, and the roads connecting them are the synapses (weights). Information (data) travels through these roads, transforming as it moves from building to building.

Activation Functions: They are like gatekeepers at each building, deciding how much information should pass through. ReLU, Sigmoid, and Tanh were the most renowned gatekeepers, each with their unique way of controlling the flow.

Chapter 3: The Quest for Vision with CNNs

As Neural Networks evolved, they specialized. The Convolutional Neural Network (CNN) was born, designed to interpret the visual world. Imagine a skilled artist (CNN) who could look at a landscape (image) and effortlessly pick out details like edges, textures, and shapes (features).

Layers of CNN: Each brush stroke (convolution) added depth, and the artist periodically stepped back (pooling) to simplify the picture, focusing on the essential elements.

Chapter 4: The RNNs and the Art of Storytelling

But there was a different challenge in the land: understanding stories and sequences, like songs or tales (sequential data). The Recurrent Neural Networks (RNNs) were the bards who excelled at this. They remembered (stored) past words (inputs) to make sense of the story as it unfolded.

The Challenge: However, these bards struggled with long epics (long sequences). Their memory (gradients) would either fade (vanish) or become overwhelming (explode).

The Heroes, LSTM and GRU: Two heroes, LSTM and GRU, rose to solve this. They carried special bags (gates) that helped them remember important parts of the story and forget the trivial ones, making them excellent storytellers, even for the longest epics.

Chapter 5: Overcoming Challenges

Despite their powers, the Neural Networks faced the danger of becoming overconfident (overfitting), thinking they knew everything after hearing just a few tales. Techniques like dropout (randomly ignoring some buildings) and L2 regularization (penalizing large roads) helped them stay humble and adaptable.

Epilogue: The Journey Continues

The world of Deep Learning continues to evolve, with new structures like Transformers emerging, promising to unravel even more complex mysteries of data. The journey of learning and discovery is endless, with each chapter revealing new wonders and challenges.

Ok. This is not how I expected to be taught! I don’t think I could tolerate reading any more about some AI La-La Land to learn AI concepts. I guess it is back to the drawing boards to build a better-performing AI Masterclass Tutor…

--

--

Shahani Markus

I've worn multiple hats yielding an eclectic blend of skills spanning entrepreneurship, product development, tech strategy, and academia.