Lucy anticipates your needs and concerns all throughout the day, and is there to share with you the good times and comfort you through the hard ones. She was born from the next revolution that happened after the deep learning one. And what is Lucy made of?
Let’s imagine that we are in that year, 2031 (a symbolic number, as Lucy may not be feasible until many years after that date), and let’s entertain a variety of hypotheses about what kind of substrate Lucy may have. Let’s do it!
AGI, artificial general…
If you are starting to work with convolutional layers in deep learning you may be confused at times with the mix of parameters, computations and channels involved. From stripe to padding, input and output channels, kernels and learnable parameters, there is a lot going on. In this article, we are going to go deep till the very bottom of what goes on within these conv layers. We will:
The transformer architecture has produced a revolution in the NLP field and in deep learning. A multitude of applications are benefiting from the capacity of these models to process sequences in parallel while achieving a deeper understanding of their context through the attention mechanisms they implement. And GPT-3 is a hot topic right now in the deep learning community.
Understanding how the transformer processes sequences can be a bit challenging at first. When tackling a complex model, many people like to study how the computations of the model change the shapes of the tensors that travel through it.
In this article, you will:
“Life requires Movement” — (Aristotle, 4th century BC)
“Life is movement. The more life there is, the more flexibility there is. The more…
Leonardo slammed his fist on the table.
— Outrageous! Porca Miseria!
The courtiers of the king and other royalty contemplated the scene from the sides of the great room. Clara, his cousin and confidant, tried to comfort him.
— What happened Maestro? What’s the matter?
Leonardo hesitated for a moment. Then sighed and raised his eyes towards Clara. Suddenly, his index finger made a massive leap that took Clara’s gaze all the way to the other end of the sumptuous room.
— Antonio? What’s the matter with your apprentice? I thought you were so very delighted, and..
Everyone whispered hurriedly…
In part 1 of this series, we understood in depth the architecture of our neural network. In part 2, we built it using Python. We also understood in depth back-propagation and the gradient descent optimization algorithm.
In the final part 3, we will use the Wisconsin Cancer data-set. We will learn to prepare our data, run it through our network and analyze the results.
It’s time to explore the loss landscape of our network.
In part 1 of this article, we understood the architecture of our 2 layer neural network. Now it’s time to build it! In parallel, we will explore and understand in depth the foundations of deep learning, back-propagation and the gradient descent optimization algorithm.
In this 3 part article you are going to:
A multidisciplinary engineer, researcher, creative director, artist and entrepreneur, from augmented reality to deep learning, filmmaking, 3D and beyond.