Lucy says hi — 2031, AGI, and the future of A.I
In the year 2031 there is a new Alexa in town, and artificial general intelligence is one of its features. What is Lucy made of? — Lucy anticipates your needs and concerns all throughout the day, and is there to share with you the good times and comfort you through the hard ones. She was born from the next revolution that happened after the deep learning one. …
Journey to the center of the neuron
Dive into the salty ocean of the brain and get closer to the entities that inspire our A.I systems and make your thoughts possible. Explore how understanding the difference between artificial and biological neurons may give us clues about how to move towards a more flexible kind of artificial intelligence. —
Convolutional layer hacking with Python and Numpy
Create a convolutional layer from scratch in python, hack its weights with custom kernels, and verify that its results match what pytorch produces — If you are starting to work with convolutional layers in deep learning you may be confused at times with the mix of parameters, computations and channels involved. From stripe to padding, input and output channels, kernels and learnable parameters, there is a lot going on. In this article, we are…
X-Ray Transformer Infographic
Dive into transformers training & inference computations through a single visual — The transformer architecture has produced a revolution in the NLP field and in deep learning. A multitude of applications are benefiting from the capacity of these models to process sequences in parallel while achieving a deeper understanding of their context through the attention mechanisms they implement. …
Loss landscapes and the blessing of dimensionality
Explore visualizations of deep learning loss landscapes, the blessing of dimensionality and other visuals including GANs & Geometric DL — In this article, you will: Understand how the loss landscapes of neural networks can be visualized. Explore high resolution visualizations (both static and in movement) of loss landscapes using real data captured from a variety of networks, from convolutional nets to GANs.
Leonardo and the GAN dream
Explore the concept of a GAN network by travelling back in time to the renaissance, as Leonardo faces one of his most surprising challenges — Leonardo slammed his fist on the table. — Outrageous! Porca Miseria! The courtiers of the king and other royalty contemplated the scene from the sides of the great room. Clara, his cousin and confidant, tried to comfort him. — What happened Maestro? What’s the matter?
Predict malignancy in cancer tumors with your own neural network
In the final part of this series, we predict malignancy in breast cancer tumors using the network we coded from scratch. — In part 1 of this series, we understood in depth the architecture of our neural network. In part 2, we built it using Python. We also understood in depth back-propagation and the gradient descent optimization algorithm. In the final part 3, we will use the Wisconsin Cancer data-set. We will…
Coding a 2 layer neural network from scratch in Python
In the second part of this series: code from scratch a neural network. Use it to predict malignant breast cancer tumors — In part 1 of this article, we understood the architecture of our 2 layer neural network. Now it’s time to build it! In parallel, we will explore and understand in depth the foundations of deep learning, back-propagation and the gradient descent optimization algorithm.
The keys of Deep Learning in 100 lines of code
Predict malignancy in cancer tumors with a neural network. Build it from scratch in Python. — In this 3 part article you are going to: Create a neural network from scratch in Python. Train it using the gradient descent algorithm. Apply that basic network to The Wisconsin Cancer Data-set. Predict if a tumor is benign or malignant, based on 9 different features. Explore deeply how back-propagation…