Everybody can make DeepFakes without writing a single line of code.

Image for post
Image for post
Photo by Christian Gertenbach on Unsplash

Do you dance? Do you have a favourite dancer or performer that you want to see yourself copying their moves? Well, now you can!

Imagine having a full-body picture of yourself. Just a still image. Then all you need is a solo video of your favourite dancer performing some moves. Not that hard now that TikTok is taking over the world…

Image animation uses a video sequence to drive the motion of an object in a picture. In this story, we see how image animation technology is now ridiculously easy to use, and how you can animate almost anything you…


Literate programming is now a reality through nbdev and the new visual debugger for Jupyter.

Image for post
Image for post
Photo by Max Duzij on Unsplash

Notebooks have always been a tool for incremental development of software ideas. Data scientists use Jupyter to journal their work, explore and experiment with novel algorithms, quickly sketch new approaches and immediately observe the outcomes.

However, when the time is ripe, software developers turn to classical IDEs (Integrated Development Environment), such as Visual Studio Code and Pycharm, to convert the ideas into libraries and frameworks. But is there a way to transform Jupyter into a full-fledged IDE, where raw concepts are translated into robust and reusable modules?

To this end, developers from several institutions, including QuantStack, Two Sigma, Bloomberg and…


Turn your Windows machine into a developer workstation with WSL 2.

Image for post
Image for post
Image by OpenClipart-Vectors from Pixabay

I used to have an Apple laptop as my daily driver. I could do almost everything there; development, proposal writing, music composition etc. But the fear of vendor lock-in, the concern that I am depended on Apple’s whims and vices — which are arguably very expensive — led me to seek a new solution.

I started building a machine learning workstation; a great CPU, lots of RAM and a competent GPU, among others. My OS of choice for almost anything was Ubuntu, except I needed Microsoft Office for proposal writing. Office online is just not there yet and, let’s face…


What if you could use your NPM goodies to build Jupyter widgets with Python?

Image for post
Image for post
Photo by Myriam Jessier on Unsplash

Notebooks have always been a tool for the incremental development of software ideas. Data scientists use Jupyter to journal their work, explore and experiment with novel algorithms, quickly sketch new approaches and immediately observe the outcomes.

This interactivity is what makes Jupyter so appealing. To take it one step further, Data Scientists use Jupyter widgets to visualize their results or create mini web apps that facilitate navigating through the content or encourage user interaction.

However, IPyWidgets are not always easy to work with. They do not follow the declarative design principles pioneered by front-end developers, and the resulting components cannot…


Everything’s right at your fingertips

Image for post
Image for post
Photo by Clément Hélardot on Unsplash

Notebooks have always been a tool for the incremental development of software ideas. Data scientists use Jupyter to journal their work, explore and experiment with novel algorithms, quickly sketch new approaches and immediately observe the outcomes.

Moreover, JupyterLab is moving towards becoming a full-fledged IDE; just not an IDE you are used to. With its great extensions and libraries like kale and nbdev, it is certainly capable of doing much more than just drafting an idea. Check the story below to find out more.

However, once every blue moon, we may want to edit a .py file. This file may…


A new Deep Learning framework that makes building baselines in PyTorch trivial

Image for post
Image for post
Image by Gerd Altmann from Pixabay

Say you are a researcher; you have an innovative idea for a brand new neural network architecture or a strange-looking optimizer step that has the potential to be a breakthrough in the area of deep learning. How should you test your approach?

On the other hand, say you are a machine learning engineer, and you want to test if the model you have built makes sense. Is it the best you can do? Wouldn’t it be better to pick an off-the-shelf, battle-tested algorithm and deploy this instead?

Lightning Flash: a new Deep Learning framework that makes building baselines in PyTorch…


How free-living, transparent nematodes provide inspiration for the next steps in Deep Learning

Image for post
Image for post
Image by Arek Socha from Pixabay

Caenorhabditis elegans, or C. elegans, is a free-living, transparent nematode. A roundworm, about 1mm in length that lives in temperate soil environments. Its nervous system consists of 302 neurons, yet it can generate surprisingly complex and diverse patterns.

On the other hand, we have Recurrent Neural Networks with millions of parameters and thousands of nodes. Still, their behavior stays fixed after the training phase. Consequently, their adaptability to changing environments is limited.

A network’s behavior remains fixed after the training phase. What happens, then, when the incoming data’s characteristics change during inference? How could we address this issue?

To this…


Nothing is stopping you from scaling-out your deep learning training process on multiple GPUs today

Image for post
Image for post
Photo by Science in HD on Unsplash

Deep learning thus far is data-hungry. Recent advances are mostly due to the amount of data at our disposal, the fuel that keeps the engine running. Thus, the need to scale-out the model training process is higher than ever.

At the same time, the field of DevOps is gaining traction. Kubernetes is ubiquitous; monolithic legacy systems are breaking up into smaller microservices that are easier to maintain. If something is wrong, these services are treated like disposable pieces of code that can be taken down and restored in no time.

How can we bring all the pieces together and scale-out…


Writing distributed applications with PyTorch: a real-world example

Image for post
Image for post
Image by PublicDomainPictures from Pixabay

Deep Neural Networks (DNNs) have been the main force behind most of the recent advances in Machine Learning. DNNs have transformed how we solve challenges in a diverse set of use cases from image classification to language translation and from content recommendation to drug discovery.

Recent advances are mostly due to the amount of data at our disposal, the fuel that keeps the Deep Learning engine running. Thus, the need to scale-out model training to more computational resources is higher than ever.


A guide on how to write distributed deep learning applications with PyTorch

Image for post
Image for post
Photo by Omar Flores on Unsplash

Deep Neural Networks (DNNs) have been the main force behind most of the recent advances in Machine Learning. From image classification to language translation and from content recommendation to drug discovery, DNNs have transformed how we solve challenges in a diverse set of use cases.

Today, the need to scale-out model training to more computational resources is higher than ever.

Recent advances are mostly due to the amount of data at our disposal, the fuel that keeps the Deep Learning engine running. Thus, the need to scale-out model training to more computational resources is higher than ever.

There are two…

Dimitris Poulopoulos

Machine Learning Engineer @ Arrikto | PhD(c) @ University of Piraeus, Greece

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store