This blog post describes how I optimised a Generative Adversarial Network and a Recurrent Neural Network using data from 40,000 scraped Tinder accounts to create generative fake dating profiles.

Here is a link to the final website. Please note that it is not optimised whatsoever for mobile devices!

At no point in the blog post or elsewhere on the internet will I share the data that I obtained, so please don’t ask for it — I have already deleted it.

Our Public Data

Recently I have been thinking about the privacy, ownership and availability of data.

Like many, in the wake of the Cambridge Analytica scandal, I found the magnitude of data harvested from Facebook by a third party astonishing. Collectively I…


Seasons Greetings With Pix2Pix

First and foremost, a thousand pardons for the cheesy title of this blog post. It’s only five days until Christmas, and I needed to come up with an apt theme. Once I thought of the name, it stuck and I struggled (or didn’t want to) come up with a new one. I’m awful, I know.

TL;DR

If you can’t be bothered to read this entire post, firstly I don’t blame you — it’s Christmas after all, what the hell are you doing reading this post — and two, you can see the output of the network’s learned representation of The Snowman…


Stop Wasting Your Time, Vectorise Your Algorithms!

I briefly did a few notebooks experimenting with vectorised versions of two metaheuristic algorithms, Differential Evolution (DE) and Dispersive Flies Optimisation (DFO), using TensorFlow last night and well into the morning.

TensorFlow isn’t just a library for neural networks, it is a general purpose library for building computation graphs that can be run on the CPU or GPU. It also does a lot of work to be similar to the NumPy libraries API. …


This post is another tutorial on genetic algorithms with a sprinkling of DFO about halfway through! If you would like to see the other blog posts then you can find them here.

  1. The No Free Lunch Theorem
  2. Optimising Neural Nets With Biomimicry
  3. Fine Tuning Dispersive Flies Optimisation
  4. Introducing Stochastic Diffusion Search
  5. Evolving Solutions With Genetic Algorithms

This post tackles various challenges with metaheuristics, and I see it as a bit of bread and butter exercises. Hopefully you’ll enjoy working through them too.

Warming Up

Whilst I’d rather take some time out to binge Stranger Things on Netflix, my professor has had the…


Hi and welcome to this weeks post on algorithms inspired by nature. The previous weeks can be found the following links:

  1. The No Free Lunch Theorem
  2. Optimising Neural Nets With Biomimicry
  3. Fine Tuning Dispersive Flies Optimisation
  4. Introducing Stochastic Diffusion Search

Ultimately as a deep learning enthusiast, I try to create neural solutions to the various problems I have to solve daily. Neural networks are comprised of weights and biases, which are optimised by an algorithm called backpropagation. As there is no free lunch, there are cases where backpropagation is not feasible, such as using a neural network to make a…


This post is on a project exploring an audio dataset in two dimensions. I cover some interesting algorithms such as NSynth, UMAP, t-SNE, MFCCs and PCA, show how to implement them in Python using Librosa and TensorFlow, and also demonstrate a visualisation in HTML, JavaScript and CSS that allows us to interactively explore the audio dataset in a two dimensional, parameterised plots, which just like these:

TL;DR

The results are hosted in a small web application on my university's servers — have a play with it!


As deep learning witches and wizards, there are many secret sauces that we need to stir into our predictive potions. As a non-mathematician and headstrong programmer, I often develop headaches when reading deep learning papers and desperately search for the source code.

This series is about the journey of digging into the basics of deep learning, from the bottom up, to complement the knowledge we develop as hacker who is used to building projects with awesome libraries like TensorFlow or Keras.

One of the main sources of information for this series is The Deep Learning Book, by the heavyweights Ian…


This blog post is a high level overview of my foray into using neural networks to trade bitcoins.

For about two years, I’ve doggy-paddled well out of my depth through the endless ocean of deep learning soup. During this, I’ve become ever more lazy; typically, I can never be bothered to learn a new topic, so I often find myself wondering if I can get my computer to learn and do it for me!

This article is about training learning systems (using neural networks) to trade cryptocurrencies for me, because I’m too lazy to read the thick financial trading tomes…


Hi and welcome to this weeks post on algorithms inspired by nature. The previous weeks can be found the following links:

  1. The No Free Lunch Theorem
  2. Optimising Neural Nets With Biomimicry
  3. Fine Tuning Dispersive Flies Optimisation

The full code for this week (C++ using the openFrameworks library) can be found at the github repository here.

Stochastic Diffusion Search (SDS) belongs to the wider family of swarm intelligence algorithms. …


This is part three in a series reviewing algorithms inspired or attributed from nature. You can find part one on the No Free Lunch theorem here and part two on applying DFO here, and the next part on Stochastic Diffusion Search here.

Unfortunately this post is simply a terse review of the DFO algorithm; I have a busy schedule this week and don’t have a lot of time, so I will highlight some of the key features of applying it to various error surfaces.

If you want an intro to DFO, I suggest you read the second post in the…

Leon Fedden

Wizard nerd summoning TensorFlow, C++ and Python black magic from the void. https://leonfedden.co.uk/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store