Evolutionary failures (Part 1)

thi.ng
13 min readJan 5, 2016

--

A journey into Genetic Programming techniques with Clojure and thi.ng

Over the years, generative techniques like Cellular Automata, Diffusion-limited Aggregation, Self-organization, Reaction-Diffusion, Genetic Algorithms etc. have been playing a key role in my work and I more or less successfully managed to use them for creating form & music. For an equally long time Genetic Programming (GP short, a specialization of Genetic Algorithms) too has been on my to-do list of techniques to use for a project, yet it never seemed sufficiently suitable until about a year ago, when I received a chance email from my friends at HOLO to create a guest design for the second issue of their “magazine” (it really should be called a book series!). Each issue of HOLO is centered around a topic in the crossover field of arts, science and technology. The topic chosen for the new issue was an “investigation of chance, predictability, and (true) randomness”, a near perfect fit for employing some evolutionary techniques to create visuals supporting the editorial throughout the magazine.

This article is split into two parts, documenting key stages of my process for this project and trying to extract some more general learnings from attempting to utilize this form of programming for design creation with Clojure.

AFAIK this still is *the* Genetic Programming text book. Jam packed with theory and lots of different implementation approaches. I found my copy in a second hand bookstore in Boston and its discussion of LISP was one of the reasons I picked up Clojure a few years later (2011). Hint: It’s perfect for GP.

Before going into the various design routes explored, let’s first briefly discuss what Genetic programming actually is. Many people familiar with generative design will have heard of or even played around with GP’s cousin (superset really), Genetic Algorithms (GA). Both techniques are very similar and strongly based on the Darwinian theory of evolution, which even Creationists might find interesting to ponder and experiment with. GP & GA are iterative processes: Starting with a population of initially random genomes, we decode & apply them to create concrete phenotypes (individuals with specific gene expressions), which are then scored for fitness using a domain specific fitness function. The only real differentiator between GA and GP is that genotypes in the latter are program operators and phenotypes actual “complete” programs, needing to be run/executed before their outputs can be used for fitness scoring. In other words, GP is a form of progressive, goal-guided, automatic code generation and optimization - all in one. The most successful members of the current population are then selected for reproduction, often including probabilistic mutation and mating to achieve cross-over effects and mixing of genomes. The resulting offspring then forms the majority of the next population (often along with new completely random individuals to replenish the overall gene pool and to help avoiding getting stuck at local minima in the often hilly terrain of optimization). The process then repeats over hundreds of generations with the aim of gradually increasing the overall fitness of the population until a threshold/target fitness has been reached or no further improvements can be found…

Since the process does not prescribe any other specifics, it’s suitable & flexible enough for a wide range of design problems and for many years has provided a fertile ground for research and experimentation, incl. arts, music, architecture & engineering, but also the nature of programming itself (see resources at the end of this article). More recent versions of GP (e.g. Cartesian Genetic Programming), combined with modern hardware (GPUs, FPGAs) makes application of this automatic programming technique more feasible than ever before, since entire populations can now be evaluated in parallel (and in hardware), potentially cutting down evolution time by magnitudes. That is important for use cases which rely on finding optimal solutions, on-demand and in hitherto unknown situations (see second video below).

Lee Spector’s GP talk at Clojure Conj 2015 is a great intro to the field and how to use it in Clojure
Wes Faler’s talk about using Cartesian Genetic Programming to evolve camera De-Bayer filters for their moon buggy.

Defining a suitable fitness function and encoding scheme for genes can be the truly hard parts of GP/GA, though, especially if the design problem is not well defined, not easily encodable (or not yet known, as was the case for this project). Therefore, prior study of the field to employ GP in is paramount and prerequisite for success…

A section in Richard Dawkins’ “Climbing Mount Improbable” (1996) was my first general introduction to applying the concept programmatically on the computer.

Morphogen

My first step in the HOLO design process was the creation of a sufficiently flexible GP playground for my later experiments and to evolve path-finding agents to create a (typographic) form as an initial design idea. Related to this, though a year prior, I created the Mophogen DSL, partially done as a component of my commission for the Barbican / Google DevArt exhibition. Morphogen is a small domain-specific language, written in Clojure, for defining complex 3D forms starting from a single seed shape. Each operation acts on a single input shape and produces any number of result shapes (e.g. the “subdiv” operator splits a shape along an axis into 2 or more smaller shapes). If arranged in a tree structure, these operators recursively transform e.g. a box into highly complex objects. Most interesting to me was, how few operators would be actually needed to create interesting objects and the current implementation has only 8 built-in operators: split, reflect, inset corners, extrude, tilt, scale face, scale edge, delete.

This structure was evolved from a single box using an operator tree ~60 levels deep and only uses five of the eight available operators (source code). Rendered with Luxrender.

The main take away points from this project were:

  1. Being completely code based (the DSL is merely creating syntax trees), it’s of course easy to define re-usable operator trees/sequences as ever more high-level operators.
  2. Because each generated object is tree based, its creation can be understood as recursive refinement over time and therefore easily be animated, simply by terminating tree processing at increasing depths in the hierarchy. This is both educational and fascinating to watch, and one of the nice side-effects just emerging automatically from this modeling approach. An example animation is here (and another one below).
  3. The Morphogen setup already partially satisfies the overall structure needed for a GP system (minus fitness function & reproduction): The operator tree is equivalent to the genotype and its resulting object the phenotype. The trees can be easily crossed-over by swapping or transplanting branches between two (or more) individuals and/or can be mutated by randomly replacing/modifying individual nodes/locations in the tree… Also the reason why tree-based GP in general still is one of the most popular approaches.
  4. The Morphogen operators all use the same type & arity (always a single shape) and thus are easily interchangeable and simplify the cross-over & mutation steps.
Animation created using partial evaluation of the object’s operator tree with increasing tree depth limit over time.
The operator tree of the above Morphogen object with the initial seed shape transformation shown at the top.

Path finding

So mixing ideas from Morphogen, LOGO (Turtle concept) and GP ant colony simulations (food locations), I set out to define a LOGO-like mini “language” to evolve path-finding agents whose genome is consisting of only these three different operations:

  • if-target(dist) - hard-coded condition with true & false branches, checks if agent is currently sufficiently close to the target path and hasn’t been in this vicinity previously
  • mov(dist) - move X units forward (in current direction)
  • turn(theta) - rotate X degrees (counter)clockwise, no movement

Clojure, with its rich set of built-in data types, sequence abstractions and polymorphic functions (here: defmulti), is a great environment to implement an interpreter for such language experiments without much ceremony. The program tree (AST) is represented as nested S-expression using simple Clojure vectors and symbols. For cross-over and mutation operations I used the fast-zip library, a faster, drop-in replacement for Clojure’s built-in clojure.zip namespace. A zipper allows us to traverse a tree structure in a linear manner, whilst also supporting ascend/descend and sibling navigation at any point. That makes it very easy to pick a random tree node/branch and edit/replace it in a functional manner. And since we’re dealing with immutable data, this “editing” is of course non-destructive and would be quite involved without using zippers. Btw. Tommy Hall has a great introduction to zippers (and GP using S-Expressions).

A randomly generated agent program with applied peephole optimization, cutting down code size up to ~30%.

Each generation consisted of 128 such agent programs, with many growing to 2000 operations each (enforced limit). To keep code size down, I experimented with applying a simple Peephole optimization to coalesce successive ops of the same type (e.g. the sequence “mov(15), mov(12), turn(45), turn(-45), mov(10)” could be rewritten as a single “mov(37)” - the two turns cancel out each other and can be removed). Even though this worked and noticeably cut down on execution time for single generations, it also had a negative impact on the overall fitness progress. I never spent enough time trying to figure out why, but speculate it has to do with the overall smaller surface area to apply mutations and cross-over. “Junk DNA” is important indeed. If I ever get another chance in the future, I would like to implement ADF (Automatically Defined Functions), which seem to provide not just better code re-use and support the emergence of higher-level operators, but can also be scored for impact on fitness independently (as unit).

Basic path finding agent setup, random AST generator and interpreter (GP parts omitted for brevity)

The following animation shows a single simulation run of the path finding process, only showing the fittest agents of each generation:

946 generations of evolving path finding agents with the fittest program of each generation highlighted in red.

As part of the evolution, the top 10 paths of each generation were recorded and a subset of these later visualized together in a single image, thus showing the slow reduction of complete random walks and the appearance of the ever more complete target path over time. I used thi.ng/geom’s SVG module to export the raw paths, then used the auto-spline feature to smooth out kinks in the paths and the Parallel Transport Frames extruder to convert each 2D path into a 3D mesh tube for rendering. In the image below the generation (in terms of evolution) of a path is mapped to its elevation, with the 1st generation at the bottom. In order to create a smooth workflow, do as much as possible directly from the Clojure REPL and to avoid manual repetitive work for dozens of outputs, I used thi.ng/luxor to generate render scene templates for Luxrender with model, camera, light groups & material setups and then could directly produce hi-def renders straight from the REPL. To free up my laptop, rendering happened on dynamically spawned 16-core EC2 instances (relatively straightforward to achieve via Michael Cohen’s Amazonica library).

Selection of the fittest paths of 20 generations (from a total set of 1438 generations)

Since GP consumes an astonishing amount of random numbers, one late night last February I had the idea to build rnd.farm, a little websocket-based tool & web API to harvest randomness from human user interactions (mouse/touch & key events and timings) and inherently unpredictable network lag. Whilst not being able to produce nowhere near enough entropy required for crypto applications, it was a nice opportunity to involve readers of the magazine in the process and we collected close to 2 million numbers. Each user session collects up to 8KB of raw data, but also builds a stream of SHA-256 digests of chunks of 512 raw numbers to achieve further bit scattering. Later experiments in my process were all based on these collected random number streams.

rnd.farm screenshot, Clojure/Clojurescript project source: https://github.com/postspectacular/rnd.farm

“It is a misconception, based on the stereotype of a Turing machine as executing a prearranged program one step at a time, to assume that Turing believed that any single, explicitly programmed serial process would ever capture human intelligence in mechanical form.” - George Dyson

Reading George Dyson’s “Darwin among the machines” at about the same time, I found it an especially good complement to what I was doing. Its anecdotal coverage of the history of computing machines and contrasting parallels and philosophical arguments in biology (e.g. Darwinian evolution vs. Endosymbiosis), also led me to learn more about Nils Aall Barricelli, one of (if not the) first pioneer(s) of GP and computer-based artificial life simulations.

What followed next in my process was a long exploration, research and partial reconstruction of Barricelli’s approach, mixed with an additional second layer of GP to actually breed replication rules & functions. However this all will be discussed in detail in the upcoming second part of this article…

A book I wished to have read 10 years earlier, a fascinating and eye-opening treatise of the history of computing since the 17th century and discussion of its parallels in biology/science/fiction. Highly recommended reading!

Tissue

Signed Distance Functions (SDF) are one of the most exciting topics / techniques in shader programming. Popularized by demoscener Iq and others, SDFs can be used to not just represent many different & complex geometries, but also manipulate our understanding of space itself (by re-interpreting the produced distance values).

Fairlight & Alcatraz’ “Uncovering Static” demo (64KB, 2011) was a pioneering production using SDFs for mesh (re)construction. Matt Swoboda (Smash of Fairlight) also has done a few highly informative talks about realtime applications of this topic, slides here.

Fundamentally, an SDF is a mathematical function, which returns the signed distance between a point and the surface of an object. So for a sphere the function would return the distance between the point and sphere’s origin minus the radius. If the distance is negative, the point is inside, if positive outside, and if zero, it’s on the surface itself. Whereas SDFs are mostly used for realtime GPU rendering in pixel shaders, my main interest in them so far has been in defining voxel geometries and constructing mesh files from those for later offline processing (e.g. for 3D printing).

The mesh in the image above has been created with the handful of ready-to-run Clojure functions below (using various pieces of the thi.ng/geom library), then rendered in Blender. Instead of using the regular sphere distance function, the “sd-nested-sphere” function re-interpretes & warps the space within the sphere, using the modulo of the distance and some arbitrary step value, and thus creates the strange repetitions. The fractured patterns on the surface are sampling artifacts, caused by the voxel grid itself, with the function only being evaluated once for each cell (voxel) in the 3D grid: if the absolute distance from the cell center is less than a given small threshold (wall thickness), the voxel is set, else left empty. The final transformation is to extract an isosurface mesh from that voxel space and export it (here in Stanford PLY format). The following functions achieve all that (far from realtime though):

With the GP setup from earlier extended with more math (and vector math) operators, the next step was breeding random SDFs, applying them to voxels and myself playing the fitness function, purely judging results aesthetically and choosing the “fittest” individuals of each generation manually. I also introduced two other concepts in order to create more varied results:

  1. Layers - The voxel space is divided along the Z-axis into slices and grouped into cells (see next point) which are evolved individually. The final isosurface of the whole voxel space still produces only a single mesh though, and the interfaces between the different layers (and cells) produce interesting effects/geometries.
  2. Macro cells - To break the “boring” cube shape of the voxel grid/space, a random polygon mask is applied to each voxel layer. The polygon itself is created using (unconstrained) Delaunay triangulation from a random set of points and therefore is always convex. To create more interesting (concave) shapes, a number of border triangles (never internal ones) are then removed, also randomly (btw. a border triangle in a non-closed mesh has one edge with only a single face assigned). For each remaining triangle cell, a random SDF is generated and evolved.
A single layer of “tissue”, showing only minor differences in genes (and resulting SDFs) for each macro cell (triangles). Each layer has its own base genome, which cells inherit and only mutate.
Two layers, each with different sets of macro cells and their randomly evolved SDFs.
3 tissue layers.
5 layers
7 layers

For the last two images two other special layers were introduced: One using a hardcoded SDF to the edges of randomized & subdivided icosahedron mesh constellations. The other is a “meta layer” intersecting all other layers and uses an SDF to a random spline path.

To dial down the possibly too “organic” aesthetic for the HOLO project, another variation on this theme was to use randomly bred Cellular Automata instead of SDFs to fill the voxel space and also skip computing the isosurface, creating an altogether more crystalline aesthetic…

As indicated by the title of this article, repetitive failure is the core principle of any evolutionary process, and so too in our (somewhat evolutionary) design process, all these experiments discussed so far didn’t pass the ultimate fitness function of this project. My sincere thanks go out to the HOLO team though, who proved to be super supportive during all stages!!! Thank you, thank you, guys!

In the next part of the article (coinciding with the release of the magazine) we will focus on the research & steps leading to the creation of the chosen design route for HOLO and also explore how Genetic Programming can be an interesting approach, if combined with other evolutionary techniques, like Cellular Automata. Even in other (non-visual) disciplines.

A simple demo of using an extended system of 1D Wolfram Cellular Automata to play notes/chords on a predefined musical scale. This example is running on a STM32F4 dev board. Additional CAs could be used to create key & rhythm changes or accompanying melody lines.

To get a sneak peek of the upcoming HOLO issue and learn more about the magazine, visit:

thi.ng is a collection of libraries for creative computing (for Clojure, WebGL, OpenGL/OpenCL, and soon more). Please visit workshop.thi.ng to find details about upcoming intensive training workshops:

  • next Clojure session: Berlin, early Feb 2016
  • next ARM embedded device audio coding: London, 23–24 Jan 2016

In addition to the books mentioned above, other Genetic Programming resources I found helpful:

A closing thought from George Dyson’s book:

“Intelligence would never be clean and perfectly organized, but like the brain would remain slippery and disordered in its details. The secret of large, reliable, and flexible machines, as Turing noted, is to construct them, or let them construct themselves, from large numbers of individual parts - independently free to make mistakes, search randomly, and generally act unpredictably so that at a much higher level of the hierarchy the machine appears to be making an intelligent choice”

--

--

thi.ng

Computational design, data, TypeScript, Clojure/script, C