A Brief Review of Analog Synthetic Circuits & Analog Computation in Living Cells
Returning to an ‘old school’ model of computation is the future
This is a review article that briefly goes over different aspects of this topic. Just a couple of quick points:
— circuits // when I refer to circuits, it’s based on the analogy of comparing cellular pathways to electronic circuits. These ‘circuits’ create a chain reaction of events that lead to the ‘programmed’ biological response to occur.
— computation // even though I explain computation as the ability to perform calculations, cells perform computation that goes outside the conventional view (i.e. deciding on whether to divide or not)
An Overview of Synthetic Biology
In the past decades, synthetic biology has made incredible leaps forward with the development of synthetic circuits in 2000, the creation of the first synthetic lifeform in 2010, and just recently, the creation of e. coli able to perform carbon fixation to synthetically produce cannabinoids. Synthetic biology is the redesign or creation of novel organisms by engineering them to have new abilities and/or functions. Deplazes, a bioethics researcher who wrote several research papers concerning the ethical implications of synthetic biology divides such scientists into two groups:
- creating artificial life — using unnatural molecules to reproduce emergent behaviors from natural biology
- unnatural systems — taking interchangeable parts from natural biology and assembling systems that perform unnaturally
Biocomputing fits into the second category, as it takes biological materials (i.e. cells, DNA, proteins) to perform computation, where computation can be thought of as the ability to perform calculations. While there is a multitude of paths within biocomputing research, this article will focus specifically on synthetic circuits.
The Current Paradigm of Synthetic Circuits
To start off, let’s look at the definition of synthetic circuits:
Synthetic biological circuits are an application of synthetic biology where biological parts inside a cell are designed to perform logical functions mimicking those observed in electronic circuits.
When I first started learning about this concept, I found it fascinating how we could essentially be ‘programming’ specific behaviors into cells where the presence of specific components could stimulate a specific biological response.
However, this directly reveals the current problem with how we talk about synthetic circuits in the first place. If we go back to the definition, it mentions “cells are designed to perform logical functions that mimic those observed in electronic circuits.” When discussing logic functions, that’s based on the concept of logic gates, where each gate is an example of a function; depending on the function’s rules, it determines what the output is.
In principle, the analogy is based on the idea that cells are computs since they can take in inputs, process them, and create outputs.
While this analogy is helpful to contextualize and understand the concept of engineering synthetic circuits, it’s imperfect. No matter how many engineering principles we bring into biology, biology still has its inherent messiness and randomness. It’s similar to how in highly diverse cities, they often have their own versions of Chinatown, Little Italy, etc., while they do echo some of the aspects of their cultures, it is not a complete representation of that culture.
And that’s not necessarily a bad thing. I think that the addition of engineering principles into biology has led to a new paradigm shift that is creating new innovations and applications that can impact the world positively. However, we have to be careful not to abstract too far, where we think that biological circuits can directly imitate and compete with traditional silicon computers. This model will only limit us in the future, as we are restricting the potential of biocomputing, where instead of harnessing the natural abilities of the cell, we are suppressing them.
There is more than one path to computing
When we hear the word ‘digital’, the first association most of us make is ‘computer’. And why wouldn’t we, considering how we call all our electronics ‘digital devices’ because they are able to perform computation? Even so, viewing this as the only model of computation inhibits our ability to go beyond the limits our classical computers currently afford.
Digital computing is computation that consists of discrete values such as 0 and 1. It processes inputs and outputs based on logic gates, which are logical functions that form the basis of all digital systems. Depending on the rules of the logic gate, it determines what your output will be. For instance, an AND gate is when both inputs are 1, then the output is 1.
In contrast, analog computing is the creation of a model that represents a problem to be solved, where the model can take in continuous inputs and process them in parallel. Inputs can be electrical, mechanical, hydraulic, or biological. Instead of having distinct input/output values, analog computing allows for a range of values between 0 and 1.
Contrary to popular belief, the main difference between analog and digital computing is NOT its ability to take in discrete vs. continuous values, as some analog computers can exhibit digital behavior.
In fact, the difference lies in the structure, where analog computers have a reconfigurable internal structure that allows them to change it to best solve a problem, while digital computers have a fixed structure that solves problems in sequential order:
[analog computers] its internal structure is not fixed — in fact, a problem is solved on such a machine by changing its structure in a suitable way to generate a model, a so-called analog of the problem. This analog is then used to analyze/simulate the problem to be solved.
~Analog Computing by Bernd Ulmann
And when we look at biology, especially cells, it’s apparent that they mostly follow an analog model of computation. Cells are incredibly sensitive to their environment, with the slightest changes in pH, temperature, ATP concentrations, etc. impacting homeostasis and cell structure. These elements are in constant flux, allowing cells to be highly dynamic systems that are able to adapt and respond to volatile environments.
As a result, digital synthetic circuits are highly inefficient and unable to scale to meet the needs of the computational complexity we desire. With analog circuits, we could be more energy, time, and resource-efficient. Since we no longer need to discretize signals into strictly 0s and 1s, we can reduce the number of parts being used to run a computation.
Less Parts => Less Resources => Less Energy => Less Time
Less Parts => More Space => Complex Computation
Additionally, with more available space, we have the freedom to add more parts if necessary to increase computational complexity.
While digital systems are great for determining the presence of specific inputs and regulating gene expression, analog systems add a whole level of possibility. Analog systems can take in continuous signals over a wide range of inputs, providing the opportunity to monitor the performance of a system over a certain time frame.
This can be incredibly beneficial for future applications in medicine, the environment, bioproduction etc. For example, if we had a colony of bacteria and we introduced an input that can cause mutations, we can monitor how these mutations accumulate over time.
Within this example, let’s say that the bacteria can be activated using light as an input. Light can activate a chain reaction that leads to the bacteria aquiring a specific mutation. There are two ways that we can hasten the accumulation of mutations in the system:
- using a brighter light (magnitude)
- shining the light on the system for a longer period of time (duration)
Based on the proportion of bacteria in the population that has these mutations to those who do not, we can use this information to determine the input magnitude and duration of the system.
Developing Analog Biological Systems
When developing synthetic circuits, synthetic biologists use the Design-Built-Test-Learn model. It’s an iterative process that allows the bioengineer to improve their current design using information gained from the following stages. In the first stage, once a challenge is chosen, researchers brainstorm different approaches and decide on which one would best solve this challenge.
From there, they can start conceptualizing what type of design they want to build.
For analog computing, one of the ways you can do that is by using differential equations. Differential equations are a mathematics concept that represents change within a system without going into the specifics of how these changes occur.
For instance, let’s say you are traveling all over a new city, visiting all the dazzling attractions and trying out their gourmet cuisine. You could represent your location for one of the days by saying: A –> B –> G –> C. This simple flowchart explains how your location changed for that day as you navigated the city, but it doesn’t say how you got from place to place. Whether you walked, took public transportation, etc.
In order to determine your differential equations, that requires knowing calculus (which I do not at the moment, so this is just what I can understand of it). Based on the scenario, you can build a chemical reaction network, which describes chemical reactions that occur in a process. In our previous analogy, the first equation in the network would be A –> B. Once you write down these equations, you can use mass action equations (used to find the rate that the reaction occurs) to derive your differential equations.
These differential equations can be inputted into analog compilers, which essentially convert your differential equations into circuit simulations that you can run your experiment on. It’s similar to how digital compilers convert code into something the computer can understand and execute. From there, it can be built in a lab and tested.
Caveats, as all systems are imperfect
As promising as analog sounds, there are still some drawbacks. For instance, analog systems are more susceptible to noise, which is any disturbance to the signal by external factors. In biological systems that could be fluctuating expression levels of proteins. Noise can lead to signal fluctuation and/or signal degradation of output signals. Because of this susceptibility, there is a trade-off between efficiency and precision.
Additionally, while analog systems provides robust final outputs, it doesn’t restore signals that occur during each step of computation. For example, if you’re final goal was to get from Point A->G, you wouldn’t know what the intermediate steps were in between to get to the final destination.
Scientists have already successfully created living cells that are capable of performing analog computations. The first group to demonstrate complex analog computation created circuits that could perform math functions such as logarithms, addition, subtraction, division, and power laws.  Another group created analog DNA circuits that utilized strand displacement to perform computation where simulations were used to model expected gate behavior. These gates were able to perform addition, subtraction, and multiplication. 
Analog circuits are also being designed to have memory functions. In 2014, Farzadfard et. al developed SCRIBE, a strategy that created cellular ‘memory’ through writing past cellular events into DNA. In brief, external signals (i.e. light or biomolecules) stimulate the production of ssDNA in cells, the ssDNA can then write mutations into the genome via recombination, which encodes this information. Through this method, they were able to use the encoded memory to determine the duration and magnitude of the input present. 
Additionally, progress is being made on developing hybrid models of computation, which involve the use of both analog and digital processing of signals. Rubens et. al developed mixed-signal circuits that created analog-to-digital comparators (components that convert analog signals into digital signals). 
Also, our ability to design and simulate these systems are improving with the creation of new tools. For instance, a group from MIT developed Arco, a solver that is able to develop a circuit hardware design based on the inputs (differential equations) it is given.  More recently, Medley et. al developed a compiler that transforms chemical reaction networks into a cytomorphic chip configuration. Cytomorphic chips are cell-inspired chips made of biological materials (i.e. proteins) that are able to simulate biological systems. 
*** For the purpose of keeping this article strictly an overview, I only mentioned these examples in passing. However, I do plan on writing future articles that dive deeper into these areas!
In order to continue to push the needle forward, there are still some challenges that need to be overcome:
- Improving the theoretical framework for constructing analog circuits — more work is needed to understand the mechanics of the cell and how we can best take advantage of them to perform computation.
- Implementation of fuzzy analog signals — fuzzy analog signals is any input between 0 and 1. Characteristics we want to evaluate in a cell (i.e. noise, expression level, etc.) will operate in that in-between in a time-based manner. Improving of our understanding of how to build them will help us move forward to constructing circuits of greater complexity.
- Analog hardware is not modular — in terms of modeling biological systems using analog computing, it’s difficult to reuse analog hardware to run different types of pathways and processes. That means that we often have to build unique hardware for each one, which is time-consuming and inefficient
Ultimately, the idea of switching completely to analog systems isn’t the end goal. Biological systems are examples of analog-digital systems of computation, so if we want to harness their full potential, that requires moving towards a hybrid model of computation.
The main reason why I focused on analog circuits is that there is so much potential for research and improving our ability to execute analog computing.
Further progress in developing analog and hybrid biological systems will allow for advancements in creating diagnostic & therapeutics, bioremediation, bioproduction, developing wetware, and far more!
Analog Computer Returns — Youtube (even though it focuses on analog electronics, concepts are transferrable)
Analog Computing by Bernd Ulmann (you can sign up for a free trial to access the book)
Synthetic Biology: An Emerging Engineering Discipline — Youtube (22:19 is when he starts talking about analog circuits, but I recommend you to watch the whole thing)
Building Analog Circuits
Moving Towards Analog//Hybrid Models of Computation
Hi! My name is Maggie and I am a 16-year-old looking to impact the world through emerging biotech. At the moment, I’m exploring fascinating topics such as biocomputing, philosophy, ethics, and climate change.
If you got to the end, thank you for reading my article! Feel free to connect with me on Linkedin or sign up for my personal newsletter if you would like to receive monthly updates on my biocomputing journey!