Equilibrium: March 25, 2018 Snippets
As always, thanks for reading. Want Snippets delivered to your inbox, a whole day earlier? Subscribe here.
This week’s theme: the fundamental way that cells and living systems operate differently from computers and digital systems. Plus Relativity Space and CommonBond both sign deals for the long haul.
Welcome back to our Snippets series on synthetic biology, where we’re thinking about biology as a platform for value creation that may rival software in it size and scope over the next century. Last week, we covered a few reasons how cells and living systems are like computers: cells process organic matter just like computers process information; they’re both extendable, abstract-able platforms, and they can both scale to an almost limitless degree. But as I’m sure you’re already thinking, the story only really begins when we start to talk about how biology and computers are different from one another.
This week, we’re going to learn the ground work for a new (to many of you), fundamental way to think about how cells process information and matter, and why it’s so different from how digital systems like computers operate. And then next week, we’ll learn why that’s so important: how the properties that make cells difficult to “program” and hard to work with also make them incredibly dynamic and resilient, in ways that computers can never match.
Let’s think about the very basics of what a cell is. We’ll start by thinking of our cell just as a bag of water, with stuff in it. There’s a very limited set of stuff: let’s say there’s only one protein we care about, called Protein A, along with the gene that specifies the instructions for how to make it, which we’ll call gene x. Let’s also assume that there’s the basic machinery in the cell that can take the instructions in gene x as input and manufactures A as output, along with all of the basic building blocks required. We can think of the snapshot-in-time state of our cell as the current amount of A present, the rate at which x is making new proteins out of building blocks, and the rate at which A molecules disintegrate as they randomly break back down into their building blocks over time.
Let’s consider a simple example. Say you turn on the gene to make Protein A, and you start making them at some rate, say 100 copies of A per minute. What happens to A over the next while?
You can see here that the rate at which A accumulates inside the cell starts off quickly (100 copies per minute), but then reaches a plateau. How come? We never told the cell to stop making A, so what’s going on? Well, we can understand why the greater the concentration of A in the cell, the faster will be the rate at which it degrades. When there’s no A present in the cell at all, the amount of A degrading is zero, of course. As we add some A, the rate at which A is degrading into [not A] will also rise. And at some point, we’ll reach a level at which the forward process of creating A and the reverse process of breaking down A will meet at an equilibrium. A will therefore be constant, inside the cell.
Now what happens if we stop making A? You guessed it: the concentration of A in the cell will initially decrease quickly, but then will slow down: there’s less A present, so the rate that A will disintegrate will also slow down.
(In practice, we describe these rates using ordinary differential equations (ODEs), which hopefully many of you remember from your college math courses: in any situation where the rate of appearance or disappearance of A is a function of how much A is already there, then you’re dealing with ODEs.)
Whenever we say “A is present at a certain amount”, or even more generally “A has some set value”, what we mean is that the rate at which A is created is the same as the rate at which A is gotten rid of. Everything is an equilibrium. If you can understand this basic idea, then you’re ready to appreciate the three basic ways that cells process process information, manage their internal state, and make stuff. The sum total of every single reaction happening inside a cell — proteins being made, proteins being broken down; signals being turned on, signals being turned off; molecules assembling, molecules disassembling — it’s all a giant equilibrium that reflects the sum of all of the different concentrations and rates of everything in the living system.
So why go through all of this? We’re really emphasizing this point — that the state of the cell represents the sum of all of the momentary equilibria — because now we can contrast the most fundamental difference of how cells work versus how computers work:
In a computer, the state of a program is the sum total of the discrete instructions that have been executed up until now. In a living system, the state of the cell is the sum of the current equilibrium. To a computer, “A equals 1” means, “A is represented in my onboard memory as a 1 or a 0. The last I heard, A was set to 1. Nothing since has happened. Therefore, A is still 1.” To a cell, “A equals 1” means something very different. It means something more along the lines of: “A represented inside the cell by the concentration of some biological molecule. The equilibrium inside the cell is such that A equals 1 right now.”
This makes cells much harder to work with in some respects: it’s difficult to get a cell to maintain a particular state for an extended period of time, because unlike in a computer, that equilibria is perpetually under constant push and pull in every possible direction. It’s hard to get a cell to behave like a state machine. But it’s easy to get them to be have like dynamic machines: cells are difficult to work with in a lot of ways where computers are easy; but as we’re beginning to learn, they’re also very good at tackling other kinds of challenges that for computers are very hard.
Now that we understand this crucial difference — between discrete state changes for the computer versus continuous equilibria for cells and living systems — we’re ready to address a question next week that’s a bit more subtle: why living systems are more resilient in the face of change and added complexity compared to artificial systems, and exactly what kind of tasks cells are more suited for than digital computers.
Heady times for batteries…
…renewable energy contracts…
…and electric vehicles:
Snippets readers may have noticed we post a lot of writing by Eugene Wei, who is a pretty fascinating guy. He has a new post this week plus appears on two phenomenal podcast episodes that are worth your time to listen:
Other reading from around the Internet:
And just for fun (although it’s actually deadly serious):
In this week’s news and notes from the Social Capital family, we’ve got some exciting, and relatively unusual, news from Relativity Space. Relativity, the additive-manufactured rocket engine technology company whose mission is to become the first organization to successfully 3D print a rocket on Mars, has signed a long-term deal with the NASA Stennis Space Center:
The deal is a 20-year partnership between Relativity and Stennis Space Center, granting Relativity the exclusive rights to use a 25-acre test facility on site until 2038. The deal was made possible through the Commercial Space Launch Act agreement, which was intended to accelerate partnerships between public and private space agencies and certainly seems to be working out here. (It won’t hurt that Relativity founder Tim Ellis has also been named to the Users Advisory Group that advises the National Space Council.) Although other NASA facilities like the Kennedy Space Center have made similar deals, notably with SpaceX, this is the first such deal with Stennis. This way, unlike some of the heavier-weight space ventures like SpaceX and Blue origin, Relativity can enjoy the freedom of operating with an “off-the-shelf” working test facility — yet still enjoy the freedom of having complete autonomy and 24-hour use. (Don’t quote me, but this might be the first recoded use of the term “off the shelf” to design a rocket testing facility.)
With their 20 year partnership now inked, founders Tim Ellis, Jordan Noone and the whole team — who are already Stennis regulars — can look forward to getting to know the area really well. In the comments section of the Spacenews article above, commenter billsimpson helpfully offers a trip down memory lane and some advice concerning the AC situation: “They had better hope the air conditioning works. I still remember riding a dirt motorcycle on the logging roads around there and getting hit by an air hotter than my body temperature, so that the wind made me hotter, rather than cooler. It is jungle humid hot there from mid May, until the first week of October. Everything metal is soaking wet by midnight from condensation. The mosquitoes at Sennis will carry you away. Rainfall totals 60 inches a year.”
Have a great week,
Alex & the team from Social Capital