(This post is an approximate transcript of what I said on the panel “Safeguarding Advanced Biotechnologies in Practice: The iGEM Safety and Security Program”, a side event hosted by the International Genetically Engineered Machines (iGEM) Foundation at the 2017 Meeting of States Party to the Biological Weapons Convention in Geneva, Switzerland.)
Just this year, iGEM student teams (more than 300 from more than 40 countries) showed off on-site diagnostic kits for cows with antibiotic-resistant infections, DNA encryption with codon reassignment, new ways to control plasmid gene expression, and even plastic 3D-printed from astronaut poop. Today, I’d like to enthuse about one particular aspect of the iGEM competition: how it encourages students to integrate biosafety and biosecurity into their work. Here are three biosecurity lessons that I learned from iGEM:
1. Safety early, safety often
Let’s talk about antibiotic-resistant bacteria. I’m concerned, the WHO is concerned, you’re probably concerned. They’re a problem. In 2014, when I was part of the Waterloo iGEM team, we wondered how synthetic biology could help.
A bit of background: lots of bacteria depend on a cell wall to hold their shape. These walls need regular maintenance—imagine repair proteins constantly swimming around and patching them up. Beta-lactam antibiotics work by clogging up the repair proteins. Clogged proteins can’t patch walls, so the bacteria lose their shape and die.
To resist beta-lactams, some bacteria have an extra set of repair proteins, with a mutated shape that prevents the antibiotics from binding. Waterloo iGEM asked: can we use synthetic biology to intervene? Maybe we could block the production of those mutant proteins with CRISPR or RNA interference.
Now, we quickly realized that working directly with infectious, antibiotic-resistant bacteria is perhaps not the safest choice for a group of undergraduates. So, early in our project, we were kind of stuck: antibiotic resistance is a real problem and we wanted to work on it. However, we didn’t want our design to depend on engineering dangerous bacteria.
One of the nice things about synthetic biology is that many of its tools work across multiple organisms. We wanted to treat skin infections caused by antibiotic-resistant S. aureus and found we could engineer a closely-related (but much less infectious) bacteria, S. epidermidis.
These two bacteria species are so closely related that, when you put them side-by-side, they can pass genes back and forth. We could have S. epidermidis pass over our CRISPR/RNA system to S. aureus and (we hoped) disrupt the antibiotic resistance.
We only had a few months to work on the project, so of course we didn’t finish it. Ultimately, iGEM is an educational organization, so let me tell you about what I learned:
First, even though the big synthetic biology news stories are about gene drives or artificial cells or undead wooly mammoths or what-have-you, engineering biology isn’t necessarily about creating something new. Sometimes you just want a cell to not have mutant proteins. It can be about repair as much as it’s about invention.
Second, consider the safety of the organisms you’re working with right at the beginning of the project. We might not have based our project around passing genes back and forth if we hadn’t looked at S. epidermidis. Fail early, fail often is a general principle of design thinking; the earlier you are in the design process, the easier it is to pivot your project. This applies to synthetic biology as much as anything else.
Safety early, safety often.
2. You don’t know what people want until you ask
The next year, I was still a part of Waterloo iGEM, but we were working on something quite different: using CRISPR-Cas9 as a defense against plant viruses.
CRISPR is one of the big biology buzzwords du jour. Celebrities tweet about it and everything. You may already know that CRISPR-Cas9 developed as a bacterial immune system. Its protein-RNA complex can recognize invading viruses and slice up their DNA before those viruses can infect the bacterial genome.
Waterloo iGEM wanted to get in on the hype, so we wondered if this might be useful for food crops: could we augment plant immune systems with CRISPR?
My teammates went out and spoke to a local banana farmer (which is honestly the most surprising part of this story — who knew anyone grew bananas in Canada?) and he defied our expectations in more ways than one. We had naively assumed that our cool, buzzword-y, CRISPR-immune plants would be so valuable that farmers would just want them, no questions asked. He asked if they would still taste good.
Oh, right. That makes sense. And yet, we didn’t think of it until we asked.
After that conversation, we spent more time examining our assumptions about how to make CRISPR-immune plants genuinely valuable. We looked at the risks of off-target mutations, ran a public opinion survey, and researched Canadian regulations on genetically-modified crops.
You miss a lot of perspectives if you only ever talk to other biologists. Below is a picture from the 1975 Asilomar Conference on Recombinant DNA. The conference brought together many of the day’s leading biologists, but I think it’s important that they also invited lawyers, journalists, and other public figures. The safety guidelines they outlined more than forty years ago still govern how I do lab work today.
Great things can happen when scientists hear the perspectives from outside ivory towers. I’m glad that iGEM encourages students to go out beyond the laboratory and university (even sometimes all the way to Geneva) to ask what people want.
3. It matters when students see that safety is valued
Those two lessons, one from each year I was on an iGEM team, are important. They feel incomplete, though. I don’t think they do justice to how much iGEM informed my perspective on biosecurity. I look back on Waterloo iGEM’s projects and feel there are all sorts of interesting safety issues that we left unexplored.
My first year at the iGEM jamboree in Boston, on the very first day, I remember listening to Megan Palmer (iGEM’s Director of Human Practices) questioning one of the teams, and thinking, “Wow, she’s being such a stickler about this safety stuff… Why is she asking about that instead of all the cool science and math and technology that the team worked on?”
So often, we talk about safety like it’s the boring part.
We talk about biosafety like it’s something that we only consider out of obligation or because regulations have forced us to. I still kind of thought that safety should be a footnote to the more exciting parts of a synthetic biology project.
However, judges and PIs and iGEM HQ and even other students kept asking safety questions. It began to feel like everyone else saw safety as an integral part of engineering biology. After that first iGEM jamboree, I started to see it that way, too.
I went back to Boston the next year, and I remember thinking, “you know, Megan Palmer makes some good points”. I walked around talking to other students from around the world and found myself asking about biosafety regulations in their own countries.
This year, I was a judge at iGEM and many of the projects I found most fascinating were fascinating precisely because of innovative thinking around safety. After I saw safety highlighted and valued at iGEM, year after year, it no longer seemed like the boring part. It matters when students see that safety is valued.
As part of their human practices, iGEM asks teams to “consider whether their projects are safe, responsible and good for the world”. Safety comes first, there, and I don’t think that’s accidental.
You can’t know if you’re engineering biology for the good of the world if you haven’t considered whether you’re engineering it to be safe. That’s maybe the most important biosecurity lesson I learned from iGEM.
Are you an iGEMer who’d like to speak at events like the Meeting of States Party to the Biological Weapons Convention? Apply for the After iGEM Delegate Program! Check out after.igem.org for more information.