Who Should Decide Whether To Put Humanity At Risk?

Photo by Walter Otto on Unsplash

Science does so much for us, but who should pull the plug if it does too much? A few times in recent history science has approached the possibility of ending humanity. These possibilities were judged statistically insignificant and the projects went forward. Obviously we’re all still alive and breathing, but as the power of science increases, these events may happen more often.

We’re reaching an age where private laboratories can modify viruses to be resistant to any cure. Supercolliders are creating conditions that occur nowhere in the known universe. Climate scientists are also considering plans to change the weather to combat perceived warming.

When science and the human desire for fame and success combine, the collision could be devastating. Could a scientist be swayed to take on a dangerous project for the prestige that would come with a Nobel prize? Would an entrepreneur be encouraged to ignore possible catastrophe for a billion dollar IPO?

Those two questions might be answered in our very lifetime. Hopefully, they’ll be answered in a way that doesn’t end humanity.


The Atomic Bomb

Perhaps the first time humanity came to the possible brink of destruction was in the creation of the Atomic bomb. This was a project unlike anything humans had ever embarked on. Splitting the atom promised the possibility of unlimited energy, but also the possibility of unthinkable destruction.

Before the detonation of the first test bomb, no one truly knew what would happen. Many brilliant scientists had formulas and likely theories, but this little experiment was moving into uncharted waters.

What would happen when you detonate this device?

Devastation most likely, but how much. This question leads us down one of the strange rabbit holes that’s rarely mentioned.

In all the probabilities and possibilities that were sketched out, there was one that involved the destruction of the whole earth.
Arthur Compton — Unknown (Mondadori Publishers) [Public domain] wikipedia commons

No one could ever deny Arthur Compton was a brilliant man. Awarded a Nobel Prize in physics in 1927, Compton was a major player in the scientific team that created the atomic bomb. He led the Manhattan Project’s Metallurgical Laboratory and was in charge of the group’s Plutonium Project.

Compton would go on to supervise brilliant scientists like Enrico Fermi, Robert Oppenheimer, and Edward Teller. Before the first test of the bomb, The Trinity Test, a frightening thought occurred to a few of the scientist. What would happen if the bomb set off a vast chain reaction?

As an interview in Scientific American notes, it was thought there was a possibility that the bomb would generate such intense heat a runaway chain reaction might burn up the entire atmosphere of the earth.

Two scientists at the project, Edward Teller — father of the hydrogen bomb and Hans Bethe, had another scientist Emil Konopinski run numbers just to check if this was a possibility. According to Konopinski’s calculations, it was nearly impossible.

The word “nearly” might make you a bit apprehensive. Apparently Compton wasn’t completely sold.

In 1959, Compton did an interview with Pearl Buck where he explained the possible dangers of runaway fission in the atmosphere.

“Hydrogen nuclei,” Arthur Compton explained to me, “are unstable, and they can combine into helium nuclei with a large release of energy, as they do on the sun. To set off such a reaction would require a very high temperature, but might not the enormously high temperature of the atomic bomb be just what was needed to explode hydrogen?
“And if hydrogen, what about the hydrogen in sea water? Might not the explosion of the atomic bomb set off an explosion of the ocean itself? Nor was this all that Oppenheimer feared. The nitrogen in the air is also unstable, though in less degree. Might not it, too, be set off by an atomic explosion in the atmosphere?”
“The earth would be vaporized,” I said.
“Exactly,” Compton said, and with what gravity! “It would be the ultimate catastrophe. Better to accept the slavery of the Nazis than to run the chance of drawing the final curtain on mankind!”

Enrico Fermi showing some dark humor, took bets before the explosion about whether the atmosphere would ignite. So, obviously there were some scientists who were nervous about this.

Of course, we know how history turned out. This obviously didn’t happen and Fermi won some cash. However, there seemed to be some calculable possibility of the world being destroyed with the use of this weapon.

At the time, the numbers were thought so low it was worth the risk. A team of brilliant scientists and government officials made the call. There was also the threat of “slavery of the Nazis” to be worried about.

What happens when the government is not involved and there is no threat from a Hitler forcing our hand? What happens if the only gain from the possible risk is just a cool science discovery or money?


Large Hadron Collider (LHC)

Part of the LHC — Maximilien Brice [CC BY-SA 4.0 (https://creativecommons.org/licenses/by-sa/4.0)]
“It is difficult to get a man to understand something, when his salary depends on his not understanding it.”
— Upton Sinclair

In his 2003 book Our Final Hour, Britain’s Astronomer Royal Sir Martin Rees talks about the possible dangers with CERN’s LHC.

The designers of the giant particle accelerator found that there was a non-zero risk that the LHC could imperil the stability of the universe. The machine would be creating conditions that didn’t exist anywhere else in the universe, so no one could say with exact certainty what would happen.

According to Rees’ book, the creators of the LHC formulated there was a one in 50 million chance they would create a theoretical piece of matter called a strangelet. These strangelets might interact with other particles, turning them into strangelets as well.

Eventually the chain reaction could convert the earth into strange matter, possibly doing the same to the universe as well.

Now, like the atomic bomb, this never happened. The scientific community scoffs at the idea now as a joke. But, were you ever consulted about the decision to flip the switch at the LHC? A group of scientists made that decision themselves for you. It was worth the one in 50 million shot for a scientific discovery in their minds.

But really was it? Were you willing to risk your family, life, and other loved ones to find base particles of existence — even with the low probability of disaster? In the mind of a scientist at the LHC, it definitely was worth the risk.

However, what about people whose lives don’t revolve around looking into microscopes or formulating the origin of the universe? They might say otherwise.


H5N1 Virus (Avian Flu)

H5N1 Flu — Photo Credit: Cynthia Goldsmith Content Providers: CDC/ [Public domain]
“If this virus were to escape by error or by terror, we must ask whether it would cause a pandemic. The probability is unknown, but it is not zero. There are many scenarios to consider, ranging from mad lone scientists, desperate despots and members of millennial doomsday cults, to nation states wanting mutually assured destruction options, bioterrorists or a single person’s random acts of craziness.” — US National Science Advisory Board for Biosecurity (NSABB)chair Paul Keim

The Avian Flu is primarily a virus that affects birds, but has shown some rare abilities to jump to humans. Once a human is afflicted, the virus could be deadly. According to a study done by the World Health Organization, 52% of the cases resulted in death.

In 2011, two separate groups managed to modify the virus to make it more virulent. A particular experiment like this is called gain of function (GOF). Obviously, a number of groups, including the NSABB were upset that such a dangerous virus was modified to spread easier. Steps were made to put pressure on the two groups not to publish papers about their experiments.

The papers were published anyway. Although there was a voluntary moratorium on other experiments. When experiments were restarted in 2013, they would be under stricter U.S. guidelines. In 2014 a funding pause was announced for a number of GOF experiments after more controversial papers were published and mishaps at federal biocontainment labs.

The National Institute of Health (NIH) lifted the pause in 2017, after proposals for GOF experiments were put under more scrutiny.

However, technology has only increased in that time period. Gene editing technology makes editing viruses much easier, enabling more labs to modify various strains. This puts more edited viruses floating around in circulation. As a result, the possibility of release by error or terror increases. Paul Keim’s fears seem well founded.


Conclusion

“Scientists surely have a special responsibility. It is their ideas that form the basis of new technology. They should not be indifferent to the fruits of their ideas. They should forgo experiments that are risky or unethical.”
— Sir Martin Rees

Technology is wonderful and makes our lives better. Science cures diseases and is a modern form of magic that makes the impossible possible. However as the power of science increases, it’s in our best interest to police the earthly desires of scientists.

What actually is considered a low risk when all of humanity may be put in danger? According to a 2018 article in Politico, the U.S. spends $60 million a year on NASA’s Planetary Defense Coordination Office preparing for an asteroid strike. This might be bumped up to $150 million.

An asteroid larger than 1 kilometer strikes the earth every 500,000 years according to Nick Bostrom, professor at Oxford University. Even with that low probably of risk, the U.S. government is shelling out a large quantity of money to defend the planet against asteroids.

Shouldn’t the low risk of a dangerous outcome from a science experiment be considered with the same gravity? As we’ve seen in this article, science has taken steps in dangerous directions in the past; the future will likely be no different.

As the power of science increases, the ability to cause possible disasters will be shared across labs of all sizes and types. In a way, annihilation could be democratized. Perhaps the desire for money, fame, or awards will shade the view of these scientists or entrepreneurs.

One can only hope that reason wins out and science treads very carefully. Otherwise, new powerful diseases or other science experiments run amok may afflict our world — possibly even destroy it.

Thank you for reading my ramblings, if you’ve enjoyed what you’ve read please share.