Benefits Of A Distributed Network for Pandemic Monitoring

Gabor Bethlendy
5 min readMar 13, 2020

--

The question we are asking at Meenta is why focus on building the house when your foundation is on quicksand? We agree the focus on Covid-19 monitoring, isolation, and testing is absolutely the right focus, for now. However, it risks a future where waves of viruses will run through our society and economy one after another. Now Covid-19, next H1N3 and on it will go. We challenge the time-tested thinking that better healthcare comes from faster scientific instruments. From better and more complex testing. From better insurance reimbursement or accelerated drug approvals. These are all true but miss the fundamental problem.

The fundamental challenge we face in science is not the technology we can build, but the infrastructure on which they are built.

The fundamental challenge we face in science is not the technology we can build, but the infrastructure on which they are built. Let’s rewind history to better understand. In 1956 Eisenhower passed the Federal Aid Highway Act that funded the entire highway system in the US. At the time Eisenhower thought the highway system would be used to transport military equipment. Later in 1960 the first supercomputer called the CDC 1604 was built by Cray and used at academic institutions that could afford to buy and house them. Originally, they were built to handle what we would today perceive as basic computational functions. Now many of those computers can fit on our smartphones. These computers were on digital islands. Few people knew they existed. There was no Internet, WiFi or cloud computing. The challenge of moving data between these behemoths eventually lead to the development of ARPANET (Advanced Research Projects Agency Network) that eventually became the Internet as we know it today, but it was originally intended for defense applications. It was the Internet that unleashed the power of computing, not the other way around. Can we as the scientific community respond to crises when our biggest assets sit on digital islands?

The evolution of fundamental infrastructure components has not happened in Life Sciences and is hampering our ability to innovate and respond to crises.

Think about it. The evolution of fundamental infrastructure components required to make resources in Life Sciences accessible has yet to happen. Akin to the supercomputers on digital islands all of our biggest research assets sit in silos. We as the thought leaders in science literally don’t know where all of the tools we need to do our work are. There is a lack of transparency on availability, turn around time, data quality and the costs of scientific equipment and services we access. This is true across all equipment types and services.

We can’t continue to rely on tribal knowledge to find the scientific tools we need…

So what?

We can’t continue to rely on tribal knowledge to find the tools we need. We need to move faster like the tech space. We have to stop and build the infrastructure that allows us to access the tools we need as seamlessly as CPU time in the cloud. Why can’t we book an RNASeq run on a NovaSeq on demand like we book a home or buy something on Amazon? I know the skeptics… “but its sciences…it’s more complex than consumer products and
services.”

We wonder whether the R&D efforts of companies like Moderna and Gilead to create better treatments for coronavirus could be accelerated by giving access to equipment and services on demand through
a distributed network vs isolated locked up instruments and assays? Could we enable more start-ups through a network of underutilized assets? Could we flatten the curve of infected individuals with more novel approaches…such as spinning up testing assay on a distributed network with the flip of a switch, globally?

The Infrastructure Problem: Testing Capacity

As we wrote before there is an “Empty Lane Problem” problem in the world. In this blog, we estimated that 48% of all scientific equipment sites underutilized. Yet we see with Covid-19 one of the major challenges we face is the lack of capacity of testing labs. How can there be idle instruments and a capacity problem at the same time? Having founded a molecular diagnostic company I realize that a clinical testing lab is not a research lab nor a core facility. Yet, we wonder whether with the appropriate infrastructure if these resources could be utilized for the greater good? Stopgap and reactive measures in New York City, for example, to contract with private labs to expand capacity is a band-aid on the Hoover dam.

So what?

In the context of pandemic response, monitoring and testing a distributed network that’s designed to be turned on during a crisis is the most immune to disruption. It is not dependent on one location. To one economy. To one politics…all the benefits and negatives of providing testing services to potentially infected citizens are distributed across thousands of instruments.

The benefits of a distributed network will impact our entire field.

The benefits of a distributed network will impact our entire field. Biotechs and pharma who depend on R&D outsourcing to Asia, for example, face real risks in biological samples getting delayed or lost in transit. Especially at a time when China has exposed a critical vulnerability in the fact that the US obtains 80% of active pharmaceutical ingredients from that region.

Sure many will look domestically for academic and commercial service providers. However, those traditional models will suffer because biotechs/pharma normally partner with a few service labs. As more and more institutions, states and staff are quarantined having all their eggs in “one” basket becomes a huge risk. On a distributed network it doesn’t matter when one provider goes offline because there are hundreds of redundancies and live instruments available to choose from each of which can be dedicated to projects to ensure data consistency.

In the next 100 years, science will no longer be confined to the ownership of equipment or live on digital islands. Today most anyone can spin up CPU time in the cloud for reasonable costs. Why can’t we spin up an instrument or a test when we need them in a distributed fashion? In the next 20 years, most wet lab science will happen in the cloud so that with the next pandemic we will be ready to test 500,000,000 H1N3 infected people in days with a modern digital infrastructure.

--

--

Gabor Bethlendy

3x Founder & CEO | Experienced Senior Executive in healthtech, life sciences and diagnostics.