COVID if it was a cyber threat, a timeline of events

Kim Crawley
9 min readJun 16, 2022

This is continued from Part one: The infuriating, avoidable, destructive disaster of COVID and the cybersecurity industry.

I have a rare advantage when it comes to communicating COVID information to cybersecurity people — I speak your language. IT and computer science jargon is to my cybersecurity research, blogging, book writing trade as a wrench and a butt crack is to a plumber. Some of my peers grasp the seriousness of the vast COVID threat, but so many others clearly don’t. If you’re an infosec professional who’s appalled by users reusing passwords and enterprise networks letting vulnerabilities that have been in the CVE database for a decade lurk in their systems, but you go to large offline events maskless and say COVID is no big deal (or “everyone will get it, we might as well let ‘er rip!”), you have no reason to feel so high and mighty. In fact, you deserve zero credibility because your judgment is severely lacking. Anyway, for those who can be persuaded, here’s how I would describe the COVID pandemic if it was a cyber threat:

Near the end of 2019, Chinese researchers detected zero day (biological) malware, a virus. Lots of viruses and other sorts of biological malware have targeted homo sapien flesh-and-blood endpoints since the dawn of humanity. Some, like the common cold (rhinovirus), are pretty tolerable to most people who are in reasonably good health. But this zero day the Chinese researchers discovered was much more serious. Imagine WannaCry ransomware, with the persistence and mysteriousness of Stuxnet. Due to the crown-like shape of the virus, it was classified as a coronavirus. From that point and throughout most of 2020, “coronavirus” is what end users called it colloquially, even though there are many types of coronaviruses that predated COVID.

Wuhan was the Chinese city where COVID was first discovered, that December. Thousands of people in Wuhan became gravely ill. They had difficulty breathing, among a variety of other mainly respiratory symptoms. The Chinese government’s metaphorical incident response team acted quickly and decisively. They took people “off of the network,” a lockdown. Human “endpoints” were usually safe and cared for during the lockdown, and the lockdown was a very temporary, emergency measure. But the mainstream corporate western media thought China’s incident response measure was some sort of horrifying authoritarianism. Sinophobia was likely a factor in how the western media reported the Wuhan lockdown. But the lockdown was a relative success, most of Wuhan’s 12 million people were protected from infection. At least until most of the rest of the world made the COVID-19 virus an ever worsening problem. By January 2020, the Chinese government restricted flights to and from China. But the virus had probably spread to other countries by then.

By March 2020, COVID was becoming a problem in parts of the world with large numbers of white people. Shit’s getting serious! It was becoming so much of a problem that the powers-that-be in Europe, Canada, and the United States could no longer ignore it. On the 11th, the World Health Organization had declared it a pandemic. Here in Canada, officials told us that we should stay at home as much as possible, to “flatten the curve.” Imagine a security team sending a group email to end users across an organization. “Don’t open those emails, don’t click on those links, don’t go to those websites.” End user education is often a crucial factor when it comes to containing a cyber threat.

Because some people had to leave home for their survival income jobs and some other people were just reckless, COVID continued to spread in countries that predominantly use English, and beyond. Grocery store employees and healthcare workers were heralded as heroes. They were “essential workers.” “Clap for your carers!” the British hegemony instructed their citizens. Some NHS workers tweeted, “This clapping is pointless. We need decent pay, better protective equipment, and our 60 plus hour weeks to be lessened.” Meanwhile, people here in Toronto (and throughout North America) bought way, way too much toilet paper and soap. Buying a lot of toilet paper and soap is okay, but not if it keeps other people from having the toilet paper and soap that they need. Plus, having an excess of toilet paper does nothing to mitigate the pandemic. Imagine if your end users thought they needed to install Fortnite on their office computers to mitigate WannaCry. What a nonsequitir. The Centers for Disease Control and Prevention (CDC) in the United States was already deliberately misleading the public. From MarketWatch in March 2020:

“Though health officials have warned Americans to prepare for the spread of the novel coronavirus in the U.S., people shouldn’t wear face masks to prevent the spread of the infectious illness, according to the Centers for Disease Control and Prevention, the U.S. Department of Health and Human Services and the U.S. surgeon general.

But that’s not the only reason Americans may want to think twice about using masks, one expert told MarketWatch.

Most people don’t know how to use face masks correctly, and a rush to buy masks could prevent the people who need them most — health care providers — from getting them, said Dr. Amesh Adalja, a scholar at the Center for Health Security at the Johns Hopkins Bloomberg School of Public Health.

In fact the U.S. surgeon general recently urged the public to ‘STOP BUYING MASKS!’ ‘They are NOT effective in preventing general public from catching #Coronavirus, but if healthcare providers can’t get them to care for sick patients, it puts them and our communities at risk!,’ wrote Surgeon General Jerome Adams on Twitter.”

Actually, masks are one of the most effective security controls against COVID. But I’ll get to that later.

There was a lot of silliness in how “end users” and “administrators” (politicians, etc.) reacted to COVID “cyber threat,” including outdoor parks being closed here in Toronto, and people thinking that washing their groceries does something. Nonetheless, I and many other people are nostalgic for how people were more likely to take COVID more seriously during that first year.

China had a limited deployment of a “beta version” COVID vaccine called CanSino by June, and Russia had a similar Sputnik V “beta version” vaccine by August. By Summer 2020, the CDC had changed its position on masks a little bit, and people in the United States and Canada were encouraged to wear fabric masks. By the end of 2020, some “stable release version” vaccines were approved by health agencies throughout the west, starting with Pfizer’s.

In 2021, the general consensus was that we had found a solution to the COVID pandemic and stopping it was a matter of time, and getting the world vaccinated ASAP. Despite the Ontario government’s terrible disorganization when it came to letting people know when and where they could get vaccinated, I used my obsessive cyber research skills to make sure I could get vaccinated as soon as my age group (people in their 30s) was allowed. My first Pfizer shot was in May and my second was in June. Most adults were getting vaccinated as much as they could throughout 2021, while vaccines weren’t approved for children. Our “antivirus software” was being updated. If as much of the world got vaccinated as quickly as possible, we could have done wonders in mitigating the pandemic! Alas, the global vaccination effort was severely impeded, largely due to corporate greed. From Mother Jones’ Jag Bhalla in August 2021:

“Among the pandemic’s many lessons, however, is that greed can easily work against the common good. We rightly celebrate the near-miraculous development of effective vaccines, which have been widely deployed in rich nations. But the global picture reveals not even a semblance of justice: As of May, low-income nations received just 0.3 percent of the global vaccine supply. At this rate it would take 57 years for them to achieve full vaccination.

This disparity has been dubbed ‘vaccine apartheid,’ and it’s exacerbated by greed. A year after the launch of the World Health Organization’s Covid-19 Technology Access Pool — a program aimed at encouraging the collaborative exchange of intellectual property, knowledge, and data — ‘not a single company has donated its technical knowhow,’ wrote politicians from India, Kenya, and Bolivia in a June essay for The Guardian. As of that month, the UN-backed COVAX initiative, a vaccine sharing scheme established to provide developing countries equitable access, had delivered only about 90 million out of a promised two billion doses. Currently, pharmaceutical companies, lobbyists, and conservative lawmakers continue to oppose proposals for patent waivers that would allow local drug makers to manufacture the vaccines without legal jeopardy. They claim the waivers would slow down existing production, ‘foster the proliferation of counterfeit vaccines,’ and, as North Carolina Republican Sen. Richard Burr said, ‘undermine the very innovation we are relying on to bring this pandemic to an end.’

All these views echo the idea that patents and high drug prices are necessary motivators for biomedical innovation. But examine that logic closely, and it quickly begins to fall apart.

A great deal of difficult, innovative work is done in industries and fields that lack patents. Has the lack of patent protections for recipes led to any dearth of innovation in restaurants? An irritating irony here is that economists who espouse the need-greed theory themselves innovate for comparative peanuts. For instance, in 2018, the median compensation for economists was about $104,000. The typical pharmaceutical CEO, meanwhile, earned a whopping $5.7 million in total compensation that year. (The hands-on innovators aren’t the need-greeders here; the median compensation for pharmaceutical employees — including benefits — was about $177,000 in 2018.) Even in Silicon Valley, writes ever-astute technology insider Tim O’Reilly, ‘the notion that entrepreneurs will stop innovating if they aren’t rewarded with billions is a pernicious fantasy.’

To be sure, it was not greed but rather a vast collaborative effort — funded largely with public dollars — that generated effective coronavirus vaccines. The technology behind mRNA vaccines such as those produced by Pfizer and Moderna took decades of work by University of Pennsylvania scientists you’ve likely never heard of. According to the New York Times, one of those scientists, Katalin Kariko, “never made more than $60,000 a year” while doing her innovative foundational research. The researchers at Oxford University who developed the technology behind AstraZeneca’s vaccine, which was mostly publicly funded, initially set out with the intention of “non-exclusive, royalty-free” licensing for their vaccine. Only after pressure from the Bill and Melinda Gates Foundation did they renege and license the technology solely to AstraZeneca.”

For more about how Bill Gates hindered the global vaccination effort, here’s Mohit Mookim’s reporting for WIRED in May 2021:

“After weeks of immense pressure, the Biden administration came out in support of waiving intellectual property rights to coronavirus vaccines. Shortly after the Biden announcement earlier this month, the Bill and Melinda Gates Foundation also reversed course and endorsed the patent waiver. But Bill Gates himself, subject to revived scrutiny around sexual misconduct and perhaps the most powerful person in global health, hasn’t budged.

While United States residents are being quickly vaccinated and may see an end to the pandemic in sight, most countries in the world will likely have to wait years for many of their vaccine doses, in a situation being described as ‘vaccine apartheid.’ Almost half of all vaccine shots have been administered in just 16 rich countries, and India is weathering a horrific coronavirus crisis.

This could have been avoided. Early last year, countries in the Global South compelled the World Health Organization to unveil a technology sharing pool, C-TAP, that would have removed intellectual property barriers for accessing Covid-19 treatments and vaccines.

Global health czar Bill Gates had other thoughts. Maintaining his steadfast commitment to intellectual property rights, Gates pushed for a plan that would permit companies to hold exclusive rights to lifesaving medicines, no matter how much they benefited from public funding. Given the enormous influence Gates has in the global public health world, his vision ultimately won out in the Covax program — which enshrines monopoly patent rights and relies on the charitable whims of rich countries and pharmaceutical giants to provide vaccines to most of the world. A chorus of support from pharmaceutical companies and the Trump administration didn’t hurt.

Should we be surprised that a monopolist-turned-philanthropist maintains his commitment to monopoly patent rights as a philanthropist too?”

Largely due to how at least a couple of billion people were denied vaccines in 2021, COVID had the opportunity to mutate many times over. This accumulated into the COVID disaster we’re stuck with in 2022. The vaccines that we have are woefully outdated antivirus signatures based on 2020 samples. The 2022 Omicron and so on versions of COVID can easily bypass these signatures.

As of this writing in June 2022, there have been over half a billion confirmed cases of COVID and over 6 million deaths, over 1 million of which are in the United States.

Let’s get into some medical research to examine this whole mess better.

Part three: COVID myths versus facts

--

--

Kim Crawley

I research and write about cybersecurity topics — offensive, defensive, hacker culture, cyber threats, you-name-it. Also pandemic stuff.