The Post-Pandemic Future of Cybersecurity

Center for Long-Term Cybersecurity
CLTC Bulletin
Published in
18 min readAug 24, 2020

Digital security — like so many aspects of life — will be meaningfully and perhaps radically different on the other side of the 2020 pandemic crisis. How can we start to prepare now for the emerging challenges and opportunities this new world will present?

Since our inception, the UC Berkeley Center for Long-Term Cybersecurity (CLTC) has sought to shape our research agenda by thinking broadly about how the future — or rather, multiple possible futures — might shape the course of human-technology interactions in ways that impact cybersecurity.

When we launched CLTC in 2015, we used a formal process called scenario planning to develop a set of narratives that looked ahead to the year 2020. While we foresaw many trends and missed others, these initial scenarios helped us anticipate trends like the rise of powerful predictive algorithms, the internet of things, and expanding markets for data. About four years later, we developed a new set of scenarios that looked ahead to 2025, weighing the potential impacts of newly emerging technologies like powerful artificial intelligence, quantum technologies, and ubiquitous sensors.

Now, in the course of the first months of the COVID-19 pandemic — and the social, political, and economic upheaval it has wrought — CLTC’s team members have developed a new set of scenarios for the year 2025. Led by our faculty director, Steven Weber, our team developed “critical uncertainties,” variable drivers that could be decisive in shaping the post-pandemic world. We used a structured scenario planning process to consider how the collision of diverse forces — social, technological, economic, environmental, political, and military — could re-shape our world over the next five years.

We bounded this work somewhat by concentrating on the country we know best: the United States. The insights here are partly shaped by that focus and should be seen only as starting hypotheses if generalized to other geographies.

We started with a set of semi-obvious but crucial questions: Will there be a backlash against technology after months of forced Zoom calls, or will we further immerse ourselves in the virtual realm? Will the advent of COVID-tracking apps usher in a new era of surveillance, in which people are willing to sacrifice privacy for a greater good of public health? Will the pandemic lead to increased global cooperation or further fragmentation based on nationalism and other narrow interests as zero-sum mindsets prevail? How might the pandemic shift public attitudes about climate change, or the role of local communities? And most relevant to our work, how might these shifts collectively shape the future of digital security, and more broadly, the relationships between humans and technology?

Ultimately, we zeroed in on two key variables’ intersection that seemed to capture many of the higher-stakes issues likely to define the next few years. The first relates to whether (and how) digital technology will continue to dominate our lives. On the one end of this spectrum,“digital eats the world,” we further immerse ourselves in technology and shift our lives into the online space because it just works so well. On the other end of this axis is “clear limitations:” societies contain their obsession with digital technologies to the parts of life where it really is an improvement, but return to a more analog world in specific areas where the limits of digital have become painfully clear.

The second key variable relates to whether (and how) national governments — in the U.S. and around the world — establish and maintain a strong role in driving society forward (“success at large scale, renewed legitimacy”), or whether the influence of the central government wanes, leaving local governments to fend for themselves (“failure at large scale, successes local”).

grid showing axes used to create the scenarios
The scenarios are based on two variables: whether (and how) digital technology will continue to dominate our lives, and whether (and how) national governments establish and maintain a strong role in driving society forward.

Of course, none of the narratives that emerge from the interactions of these two critical uncertainties is likely to be “right.” Scenarios are meant not to be predictions, but to generate hypotheses. And so these stories are best used as heuristics to help us think more clearly about what may be coming. As always, our goal is to use long-term thinking to shed light on the question: if this future unfolds, what cybersecurity issue will we wish we had been working on more in 2020? If the hypotheses are compelling, then the decision consequences are clear: let’s start working on those problems now.

An Imperfect Patchwork

Crippled by the post-pandemic recession, national governments face shrunken budgets and have limited capacity to coordinate an increasingly technology-based society. The shift toward a digital society continues unabated, but there are downsides. School districts go online and health-care practitioners transition to telemedicine, but quality suffers. The widespread divergence in approaches and standards across regions leads to massive inefficiency.

Dr. Sammy Patel ‘treats’ between 40–50 patients per day through his practice, everything from poison oak rash to influenza — all from the comfort of his living room in Omaha, Nebraska. A “country doctor” for the 21st century, Patel guides patients through a consultation on Zoom as they manipulate the cameras on their mobile phones.

“The toughest part is when I need to see a patient’s back, and they don’t have anyone there to help,” Patel says. “With the help of the personal diagnostic tools now available, I’m able to treat about 75 percent of patients without ever meeting them in person. The quality of care is perhaps not as good, but it is very convenient.”

Dr. Patel’s remote medicine practice is just one example of how the world has “gone virtual” over the past five years, a transition that was accelerated by the COVID-19 pandemic of 2020, when everything from universities to Burning Man was forced to move online. But the acceleration may have been a little too fast to take hold.

Even after the coronavirus was “fully contained” in 2022, people chose to stay inside and avoid public spaces, in large part because they doubted the government’s competence. Commercial real estate prices have collapsed. Social activities ranging from bridge clubs to bowling are mainly digital. Hundreds of private schools across the nation have abandoned in-person learning. Meanwhile, public schools, bound by law to accommodate all students, have struggled to adapt: K-12 test scores dipped to an all-time low in 2024, and experts point the blame at digital classrooms.

The demands on digital are excessive: the online world feels more broken and fragmented than ever. Crippled by the post-pandemic recession, governments have had limited capacity to coordinate basic services, and the weak regulation of technology (and most other areas) has led to a chaotic patchwork of rules about privacy and data protection. Despite the best efforts of major digital platforms to establish standards, widespread divergence has led to massive inefficiency. Individual states within the U.S. have tried to regulate the increasingly active online world on their own, but this has generally led to confusion and chaos; in early 2024, for example, interstate truckers were denied entry into Arizona from New Mexico because of incompatible digital identification systems.

Many businesses that were forced online before they were ready have failed to establish, much less sustain strong security. Ransomware and other attacks have become a ‘normal’ part of life. The Amazon breach of 2023 — when $10 billion of the company’s virtual currency mysteriously disappeared — underscored the challenges of governing a fragmented, unruly internet. When hacktivists released video from dozens of Fortune 500 board meetings, it not only shocked the stock market, but raised new questions about Big Tech’s ability to protect even their most exclusive consumers.

Integrating 5G into the internet of things (IoT) has enabled supply chain efficiencies. It has also led to uncounted and poorly secured endpoints, and there is serious concern about “death from a thousand cuts.” In what has been hailed as the “revenge of the LAN” (or local area network), many communities have abandoned cloud-based solutions and returned to operating their own networks, figuring they are better off fending for themselves.

The main upside of all this splintering is that, while digital “petty theft” is on the rise, the risks of a catastrophic, systemic cyberattack have diminished. Nation-state attacks have become less important because the rewards are no longer big enough to matter, except in a few high-profile targets, which are aggressively defended. But for most users, security has corroded further and crime rates are rising in both the analog and digital worlds.

“They used to say ‘we’re all in this together,’ but now that just sounds silly,” says Martha Johnston, a former bartender who now runs live online classes on mixing cocktails. “Now it’s all about connecting through Nextdoor to know what’s up, staying below the radar, keeping your doors locked, and making sure your passwords are strong.”

If this future unfolds, what cybersecurity issue will we wish we had been working on more in 2020?

If this kind of future were to unfold, we might wish to have in place a shared metric for evaluating risk and effectiveness upgrades and/or downgrades, mapped to a variety of economic sectors and other human activities, so that the decision to move a particular activity online could be made according to a standardized cost-benefit analysis and evaluated over time. In some cases, the risk-adjusted burdens of COVID infection might be preferable to the risk-adjusted burdens of moving online. Summer 2020 is suffering from the lack of this metric in decisions about opening schools.

Party Like It’s 1999 (or The Roaring 20s)

Following months of a quarantine-imposed retreat into Zoom calls and other online interactions, societies rush out of the digital sphere as soon as the pandemic wanes, and people turn eagerly back to live events and in-person interactions. National governments have fallen further into dysfunction, leaving local communities and companies to struggle on their own with cybersecurity and other challenges in the face of exhaustion and disillusionment with the digital world.

It’s been five years since COVID-19 brought the world to its knees, and while the virus may be under control, the global economy is still in recession. Ironically, brick-and-mortar retail has surged again, shopping malls are all the rage, and digital juggernauts like Amazon.com have seen their stock prices plummet.

“Digital businesses that were soaring five years ago have struggled to recover from their bubble experience,” says Jorge Padrio, an investment analyst for the Kitari Investment Group. “The pandemic pushed customers to places they didn’t really want to be, and so they went running away the moment they had the chance.”

At first, the natural environment, too, took a turn for the better, as carbon emissions were sharply reduced between 2020–2022. But pollution has slowly ticked back up as industry and car usage have returned to trend. “There were a few months there where we could hear birdsong outside our apartment,” says Heather Jones, a homeowner in Irvine, California. “There was sort of this realization, like, this is actually the way it’s supposed to be.”

The surge into Zoom calls and online delivery brought on by the pandemic exposed the stark limits of all-digital life. Exhausted by the quarantine and burned out on tech, Americans turned back to in-person interactions the minute they had the chance. Similar to the “slow food” movement, the “slow tech” movement has led to a sharp reduction in usage of digital devices. In 2022, the hashtag #LookUp briefly trended on Twitter before the message it conveyed — ditch your phone, re-join the physical world around you — made social media itself less appealing. Some bars and restaurants in New York City and other urban areas have started offering “phone checks” to help guests ditch their phones. Streaming video ratings have plummeted, but attendance at sporting events and Broadway shows is at an all-time high. “Looking at your phone all the time is gauche,” says Melanie Matthews, a style columnist for Hipster Weekly.

The backlash against virtual life was triggered in part by the revelation that Facebook was storing users’ private information in servers all over the world, including in nations where the government claimed the right to access data stored on their soil. “Zoombombing was one thing — but the intentional risking of so much personal data was on another level,” says Moisha Yldirim, a security analyst for Neptune Digital Security. “People got really creeped out when they realized all their text messages, family photos, and other data could have ended up anywhere — and probably did.”

The pandemic also laid bare the federal government’s inability to protect Americans’ livelihood. Locked in a stalemate following the third wave of the pandemic, Congress left it to state and local governments to figure recovery out on their own. The result has been an awkward hodgepodge of approaches to shared challenges like cybersecurity, at a time when data breaches and ransomware attacks continued unabated.

The “tech revolution” stagnated following the coronavirus pandemic, and much-touted technologies like VR, 5G, and the “internet of things” underwhelmed, as people were largely content with the same mobile devices they’d been using for the past decade. There have been efforts to improve “security by design,” but without clear, strong standards and regulation — and an activist government to back them up — networks are as vulnerable as ever. The so-called “Big Breaches” of 2023 — when cybercriminals plundered billions of retirement savings from public pensions, as well as private firms Vanguard and Schwab — further eroded trust.

“People are less focused on their tech, and continue to be willing to accept a loss of privacy — and the occasional identity theft — in exchange for simply not having to deal with it,” says Vanessa Brown, a security analyst for Gorman Research. “In reality, technology never went away, but the lack of attention has meant that security standards have fallen behind.”

With less trust that the government can come to the rescue when needed, community organizing has become essential. Non-governmental organizations — in the U.S. and internationally — have filled the gaps in providing a variety of social services. “Do it yourself” is the mantra of the day, and despite growing resistance to technology in classrooms, the American Union of Teachers endorsed a cybersecurity curriculum designed to develop students into “self-sufficient citizens in protecting their digital security and privacy.”

“Cybersecurity professionals have really struggled to navigate the myriad platforms, standards, and technologies that are continually emerging and then disappearing,” says Padrio. “We should have done more to encourage strong, open-source standards for security and encryption that are widely used. We also need to continue to train people (including youngest generations) on the fundamentals of security and privacy-protection. It feels like it’s back to basics. The world feels more like 1999 than 2025.”

If this future unfolds, what cybersecurity issue will we wish we had been working on more in 2020?

If this kind of future were to unfold, we might wish that the big technology platforms were permitted to grow even larger and more dominant in markets than they currently are, as long as they invest a much greater proportion of their resources and efforts into security. A more secure oligopoly might be a better outcome than a wide-open market of smaller and less secure solutions, particularly in a situation where governments are simply unable or unwilling to shape and regulate a large number of small players.

Analog Investment

The U.S. federal government plays a decisive role in driving an economic recovery, in large part by focusing on creating jobs and building traditional infrastructure. Global economies recover and enter a new era of prosperity, but backlash against AI follows moves by many major manufacturers to replace workers with robots, and the advancement of digital technologies slows dramatically.

As we celebrate the four-year anniversary of “Cure Day,” when a vaccine for COVID-19 was released to the general public, it’s worth recognizing just how different the United States looks today than it did when the pandemic first struck in 2020.

What’s most surprising is how respect for competent and activist central government has returned, as the varying experiences of different nations in controlling the pandemic — from winners, like Japan and South Korea, to clear failures, like Brazil and the United States — created a new awareness of how only highly competent government could orchestrate the delivery of public services on a massive scale. The resulting spirit of compromise in Washington, DC has led to the start of a reform process that, five years ago, few would have thought possible. “Frankly, we had nowhere to go but up at that stage,” said Victor Hruska, an analyst for Greenblatt & Walsh, a political consulting firm. “Who knew it would take a global pandemic and economic collapse for the government to get its act together?”

Many pundits point to the President’s massive “Back to Basics” infrastructure spending plan, which relied less on “smart” systems than on traditional needs, like bridges, housing, and roads. This 21st-century “New Deal” plan put millions of Americans back to work, and many of the new jobs focused on smart energy and climate resilience (e.g. wildfire fighting, solar power plants).

This massive investment in infrastructure was accompanied by a sharp reduction in dependence on technology tools and systems. The unemployment wave following the pandemic made labor cheaper, and companies like McDonald’s, Wendy’s, and other chains were hit with a backlash for opening “touchless” restaurants that eliminated the need for cooks and cashiers. In moderately complicated settings, AI systems performed in sub-optimal and sometimes slightly comical ways. Following a week-long outage of new 5G networks, and a series of cyberattacks that brought more than a dozen S+P 500 firms to a halt for a two period, the public essentially decided their “new” world should be built on old and proven technologies.

This generated the political will necessary for Congress to pass the Digital Infrastructure Act of 2023, the first-ever national privacy and “data rights” regulations with real teeth, one consequence of which is that platform firms and their data have been redesignated as “critical infrastructure.” Meanwhile, communities and labor unions alike rejected the call for a mass-scale “smart city” infrastructure, and the backlash toward distance learning has led school districts to invest in buildings, teachers, and staff, rather than more online education capacity. In 2025, the U.S. has more elementary school teachers per capita than ever before.

Cybersecurity has taken a surprising turn, as well. In one respect, the attack surface has remained more or less stable, as deployments of new systems slowed down. But there is a huge challenge in the large number of legacy IoT devices (especially industrial devices) that were never meant to be still in use in 2025. “IoT devices have become ‘brownfield’ assets because no investment has been made in making the software and hardware updates necessary to keep them secure,” says Jody McGovern, President of TechnologyServes, Inc.

Federal and state governments are now the largest buyers of technology solutions (including cybersecurity services and products), as they have learned to use technology in a highly instrumental way for greater effectiveness of service. Talent has shifted away from private-sector tech hubs like Silicon Valley and Austin, toward Washington, DC and its lucrative government contracts.

“Cities are investing more on identifying and repairing potholes than purchasing the latest version of AI,” says Susan Binghamton, director of the Association of American Urban Areas. “And some cities have quietly been adding ‘ransom’ to their IT budgets as a cost of doing business. Aging infrastructure isn’t just about crumbling bridges, but no one wants to invest in the upgrades necessary to keep their digital systems secure.”

If this future unfolds, what cybersecurity issue will we wish we had been working on more in 2020?

If this kind of future were to unfold, we might wish we had spent more time and effort on basic ‘blocking and tackling’ cybersecurity issues, like phishing attacks, ransomware, and brute-force password attacks on cheap IoT devices. Much of the effort spent on leading-edge issues like adversarial machine learning would have little practical value in this world, while behavioral cybersecurity interventions that aim to change the daily practices of users in fairly simple ways would be much more important.

Global Singapore

Photographer: Three Lions/Getty Images

Five years after the coronavirus pandemic first swept across the globe, the physical world looks remarkably similar to 2019. But the digital infrastructure underpinning everyone’s lives has advanced dramatically. America has invested trillions of dollars in digital technologies, and economic well-being has improved through the digital transformation that was turbocharged by the COVID-19 pandemic. Citizens have (sometimes grudgingly) sacrificed some freedoms and a lot of privacy for safety and efficiency.

By late 2020, the United States became the acknowledged global epicenter for the COVID-19 pandemic; every American knew someone who was infected and someone who lost their job during the recession. But as the U.S. flailed, many other nations managed to limit the spread of the virus and resume much of normal life in early 2021, in large part by implementing large-scale, centralized efforts guided by sophisticated digital technologies. The extensive use of sensor data — from mobile phones, search engines, mapping apps, and traffic patterns, along with sophisticated AI models — helped determine how to allocate public health resources. The word ‘surveillance’ has a different connotation now, much closer to how it is used in public health as a positive than the negative “surveillance capitalism” slogans of early 2020.

Gradually, the U.S. government’s uncoordinated response simply grew impossible to justify. It became clear that a dramatic shift was needed, and support grew on both ends of the political spectrum for a data-driven and decisive federal government response. The White House appointed technologists, scientists, and public health experts and gave them a blank check to do “whatever it takes” to pandemic response and economic relief in early 2021. The government forced major platform firms to coordinate the sharing of data through a centralized data repository, and developed new methods to streamline contact tracing. As part of a suite of data-driven public health measures, the coronavirus pandemic was effectively contained in just a year’s time.

The success of this mobilization effort impressed even the most skeptical members of the American public and emboldened policymakers to “optimize” other aspects of American life. Privacy advocates naturally were up in arms about encroachments on civil liberties, but so much progress in public health and the economy over such a short period of time was just too much good news to be held back by hypotheticals. Ninety percent of the public now approves of the government’s response to the coronavirus pandemic, and only 5% think that the costs to civil liberties were excessive.

Political success and restored legitimacy emboldened politicians, who sought to extend data-driven policymaking and technological infrastructure across many sectors of American life. Singapore’s technocracy became a widespread inspiration. In late 2021, Congress passed a series of bills titled GovNet 2021 that invested trillions of dollars in digital tools and technologies, as well as cybersecurity and other initiatives, to upgrade everything from public transportation to police. The project’s scope, scale, and budget surpassed that of the New Deal. Officials once again relied on the expertise of researchers, technologists, computer scientists, and cybersecurity professionals, with a few tech ethicists in the mix, to inform how to responsibly govern these new digital infrastructures.

Today, the large-scale investment has largely paid off, and sophisticated digital technology is now seamlessly integrated into public infrastructure. America is a surprisingly efficient place. The 2024 presidential election was a historic moment as the United States offered online voting for the first time. Fifty percent of Americans chose to vote online, with a rate of voter fraud proven statistically to be lower than that of in-person voting, thanks in large part to the nation’s large cybersecurity budget.

That cybersecurity investment was critical, because the attack surface has become much larger much faster than anyone had foreseen before the pandemic. There is less concern about sharing data across platforms, but more worry about increased vulnerability to nation-state adversary attacks on infrastructure, given the increasingly centralized points of failure (though it’s unclear whether anyone even really knows where those points of failure are).

Anxieties about freedom and individual privacy have become luxuries that people indulge when they can. There are rumors of suppression of public dissent and social movements, but to the horror of civil libertarians, most citizens have come to accept the trade-off. Easily able to recall the trauma and insecurity they felt in 2020, the desire for health, economic, and psychosocial security has come to far outweigh what many citizens view as largely imaginary privacy concerns.

“The government now has access to an unprecedented amount of centralized data, and the fact that they tell us data is managed responsibly now is no guarantee that it is or that it will be in the future,” says Kristen Hyde, a policy specialist at the Electronic Freedom Foundation. “The American public needs to think through all the ways our information ecosystem can be compromised. We are flying blind, and there is no model or framework for what can go wrong.”

If this future unfolds, what cybersecurity issue will we wish we had been working on more in 2020?

If this future were to unfold, we might wish we had developed and broadly disseminated a more tightly integrated view and a set of models that clearly articulate nuanced relationships and trade-offs between surveillance and privacy. The tendency toward zero-sum, worst-case privacy narratives of 2020 would have been dysfunctional if the choices were perceived as binary and were over-determined by a public health and economic catastrophe of this magnitude. We might also wish for more research and insight about how to reverse what is often configured and perceived as a “ratchet effect” on personal data: once it’s out of the individual’s control, is there a way for it to provably “sunset,” or is there a way to take it back?

The team members of the Center for Long-Term Cybersecurity contributed to this piece through their thoughtful brainstorming and scenario development. Kayla Brown was the lead author of the Global Singapore scenario. We welcome your feedback on these scenarios. Please send any ideas, comments, or questions to cltc@berkeley.edu.

--

--