The coronavirus could be here to stay. Your privacy may be another victim.

Mashable
Mashable
Published in
11 min readSep 7, 2020

Your health may not be the only casualty of a pandemic future.

BY JACK MORSE

Image: Vicky Leta / Mashable

Mashable’s series Tech in 2025 explores how the challenges of today will dramatically change the near future.

You want to go grocery shopping, but the location-tracking app you were forced to download will report your movements. And because you don’t have permission to leave campus, you’ll be automatically locked out of your dorm.

You think about your part-time job, and wonder what data is being collected by the camera-tracking technology installed to monitor social distancing on the warehouse floor. If you get too close to your coworker, the system will automatically flag you. Another flag, and you might lose your job.

You consider going for a run, but worry that it will raise your body temperature, which in turn could trigger a health-monitoring device you were pressured into wearing. You might then be prevented from participating in campus activities.

Your mind turns to your kid, who every day at school is tracked by cameras monitoring her every move. If the system decides that she isn’t socially distancing enough, will she be sent home?

You can’t take time off to watch her. Lose your job and you won’t have health insurance. And you really, desperately, need to keep your health insurance. There’s a pandemic, after all.

This is not some distant future. This is real life, today, in America. And, if we’re not careful, it could get worse. Because despite what we all hope, the coronavirus is probably here to stay.

The forever problem

Let’s just get this out of the way: You are probably not getting a broadly effective coronavirus vaccine anytime soon.

Despite the cautious optimism of Dr. Anthony Fauci, the director of the National Institute of Allergy and Infectious Diseases, that we will have a vaccine by the end of 2020, there is less cause for you, specifically, to be optimistic.

In all of history, mankind has only successfully eradicated one human virus, smallpox, and that took around 200 years of sustained effort. Today, a coronavirus vaccine that is both safe and 50 percent effective would be considered a “ game changer.”

Let’s say that does happen, as Dr. Fauci hopes, by the end of the year. What does that mean for you? Do you fall into the category of “critical health care and other workers?” No? Then as supplies will initially be limited, you’re probably not getting the vaccine for a while.

Even then, when you and all your friends and family do get the vaccine, the coronavirus will still be around. Importantly, not everyone will follow in your footsteps. As years of anti-vaxxer fear mongering and the backlash over wearing masks has shown us, not all Americans are inclined to believe medical science.

In mid August, NBC News published the results of a poll of 34,269 U.S. adults. Thirty-three percent of those who identified as Republican, or lean Republican, said they would not get a coronavirus vaccine even if one became available. So did 12 percent of those who identify as Democrats or lean Democratic.

Like the ongoing measles outbreaks in the U.S., the coronavirus will likely never truly go away. Your privacy, along with your health, is at risk. And that’s worrying scores of policy experts and elected officials.

A new normal

The United States has not tamed the coronavirus. As of early September, the Centers for Disease Control and Prevention reported over 6 million cases and 185,000 deaths in the U.S. alone. With tens of thousands of new confirmed cases each day, the pandemic will likely continued to burn through the end of summer, into fall, and explode in the winter months.

None of this is news to data-hungry behemoths like Google and Facebook, both of which told staff to not bother returning to their offices until the summer of 2021 at the earliest. It is also not news to the companies actively building, marketing, and selling new forms of surveillance technology meant to help track the daily habits of millions of Americans.

In the face of all this, universities, restaurants, sports leagues, and businesses of all kinds are stumbling forward — some more blindly than others — with varying forms of invasive tracking technology.

Surveillance creep

The Berkeley Center for Law & Technology, established in 1995, seeks to “foster the beneficial and ethical advancement of technology,” and has long explored the effects of new technology on privacy and constitutional law.

Jim Dempsey, the executive director of the center, sees a clear path forward in the forever age of the coronavirus.

“[It] is likely that there will be more data collection and usage,” he explained over email. “Key principles of privacy protection should apply: collect the minimum necessary for the purpose at hand and keep the data only as long as necessary to serve that purpose.”

More data collection, however, is often fundamentally in conflict with privacy protections. Early signs of that looming conflict can be found at Michigan’s Oakland University, which still plans to open to students in the fall. To accomplish this, the school is depending in part on BioButtons — “a coin-sized, disposable medical device that measures continuous temperature and other vital signs for a remarkable 90-days” — which are to be worn by students.

“Many students are already hesitant, at best, to be tracked as they stay on campus.”

“The button will be used in conjunction with the daily health assessment to determine if you are able to participate in campus activities,” read an Oakland University statement — from a now-deleted webpage — initially reported by ClickOn Detroit.

Students pushed back, gathering more than 2,000 signatures for a petition demanding that wearing the buttons be made voluntary.

“Many students are already hesitant, at best, to be tracked as they stay on campus (and as they leave for the weekend to visit home, go to class, grocery shop, etc.),” read the petition.

University officials later clarified that the BioButtons are not mandatory, but they are still “strongly encouraging a BioButton to be worn[.]”

Oakland University isn’t the only higher-ed institution dipping its toes in the surveillance pool. Harvard has proposed a tracking system, TraceFi-MyDataCan, which piggybacks off students’ smartphones to theoretically assist with contact tracing.

Crucially, the TraceFi system is opt-out. Students have to literally turn off their WiFi on smartphones, laptops, and tablets — a nearly impossible task for a modern college student — in order to avoid being tracked.

Many people aren’t even given the opt-out option. Albion College, a private liberal arts college in Michigan, has made its contact-tracing app mandatory. That app collects real-time location data on its students, who cannot delete it without risking suspension.

We’ve seen this outside of the U.S. as well, with India’s de facto mandatory contact-tracing app Aarogya Setu. The app relies on Bluetooth and location data to track users, and shares that data with the Indian government. In May, security researcher Baptiste Robert found the app could be exploited by hackers to confirm the diagnosis of people infected with the coronavirus.

IMAGE: AAROGYA SETU
IMAGE: AAROGYA SETU

It’s worth emphasizing that contact-tracing apps are not the same thing as traditional contact tracing. The former is controversial, and may be ultimately worthless, while the latter is a tried and true public health measure that involves actually getting in touch with people who may have been exposed to the virus.

The Electronic Frontier Foundation is a San Francisco-based nonprofit that has long worked to defend civil liberties in the face of ever-expanding technology. Over the course of a lengthy conversation, EFF senior staff attorney Adam Schwartz cautioned against letting the measures we’ve seen employed overseas — and at schools like Albion — be enacted in the U.S.

“For companies, or businesses, and governments to be forcing people to put stuff on their phones that tracks them and shares information about them, that’s a red line that we can’t cross,” Schwartz said over the phone.

There are other forms of tracking being deployed in the U.S., today, that are not truly optional. As the Wall Street Journal reported in August, the Perry Township school district in Indianapolis plans to use a Motorola Solutions system of cameras to both identify if students are wearing masks and whether or not they’re socially distancing. The system “detects physical attributes” in order to keep track of specific individuals.

Perry Township lists four kindergartens, 11 elementary schools, two sixth-grade academies, two middle schools, and two high schools on its website.

“These technologies always hit low-income communities the worst, and people of color the worst.”

It’s not just schools going down this route. Amazon is now using a camera system to monitor its employees with the stated goal of enforcing social distancing. In June, Politico reported that the multinational firm PwC planned to deploy its own contact-tracing app. It uses Bluetooth to track who users come into contact with, and will reportedly be made mandatory for employees returning to the office.

“What does become inherently bad, in our opinion, is corporations saying you can’t come in here unless you download an app that’s tracking your movements or tracking who you are in contact with,” noted the EFF’s Schwartz. “We understand that some employers are talking about doing this to their employees, that some restaurants are thinking about this.”

In addition to being an invasion of privacy, such a system sets people up for potential harassment and stalking. In May of this year, a Subway restaurant employee used a woman’s phone number, provided to assist in contact tracing, to hit on her. If we are forced to hand over even more intimate data, like our location, the opportunity for abuse only grows.

Using what’s there

History has repeatedly shown that once a new form of surveillance technology has been deployed, regardless of the stated intention, it’s often repurposed for wildly different purposes. Take San Diego’s smart streetlights. Originally intended to help reduce traffic fatalities and carbon emissions, the network of cameras is now exclusively used by police to do things like surveil Black Lives Matter demonstrators.

“Often when measures are introduced for a specific purpose, they linger on because people become acclimated,” Ryan Calo, a law professor at the University of Washington, told the Wall Street Journal.

Image: ARIANA DREHSLER / getty

We saw this in the early stages of the pandemic, even before the scale of the disaster was fully clear. Advertising companies with access to location data on millions of Americans decided to paint themselves heroes by using that data to track, at least in theory, whether or not people were following shelter-in-place guidelines or socially distancing.

As Motherboard reported in August, the Department of Homeland Security purchased location data generated by apps on millions of Americans’ phones. Originally gathered for advertising and marketing purposes, the location data is now being used for something much darker. As the Wall Street Journal reported in February, such data has been purchased by DHS and used for “immigration and border enforcement.”

Both the amount, and kind, of data being collected during the coronavirus crisis continues to expand. Someone will find a use for it — possibly one of the many would-be dictators looking to solidify political control, like Turkey’s Recep Tayyip Erdoğan and the Philippines’ Rodrigo Duterte.

The U.S. government has long demonstrated its willingness to engage in unconstitutional surveillance. Imagine what Trump might do with all of this new tracking data over the course of a second term.

Targeting the vulnerable

The pandemic is not the great equalizer.

We’ve seen the disproportionate effects of the coronavirus time and time again. Latinos are both hospitalized, and dying, at rates four times that of their white counterparts. Those with lower incomes have been hit the hardest by the economic downturn associated with the pandemic. But health, and wealth, aren’t the only ways the already vulnerable are at further risk.

“There is a color of surveillance.”

“When we talk about Orwellian surveillance technology, there is a color of surveillance,” observed Schwartz. “These technologies always hit low-income communities the worst, and people of color the worst.”

This matters, and not just because it is unjust.

“When you intrude on people’s privacy — their locations, their associations — it’s not just an invasion of privacy,” Schwartz explained. “It discourages people from engaging in behavior protected by the 1st Amendment: meeting with a union organizer, going to a protest, attending a church.”

In this way, the surveillance technology heralded as a tool to reopen schools, businesses, and the country at large might instead, in the long run, make things worse for the groups of people already struggling the most.

Glimmer of hope

As the pandemic spring shifted to pandemic summer, what was once unthinkable has congealed into the new normal. With the U.S. seemingly unwilling or incapable of bringing the coronavirus to heel, a new future emerges on the horizon: a future where our remaining shreds of privacy have been reluctantly sacrificed.

It doesn’t have to be this way.

Legislators across the country are pushing privacy bills specifically designed to protect Americans’ privacy in the face of an ongoing pandemic.

While we don’t have a national, comprehensive data privacy law, there are proposed COVID-19 privacy bills in Congress and the state legislatures of California, New York, and New Jersey.

While some need work, others, like New York State Senate Bill S8448D, which has been passed in the State Senate but not yet signed into law, could be effective.

“The longer the crisis continues, the greater the temptation to ‘break the glass.’”

The New York bill, supported by the New York Civil Liberties Union, the EFF, and Consumer Reports (among others), seeks to regulate the “collection of emergency health data and personal information and the use of technology to aid during COVID-19.”

We need to codify these protections into law, now, before things get worse. Because if there is anything the last few years have demonstrated, it is that things can get always get worse.

In Israel, authorities used drones to literally peer into people’s home windows to ensure they are quarantining. In China, drones equipped with thermal imaging were used to keep people indoors. A mandatory contact-tracing app in Qatar exposed the names, health status, and location data of over 1 million users. A U.S. contact-tracing app secretly shared location data with Foursquare.

If all this happened in the first six months of the pandemic, imagine what the next 5 years could bring.

“The longer the crisis continues, the greater the temptation to ‘break the glass,’” Schwartz warned.

What lies on the other side of that glass is still, for the time being, up to us.

Originally published at https://mashable.com

--

--

Mashable
Mashable

Mashable is for superfans. We’re not for the casually curious. Obsess with us.