Journalism, Sensors and Privacy in an Age of Surveillance

The Ethics of Sensor Journalism Part Three


(Read parts two or part one)


It would be impossible to discuss the intersection of sensor journalism and our communities without addressing the impact of recent revelations about privacy and surveillance by governments and corporations. While the privacy issues that sensors raise exist independent of factors like the Snowden revelations, news about the National Security Agency (NSA), and commercial data collection and tracking, these are still deeply influencing the national dialogue about privacy and security in the digital age. While public response to the NSA revelations was initially muted, as a fuller picture of the agency’s surveillance programs emerged, public polls shifted dramatically. The Snowden documents — paired with security breaches at major retailers like Target — have spiked interest in digital privacy and security.

In many respects, sensors may seem even more intrusive to the general public than the NSA’s bulk collection of metadata. Sensors collect data about our movements and our environment, which for many people feels much more immediate than recording what we do on our phones and computers. Journalists are late to the big data game and we are entering it amidst increased questions and concern about surveillance in our lives in general.

Image by g4ll4is, used via creative commons

During the Tow Center’s 2013 sensor journalism ethics panel, Robert Lefkowitz, the CTO at Sharewave, argued that the scope of the ethical questions we face are directly related to how we define sensor journalism. If sensor journalism is only about collecting environmental data, “then the issues are fairly clear,” he said, but once we begin collecting personal data they expand exponentially. Of course, defining the lines of privacy and personal information is increasingly challenging. At the same event, another panelist, Professor Joanne Gabrynowicz, pointed out, “The whole idea of public and private spaces is being turned on its head.”

This concept is reinforced in an important paper on data protection laws by Eloïse Gratton, in which she writes, “A literal interpretation of the definition of personal information is no longer workable.” Changes in technology and in the amount of data people leave behind, both intentionally and unintentionally, are challenging what we mean by personal data. As the power and accuracy of sensors expand and tools for parsing data sets develop, analysts and algorithms can connect the dots in ways that can turn seemingly anonymous information into personally indefinable data.

“A literal interpretation of the definition of personal information is no longer workable.”

In her paper, Gratton argues that, given these changes, a literal definition of personal data can create three contradictory consequences when used to guide ethics and policy. The literal definition of personally identifiable information could be too inclusive (threating the free flow of information), too lax (misunderstanding how the context of individual data points), or simply too vague (leaving uncertainty regarding the point at which a piece of data becomes personally identifiable). Thus, returning to Lefkowitz’s initial point above, it becomes more difficult to differentiate “benign” environmental data, for example, from “sensitive” personal information.

Instead of drawing arbitrary lines based on outdated notions of personal information, Gratton suggests we focus on the purpose of the protection — to avoid harm. Using risk of harm as a guide in thinking about how we collect and handle data has its own layers of uncertainty and interpretation, but it resonates both with the community-focused approach outlined earlier and with journalism ethics that have long grappled with balancing the needs of privacy with the public interest.

In a very real way, sensors are expanding the web of surveillance, and so we need to consider the various risks of harm embedded in our actions. Much as journalists should base their own security precautions on real-life threat models, we should assess what threats our sensor projects pose to our communities. This raises important questions for journalists regarding how they collect, store, and protect the data they gather (as well as legal questions around how governments can and will request access to that data).

Who Controls The Data?

In a 2012 article, Alistair Croll argues that big data is this generation’s key civil rights issue. He points out how personal data collected on and offline is being used to make discriminatory decisions in everything from credit cards to loan offers. “Data doesn’t invade people’s lives,” he writes, “lack of control over how it’s used does.”

Image by opensource.com, used via Creative Commons

Indeed, the question of who gets to control the data collected about us was also a key point in a set of recommendations delivered to the White House in early 2014 by a coalition of civil and human rights groups. “Individuals should have meaningful, flexible control over how a corporation gathers data from them, and how it uses and shares that data,” wrote the groups.

Both Croll and the civil rights coalition recognize that data can be a powerful force for positive change in health, environment, and social justice, but that without real agency from communities it can also be a double-edged sword. “While data is essential for documenting persistent inequality and discrimination,” said Wade Henderson, President and CEO of The Leadership Conference on Civil and Human Rights, “no one — no matter their color, ethnicity, or gender — should be unfairly targeted by businesses or the government for dragnet surveillance, discriminatory decisions, or any other unwarranted intrusions.”

“While data is essential for documenting persistent inequality and discrimination, no one — no matter their color, ethnicity, or gender — should be unfairly targeted by businesses or the government for dragnet surveillance, discriminatory decisions, or any other unwarranted intrusions.”

It is easy to see how journalists bringing sensors into a community might be seen as an unwarranted, or at least unwelcomed, intrusion. We have to understand the history and current state of surveillance in diverse communities around the United States, and the discrimination and threats linked to that surveillance. Our decisions have to be informed by that history and in conversation with communities.

In addition to the transparency and community engagement outlined above, whenever possible journalists interested in employing sensors should develop meaningful ways for communities to access, control, and discuss their participation and data. Developing these systems will only come through hands-on application, and will demand flexibility based on the journalism project, the newsroom, and the community. A few questions for consideration as newsrooms develop sensor journalism projects include:

  • Who has control over the data we collect (both during collection and afterwards)? Can communities and individuals access the data?
  • Can individuals and communities opt-in or opt-out? When is such an opt-in process important and when does the public interest outweigh the need for an opt-in?
  • If publishing open data sets, can we (or our communities) control whether or not the data gets used for commercial purposes?

Can We Protect The Data?

A key part of how we control the data we collect relates to how we store and protect it. There has been a growing emphasis on transparency and calls for data journalists to publish their data openly on the Web. While that level of transparency for most data should be the standard, depending on the data collected by our sensors and the feedback from our communities, we’ll need to think about how and if we should share what we collect publicly.

Image by Yuri Samoilov, used under creative commons

In the planning stages of a data journalism project, newsrooms should discuss their policies and procedures around anonymizing and sharing data. Part of that planning must also address how it will be stored. Newsrooms need to understand the best practices for safely storing information, and be up front about how long the data will be kept and for what purpose.

In his article on big data and civil rights, Croll notes, “If I collect information on the music you listen to, you might assume I will use that data in order to suggest new songs […] But instead, I could use it to guess at your racial background. And then I could use that data to deny you a loan.” This example should make us question how we’ll control future uses of the data we collect and store.

Also, when our sensors are our sources, there will be times when we need to consider how we will protect that data just as we would protect a source. Jonathan Peters, the press freedom correspondent for the Columbia Journalism Review, has written about the security and legal risks of journalists’ increased reliance on cloud storage for files and data. “The legal protections for journalists using the cloud is, well, cloudy,” he writes. Historically, the Privacy Protection Act of 1980 protects journalists from government searches and seizures but Peters notes:

“It’s unclear whether the things journalists store in the cloud enjoy the same legal protection as the things they store on personal computers, on local servers, and in desk drawers.”

In the early days of sensor journalism it might be difficult to imagine a project whose data set could be of interest to government investigators, but in many ways, that is the point. We should grapple with these questions now and develop ethical guidelines for sensors that address all the kinds of data we collect — recognizing that, as our identities are increasingly defined by our data, more and more of that data is personal.

Where Do We Go From Here?

The intersection of changes in technology and shifts in newsroom processes and ethics mean it is nearly impossible to define any clear-cut guidelines for sensor journalism ethics. Instead, we can and should identify values and questions that can guide us appropriately. Recognizing that sensor journalism projects will vary greatly in project design, the technology they use, and the context of the communities where they are working, our ultimate goal should be to align those various elements in ways that build both new knowledge and safer communities.

Photo by Don McCullough, used via creative commons

Alister Croll suggests that one of the most dangerous things we do when we collect data is “collect first and ask questions later.” In this model, he argues, “This means we collect information long before we decide what it’s for.” If we don’t ask these questions in advance there is no way we can provide our communities any control, or ensure that we are protecting their privacy and storing their data securely. Croll suggests “This act of deciding what to store and how to store it is called designing the schema, and in many ways, it’s the moment where someone decides what the data is about. It’s the instant of context.”

While I’ve referenced the ongoing ethical debates and inquiries at the major journalism institutions, the questions outlined here need to be better integrated into those debates. As a field, we need to weave these threads together and consider ethical guidelines that are not so narrow that every new technology challenges them and sends us back to the drawing board. To that end, we should also draw on the expertise of those groups working at the intersection of press freedom and civil liberties like the American Civil Liberties Union, Free Press, and the Electronic Frontier Foundation. These groups, which have long worked with questions at the heart of journalism and individual privacy, can add a lot to our discussions and serve as a check against our industry’s assumptions and culture.

If sensor journalism is to expand and thrive, however, we’ll need communities, elected officials, technologists, and newsroom stakeholders to buy in. It is not enough to define the ethics of sensor journalism through the lens of newsrooms and journalism ethics; we must also shape our choices through the lens of communities and their values. We need to be aware of historic issues around power and racism that are also embedded in debates about surveillance and move forward in a way that invites more people to the table to discuss these issues. This is an organizing challenge and we should rise to meet it.

Show your support

Clapping shows how much you appreciated Josh Stearns’s story.