Turning a New Leaf on Privacy in 2016

What today’s FTC’s PrivacyCon and the latest Pew Internet Privacy Report might reveal about the current state of data surveillance and free society and who’s still not part of the discussion

As an allegory of privacy when it comes to thinking about our data-driven sins committed both with intent and without. “Adam and Eve” Lucas Cranach the Elder, oil on panel, 1528, Uffizi Florence

Contrary to the prevailing myths commonly circulated in the media and among vested interests, today’s privacy discussions about attitudes and industry practices are nuanced with clearer pictures of how consumers understand what’s at stake when our data is collected and shared. Actually, we don’t need to surrender to false dichotimies, that we can either have convenient technologies or privacy but not both. On the contrary, Americans surveyed by Pew’s latest report released today indicate varying degrees of acceptance but surprising sophistication in their understandings of how “it depends” on the context, nature of the exchanges, and disclosures. As a complement to consumer attitudes revealed by Pew, the scholarly research presented at the FTC’s PrivacyCon today tended to support this idea that the choices between privacy and economic value are not binary but are currently rife with complexities, problems, inequities, and need further research and an increased openness to changing certain business practices.

With clear evidence that foregrounds a business climate where disclosures are generally un-useable, but occasionally dishonest, deceptive, and potentially discrimatory, we need to get focused on solutions beyond “admiring the problem” and accept that sufficient research concurs to support this need. If the linear growth of adblocking is any indication of a correlation to Pew’s findings and the PrivacyCon researcher findings, consumers realize they have lost control of their data and privacy to companies but increasingly will take matters into their own hands, even if this action is possibly mutally destructive. Blocking ads harms publishers far more than advertisers, hurting those who make the product we actually want to consume. This type of conflict occurs when people are not presented with other viable options.

Attacks on Privacy Accountability

The lawsuit case of a notorious data broker, Spokeo, reveals an example of willful intent by corporations to overturn law (Fair Credit Reporting Act) when legislation conflicts with their questionable business models. Indeed, there are multiple attacks on accountability in the courts (binding arbitration to nullify class action), in Congress (anti-regulation sentiments despite obsolete legislation), at the FTC (demonstrable need for new policies), and within industry (they set the terms exclusively on profit motive). This climate leaves consumers on their own to build, adopt, and evangelize protections as government and industry both fails and flails on providing protections or at least more trustworthy communication. By leaving privacy regulation as laissez-faire, it promotes the tendency of “tragedy of the commons” and “prisoners dilemma” scenarios typified by the adblocking trend and surrounding controversy. What are the business incentives to protect privacy? Consumers decide.

The potential First Amendment defense of tracking by industrial surveillance companies is especially troubling and epitomizes how technological threats to democracy are not exaggerations. This is especially concerning when we think about how privacy as a right of seclusion and control over our identity is intrinsic to our freedom of speech and pursuit of liberty. We can avoid a Constitutional show-down between business rights and consumer rights if we demand that certain members of industry put their fiduciary obligations aside for a mere moment and remember that they are citizen consumers too.

As more and more peer-reviewed scholarly research is published accounting the issues of online data protection and disclosure in the context of actual and potential harm, anti-regulation lobbyists no longer have a cogent talking point to defeat any and all attempts at curtailing business practices that may be profitable but are consistently presented with emperical evidence as harmful to the public interest and democracy. Until companies and their K Street mouthpieces change their tune and join the American public in articulating a nuanced understanding of what’s at stake, we should disregard what they say as merely paid corporate speech that is artificially distinct from the society these businesses operate within. Consumers decide.

I’m not advocating regulation, though. I would agree with the free-market fundamentalist that bureaucrats are ill-equipped to solve these conflicts now or in the future. However, it’s very clear that a class of professionals have been excluded from conversations about how to balance privacy rights with extracting and exploiting data: Designers. Indeed, no designer or user experience specialist presented at the FTC’s PrivacyCon today. (I tried and failed to earn a chance to contribute.)

Privacy is a Solvable Design Problem

Privacy Controls on Smartphones need work. At PrivacyCon, Serge Egelman from the University of California at Berkeley presented research studying how users understand and respond to the privacy permissions in mobile apps. These are the dialog popups that appear in Android and iOS when you install an app or at runtime (the first time you use it). The key takeaways from his group’s research are that the current user experience of mobile OS privacy user interfaces are flawed because the notifications are habituated and become meaningless. Worse, Egelman’s research team discovered that in half of tested instances, the user is not accurately presented with a viable or accurate understanding of the occasions when their data gets shared. His work focused on Android, but the findings can be reasonably applied to Apple’s platform as well. Today’s privacy controls as a designed user experience on smartphones aren’t succeeding at providing clear consent and notices.

We need the equivalent of visual nutrition labels to augment unread and inscrutable privacy policies. Ashwini Rao from Carnegie Mellon University along with Heather Shoenburger from University of Oregon and Jasmine McNealy from University of Florida separately presented empirical evidence that supports a need for a widely adopted (or mandated?) visual summary of privacy and data policy practices. People do not read privacy policies but their understanding of data collection, sharing, and control features of a website or application greatly affect behavior. In most cases, the habit of obfuscation by companies of their policies and practices demonstrably harms the basis of trust with their consumers and limits the capacity for constructive engagement. In that regard, it’s in the business community’s interest to embrace the equivalent of a nutrition label for privacy standards and practices. We should be able to understand how our data is collected, shared, used, and what rights we have to control and delete it at a glance.

Rao from CMU discovered that people do not expect their banking site to collect health care information. But Bank of America does. This bank should either cease this activity or an icon should prominently disclose to website visitors that health data is in the mix of tracking activity. If collecting health data encourages consumers to change banks, then collecting it isn’t good business. The market should be able to address these asymmetries which obscure deceptive yet profitable methods from us. If self-regulation is the solution, how can consumers punish firms that misbehave? What about untapped profit potential in unfulfilled privacy-by-design?

Even though we don’t know exactly how, targeting ads can discriminate and violate a company’s own privacy policy. Researcher Roxana Geambasu from a team at Columbia presented recent work and new toolkits on how they can reverse engineer ad targeting techniques despite the fact that these are shrouded in trade secrets (especially Google’s). They empirically determined that there are signficant instances where they can detect gender discrimination and other situations where privacy policies are not being honored. Again, how can consumers punish Google for harming the job opportunties of women and failing to uphold their own pledges not to target ads based on factors like religion and sexuality, if we cannot understand how our privacy is exchanged for their services? Thanks to these universities, neutral participants in the privacy debates, we are beginning to benefit from “transparency infrastructures.”

Conclusion

The exclusively legal approach to establishing trust between companies and consumers is failing badly. We know this because the quantitative data and qualitative verbatims from Pew point toward these failures based on Americans’ sentiment about privacy and their measurable reluctance to accept unfair trades of privacy for products and services. Designers have not been sufficiently involved in the task of communicating privacy and data practices to consumers. We can now more clearly see how this is eroding trust and harming the business environment. Designers and privacy advocates should be invited to the table, joining computer scientists, lawyers, and policy experts who dominate the discourse. Designers are needed to urgently help iterate and re-design the data disclosure user experience. Indeed, the influence of lawyers and marketers needs recalibration with the voice of the consumer, represented through user-centered design practices (not just legal briefs and computer science papers). If the incumbant firms have no incentive to change their practices, then startups can win the hearts and minds of consumers. The need for better privacy-by-design is unmet. Consumers decide. Will we?


Note: I wasn’t able to attend the FTC’s PrivacyCon but I tuned into the livestream. I may revise this posting later as I can further review the published papers as they become released on the FTC PrivacyCon website.