Using Data to Inspire Humans and Baffle Machines.
We’ve all been there before; you look up some term on Wikipedia and instead you are forwarded to its “Disambiguation” page. Yes, you asked for a certain term, but what exactly did you mean? Wikipedia cannot deal with this ambiguity and hence tries to discharge it. Unsurprisingly, this odd term disambiguation has its origins in computer-science and through Wikipedia it has crept into culture as if it always belonged. Unlike human culture, Wikipedia and other digital systems are based on binary code that simply cannot handle ambiguity. And so ambiguity is made the enemy of the digital and is eradicated in favor of what can be more easily processed.
Undoubtedly there are benefits to this disambiguation. It gives us clearer understanding, simpler collaboration, a common ground, so we can also fit nicely within the big picture that is then spat back at us from the screen. But it comes with a price. In an effort to make everything more predictable and controllable we might lose the ability to step outside of ourselves, to challenge the norm and to maintain a sense of surprise and discovery.
Though not framed in terms of ambiguity, our digitized social lives have become major fields of contestation. The intricacies of our ever-changing relationships defy categorization and our intimacy and privacy become the price we must pay for connectedness and friendships.
“If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place”
In 2009 this statement by then Google CEO Eric Schmidt was considered very undiplomatic and at the same time, very telling. It represents not only how disconnected Silicon Valley is from the nuances of culture, but also how frustrated it has become in its monumental efforts to mine, extract, index, categorize and organize the world’s information and make it universally accessible and exclusively exploitable. On the other hand, the widespread anxiety caused by this erosion of privacy leads most of us to a somewhat similar conclusion. Overwhelmed by digital media, we feel forced to adjust to its binary premise that all information once networked, is prone to become public. And therefore the best way to maintain the integrity of your communication’s context is to not be in communication in the first place, at least not online.
These concerns have become even graver with the rise of so-called “Big Data”, the technological ideology that attempts to derive quality from quantity. As every bit is collected, cross-examined and analyzed against the statistical norm, it becomes even harder for us to maintain our contextual integrity.
Some choose to ignore the risks; others may assume responsibility in the form of self-censorship; others, exhausted from the race under the omnipotence of Big (Data) Brother might go as far as committing social media suicide, disconnecting from the network altogether. Yet in recent years, and more forcefully since Edward Snowden’s NSA revelations, we’ve seen the rise of Crypto-Culture, led by a highly tech-savvy community that uses strong encryption tools to conceal the content of their communication.
But while it offers some protection, cryptography does not challenge the main assumption at the core of Big Data ideology, and to some degree it even strengthens it. We go online not to hide, but to communicate and to express ourselves. Crypto Culture is presented as a counter-culture movement but its resistance to the powers that be only goes so far. At the end of the day both the NSA and crypto-activists share a similar perspective. The former believes we can be reduced to a set of signals and therefore attempts to collect as much of it as possible, the latter also believes we can be reduced to a set of signals and therefore attempts to conceal as much of it as possible.
It seems like the Modernist project has reemerged through datafication: Post-Modernism and Relativism were just temporary hiccups as now we can finally analyze all of them in bulk as data. Now we can find those hidden patterns, define the underlying trends and at last expose the overarching big picture — finally… the Modernist truth.
The myth of big data is reductive and de-humanizing. And accordingly, acknowledging it as a myth is essential for us to embark on the humanist project of de-mystifying it. The Turing Test was never meant to examine whether machines are intelligent enough to pass as humans, it was meant to examine whether humans themselves can still pass as such. Let’s not insist on packaging our lives in semantic machine-readable form, let’s practice humor, poetry and art — create signals that inspire humans and baffle machines. Rather than give in to Disambiguation, let’s celebrate Reambiguation!
One place to start in demystifying technology is in the emerging school of Seamful Design. The term Seamful Design was coined in 1994 by Mark Weiser, and then further developed by Matthew Chalmers and Ian MacColl.
“Seamfulness is about taking account of these reminders of the finite and physical nature of digital media. Seamful design involves deliberately revealing seams to users, and taking advantage of features usually considered as negative or problematic.”
— Mark Weiser¹
Matthew Chalmers calls² for reintroducing the seams previously hidden in the name of seamful technology and seamful interactions. Now at the age of seemingly seamless big data, it is time to reintroduce seamful information design into our lives and our databases.
One such tactic for seamful data is obfuscation. AdNauseam³ is an obfuscation tool I produced in collaboration with Daniel Howe and Helen Nissenbaum. It hides ads just like an ad blocker, except that every ad blocked is then silently clicked by AdNauseam, confusing your data trackers by virtually liking all ads. AdNauseam is just one of many obfuscation examples chronicled in Finn Brunton and Helen Nissenbaum’s 2015 book Obfuscation: A User’s Guide for Privacy and Protest⁴. Obfuscation exposes the seams of the data collection and analysis process by turning data gluttony on its head: while the myth of big data assumes more data would lead to clearer signals, obfuscation shows that more noisy data would actually result in more ambiguity.
Another opportunity I had in exploring seamful information design was through a series of workshops I’ve been leading titled Disinformation Visualization⁵. In an attempt to challenge the neutral assumptions of data, I encouraged workshop participants to be truthful to the data and deceitful through its framing, and so participants used false dichotomies, biased categorization, selective sampling, deliberately counter-intuitive use of color and misleading iconic language. They have also wrapped the visualization in text that further emphasized their skewed analysis. The often over-the-top results emphasized both to the viewers and the producers of these visualizations just how fragile and bias-prone the data gathering, analysis and representation process can get, exploring the malicious seams and problematizing the de-politicization of disambiguated seamless data visualization.
While the short-term incentives for disambiguation are tempting, in the long-run we might find the disambiguated datafied life less worth living. The creative ethics of seamful design and the performative tactic of obfuscation invite us to reambiguate our lives, to demand more from technology, to reject the data-surveillance packaged in Silicon Valley self-style Neo-Modernist propaganda; not by hiding, but by explosive expression, to poke holes in the cloud and to ask how big can big data really get before it collapses into itself.
This article was commissioned by Gaite Lyrique & Lienart and initially published in French under the title “Les Big Data Face a l’Ambiguite” in the exhibition reader Extra Fantomes.
- Weiser, Mark. “Creating the invisible interface:(invited talk).” Proceedings of the 7th annual ACM symposium on User interface software and technology. ACM, 1994.
- Chalmers, Matthew. “Seamful design and ubicomp infrastructure.” Proceedings of Ubicomp 2003 Workshop at the Crossroads: The Interaction of HCI and Systems Issues in Ubicomp. 2003.
- Howe C. Daniel, and Nissenbaum Helen, Mushon, Zer-Aviv. “ADNAUSEAM — Clicking Ads So You Don’t Have To.” ADNAUSEAM — Clicking Ads So You Don’t Have To. Accessed January 26, 2016. http://AdNauseam.io/.
- Brunton, Finn and Nissenbaum, Helen. “Obfuscation: A User’s Guide for Privacy and Protest” MIT Press, 2015.
- Zer-Aviv, Mushon. “Disinformation Visualization: How to Lie with Datavis.” Visualising Information for Advocacy. January 31, 2014. Accessed January 26, 2016. https://visualisingadvocacy.org/blog/disinformation-visualization-how-lie-datavis.