

Journalism in the Age of Moore’s Law
Deception has become the norm in digital design. When Netflix’s servers are overloaded, it swaps out its personalization algorithm for a general popularity ranking without telling the user. On Instagram, when you heart a photo, it seems to instantly register, but it really takes a few seconds at least. Coinstar reportedly delays its counting process, because in user testing, the company found that people didn’t believe a machine could count change that quickly.
You could argue that any GUI is deceptive, because it obscures underlying processes. The distinction to make is between benevolent and malevolent deception. Two researchers at Microsoft and one at the University of Michigan put it this way:
“We frame the distinction [between benevolence and malevolence] from the end-user’s perspective: If the end-user would prefer an experience based on the deceptive interface over the experience based on the ‘honest’ one, we consider the deception benevolent.”
The examples given above are forms of benevolent deception. In the Instagram example, I prefer the instant gratification of seeing the heart illuminate over knowing exactly what the server is doing.
Malevolent deception, however, is design that benefits the system or system author over the user. A common example of malevolent deception is a virus. A Trojan that appears to be an email from a friend is designed to trick you into doing something that’s in the virus creator’s interest, but against yours.
There has never been greater opportunity for malevolent deception, because technology has never been more complex, and complexity enables deception. Those researchers continue:
“In any situation where a poor fit exists between desire (e.g., the mental model or user expectations) and reality (e.g., the system itself) there is an opportunity to employ deception. This gap — which is extremely common — both motivates and enables the deception.”
An example of this gap getting exploited is Volkswagen’s recent “diesel dupe.” Thousands of vehicles were equipped with a “defeat device” that could tell when the car was being tested by government regulators. The device would take over the car’s reporting system during tests so it would temporarily appear to have a higher fuel efficiency score.
Fortunately, regulators were able to uncover this. But as technological complexity increases everywhere, the opportunity for even more advanced deception will only grow. Soon, our simplest objects will have chips in them, and the user-designer gap will come in tandem.
This has big implications for journalism. Primarily, the need for full-time investigators of the systems that power everything in our life will go way up, perhaps exponentially.
Two other implications for journalism:
- In order to be able to call out malevolent deception wherever it crops up, journalists should get technical so they can investigate digital products.
- The journalistic process is itself subject to the above trend toward technological complexity. As that process gets more complicated, news organizations will have to think about whether to expose, summarize or hide their internal processes.
Which is all to say that there’s never been a more exciting or necessary time to be in this business.