Towards design that makes us better

Journalism and technology have been two sectors of staggering innovation and evolution in my lifetime. As journalists grapple with how to thoughtfully consider how changes in media are impacting society, they have begun a discussion that has important analogs for other creators. Journalists’ self-awareness and public conversations about the competing interests of creating the best thing and creating the most profitable/popular/viral thing are something I would like to see more of in the conversation among technologists, entrepreneurs and designers. I appreciate the sense of ownership amongst many of the journalists who are considering the global impact of their individual choices.

The “On Being” podcast is on the periphery of my podcast diet. Although I often enjoy long-form interviews, the topics and personalities featured on Krista Tippett’s podcast meander in and out of my zones of interest. The podcast recently replayed an interview from 2014 called “Cartographer of Meaning in a Digital Age,” on topics that seem especially relevant in this era of questioning the motives and ethics of digital design. When I started listening to Tippett’s interview with Maria Popova of Brain Pickings, I was unfamiliar with her guest and had only passing familiarity with the website that Popova curates. I could immediately picture the bright yellow branding of the website and was sure I had read articles from it before, but did not realize that it was the work of a single creator.

The conversation with Popova focused largely on Popova herself, her work writing and curating the content on Brain Pickings, and the relationship between creators, readers and content. Tippett frequently refers to her interviewee’s journalistic style as “old-fashioned” and describes Popova as an “old soul,” both clearly meant as a compliment in contrast with the modern state of online writing. Even as someone who is the same age as Popova (we’re both 34), I don’t identify with a nostalgia for “old-fashioned journalism.” By the time I was really consuming journalism in earnest, I was in my late teens and it was already the early-2000s, a time that doesn’t seem distinct enough from our current era. Reaching further back into my childhood, I still struggle to connect to any examples within my lifetime of old fashioned journalism that I might feel nostalgic for. Could it be the staid local newspapers of my childhood with grocery store insert and a classifieds section dominated by car ads? The neon-colored teen magazines that my friends and I traded? My grandfather’s stack of Wall Street Journals with their dense text interrupted only by tiny hedcuts? The subscription to Rolling Stone I had in college?

I have to assume Tippet and Popova are going beyond my and Popova’s lived frame of reference to an even earlier time. My familiarity with great journalists of bygone eras is limited to those investigative journalists and muckrakers who took seriously their role as the fourth estate, holding politicians and capitalists to account, yet Brain Pickings has little in common with that style of journalism. It is to Popova’s credit that her work does not stand on the shoulders of familiar giants, rather it is her own unique way of taking in the world across many topical dimensions, processing it and reflecting it back with the context and perspective of great minds of the last several hundred years. I don’t know of any examples in history of writers who have so deftly synthesized current events, movements and mindsets with the historical perspectives of the bright minds of many generations. The result is more literary than journalistic, more academic than commercial.

Despite the interview’s focus on her retro approach to writing, the discussion centers on how the novelty of Popova’s work is the result of designing for different outcomes than other authors have chosen to design for, either throughout history or today. In nine years of writing Brain Pickings (at the time of the recording), her intentions for her work had evolved and she had developed an increasing awareness of her agency as a designer. While she always was conscious of the voice of Brain Pickings being fully her own, she became more thoughtful about what impact she wanted her platform to have on society.

As they discuss this evolution, Tippet shares a quote from Seth Godin’s blog: “Giving the people what they want isn’t nearly as powerful as teaching people what they need. There’s always a shortcut available, a way to be a little more ironic, cheaper, more instantly understandable. There’s the chance to play into our desire to be entertained and distracted regardless of the cost. Most of all, there’s the temptation to encourage people to be selfish, afraid, and angry. Or you can dig in, take your time, and invest in a process that helps people see what they truly need. When we change our culture in this direction, we’re doing work that’s worth sharing. But it’s slow-going. If it were easy, it would have happened already. It’s easy to start a riot, difficult to create a story that keeps people from rioting. Don’t say, ‘I wish people wanted this.’ Sure, it’s great if the market already wants what you make. Instead, imagine what would happen if you could teach them why they should.”

As I listened to this remark, from a pre-Trump presidency, pre-Cambridge Analytica scandal, pre-“Weapons of Math Destruction” era, I had to pause the podcast and listen to it again. This short paragraph effectively responds to the anxiety that many Americans are expressing as a result of feeling overpowered by digital ecosystems capable of producing malicious or dangerously apathetic consequences. It is a beacon of hope in a fog of helplessness evoked by the media coverage of the ways that people have been victimized by technological tools and a call to action for the creators who are responsible for the latest generation of tech tools that are ever-present in our lives.

As users of these tools, our feelings of helplessness seem to stem from a feeling that even though the brightest minds have been recruited to work for Facebook, Apple, Amazon, Netflix and Google, we still have poorly designed products that fail to protect us from Russian trolls, opportunistic capitalists or the lesser angels of our natures. If these geniuses can’t solve these problems, they must be unsolvable. Yet the marching orders given to the data scientists, software engineers and designers who built these systems weren’t to create a better world, they were to optimize for engagement, views, comments, likes and purchases.

Is this granular, hyper-zoomed-in perspective to blame for how these companies have time and again not been able to see the forest for the trees? Are there CEOs and product managers and even shareholders that would be interested in the long game that Godin describes? A world where we invest in tech that people need, rather than what they want? The latter is frighteningly easy to design for. We can A/B test websites, products, interfaces, and apps to measure engagement and optimize for capturing more of it, only to find ourselves drowning in vapid clickbait and shoddily made consumer goods as we endlessly scroll through banal social media content. The data about what we want is easily collected. The knowledge of what we need requires judgment and thoughtfulness beyond the capacity of even the most sophisticated AI or the most brilliant CEO.

For the sake of argument, let’s for a moment cast aside the inherent conflict between designing for a better world and designing for short-term profits (given that all of these companies have long histories of eschewing profitability). Let’s generously assume that these tech titans want to be a force for good more than they want to install another helipad at one of their lavish estates. Could they? In the way that Maria Popova thoughtfully designed Brain Pickings to be consistent with her values about making the world a better place, could these platforms leverage our collective knowledge about neuroscience, social psychology, sociology, emergent systems, behavioral economics to design technology that makes us better, not worse? I think we can.

It starts with zooming out. I used to work for a company that organized and led outdoor adventures with hundreds of guides and thousands of clients each week. Leading these trips required being present 24-hours a day for several days or even several weeks at a time. From dawn until well after sundown we did a job that was incredibly rewarding while also physically, emotionally, and spiritually demanding. This work was supported by a team of hard-working logisticians that did behind the scenes work to organize food, gear and equipment to make these trips happen. At the end of our trips, we would return dirty, sweaty and exhausted to the warehouse to de-issue our gear back to the logistics staff. At the warehouse, we could pick up a copy of the weekly company newsletter.

One week that staff newsletter attempted to ply us with the promise of a fabulous party at the end of the year, if only we (the guides) could stop losing tent stakes in the field. In the last year we had lost $300 worth of tent stakes, but if we could get that number down to zero, they would use the savings to buy us some kegs and throw us a party. This was the outdoor industry’s version of being way too zoomed in. Our company netted millions of dollars in annual revenue; the tent stake budget was small enough to be immaterial from an accounting perspective. In terms of providing an experience for our clients, coming home one or two tent stakes lighter didn’t have any measurable impact, yet from a granular and purely economic perspective, wouldn’t not losing tent stakes be better than losing tents stakes? If we only take into account the data we can measure: number of tent stakes, views/likes/shares, five-star reviews, items purchased, then yes, we should do everything we can to come home with all the tent stakes (or likes).

However, the opportunity costs of optimizing only for these measurable outcomes are immeasurable and enormous. As a wilderness guide, coming home with all the tent stakes might mean spending an extra hour each day inventorying gear instead of spending that time telling stories around the campfire. Yet there was no metric for the good vibes accrued while spending time making authentic connections rather than obsessing about gear. For massive tech companies, the cost of maximizing engagement is their users’ emotional wellbeing, mental and physical health, trust in tech companies specifically and institutions generally. Although leveraging our dopamine system to manipulate us into devoting increasing amounts of time to our screens is measurable in the aggregate, we are not capable of measuring it in the way we can track individual engagement or purchases. In focusing on the small, measurable statistics, while ignoring their larger context, tech companies have created platforms that make us miserable. This outcome makes for an unsustainable business model in the long term, as users ultimately recognize that their lives are improved if they stop using your product. And, more importantly, it has led to the investment in and creation of a whole ecosystem of tech products that has promised better living through technology, and time and again failed to deliver on that promise.

The proliferation of these antagonistic products has driven massive anti-tech sentiment, a perspective to pervasive that screen time (especially for children) is assumed to have a negative correlation with living a good life. However, I don’t believe that technology use inherently has to have a deleterious effect on our lives. We can do better. As entrepreneurs and designers, we can design like Popova and Godin. First, we need to better understand the constraints of data-informed design. The things we can easily measure are almost always short-term phenomena that may be in conflict with our loftier goals to design products that make people’s lives better and make the world better. The tendency to design for what we want rather than what we need stems from the bias towards easily available data rather than complicated, difficult to collect data. Absent context, these data can lead us to make catastrophic decisions, to prioritize the measurable over the meaningful, to lose sight of the big picture. Second, we need to know what the big picture goal is. Forgetting that any given metric might be difficult or impossible to measure, what it the outcome you would be designing for if you could? More time spent with friends? Less carbon in the atmosphere? More confidence when taking an exam? Better communication with your family? Fewer days struggling with a mental health challenge before seeking professional help? More laughs? More kindness? More courage?

Once you have your big picture metrics, track those alongside the user data you are collecting. You may not have quantitative data, but you can collect data from other sources about how you are meeting those goals. This is a time to be extra critical as we know there are lots of biases and heuristics that might lead you to conclude that you’ve succeeded in achieving your big picture goals when you have not. Consider confirmation bias, the halo effect and the fundamental attribution error and seek out voices that tell you things you may not want to hear (unhappy users, competitors, critics). Err on the side of believing the haters if their insights push you towards meeting a higher standard for having a positive impact.

Finally, zoom out even further an measure the impact your product has beyond its users. What are the negative externalities of your product for the environment? The community beyond your customers? Future generations? Can you mitigate those impacts? Let’s say your mission was to make life easier by connecting people to on-demand transportation services via an app. How can future iterations of your product further reduce greenhouse gas emissions from cars? How can you more compassionately support the contractors who are providing services for you? What can you do for those workers who have been displaced by this market disruption? How will you ensure that the automation of the driving doesn’t further enrich those with the means of production at the expense of the working class?

These are big questions, and at the scale of our largest tech companies would likely require teams of experts and investigators to craft and implement thoughtful solutions. One of the reasons that we have seen such thought leadership from journalists is that they don’t have the luxury of the diffusion of responsibility that employees at massive tech companies feel. Popova has complete ownership of the content of her site in a way that doesn’t really translate to the ownership a single software engineer amongst thousands feels. Even at a much smaller company with just a few designers and engineers, building software is a team sport, reducing the sense of autonomy and ownership that journalists have. This is only to offer an explanation for why tech has lagged behind journalism, but not to suggest that design ethics are not relevant to the tech industry.

As we move towards a model of either increasing individual contributor accountability for the impact of design choices or the collective responsibility of large corporations, I fantasize about large tech companies having a “Don’t Be Evil Team” or a “First Do No Harm Department,” the same way they have a legal team or PR department to mitigate risk and manage the reputation of the brand. Instead of assuming negative outcomes and hiring teams to clean up the inevitable messes, companies must invest more resources in avoiding problematic outcomes altogether. Abandoning a singular focus on the minutiae of user data in exchange for a more holistic view of how our tools impact individuals and communities can take us there.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store