Wrangling chaos, leaning into tension
SXSW learnings and how they reflect Fjord Trends 2018
As one of my colleagues graciously warned me, “Overbook your schedule, but be prepared to miss out on most of what you see.”
To call South By Southwest (or as Austinites know it, “South By”) overwhelming doesn’t quite cover it. Clocking in at 10 full days for the main festival, six conferences, and an average of a dozen simultaneous sessions per hour within the Interactive track alone, it’s a huge world stage for Texas’s modest capital city. I’ve experienced the public bits of SX for six years, but this was my first time armed with a badge for the Interactive conference.
What did I take away from it all? An awful lot. A few themes emerged from this year’s Interactive tracks, such as tech industry ethics, trust and design leadership, the use of artificial intelligence, and the emergence of physical experiences as differentiators in customer/brand relationships. These are also many of the same themes identified in this year’s Fjord Trends, already coming to life in the early part of 2018.
Let’s dig in.
Design gets existential in ethics and leadership
The design practice is increasingly self-reflective in 2018, and many speakers offered their take on the role that designers, businesses, and people play in today’s ethics economy. Dr. Melis Senova gave a talk about the shadow side of human-centered design. In it, she shared how she creates “actor maps” of all the people who affect, or are affected by, a design change — but even those maps are aspirational, assuming the best possible circumstances and behavior. She advocates the formation of a strong “design character”: a personal guiding direction that creates a moral awareness of the possible negative effects of designers’ work.
Facebook VP of Product Design Margaret Stewart spoke to the tough decisions that they’ve faced in the past two years around data privacy and algorithmic recommendations. She likens designing tech for such large populations to “a new kind of urban planning,” requiring the same level of forethought and multi-organizational cooperation.
When I asked how she expects to be able to keep pushing for user needs against Facebook’s part in the surveillance economy, she advised:
“It’s about the long term. If you lose user trust, you lose your business. Your execs have to be focused on that.”
Stewart’s three recommendations for ethical design:
- Look out for and prevent misuse — things like harassment or content manipulation.
- Seek external expertise. “Asking for help is a sign of maturity, not weakness.”
- Assess success, don’t just measure it. “Metrics can serve as horse blinders when not contextualized. Not everything that’s valuable can be measured.”
It seems that the largest social network has indeed begun to lose user trust, their negligent handling of data privacy in the Cambridge Analytica scandal being the latest offense in their long history of pushing privacy boundaries. Fjord’s 2017 trend Unintended Consequences may be one of the most relevant observations for years to come, or at least until designers and leadership within tech companies can humbly steer the industry through the same ethical reckoning required of older fields like chemistry and physics.
In “Embedding Empathy in Experience,” brand expert Jon Wilkins explained that “half of the world’s consumers are Generation Z or Millennial, and they have expectations of values, consistency, personalization, and more. Smaller companies have to face the customer directly, with no middle-men or monopoly power, which means they end up with a better customer experience.” There is much more to a human’s life than consumption, so he advocates designing for “human experience” (HX) over “consumer experience.” His recommendations for brands:
- It’s not about performing the best, but rather being the preferred choice. Don’t sacrifice your experience for a short-term win. “You have a bank account of karma with your customers, and that’s all you really have.”
- Get and maintain permission. People don’t opt-in to many modern marketing behaviors. “People leave things in their online shopping carts, and then get bugged about it later. They leave them because they don’t f&#%ing want them!”
Several other sessions addressed the ethical considerations of AI, design, politics, and social impact, calling designers to take responsibility for the results of their work and broaden the use of design as strategy.
Physical experiences as a premium
It was clear from the first day of SX who stole the show this year: the real-life Westworld. HBO’s heady sci-fi western theme park was brought to life in a resurrected ghost town outside of Austin in the largest-scale, most detailed “transmedia” experience I’ve seen outside of the likes of DisneyWorld. The feelings of exclusivity, mystery, and broken boundaries crafted by the show were carried over into real life through secret ways into the park, clues hidden inside guests hats or graveyards, and improvisational actors who played realistically with each other as well as with us, the guests. There was no divide between the audience and the stage.
A behind-the-scenes panel with the creators of the experience revealed how they earned HBO leadership’s trust over time with lower-investment immersive mysteries online to justify such a massive undertaking. In entertainment, especially, expect more physical experiences as a premium like Alamo Drafthouse’s themed off-site screenings and physical-digital mashups like The Void VR arena.
A panel of women from NASA shared the myriad ways that the beloved agency invites citizens into its missions and surfaces publicly-funded data and images to the public. There are multiple AR/VR experiences, online tools, partnerships with toy companies and film studios… but most interestingly, a social media press pass that lets selected citizen journalists experience facility tours and launch events just like a New York Times reporter or a congressional VIP.
Last year’s Place By Design competition was moved into the main festival this year, highlighting scalable, physical creative works of artists, architects, and designers that improve shared urban landscapes.
At Fjord’s own Physical Fights Back panel, Nike’s Sean Madden reminded us that “the service is the humans; you should use software and resources enhance that,” not simply make better software. “Humans become more autonomous and do trustworthy judgements when they’re empowered with data. This also makes them more valuable.” Lyft VP of Design Katie Dill used a coffee analogy:
“A robot could engineer the objectively best cup of coffee. But it feels different than when it’s made by a barista. Is that experience difference worth it?”
Max Burton shared how we work at Fjord: Not simply sharing files back and forth to collaborate but pinning things up on walls so they can be discovered and discussed naturally.
Unlike infinitely duplicable digital experiences, physical experiences are often more exclusive just by the nature of their physicality. The continued existence of in-person conferences like SXSW stands itself as a testament to the value we place on meeting people and experiencing things in real life.
As many as three of our 2018 Trends speak to machine intelligence and automation. At SXSW, there were dozens of sessions on AI, but not a lot of focus or grounded thinking. Too much thought about sentient, emoting robots; not enough thought into how existing machine learning technology will dramatically affect our work lives in the near term.
When I asked the panelists in Humanizing Autonomy about managing the impact of existing automation, they deemed it a “doable design problem,” without need for some new approach. Automation needs to:
- admit its limitations,
- show how it makes decisions,
- communicate what you can do to interact or question or challenge it,
- and overall, always be transparent.
Otherwise, it confuses users and removes human agency.
The most pragmatic session on AI was Design in the Era of the Algorithm by Josh Clark of Big Medium. With any new technology, there are phases of reaction: Technical Advancement (“Wow! This will change everything”), Disillusionment (“This is not what we were promised”), and our current phase for intelligent machines, Critique (“Let’s make the best possible version of this”).
The machines make mistakes, so we must design for those mistakes — not just the happy path. Our job is to set good expectations and channel behavior toward those expected capabilities.
“Our answer machines have an overconfidence problem.”
For example, Google Search’s featured snippets appear too confident for its actual ability. We need to build in some “humility”.
A few of his A.I. design principles:
- Favor accuracy over speed. A fast wrong answer is still wrong.
- Help systems know when they’re not smart enough. “I don’t know / I think I know” are good answers if they’re accurate. Think about how we use vocal tone, facial expressions, and body language when we are/aren’t confident
- Add human judgement. “Hostile information zones” should contain warnings that ask people to be critical when they view this content. (For example, Wikipedia labels pages that are disputed.)
- Advocate sunshine. Don’t hide what you’re doing with data. We need to be able to audit the machine’s logic.
- Embrace multiple systems. Humans do this when we “get a second opinion.” To get a better idea of a hurricane’s path, forecasters overlay multiple hurricane models on one map.
- Root out biases and bad assumptions. “Let’s not codify the past.” For example, an automatic faucet didn’t recognize black hands because it apparently wasn’t trained on people of color. Remember that our tools only know what we feed them.
- Take responsibility. It’s tempting to think that engineers or automated assistants are responsible for the user experience. Just because it lacks buttons or pixels doesn’t mean it needs less design. “Machine learning is just a new design material.”
More festival highlights
- Duke University’s Center for Advanced Hindsight ran a workshop on behavior design, a more measurable and outcomes-focused methodology than human-centered design that often requires no technological changes, instead leveraging psychological tendencies to help people take actions or make self-improvements that they seek. “You accomplish the things that you measure.”
- In Want to Fix Corporate Innovation? guests from BBVA, Comcast, and the U.S. Air Force shared their biggest challenges and experiments in moving large organizations forward. “You can’t be in corporate innovation and be jaded. The opposing static force is already so strong, you have to be a believer.” In one example, the Air Force’s “Spark Tank” competition surfaced one airman’s concept for a pad that saves hundreds of millions of dollars annually on back and neck injuries of mid-air refueling techs. Another tip: Creating KPIs that affect different parts of the organization helps get buy-in and share the wins.
- Civic innovation was out in force this year, at NYC’s Making Tech Work for People series, WE*DC, the Michigan House, and more. When it comes to new tech like autonomous cars, one NYC panelist advised “We need to stop implementing shiny new tech without a focused problem to solve.” Small behavior changes like nudging 5% of riders off peak times saved millions a year and prolonged the life of one city’s transit system. From Detroit: “You need to create a space where you can disagree but still come together and hold relationships.” Designers are needed not just as craftspeople, but as mediators of tough, messy, overdue conversations.
An industry grappling with its own large impacts
We as designers still have a lot of work to do. As you might expect at the modern equivalent of the World’s Fair, there remains a lot of focus on technology itself, and not enough on its application or impact. Human-centered software and service design are now mature and respected crafts, but the work of speaking for users and determining human impact is still not taken for granted. Many here recognize experience design as a good thing, but many more stop short of taking the steps to invest in it or understand its real value. Even worse, some completely ignore any ethical concerns around privacy, and abdicate responsibility to users who aren’t informed about how their data is used or how they might be manipulated.
The good news? Collaborative, human-centered design is spreading across fields and disciplines. Public and private orgs are beginning to work together more closely with a focus on solving significant civic issues and preventing adversarial relationships or proprietary solutions. There is great work being done with very low-tech (or no-tech) quantitatively measured solutions in the emerging field of behavior design.
Now, if you’ll excuse me, I’m off to grab a noodle bowl from my favorite food truck — badge no longer required.
For more detail on Fjord’s predictions for 2018, please check out our Trends site.