Strava Heatmaps: Why Ethics in Design Matters

Ray Crowell
humble words

--

San Francisco-based fitness-tracking company Strava has found itself under scrutiny following the release of heatmap information revealing many national security vulnerabilities linked to the strategic elements of geography.

Some people are blaming the company for its lack of understanding the full implications of their data usage, while others are blaming individuals for sharing their data publicly.

So, whose responsibility is it? Or is there a shared responsibility yet unexplored that exists?

Strava maintains they are “committed to helping people better understand” its privacy settings.

But is it as simple as just educating on privacy settings? One could argue that national security, federal law enforcement, the armed forces, and other professionals have the more vexing role understanding the implications of the technology, it’s user behavior in relation to risks, and continuous feedback loops between employee, institution, and industry. Scott Lafoy, an open-source imagery analyst, told CNN “This is literally what 10,000 innocent individual screw-ups look like,”…”A lot if it is going to be a good reminder to security services why you do OPSEC (operational security) and why you do manage this sort of thing, and everyone is going to really hope it doesn’t get a couple people killed in the meantime.”

These organizations have been warned for years (including by myself) of the information/operational security (specifically with pattern of life, that is, the data collected and analyzed establish an individual’s past behavior, determine their current behavior, and predict their future behavior) implications associated with social platforms and advanced analytical technology. I spent my career stabilizing this intersection between national security and progress — having a deep understanding of the protection of lives, billion-dollar weapon systems, and geopolitical assurances and on the other side, the power of many of these technological advancements in enabling access to health and wellness for all.

Getting at this balance requires us to not get enamored by the idea or implications of ethically sound solutions, but rather exposing our design practices to ethical scrutiny.

Why ethics? To “rationally” study the moral dilemmas in “human” action allows us to analyze the implicit or explicit codes of conduct rooted in multi-generational societal beliefs and values. To oversimplify the framework, consider ethics as a benchmark for quality assurance measuring the “quality” of the idea, holistically.

In what my compatriot, Mary Iafelice, assures me should be a three-part series, let’s concentrate first and foremost on what strategic questions can tell us.

Ask Strategic Questions

1. How does ethics fit into goals?

2. Why should designers serve as shapers of users?

Last year, one of my dear friends, Yoko Sen, introduced me to what Jet Gispen was doing in Europe. For those not familiar with her work in Ethics for Designers, her toolkits help you learn from existing designs (products, processes, policy, etc.) and uncover the underlying intentions and view of the designer. She leverages hidden roles, brain writing, and other principles to better grapple with the varying degrees of ethical issues in design and map diverse values across different stakeholders.

One of her tools, Normative Design Scheme, exposes designers to philosophical considerations across the three phases. These three phases (outlined below) help connecting assessment back to goals when it comes to determining ethical considerations of design.

Virtue Ethics (Intention Phase)“An action is morally right just because it would be done by a virtuous person acting in character.”

Deontology (Design Phase)“An action is morally right just because it abides moral rules and serves a person’s moral duty.”

Consequentialism (The Effect Phase)“An action is morally right just because it produces the best actual or expected results for the most people.”

To answer subsequent questions within the tools, one must reflect on where they get their own values and formulate an ethical posture. What does it mean to do good? Don’t just consider harm but assess how harm manifests. The designer must take on the role of a shaper and abandon any foolish notion that the consumer is as little complicated as a “user” would imply. The designer must be cautious with personas generated so one-dimensional and absent the core appreciation he/she is an actual human being. Be critical and challenge yourself to design the user you feel would be most ethical and reflect and compare your point of view towards those of others.

These tools are not only beneficial for the designer, but for the user as well. I mention these specifically for institutions like the Defense Department, impacted from the Strava heatmap and frankly many other technologies being employed both sanctioned and unsanctioned by military members and on military installations. These tools are beneficial the institution’s leadership to “reverse engineer” what technologies on the market can do by way of harm … in balance with the good. I learned a long time ago, from wiser mentors than myself, that you don’t know what you’re missing, if you’re not looking to begin with.

In a must-read blog post, Part 3: How Design Designs Us, The Ethics of Design, Leyla Acaroglu asks a most relevant question to Strava’s dilemma of the day, “Who is taking responsibility for the outcomes, externalities, and downright damaging impacts of our hyper-consumer, ever-changing landscape of new gadgets and virtual arenas that are coming on board at a lighting speed pace?”

Passing the buck isn’t going to cut it given the speed of innovation and elevated consumer adoption rates. All stakeholders must come together and navigate conflicts of visions, cultures, etc. and the moral imperatives associated.

Conclusion

In this relationship that is the producer and the consumer, there is a shared responsibility as it relates to safety and well-being. Those that are in the business of unlocking large amounts of personal data and those in the business of collecting and using in automated decision and profiling, should take a lesson from Isaiah Berlin,

“Once men’s true interests can be made clear, the claims which they embody can be satisfied by social arrangements founded on the right moral directions, which make use of technical progress or, alternatively, reject it in order to return to the idyllic simplicity of humanity’s earlier days, a paradise which men have abandoned, or a golden age still to come.”

And we have to balance the notions that almost anything can be weaponized, but we should not unnecessarily stifle creativity and innovation. There is a balance. I suggest we start with the grand virtue of dialogue.

--

--

Ray Crowell
humble words

Exiled Alabamian | Venture @SCAD | Builder-at-Large @humbleventures | Former Fellow @harvard | Veteran @USAF #getshitdone