CG models, VC values, and AI disguises

Alexis Lloyd
Ethical Futures Lab
7 min readAug 24, 2020

--

To receive Six Signals in your inbox every 2 weeks, sign up at Ethical Futures Lab.

This week, we investigate systems that are in transformation, from fashion to supply chains, and we also take a look at the tactics that people use to respond to the systems around them (even as they change). There’s also a mind-blowingly cool analysis of LEGO and UX design.

— Alexis & Matt

1: The 3 As: how we engage with imperfect tech

As technology has become integrated into every aspect of our lives, we often find ourselves confronted by systems that don’t do what we want them to. In some cases that is because they are designed to do something we want to avoid (e.g., interactions that come with troublesome privacy implications). In other cases they are simply misinterpreting what you’re trying to do. The latter issue has been a thorn in the side of geneticists who work with Excel. The names of multiple genes, like MARCH1, are automatically converted to dates by the spreadsheet software, leading to erroneous gene name conversions in about 20% of papers with Excel gene lists. There are workarounds, like adding an apostrophe before the value or converting column types. But instead, a few weeks ago the HUGO Gene Nomenclature Committee (HGNC) decided to actually rename all the problematic genes to avoid the issue altogether.

As we’ve been tracking a variety of human interactions with technology, we’ve identified three distinct tactics that people use when their systems aren’t working for them, which we call the Three A’s. The first is response is Adversarial, where people develop hacks and workarounds to achieve their goals despite the system (we write about these a lot and find them fascinating). The second is Adaptive, where we adapt our own behavior to the system’s rules and expectations, contorting ourselves to match machine perception. These include examples like using an apostrophe so Excel doesn’t turn your gene into a date, or employing hyper-articulation so that voice assistants can better understand you. The example above, where geneticists just gave up and renamed entire genes to please a piece of software, is the third tactic, which we call Acquiescence — where people just stop trying to do the thing they wanted altogether because it’s too frustrating.

--

--

Alexis Lloyd
Ethical Futures Lab

Ethical design and weird machines.