Autonowashing: The Greenwashing of Automation

Definition, Application, & Consequences

Liza Dixon
Apr 1 · 4 min read
Sofia the robot can drink water, just like us!

There it is. Smugly sitting on aisle three.

A bottle of water, just like all the others it shares a shelf with, but this one…this one has a fancy, eco-feel-good label on it. It appears to be made of the same plastic as all the others, but it says it’s made of biodegradable unicorn tears. Sure, you would like to believe that, but your gut is telling you — that sounds a little fishy.

What’s with that?

That’s greenwashing, also known as:

“...the act of misleading customers and potential customers into believing that a product or service is environmentally friendly” (source).

But what does greenwashing have to do with automation?

The promises of automated technologies — from robotic assistants to self-driving cars — to improve our safety and quality of life are seemingly boundless. But these rose-colored fantasies of tomorrow are not guaranteed.

To make these visions a reality, automated systems must be mindfully introduced to the people they support.

Unfortunately, like the greenwashing of the sustainability movement, the capabilities of automation are commonly inflated, often by those interested in short-term profits.

Enter: Autonowashing

Adopted for automation (source), autonowashing, v. (v. autonowash, n. autonowasher, v. autonowashed) is the practice of making unverified or misleading claims which misrepresent the appropriate level of human supervision required by a partially or semi-autonomous product, service or technology. Autonowashing may also be extended to fully autonomous systems, in cases where system capabilities are exaggerated beyond what can be performed consistently and reliably.

Autonowashing makes something appear to be more autonomous than it really is.

The objective of autonowashing is to differentiate and/or offer a competitive advantage to an entity, through the use of superficial verbiage meant to convey a level of system competence that is misaligned with the technical specifications of the system.

Autonowashing may also occur inadvertently, when one unknowingly repeats erroneous information about the capabilities of an automated system to another. Autonowashing is, in a sense, viral.

Autonowashing extends to many applications of autonomy, but especially those where human-automation interaction occurs in safety-critical scenarios.

Take for example, the automotive industry. Original equipment manufacturers (OEMs) offering driver assistance options in their vehicles (i.e. Audi, Ford, or Tesla), are using a wide vocabulary to describe these options and their abilities. Issues surrounding the language used to describe vehicle autonomy are well documented in scientific literature (for example, this study) and in the media (see this article). Because there is no regulating body overseeing the language used to describe assistive systems, OEMs have been unchecked in their use of branded terms.

Current Autopilot option on the Tesla Model 3 (source)

Perhaps most notable, is Tesla’s use of the term “Autopilot” to describe a Level 2 system (partial automation, as defined by the Society of Automotive Engineers) which requires full driver supervision at all times. And now, “Full Self-Driving Capability,” which is unsubstantiated and dependent upon future technological breakthroughs, not to mention regulatory approval (as Tesla states) which is certainly outside of the company’s control.

This is autonowashing.

As a result, 40% of Americans believe that a car with a system called “Autopilot” (or the like), has the ability to drive itself (source).

Fact: The highest level of vehicle automation on-road today is Level 2 (source).

The consequences of autonowashing are, but not limited to:

  • Misuse of a system due to inappropriate reliance. Leading to…
  • Disuse of a system due to performance concerns. Ultimately, causing…
  • Increased public distrust of automated systems

Today, 73% of Americans are reportedly afraid to ride in a self-driving car (source). For the auto and tech industries pouring billions into the research and development of this technology, this is a problem.

According to Håkan Samuelsson, the CEO of Volvo Cars, public perceptions of automated driving systems are the responsibility of OEMs (source). PAVE (Partners for Automated Vehicle Education), who’s members include GM, Waymo and Intel, has been established to assist in the public education of automated driving technologies, ensuring their safe adoption. No small task, given the uphill battle that is retraining an autonowashed public.

In the case of automated driving, delays in the release and acceptance of this life-saving technology are costly— for companies, their investors and for anyone who uses public roadways, at risk of being one of the 1.2 million people who will die this year in auto accidents, internationally (source).

The conversation about high technology like automation will continue to be bi-level. Experts require their own vocabulary to address technical challenges, while the user requires language akin to a “plug & play model,” able to support even novices in appropriate, safe interactions with automation.

Yet, nothing happens in a vacuum; the blurring of the meaning of words is to be expected. We’re human after all, and automation is attempting to solve some of our biggest problems.

Supporting the proper adoption of automation is an effort to improve the quality of life for the humans it serves. The most efficient way to do this: Deploy automated systems with proper language which reflects their limitations.

Giving this problem a name and calling autonowashing out for what it is, allows us to tackle the challenges it presents, head-on.

Thanks to Amrith Shanbhag.

Liza Dixon

Written by

Usability Engineer, MSc. Advocate for human-centered automation 🚘 Passionate about tech, design, science & autonomous mobility. Eat plants 🌱