The Necessity of Ethics in AI: A Lasting Legacy

Shyamala Prayaga
Digital Assistant Academy
5 min readDec 27, 2020
hands holding human rights placard

A wealth of knowledge and experience along with a diverse background has enabled Renée Cummings to pursue work in ethics in AI.

She has experience in criminology, criminal psychology, substance abuse therapy, therapeutic jurisprudence, and journalism — to name just a few areas.

Today, Renée is the Data Activist in Residence at the University of Virginia. She looks at the influence of big data policing on underserved or high-needs communities and is also working to create a tool that will: “Build communities up and support the interface and interaction between community stakeholders.

“I think the diversity of my background is what brought me into AI,” Renée shares. “I started to look at the risk assessment tools that were being used in the criminal justice system, and how these algorithmic decision-making systems work, are really misbehaving when it came to the administration of justice.”

The need for ethical guardrails

Renée felt she had to speak up about the existing and potential harm she saw in AI, especially as certain technologies, such as facial recognition, made their way into the criminal justice system.

woman with facial recognition on phone

As new and emerging technologies become reality, Renée seeks to ensure that our society has “rigorous and robust ethical guardrails” in place. AI can have negative long-term impacts, and this should be considered when deploying technology.

Renée explains that for her ethical design doesn’t have a definition. However, it should start with an understanding of what good design is. “And who is good design for?” she asks. “Is it good for one community? Or is it good for all communities?”

To ensure that all communities benefit from AI, diversity, equity, and inclusion must be part of the design process.

“I look at design through the eyes of criminal justice, and the ways in which we have designed prisons, or the ways in which we design facilities for juveniles, or the kind of ways we design public spaces. When it comes to ethical design, you come with that premise that speaks to do no harm.”

The role of voice designers

Ethics matter across all relationships in a society. However, voice designers should be aware that they are also responsible for the relationships they create between users and AI.

“Voice has the ability to motivate, to inspire, but it also has the ability to harm and to disenfranchise — and you have got to understand that as a designer of voice technology, you have a role.”

Part of this role involves making sure that voice design is inclusive. Renée gives the example of an individual with a speech impediment using voice technology. “How do you merge those two? If I take much longer to pronounce a word, and I’m using a technology like Alexa, is that technology inclusive?”

Further, it’s important to ask whether technology is accessible to those in a state of trauma with greater emotion present in their voice, and for individuals with different life experiences.

Voice in hiring

This year in particular, employers have relied heavily on technology in the recruitment process due to COVID-19, but its misuse can do immense harm. Some candidates, such as those with heavy accents, can be discriminated against in the hiring process if the technology used isn’t designed to be culturally diverse and appropriate.

diverse people in office setup

Issues like these only serve to further disenfranchise historically underrepresented, or unrepresented, people. “You don’t want to create a technology that amplifies disenfranchising individuals. What you want to do with your technology is ensure that all voices are amplified, heard, respected, appreciated, celebrated.”

Voice for sensitive conversations

Renée also explores considerations when designing for sensitive conversations, which is “one of the most critical aspects of voice technology.”

Because of differing cultural backgrounds and experiences, sensitive conversations can be relative; what may be a sensitive conversation for one individual may not be for another. Pronunciation and enunciation can also change the meaning of some words. This is why designing for diversity is crucial.

“What are the words that could impact? What are the words that can be used by bad actors? What are the words that could easily spiral into a disinformation campaign, and how are these things are used?”

Designing for a better future

Renée encourages those in voice to not only think deeply about ethics, but to educate themselves and apply that learning to their work. “Technology is now part of our bloodstream,” so it’s essential that designers boldly step up and build for a more ethical world.

“We want to ensure whatever we create, the legacy that we leave is a legacy that provides the kinds of resources we need as a society to continue to grow and develop positively — to include, not exclude, to really celebrate diversity.”

About Digital Assistant Academy

Digital Assistant Academy provides Voice Interaction Design and Conversation Design training and Certification. In this program, we will take you from the very basics of voice interaction and conversation design, through to how voice technologies work. We’ll do a deep-dive into conversation design strategy, and it will be fully hands-on with your Capstone projects. By the end of the course, you will have two voice applications successfully designed, developed, and deployed. Learn more at Digital Assistant Academy https://www.digitalassistant.academy/

Before you go

Clap 👏 if you enjoyed this article to help me raise awareness on the topic, so others can find it too
Comment 💬 if you have a question you’d like to ask me
Follow me 👇 on Medium to read the more articles on Voice Technology and Conversation Design, and on Twitter @sprayaga

--

--

Shyamala Prayaga
Digital Assistant Academy

Shyamala Prayaga is the founder of the Digital Assistant Academy. A self-described evangelist for UX and voice technology.