Open Dialogue

Evan Selinger in conversation with Chris Gilliard

Illustration: Julia Moburg/Medium; Source: Getty Images

This is Open Dialogue, an interview series from OneZero about technology and ethics.

During the pandemic, educational technology companies experienced a 900% increase in business once schools started shutting down campuses and restricting visitors. These companies swooped in with A.I.-infused software designed to prevent students from cheating. These proctoring algorithms can verify who is taking an exam through facial verification. They can also monitor test-takers, scrutinizing their behavior for signs of irregularities that might indicate cheating, like looking away from the screen.

Critics contend the software promotes unfairness, invasions of privacy, and unduly inflicted anxiety. The situation is so dire…


Open Dialogue

A conversation with the professor who just turned down a $60,000 grant from Google

The graphic text “Open Dialogue” is framed around different sketches of human faces and emotions.
The graphic text “Open Dialogue” is framed around different sketches of human faces and emotions.
Photo illustration: Save As/Medium; Sources: Getty Images

Emotion A.I., affective computing, and artificial emotional intelligence are all fields creating technology to understand, respond to, measure, and simulate human emotions. Hope runs so high for these endeavors that the projected market value for emotional A.I. is $91.67 billion by 2024. A few examples are revealing: The automotive industry sees value in algorithms determining when drivers are distracted and drowsy. Companies see value in algorithms analyzing how customer support agents talk and computationally coaching them to be better speakers. And researchers see value in children with autism using A.I.-infused …


OPEN DIALOGUE

Unlocking the medical potential of artificial intelligence requires being more realistic about its limitations

Photo illustration: Julia Moburg/Medium; source: Getty Images

This is Open Dialogue, an interview series from OneZero about technology and ethics.

I’m excited to talk with Muhammad Aurangzeb Ahmad. Muhammad is the principal research scientist at KenSci, inc., a company specializing in A.I. in health care, and an affiliate professor in the department of computer science at the University of Washington Bothell. I’ve known Muhammad for a long time. When I started teaching philosophy at Rochester Institute of Technology, he was one of my first students. Over the years, we’ve kept in touch as Muhammad went on to get his PhD in computer science and eventually became a…


Open Dialogue

Evan Selinger in conversation with Mary Berk

Illustration: Julia Moburg/Medium; source: Getty Images

This is Open Dialogue, an interview series from OneZero about technology and ethics.

I’m thrilled to talk with Mary Berk. Mary has a PhD in philosophy, a degree that includes a specialization in ethics, but spent her career working in Silicon Valley. Most recently, Mary was a product manager at Facebook and Instagram. Previously, Mary worked at Amazon, Google, Microsoft, eBay. Given Mary’s many years of experience and her disposition for critical thinking, she’s the perfect person to discuss whether Big Tech can care about ethics.

Our conversation has been edited and condensed for clarity.

Evan: What got you interested…


Open Dialogue

Evan Selinger in conversation with Kate Darling from MIT Media Lab

The text “Open Dialogue” as a graphic next to a photoshopped image of a standing biped robot holding a cardboard box.
The text “Open Dialogue” as a graphic next to a photoshopped image of a standing biped robot holding a cardboard box.
Photo illustration, source: Agility Robotics

This is Open Dialogue, an interview series from OneZero about technology and ethics.

A few years ago, I read a fascinating paper by Kate Darling, a research specialist at the MIT Media Lab, that left a lasting impression. In “Extending Legal Protection to Social Robots: The Effects of Anthropomorphism, Empathy, and Violent Behavior Towards Robotic Objects,” Kate clarifies how easy it is, given the way the human mind works, for us to become emotionally attached to all kinds of robots — robots that have humanlike, animal-like, or even basic lifelike features. Given this tendency, she proposed a radical idea: granting…


Open Dialogue

Evan Selinger in conversation with Clive Thompson

Illustration: Julia Moburg/Medium; Source: Getty Images

This is Open Dialogue, an interview series from OneZero about technology and ethics.

I’m Evan Selinger, a professor of philosophy at Rochester Institute of Technology. One of my favorite activities is talking with smart and engaging people who think deeply about responsibility and the paths for creating a better future. In the “Open Dialogue” series, I’ll reach out to academics, journalists, activists, tech workers, and scientists to explore how to better understand controversies, more thoughtfully analyze innovation, and critically determine which leading ideas and behaviors need to change.

I’m excited to talk this week with Clive Thompson about how the…


Brighter AI promises to protect protesters. But is it enough?

Gif of a man’s face getting pixellated.
Gif of a man’s face getting pixellated.
Photo illustration; Image source: Carlina Teteris/Getty Images

There are many reasons why the movement to ban the police from using facial recognition technology is growing. This summer, reporters at the New York Times and Detroit Free Press revealed that Detroit police officers used faulty facial recognition to misidentify and wrongfully arrested two Black men, one for supposedly stealing watches, and the other for allegedly grabbing someone else’s mobile phone. Recent reporting at Gothamist revealed the New York Police Department deployed facial recognition technology to investigate “a prominent Black Lives Matter Activist.”

Technology companies have been harshly criticized for providing law enforcement with facial recognition technology. While IBM


Technology like thermal imaging is little more than security theater

Passengers of the metro are captured by thermographic or body temperature measurement cameras, in response to find those possibly infected with the coronavirus, in Panama City, on April 21, 2020. Photo: Luis Acosta/AFP/Getty Images

This op-ed was co-authored by Evan Selinger, professor of philosophy at the Rochester Institute of Technology, and Brenda Leong, senior counsel and director of A.I. and ethics at the Future of Privacy Forum.

The lockdown on commercial industry and personal activity in response to the global Covid-19 pandemic has been in place for almost two months in many parts of the U.S. Due to financial desperation and frustration with isolation, nonessential businesses are starting to reopen and more people are going out in public despite ongoing health concerns.

Seeking to frame this economically driven agenda with a veneer of public…


There’s an unspoken sadness whenever we join a video call

Photo: Robert Nickelsberg/Getty Images

Like so many families, mine is trying to keep things together during the pandemic by scheduling Zoom time. We Zoom to celebrate birthdays and holidays, catch up, and pass the time by playing games and solving puzzles. This is the new normal of socially distancing together.

Unfortunately, I’ve started experiencing what’s come to be known as “Zoom burnout,” or sheer exhaustion after so many video chats. Don’t get me wrong: A locked-down world without video calls would be significantly worse — more socially isolating and economically devastating. …


To shield the privacy of marginalized communities, civil society must understand what they’re going through

Credit: Colin/Wikimedia Commons

Co-authored by Albert Fox Cahn

Nearly two years ago, Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, tweeted, “If you are a woman who has been sexually abused by a hacker who threatened to compromise your devices, contact me and I will make sure they are properly examined.” Despite Galperin’s vast expertise, she didn’t expect what happened next: a mailbox that was flooded with requests for help from survivors of domestic abuse that continues to this day. A determined Galperin responded by launching a multi-pronged campaign against stalkerware.

Abusers install stalkerware in order to surveil, harass, and control…

Evan Selinger

Prof. Philosophy at RIT. Latest book: “Re-Engineering Humanity.” Bylines everywhere. http://eselinger.org/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store