Interview: Responsible AI with Anna Bethke
How can we more fully consider building AI for social good?
--
For the 3rd episode of our interview series with AI experts, we had a great conversation with Anna Bethke!
Anna Bethke is a Principal Data Scientist focused on fair, accountable, transparent, & explainable (FATE) AI in Salesforce’s Ethical AI Practice Team, collaborating with product and research teams to create AI responsibly and empower our customers to use it responsibly. They research and implement innovative techniques for assessing and mitigating bias and harm in AI. Anna received their MS and BS in Aerospace Engineering from the Massachusetts Institute of Technology concentrating on Human Factors Engineering. Anna was formerly the Head of AI for Social Good at Intel and previously worked at Facebook, MIT Lincoln Labs, Argonne National Labs, and Lab41.
Today, we dive into:
- Human factor engineering
- AI for social good in the industry setting
- Life as a gradient ascent
Key Responses & Takeaways
Note: this section has been edited and shortened for clarity. Please scroll to the next section to watch or listen to the full interview.
Could you tell us more about your background? What led you from Aerospace Engineering to AI Ethics?
It definitely seems like a little bit of a circuitous road… So one of the key things that led me here was that human factors engineering experience. So when I was doing my Aero Astro [Aeronautics & Astronautics] degree, I was in the human and automation laboratory. There we were, trying to figure out how humans and computers can work the best in order to complete a task. Specifically, this was for a lot of flight operations. Flying a plane is very complex — the larger the plane, the more types of complexity there is. So you have to figure out as a designer, what should a computer do? What should a person do? When is it appropriate? When is it not?
That gave me a huge introduction to both automated systems, as well as designed display statistics. Because in human factors engineering, you’re always doing a lot of user tests. And that gave me the jump I needed to…