That Coding Yogi Antonella Avogadro on Social Impacts of Tech Algorithms

Robyn Smith
HerProductLab
Published in
9 min readJan 25, 2021

When you work in product, you probably spend a lot of time asking yourself, “how can my product solve my customer’s problem?” The thing is, not every customer is the same — they come from different backgrounds, genders, races, and more. As a product manager, how can you be an advocate for every customer? We sat down with Antonella Avogadro, a computer science student at Florida International University, who shared with us why she thinks universities need to do a better job defining and teaching ethics and diversity in technology.

Could you just give us a brief overview of your background? I know you’ve had a hand in a lot of different types of work, which is cool. What got you interested in product?

I was born and raised in Argentina. I first studied apparel and textile design there. I did freelance work for a few years in illustration and I transitioned into computer science. I’ve been involved in a lot of teaching computer science, so I’ve taught kids from ages seven to 18, and was involved in high school teaching. I’m also involved in research, specifically about the experiences that women of color in particular have in computer science education, and how we can improve their experiences so that their retention rates are higher and we can bring more women into STEM fields. So that’s mostly what I’ve found my passions within computer science to be, how to bring that human aspect of it, and understand how technology affects other individuals that are different from myself and others, and what we can do to make it more inclusive, not just in the creation of it, but also in the application.

These technologies are being created by humans, so all of the systemic racism we have within our social conscious gets infiltrated into it.

That’s where I fell into my interest for product: How can we create products that have a good impact on everyone, and basically don’t discriminate against people inadvertently.

Yeah. That’s a perfect transition into the next question, which is, what are some vital things to consider when you are creating a product in order to be more inclusive, like you said?

I’m sure there are many smart women that are successful within the field and have a lot more comprehensive answers to this. Just from the perspective of a spectator, a user, and a computer science student, one vital consideration that I think needs to be emphasized more when creating a product is the social environment it will be affecting in particular and the environment it’ll be interacting with.

So I believe that this is one of the biggest considerations that I see is lacking right now. A recent example is the controversy that Twitter had last September when a user noticed that their image cropping system was prioritizing white faces over Black faces. On the timeline when it shows a preview of the picture that the user posted, if there was a picture of a black man and a white man, it would just completely crop out the black man, no matter where it was placed.

We’re starting to become more aware that technology is not created in a bubble. Even though we see computers as neutral deciders, unbiased, that can make decisions for us, They are not. For instance like deciding who gets incarcerated, or what penalties people get, who gets a loan, who doesn’t — we are leaving machines with the task of making all of these decisions for us, so that a person doesn’t include their own personal bias into the decisions. These technologies are being created by humans, and the information we’re putting in to teach them what to decide is also human-created data, so all of the systemic racism we have within our social conscious gets infiltrated into it, whether we want it or not, because we’re not being conscious or mindful of blocking these things out of the products we create. It’s not something that’s been fully explored yet because all the technology is so new that we’re just starting to realize the impact it has on these issues.

I think you brought up something really interesting; people view technology as arbitrary and objective. But, it was still designed by people who live with these biases.

Exactly. The computing power that is placed behind these biases just amplifies what we see within society without technology already.

It just opens this possibility for it to reach further and wider faster — that’s really what we’re starting to see now.

I’m new to this, but it’s also interesting thinking of an algorithm as a product — that’s totally what it is.

So you were recently in an ethics class. What were some of the things you went over in the class? Any modern examples?

Clearly I’m very interested in the social and ethical concerns within technology, so I was very excited about this class because everything else I have to take is technical. I was excited to look into the social side and have meaningful conversations about it.

It requires us looking at ourselves and our own biases. If we are affected negatively and emotionally by these things, imagine the people being affected directly and how they feel.

I was unfortunately very disappointed because the subjects were very vague and over the surface. The things that, at least for me, feel very important and relevant — topics such as race and gender were barely touched upon. The subject of race with technology was never covered, and we had I remember one lecture on police brutality that was questionable at best. It really put me off from the class completely.

We had one assignment regarding women in computer science where one of the answers was that: “Women choose not to go into computer science because it’s nerdy and they just like doing other things. Nothing more to it.” Coming from being a research assistant within gender and computer science, and knowing how women are falling out of the pipeline toward graduation and the stories around that, it felt, first of all, uneducated to hear someone say, “it’s just that women are not interested”. Because there’s such a social background to that. There’s years and years and years of indoctrination of a girl being told, either consciously or subconsciously ‘You shouldn’t do science, you’d be better at humanities.’ There’s an excellent article by The New York Times Magazine that takes a deep dive into the history of women in computing and studies around why the rate of women in computing has dropped, called “The Secret History of Women in Computing”.

Now things are changing, but back when I was a kid, I didn’t feel like I could do something in engineering. I always thought that was a guy’s thing. “My brother’s going to go into engineering, but I’m not going to do that. I’ll do fashion design” which is what I tried initially. Then I realized, “oh, I’m interested in that other thing that I didn’t pursue because I thought I wasn’t good enough”.

For some people, I guess it’s not that evident because it’s not their lived experience. And, unless they get some formal education to learn that, as women, this is the reality we live in, it creates that alienating discourse that is very over the surface and lacks an empirical background, resulting in people from other backgrounds dismissing our struggles as simply lack of interest.

The things we went through all came from a Utilitarian perspective. if it’s good for the common good, for the largest amount of people, then it’s okay. I feel like that simplifies things way too much, because then you’re leaving out people, that are in the margins, that we should consider. They may not be within the majority, but we’re affecting them in awful ways.

By the end of the course I spoke up about my concerns regarding the gender biases on some of the assignments and, thankfully, the response I received from my professor was very graceful. He expressed a willingness to hear why I found the questions to be biased, how I would reword them. He was very vocal about wanting to do better and even expressly said he had failed. This response gave me a lot of hope.

My biggest concern is that it seems to be our only required ethics course. The one point of contact we have, as computer science majors, with the ethical implications of what we are creating.

This is why I believe so strongly that we need a lot more education within the degree about race and gender studies.

You previously mentioned that a lot of companies now are trying to ensure their staff and employees are more inclusive and more diverse, and that’s not necessarily leading to a diversity of ideas or solutions that are more fair to everyone. We know that diversity and inclusion in product is a must. Beyond making sure your staff is diverse, what do you think is the best way to go about achieving this diversity of ideas and creating things that are more accessible to people?

I believe there are two main components. Firstly, it’s knowing that a product we make with good intentions, that may help some people, may not only not help others, but make their circumstances worse. It could be Indigenous people, Black people, Asian people — any sort of background that we are not considering as the “neutral” during the creation process. The neutral point tends to be white, and then we look at other places and other races. So, to change that approach from a more academic point of view. Having that background in racial studies to understand their experiences, and starting from a base of looking at everyone, rather than just one group of people and then expanding to the sidelines.

We can’t really make ethical decisions and choices if we don’t know who we are deciding for or what their experiences are like.

The second one is just listening. Having a genuine interest, and then cultivating that interest overtime, to understand and listen to BIPOCs and their experiences. Having that willingness to set aside whatever discomfort we may have within us in having those conversations.I know it can be a difficult avenue to take sometimes because it requires us looking at ourselves and our own biases. If we are affected negatively and emotionally by these things, imagine the people being affected directly and how they feel. So cultivating that mentality within us so that we can listen and understand how to be better allies and come from a place of humility, not ‘Oh, I’m feeling attacked so I’m not even going to try.’

I feel like that can relate back to a lot of conversations in society.

Exactly — we can feel very attacked and we have a hard time stepping out of that place of ego.

There was an article recently in which AOC received pushback for making the statement that AI can have bias. This sort of reinforces your point, doesn’t it?

Wow — that’s very unfortunate, but that’s exactly why we need to have a lot more of these conversations and share things online that relate to this, because that’s the only way we can start normalizing the thought that, ‘Wait, we should be analyzing these things from a rational point of view and looking into how these machines actually work.’

What do you think is a better way to structure teaching ethics to people going into product and technology?

I think having the courage to go beyond the superficial and — not even just courage, but the willingness to put in the work. Because it does take a lot more mental energy, time and effort to consider all of these different avenues and possibilities. And then we need compassion, too. There has to be, at the very least, a sliver of compassion for people who are different from us. We can’t really make ethical decisions and choices if we don’t know who we are deciding for or what their experiences are like.

What is the greater good I am choosing to focus on? If the greater good I am choosing to work on here is productivity — which is also another thing the course was very focused on, productivity — and if something led to productivity, then it was most likely ethical. So it doesn’t take into consideration the human experiences. We make things and do things, but outside of all of that, we just exist. How will things affect us when we are existing? That’s what I worry about.

Want to learn more from Antonella and our talented network of women in product? Check out our website where you can sign up for our monthly newsletter, and our learning series — on Feb. 5, Antonella is hosting! For daily updates, follow us on Instagram and LinkedIn.

--

--