Technology and Human Rights… in Comics!
I have a 10-year-old living in me. She’s shorter than average 10-year-olds. She has straight, long brown hair, big and curious dark brown eyes: fierce, free, and more confident than my 32-year-old self.
My 32-year-old self is married, is planning to have kids, and has managed to line up some university degrees and a 4-page CV. She has some age-lines around her lips and eyes (zero white hair though — I often say that the world has been tough to Middle Easterns but at least it bestowed some slow-aging genes upon us!)
My inner 10-year-old sometimes frustrates me; she tends to awaken the childish, selfish self which I try so hard to get rid of. Despite this, she sometimes makes me seek joy in small little things: climbing up trees, stealing fresh fruits, laughing stomach-achingly.
So, why am I telling you this?
I often find it difficult to explain to people outside my professional circle what it is that I do. I start with, “ I’m a researcher and consultant in technology and human rights.” And when they give me a look of “cool, what the heck does that mean,” I start throwing out some acronyms, sometimes veering into giving too many details, sometimes being too vague, and always using jargon. Just to give you a sense of it, it took me 5 kilometers of a medium-paced walk to tell my mom what I mean by “gender-based discrimination in Artificial Intelligence systems.”
But my whole passion for doing what I do is to show how tangible issues about technology and human rights are, and how we live them every day.
My 32-year-old self has failed me many times in trying to reach this goal. But today, in honor of the holiday season — which allows you to be fun and silly—I want to let my 10-year-old tell you what I mean when I say that I work on the intersection of technology and human rights.
From 2012 to 2015, I was a DFT engineer working at AMD.
Don’t know what “DFT” means, you say? Well, to be totally honest with you…
My work was challenging but not fulfilling. Luckily, I was also involved in volunteer work that I really enjoyed: working for a women’s rights non-profit in Austin, Texas.
This non-profit, Asian Family Support and Services (AFSSA), was doing great work helping women in abusive relationships. Volunteering there, I had many conversations like the following:
This was a real wake-up call for me. It led me to think about the effects of the work that I and other fellow technologists do every day — how it impacts people in the real world, and what the unintended consequences of our projects might be, beyond what our managers and executives were telling us.
So in 2015, I decided to leave my job and go back to school to study human rights with a focus on the intersection of technology and human rights.
I wrote my thesis on the role of information and communication technologies in the livelihood of refugees. (If you are interested you can read it here, or a shorter paper here.)
In 2016, there was a massive interest in developing apps and online services to address refugees’ needs. I saw this firsthand when I met with refugees and aid workers in Greece.
Throughout this period, you would often see headlines such as this: “Smartphones as a lifeline for refugees.”
But what about the risks?
So now I’ve been involved in technology and human rights for the past four years. It’s a topic that’s always changing and is more relevant than ever.
Want to know more examples? Here’s one.
Automated decision-making systems might perpetuate our society’s racial and gender biases. Just like our kids, if we train these systems with our own biases and stereotypes, they will learn those biases and make decisions based on them.
This is what data scientists and machine learning practitioners call…
Garbage in = Garbage out!
These systems are sometimes called “Black Boxes” because it is often not possible to know exactly how they made certain decisions.
Another example of the intersection of technology and human rights is one you’ve almost certainly seen in the news, and maybe even had conversations about: how social media platforms affect our right to privacy, freedom expression, safety, access to information, and more.
But there’s much more than this to think about when it comes to human rights implications of the technologies surrounding us. And you can bet there’ll be even more in the years to come.
These are all things that will keep me busy in 2020! Going forward, I’ll post more cartoons on specific issues in my field.
If you are interested subscribe to my newsletter Humane AI for new posts. Thanks!