Technology and Human Rights… in Comics!

Roya Pakzad
Taraaz
Published in
5 min readJan 3, 2020

I have a 10-year-old living in me. She’s shorter than average 10-year-olds. She has straight, long brown hair, big and curious dark brown eyes: fierce, free, and more confident than my 32-year-old self.

My 32-year-old self is married, is planning to have kids, and has managed to line up some university degrees and a 4-page CV. She has some age-lines around her lips and eyes (zero white hair though — I often say that the world has been tough to Middle Easterns but at least it bestowed some slow-aging genes upon us!)

My inner 10-year-old sometimes frustrates me; she tends to awaken the childish, selfish self which I try so hard to get rid of. Despite this, she sometimes makes me seek joy in small little things: climbing up trees, stealing fresh fruits, laughing stomach-achingly.

So, why am I telling you this?

I often find it difficult to explain to people outside my professional circle what it is that I do. I start with, “ I’m a researcher and consultant in technology and human rights.” And when they give me a look of “cool, what the heck does that mean,” I start throwing out some acronyms, sometimes veering into giving too many details, sometimes being too vague, and always using jargon. Just to give you a sense of it, it took me 5 kilometers of a medium-paced walk to tell my mom what I mean by “gender-based discrimination in Artificial Intelligence systems.”

But my whole passion for doing what I do is to show how tangible issues about technology and human rights are, and how we live them every day.

My 32-year-old self has failed me many times in trying to reach this goal. But today, in honor of the holiday season — which allows you to be fun and silly—I want to let my 10-year-old tell you what I mean when I say that I work on the intersection of technology and human rights.

Roya working at her cubicle, not happy about her day-to-day job and thinking she shouldn’t spend her entire life doing this.

From 2012 to 2015, I was a DFT engineer working at AMD.

Don’t know what “DFT” means, you say? Well, to be totally honest with you…

Roya saying “Doesn’t Matter” in response to what DFT means?

My work was challenging but not fulfilling. Luckily, I was also involved in volunteer work that I really enjoyed: working for a women’s rights non-profit in Austin, Texas.

Roya driving to a non-profit organization called AFSSA. She is happy about it.

This non-profit, Asian Family Support and Services (AFSSA), was doing great work helping women in abusive relationships. Volunteering there, I had many conversations like the following:

Roya at AFSSA, a woman telling her that her partner stalks her online and hacked her computer.

This was a real wake-up call for me. It led me to think about the effects of the work that I and other fellow technologists do every day — how it impacts people in the real world, and what the unintended consequences of our projects might be, beyond what our managers and executives were telling us.

Roya thinking about the unintended consequences of technology: GPS-enabled apps, CCTV, hacking and stalkware

So in 2015, I decided to leave my job and go back to school to study human rights with a focus on the intersection of technology and human rights.

Roya at her desk studying human rights & tech: privacy-by-design for IoTs, Internet freedom, UN treaties

I wrote my thesis on the role of information and communication technologies in the livelihood of refugees. (If you are interested you can read it here, or a shorter paper here.)

Roya at a refugee camp discussing issues with social media and refugees. Rumors circulating in fb is one of the main concerns

In 2016, there was a massive interest in developing apps and online services to address refugees’ needs. I saw this firsthand when I met with refugees and aid workers in Greece.

Throughout this period, you would often see headlines such as this: “Smartphones as a lifeline for refugees.”

A phone showing all apps for refugees: e-education, e-business, e-health, chatbots for legal services, blockchain, digital ID

But what about the risks?

Roya looking at her phone thinking about all the risks of apps for refugees: privacy, security, dignity

So now I’ve been involved in technology and human rights for the past four years. It’s a topic that’s always changing and is more relevant than ever.

Want to know more examples? Here’s one.

Automated decision-making systems might perpetuate our society’s racial and gender biases. Just like our kids, if we train these systems with our own biases and stereotypes, they will learn those biases and make decisions based on them.

Showing racial discrimination by automated decision making tool. on the right showing gender discrimination in hiring.
The drawing on the left is based on an image from Propublica’s article, “Machine Bias

This is what data scientists and machine learning practitioners call…

Garbage in = Garbage out!

These systems are sometimes called “Black Boxes” because it is often not possible to know exactly how they made certain decisions.

A black box and a magic wand showing algorithmic decision making is often un-interpretable

Another example of the intersection of technology and human rights is one you’ve almost certainly seen in the news, and maybe even had conversations about: how social media platforms affect our right to privacy, freedom expression, safety, access to information, and more.

At US Congress, people talking about social media issues: Russian bots, targeted ads, hate speech in Myanmar
(Joking about the Trump thing!!)

But there’s much more than this to think about when it comes to human rights implications of the technologies surrounding us. And you can bet there’ll be even more in the years to come.

Roya thinking about other tech & human rights issues: Smart cities, public data, ethics for engineers, internet shutdowns
Iran’s map showing Internet Shutdown is from Small Media’s blogpost, “Iran Shutdown Monitor

These are all things that will keep me busy in 2020! Going forward, I’ll post more cartoons on specific issues in my field.

If you are interested subscribe to my newsletter Humane AI for new posts. Thanks!

--

--

Roya Pakzad
Taraaz
Editor for

Researching technology & human rights, Founder of Taraaz (royapakzad.co)