Image for post
Image for post
Source: https://www.youtube.com/kids/

As tech companies continue to collect large amounts of personal data on Internet users, questions arise around digital rights of society and individuals. National, regional, and international bodies have scrambled to create laws and regulations that keep up with the rise of invasive technologies. However, regulation, industry practice, and current digital education has failed to adequately address the protection and rights of one of the largest and most vulnerable populations online, children. Children represent one-third of all Internet users, and are susceptible to both benefits and harms of the digital world. This three part series will explore local and global factors affecting child privacy online. …


New technology is often initially met with scepticism, with a small number of innovators and early adopters paving the way for the majority to eventually move beyond their initial reservations. What causes these innovators, early adopters, and then eventually the public to adopt new technologies?Having confidence in something new requires trust. When people overcome key trust barriers, they can leap into the unknown and feel more comfortable using new technologies that once seemed alien.

However, sometimes, new innovations are discarded because the trust barriers are insurmountable. Innovation in and adoption of automated facial recognition (AFR) surveillance cameras can be linked to three key trust barriers: fear of big brother, tension between privacy and security, and a general fear that technology will make costly mistakes. …


Image for post
Image for post
Source: FaceApp

You can’t look anywhere on the internet anymore without seeing celebrities, friends, and family using the popular FaceApp age filter to imagine what they’ll look like in 50 years. While it may seem like an innocuous and amusing way to spend 10 seconds of your time, FaceApp’s technology isn’t exactly harmless.

For one, the app has a shaky policy when it comes to accessing your photos. Some have speculated that the app uploads users’ entire photo libraries in the background, but these claims remain unsubstantiated. Still, there are concerns about users who have listed photo access as “never” still being able to share individual photos with the app. It is likely that as people upload their photos onto the app, they don’t have a full understanding on what they are actually handing over to this company. FaceApp’s own privacy policy acknowledges: “We may share User Content and your information (including but not limited to, information from cookies, log files, device identifiers, location data, and usage data) with businesses that are legally part of the same group of companies that FaceApp is part of, or that become part of that group”. …


Image for post
Image for post

Earlier this month, the UNHCR reported that they had rolled out an iris identification (a form of biometric identification) program that gave Rohingya refugees a form of identification for the first time. The initiative was met with plenty of positive press, with many applauding IrisGuard, the private entity that developed the technology, for providing new opportunities for refugees. Having an identity will allow refugees access to basic human rights and economic opportunities, and will provide additional data to aid agencies who are tracking population movements.

This sounds like a win-win-win— refugees get an identity, aid agencies get more accurate data, and tech companies get access to data and global recognition. But my alarm bells went off as I recalled Anand Girdhirdas’s concept of the win-win fallacy: there are likely to be unintended consequences that disproportionately negatively impact the most vulnerable (in this case, the refugees). My discomfort arose from two blindspots this program seems to have missed: lack of true choice and data security. Overlooking these crucial blindspots could both perpetuate existing human rights inequities and further endanger the lives of those that are tracked. …


Image for post
Image for post

“We have to move beyond talking about AI for good and AI ethics. We simply cannot build just, equal, and fair automated systems on top of corrupt toxic sludge.”

Tanya O’Carrol’s mic-drop statement at the end of her talk brought on a raucous round of applause at the Artificial Intelligence (AI) and Human Rights panel at the 2019 Skoll World Forum. She had just addressed the major elephant in the room: the very business models of extraction of personal data at any cost that have brought on an age of surveillance capitalism need to be challenged.

The AI and Human Rights panel at the 2019 Skoll World Forum brought to light some incredibly pertinent insights surrounding the intersection of technology, society, and business. First, there was a questioning of “AI for good” and a plead for nuance while looking at what has already been done in human rights when discussing AI ethics. Second, there was the challenging of the capitalist structures in place that have even created a need for “AI for good”. Lastly, as is the optimistic nature of the Skoll World Forum, there were examples of the power of collective genius in addressing challenges of human rights in the digital age. …


A common rhetoric in news coverage about the spread of the Internet is that it is a potential equalizer in society because it gives new power to those who do not traditionally have it. Recent news coverage touts digital technologies as having the ability to alleviate gender inequalities, particularly in emerging markets, because these technologies theoretically empower women to move beyond societal norms surrounding gender roles. However, if we examine the issue closer, questions arise. How do women actually use technology that may not have been created for them in the first place? Does this technology alleviate or exacerbate existing inequalities? …


Marking your territory

Driving to BKC today I saw 7 different men peeing on the side of the road. They didn’t even bother to try and find a secluded enclosure; they simply planted their feet firmly right next to their car or bike and generously turned away from oncoming traffic.

This isn’t something new in India. I’ve noticed this from a young age, always recoiled in disgust and shook my head. I attributed it to poor personal hygiene habits and a blatant cultural disregard for public places.

But yesterday I thought that these unfortunate incidents I unwillingly was privy to may symbolize something more than just poor hygiene. The only perpetrators of the aforementioned unfortunate incidents are men. Men who feel that public spaces are their spaces. Men who feel they can stare at, harass, and rape women in public places because a public place isn’t really for the entire public. Public places are reserved for men only and the men I saw on my drive were showing this reality. …


Lessons in bilingual texting

I received a slew of WhatsApp messages on my phone today from an unknown number. “Hy tulsi Kaiesi ho” was the first message, followed by “Helo”, and “Fn nhi Uthati mera”. Obviously this was gibberish, and I was about to block the sender when my sister pointed out that this might not in fact be a spammer. This was Divya, our nanny/housekeeper from about 10 years ago.

Divya had recently come to our house to visit us. She was a wonderful addition to our family the short time she was with us in Bangalore. …

About

Tulsi Parida

I study the Internet, social justice, and responsible business at the University of Oxford. I care about intersectional feminism and inclusive tech.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store