Design by Annalise Huynh

Not like the aunties

Digital Justice Lab
6 min readMay 7, 2019

By Aliya Bhatia

How the Canadian government is only boosting the surveillance of immigrants without any regard for their humanity

I loved living in Regent Park. I moved there in 2015 after graduating from university, as well as having worked there as a political campaign organizer for years. As an organizer, I would knock on thousands of doors and speak to hundreds of aunties.

Regent Park is one of the most diverse neighbourhoods in Toronto and across Canada with almost 60% of residents claiming that English isn’t their primary language. Dialects from Bangladesh, Sri Lanka, China and Taiwan, and Eastern Africa can be heard around the premises. The scent of turmeric and browning ginger wafts in the hallways of the drab brutalist buildings that make up most public housing projects similarly constructed in the late 70s. Later, when on a campaign, I would play a game with the candidate I had worked for where we would guess, door to door, what was for dinner that night based on scent — and often an illicit taste. (Sometimes, residents would offer us food but we were discouraged to take it, although if pressed, I would sneak a bite.)

During this time, I had spoken to thousands of residents: voters and nonvoters, children and their parents, seniors, newcomers and fourth generation Canadians. But aunties were omnipresent. Aunties, mothers and older women who resided in the area, played an integral role as an informal network of surveyors of the community; they knew everyone’s business. Once, upon asking a group of aunties what their favourite part of living in Regent Park was, they smiled and unanimously said, “the kids always have moms looking out for them.”

The community they described — a network of mothers and aunties looking out for children that weren’t biologically theirs and teenagers they had informally began to feed, chastise, and often impose curfews for when necessary — was akin to a local public housing panopticon. A panopticon, as philosopher Michel Foucault viewed it, was a style of prisons that had a watchtower where prison administrators could watch inmates at all times but not be seen. By virtue of not being seen, administrators did not need to actually be there as the fear of being seen and being chastised guided inmates to discipline their behaviour without actual monitoring. Similarly, the kids of Regent Park, the aunties noted, never knew when an auntie was watching, but they knew they were being watched.

Regent Park is used to the surveillance of a pernicious kind. For years, it was stigmatized for being home to a large population of people of colour and particularly low-income communities. A quick Google search of “Regent Park Toronto” results in a similarly myopic view of the downtown neighborhood, with words succeeding the search term including “shooting”, “gangs”, “safety”, and “crime.” In response to this stigma, politicians, no matter the political affiliation, have had no problem boosting up police presence under the guise of increased criminality — a dog whistle if I’ve ever heard one. Just this last year, newly re-elected mayor John Tory announced an expansion of overnight police presence in downtown “hotspots” in response to growing “community concerns”. And mere months ago, Selwyn Pieters, self-proclaimed Mr. Toronto Lawyer on Twitter, joined a group of lawyers fighting against the racial profiling of youth in the region saying, “They’re giving the kids tickets for crossing on the red light and a group of white people crossed on the red light and nothing happened”. Complaints have come in the form of protests at the police departments, Know Your Rights workshops in Regent Park, and town hall meetings. Nevertheless, investments fuel the police-accompanied surveillance of residents of Regent Park and beyond.

But that’s just a small glimpse of the problem. The Canadian government, in May of this year, started soliciting proposals from private companies for new digital tools to assist in making humanitarian visa decisions. There are several problems that arise from this decision. First, the collection and privacy of data that could give a skewed representation of immigrants and immigration applicants. Data used to feed these digital tools will include crime data, which means if an individual has ever been stopped by the police, that will be used against them. This will disproportionately affect many in Regent Park, who are immigrants or asylum seekers in the thick of the process, are over-policed.

The government’s decision to create an algorithm to make decisions on immigration is also worrying as it is being conducted in a policy vacuum, specifically that there are almost no laws, codified or otherwise, that enfranchise individuals with the right to their own data or guidelines that determine what is allowed or not.

At present, the government has paid over $20,000 in licenses to Blue J Legal, a law firm which has made similar algorithmic tools for tax decisions in the past, to use an off-the-shelf tool. This has worrying implications, as decisions of this sensitive nature require nuance, human discretion, let alone empathy that an impersonal tool is not able to offer.

The algorithm that the government seeks will make computer-generated decisions for applicants applying for permits or residency under the vaguely termed “Humanitarian and Compassionate Grounds,” which is often a claim used by very vulnerable individuals.

The way Humanitarian and Compassionate Grounds applications work is that it allows those who were denied residency or a visa to appeal the government’s decision with a claim for asylum. This appeal is almost completely at the discretion of the court, and soon, an algorithmic decision. To be considered, the court will look at whether the individual has met the usual requirements like residency in Canada, time spent, educational attainment. Additionally, and more worryingly, they hope to look at social media accounts.

Citizen Lab, Gizmodo, and more have pointed out the danger of using computer-generated decision making in matters of immigration. For one, no two immigration applications are the same, just like no two immigrants are the same. Relatedly, creating an archetypical immigrant or immigration application would be an attempt in futility, trying to distil a person to an impersonal rubric. My family’s own experience in applying for Canadian citizenship is an example of this, where my father’s application was stalled indefinitely due to medical records that were misunderstood due to a complex chronic illness (something that is happening increasingly in the United States, too.)

Whether or not an applicant has a claim to reside in in Canada based off of agents looking at their social media data — or any qualitative data, for that matter — should make us terrified. This analysis is tantamount to surveillance and will have long-lasting repercussions on the behavior and expression of immigrants. The monitoring of applicants’ behavior online will only limit applicants and those wishing to reside in Canada from expressing their views online or amongst friends on platforms like WhatsApp, a platform already known to supply information to governments under the guise of national security.

Another concern is that censoring speech in order to stay under the radar of the immigration algorithm may keep immigrants from truly becoming engaged in the political fabric, lest something they say online be seen as in contradiction to Canadian values. Studies show that increased political participation and speech online dampens when there is knowledge of monitoring online, consistent with Foucault’s theory of the panopticon: when you know you can be watched at any time, you obey all of the time.

Unlike the aunties of Regent Park, the watchful eye of police and an immigration algorithm is far from forgiving, let alone caring. Where aunties understand the cultural context, the dynamics and relationships between their children and their friends, and their motivations, the surveillance tools pursued by the government have no regard for the humanity of these individuals, seeing the predominantly low-income community of colour as criminal and worse, not deserving a place in Canadian society. These tools will birth a generation of fearful immigrants kept outside the sphere of political engagement and full civil liberties. Before deploying new tools for the surveillance of communities in Regent Park and beyond, consultation, outreach, and a deep understanding of the motivations of these immigrants must be pursued. For now, immigrants in Regent Park are under the watchful eye of aunties. Perhaps the Canadian government would be better off consulting with them to perfect their immigration decisions.

The author Aliya Bhatia can be found here

This article is part of a new series hosted by Digital Justice Lab, the series will be navigating technology and its impact on our daily lives.

This story is part of a series on technology and its impact on our lives funded by Shuttleworth Foundation Flash Grant received by Nasma Ahmed at Digital Justice Lab. The Digital Justice Lab’s mission is to build towards a more just and equitable future. The Digital Justice Lab is a member of the Tides Canada Shared Platform

--

--

Digital Justice Lab

The Digital Justice Lab is a national organization that engages and collaborates with diverse communities to build alternative digital futures.