We Need to Talk About “Digital Ageism”

Bridging critical gerontology and data and information scholarship is a win-win.

Agrigorovich
Data & Society: Points
7 min readMar 29, 2023

--

By Clara Berridge and Alisa Grigorovich

Damir Bosnjak via Unsplash

The long-term care industry is increasingly focused on using digital monitoring and artificial intelligence to supervise residents and older adults who require support in their homes. Driven by the promises of prevention, timely identification of health problems, and personalization in the context of staffing shortages, these companies and corporations are collecting a range of data for predictive and automated decision-making. This data collection includes the repurposing of sensors or simple location tracking devices for digital contract tracing, as well as the development of newer products marketed to the senior living industry to comprehensively monitor older adults and often, support staff. (This includes services like CarePredict and Caspar.ai, which boasts “30+ actionable analytics.”)

Within the field of gerontology, aging research has long been critiqued for being “data-rich but theory-poor,” a status quo that results in a tragic lack of critical analysis when it collides with the expanded use and funding for AI and surveillance technologies. With some exceptions, researchers have paid little attention to the potential harms these technologies pose to older adults, workers, and our broader communities — or the ways they might contribute to ageism. (For more on ageism, see Reframing Aging.)

Critical scholarship in fields that have produced pioneering work on the impacts and harms of AI and algorithmically mediated decision making has largely overlooked aging, and gerontology has largely failed to engage that critical work. We’re interested in ways that critical race, feminist, and disability scholarship and activism can be mobilized to illuminate algorithmic harms as they relate to the problem of ageism, which extend beyond consequences of bias and problems of negative or inadequate representation.

How can critical data and information scholarship strengthen age studies?

Inspired by efforts to fight racism and sexism and strengthen fairness and inclusion, momentum is growing to address the newly labeled problem of “digital ageism” and with it, to “add age to AI.” The gist of critiques will be familiar: older adults are represented poorly or negatively in data, design does not represent the heterogeneity of older users, and a youth-centric tech industry excludes older adults. Even in discourse on digital ageism, the scope and nature of the problems introduced by AI are confined to and/or articulated on AI’s terms, rather than through an understanding of how AI technologies reorganize subjects and relationships regardless of their nominal fairness or inclusivity.

The concept of digital ageism is shaping up to follow the bias literature’s path, one in which structural ageism is under-accounted for, expertise stays in the realm of tech, and the default analytical whiteness of ageism is carried over. There are yet more elements of ageism that are obscured by calls for inclusion. In addressing this, we see opportunities to engage the brilliant work of critical information and data technology scholars. To that effect, in our article, “Algorithmic harms and digital ageism in the use of surveillance technologies in nursing homes,” we outline ideas for those of us in the aging space to generatively engage with critical race, feminist, and disability studies and activism. Grounded in the nursing home context, the paper underscores the need to address the racism and ableism on which the U.S. and Canadian long-term care systems are built.

The grounds for concern with the deployment of AI in nursing homes become obvious once we understand the problematic context of nursing homes themselves. Most are chronically understaffed institutions where residents have limited autonomy. AI and its predecessor devices like closed circuit cameras and bed-exit sensors are positioned to further institutionalize and restrict self-determination, which conflicts directly with both widely popular nursing home reform efforts and (less popular) abolition efforts. Engaging with critical scholarship on information and data technologies would enhance understanding of the social harms of AI in relation to the increasing adoption of surveillance technologies within nursing homes. In turn, it would enable us who study aging to take the implications for increased surveillance and control of both older adults and workers seriously, and to incorporate refusal into our analytical lens. It would also help us understand, accommodate, and develop alternatives to (individual) consent models and call out the opportunity costs of prioritizing further investment in surveillance technologies over structural change in long-term care.

How could thinking through age strengthen critical data scholarship and activism?

Thinking about how critical information and data studies could benefit gerontology also raises the reverse question: How can non-gerontologists learn from turning to questions of old age, aging, and older adults, and from engaging with the problem of ageism? We have some ideas.

First, research on older adults’ agentic technology interactions illustrates the need for refusal scholarship to be inclusive of distributed (non-organized) practices of resistance and refusal. (The lens of refusal is just beginning to be turned to old age.) The drive to counter the ageist narrative that older adults are technologically incompetent or disinterested passive actors can drown out their resistances and refusals. For example, older adults may, in refusing a given technology — such as activity tracking sensors — be taking a stance against the devaluation of privacy. But studies that seriously engage with the basis of older adults’ non-tech positive preferences and choices are quite rare.

Expressions of refusal are seldom grounded in coordinated political action, yet they hold value, often of a political nature. Take an example from the field in which a Medicaid-eligible resident in her 80s angrily confronted her senior housing building’s social worker for unfair treatment of a fellow resident. This resident had support from a home aide in her independent living apartment, but her friend and fellow resident had been denied the in-home aide she needed and was instead offered a sensor system to detect changes in her routine and inactivity. Finding this inadequate, the resident demanded that a home aide be hired for her friend, too. The social worker characterized this resident’s reaction and demand for access to an aide as “socialist thinking” because she understood what the resident would not accept: that one can be near-poor and unable to afford a home aide, yet ineligible for any assistance. This resident refused to accept the technological solution on offer, and refused to accept the U.S.’s stringent means-tested long-term care system as just. Her refusal included demands for what she deemed necessary and acceptable to benefit a fellow resident.

There are significant barriers to political organizing around issues of ageism. There’s no mass movement to reclaim “old.” Being placed in an othering category can be isolating. And while the need for long-term care is among the most shared human experiences, a socio-political system that privatizes long-term care isolates individuals and their networks of care. Experiences of older adults offer researchers and activists another way to understand the importance of accounting for distributed moments of resistance and refusal.

Second, research on older adults in nursing homes demonstrates the need to develop more ethical and power-aware approaches to AI ethics and surveillance technologies, ones capable of protecting the interests of individuals who can’t participate in informed consent or are treated as such. Surveillance technologies often target people living with dementia, who gradually lose the capacity to give, refuse, or withdraw consent. Decisions about the use of surveillance technologies is often ceded to family members or legal representatives who tend to be more enthusiastic about these technology practices than older adults. Moreover, given ongoing staff shortages and data flow awareness gaps at both resident and staff levels, ensuring that consent is an ongoing process — that people can change their minds — is practically impossible in the average nursing home, especially when an investment in a given device or surveillance system has already been made. In this way, reliance on informed consent as the ethical benchmark serves as a legal loophole to circumvent more critical questions about what and whose values we are protecting. How is the best interest of the economically vulnerable resident living with dementia assessed, given unbalanced power and countervailing interests? While not a new consideration for critical data and information scholarship, the reality of the limitations of a consent model in the nursing home setting adds weight to the need to develop creative articulations of this ethical problem — and creative solutions.

We cannot consider the harms of surveillance technologies to residents of long-term care facilities without also considering their implications for direct care workers, who are disproportionately BIPOC and immigrant women. Care workers in nursing homes are in an environment that is already characterized by “ambient criminalization.” The use of surveillance technologies is likely to reinforce racist labor inequalities by extending the power of employers to control low-wage BIPOC workers and to undermine their labor rights and protections.

Finally, centering age within analyses of surveillance technologies demonstrates the wider multigenerational implications of the potential social harm of automating ageism. Their use as proposed today could reify and further entrench cultural norms about old age and care. Surveillance technology in elder care has the potential to help shape how people of all generations view their futures, and it risks entrenching an impoverished imaginary of ways of relating through care in old age. The reach of these potential harms demands broader attention.

Clara Berridge is an associate professor in the School of Social Work at the University of Washington and core faculty in the Disability Studies Program. She developed the first tool to engage people living with dementia in informed decisions about technologies used in their care.

Alisa Grigorovich is an assistant professor in the Department of Recreation and Leisure Studies at Brock University in St Catharines, Ontario. Her research primarily focuses on the design and use of gerontechnologies and their social, ethical and policy implications.

--

--

Agrigorovich
Data & Society: Points

Alisa Grigorovich is an assistant professor in the Department of Recreation and Leisure Studies at Brock University in St Catharines, Ontario, Canada