Towards an Ethic of Care

Introduction: Care Ethics and Care Politics in Technology Labor

I’ve been thinking about care work and technology lately, and last week, I gave this talk at Open Source Bridge, here in Portland. Here’s Part I.

“Caregiving is the act of providing assistance and support…Caring for others generally takes on three forms: instrumental, emotional, and informational caring. Instrumental help includes activities such as shopping (or) cleaning… Caregiving also involves a great deal of emotional support, which may include listening, counseling, and companionship. Finally, part of caring for others may be informational in nature, such as learning how to alter the living environment of someone in the first stages of dementia.”

-Drentea, P. (2007). Caregiving. In G. Ritzer (Ed.), Blackwell encyclopedia of sociology. Blackwell Publishing (via Work and Family Researchers Network)


I’m here today to talk about care. The work of caring, the costs of caregiving, and the visibility and invisibility of caregiving, and how this figures into how we build, support, and maintain software and other technologies.

First, I’ll introduce the concept of care ethics, and then I’m going to explore the political dimensions of care, and the importance of how we consider care work, especially in relationship to technology.

Then, I’m going to talk about the relationship between caregiving and technology and about some of the ways I have observed caregiving take place in and around technology, along with some of the implicit assumptions that surround these dynamics.

To illustrate this, I’m going to draw on some examples from the great Texts of television, namely, Cheers and Star Trek.

Then, to conclude, I’ll briefly explore the business case for taking a different approach to care work in technology companies, both in our workplaces and in our work. More specifically, I want to propose some measures about how we might adjust our thinking and allocate resources to support and recognize care work in the interest of productivity, innovation, and inclusion.

So, who am I to talk about this, and why should you listen to me? I’m going to give you a few reasons. First, I’ve been working in technology and academic STEM settings for ten years now, (longer if you count the years I spent as a librarian, which was actually the only job I’ve had where I wrote code most of the time), and a parent for 5. long enough to see the statistics in regards to burnout and attrition come to life in a very real way.

My perspectives on care and caregiving have radically shifted during this time, both in my professional and personal lives, and often in ways that intersect, in questions like “Should I put my partner’s career before my own?”, “Should I shift to a career that is less rigid than academia?” , “How do I support my own parents as they care for my aging grandparents?” “Should I work late tonight and get a sitter for my daughter?” and “How much should I pay my daughter’s sitter?”

I speak from my own experiences, and my own biases and privileges, and also in debt to the friends and colleagues who have shared their experiences with me, and other folks who have shared, collected and analyzed these issues in a public way.

Secondly, I was trained as a feminist scholar in the field of science and technology studies, and that means that I’ve been working at this intersection of technology and labor, race, class, and gender. Although I was trained as a social scientist, and had, during my time researching human-computer interaction, searched for frameworks to understand labor and gender, so much of traditional scholarship ignored how caregiving figured into almost any human endeavor.

It’s a big part of what I do as a User Experience consultant: look at situations where people use technology for a given purpose, and look at how folks work with and around the capabilities and limitations of technology and find ways to improve the working relationship by improving the technology. Doing this makes you really aware of systems in play, both for humans and for the computing they do. You can’t turn it off when you’re off the clock.

This is an ongoing project, and I’m grateful to have this audience to share this with and get feedback. I’ve been thinking about care work and technology for awhile, and realized, this spring, writing in response to some personal accounts that care and support workers in tech companies had shared, that there was a lot more work to be done. And that it was a crucial, timely topic.

This project is an attempt to turn the lens a bit: how do we build better interactions, better systems, for human care work? How do we recognize and, if not quantify, explore the qualitative value of care work in technology and in society as a whole.

Care Ethics

In the early 1980s, the Harvard psychologist Carol Gilligan launched a study of teen girls. According to all of the mainstream psychological metrics in place, from Freud to Piaget, to Lawrence Kohlberg, Girls were deemed inferior, not as developed. They scored lower on quote-unquote, moral reasoning.

Gilligan’s study set out to examine this, and to offer a different perspective. The participants in her research, adolescent girls, she found, weren’t deficient. She argued, instead, that the standards of moral reasoning were. Instead of thinking of rights and wrongs as absolutes, they thought about context, about relationships. She found that her research participants had, instead of abstract ideas about justice and truth, an advanced ethic of care, which she summarized as:

ethics of justice and care, the ideals of the human relationship — the vision that self and other will be treated as of equal worth, that despite differences in power, things will be fair; the vision that everyone will be responded to and included, that no one will be left alone or hurt.

This concept sparked discussion across philosophy and applied ethics, and it was super resonant, especially in domains that tended to be devalued by the usual patriarchal or capitalist frameworks: in nursing, in medicine, in teaching, the so-called “caring professions”. It also caught on as a way to think about the social and cultural aspects of care in industry. Aside from market assigned values of products and services, do organizations do a good job in treating people fairly, or do they leave certain folks behind, either systematically or anecdotally?

I’m going to spend most of my time in this talk exploring how care, and care ethics play out in tech, and in open source in particular. I’m going to come back to this, but first, I want to look at the political contexts of care.

Care Politics

I will warn that I will give a very brief and incomplete review of the politics of care and activism around it. These are references for how we might think about how care has been politicized: in first and second wave’s feminism’s exploration of care work, in the civil rights movement and the significance of African American domestic workers’ political organizing in the 1960s, and last, how caregiving became a grounds for political action and organizing in the queer community in the face of the aids crisis. To conclude, I’ll in the face of contemporary austerity politics.

Feminism and care work

I’m going to give a purposely light exploration of the relationship between feminism and care work, because it’s not one with a tidy summation. Both first and second wave feminism’s push to quote-unquote liberate women from care responsibilities gave us no real change in how we value care work, and no real alternative other than relying on the underpaid labor of women of color.

Yet, it is worthwhile to look at how recognizing the work of caring has been a central tenet of much feminist exploration. Again, Feminism’s relationship to the labor of poor women and women of color continues to be problematic and annoying, but looking at these problems is pretty illuminating.

More work for mother: Marxist feminist thought, material feminisms and technologies for living

Dreams of Feminist Utopian Architecture

The Material feminists of the late nineteenth and early 20th century first wave, such as Charlotte Perkins Gillman, focused on questioning the separation between domestic work and public life, advocating for collective resources for care, and adopting technology to suit them.

Dolores Hayden’s book about the movement, The Grand Domestic Revolution has these great sketches of utopian “kitchenless houses”, industrial baby rockers, and collective housing communities with communal automated kitchens. Needless to say, it never got built, the rest of the twentieth century happened… and hi, maybe living in single family houses and driving around in cars isn’t the best thing to have happened to everyone?

From the See Red Women’s Workshop

Another relevant point of interest in the second wave is Marxist feminism, and the work of Silvia Federici in exploring the relationship between domestic labor and capital. This work has influenced social movements ranging from the Wages for Housework campaigns of the 1970s to the contemporary Universal Basic Income movement.

I identify as a Caring Cyborg

Next, without much introduction necessary, Donna Haraway. She’s explored care work’s relationship to technologies since the early 1980s, and her work remains relevant to understanding how technology is culturally situated in our everyday lives. Her essay on the homework economy is oddly prescient in how it predicts our current state of precarity and technological integration in everyday life:

In the prototypical Silicon Valley, many women’s lives have been structured around employment in electronics-dependent jobs, and their intimate realities include serial heterosexual monogamy, negotiating childcare, distance from extended kin or most other forms of traditional community, a high likelihood of loneliness and extreme economic vulnerability as they age. The ethnic and racial diversity of women in Silicon Valley structures a microcosm of conflicting differences in culture, family, religion, education, and language.

Black feminism, the civil rights movemement, and domestic laborers

It’s also crucial that we look at the history of the civil rights movement, and recognize the role that organizing African American domestic workers in the American south played in both galvanizing the movement and seeing real economic change in their communities.

Premilla Nadsen, a labor historian who has written about the movement, describes how domestic workers “testified, they lobbied, they shared their stories, they wrote codes of conduct and they worked to educate employers.” It’s oddly familiar, is it not?

The Black feminist movement of the 1970s, embodied by collectives such as the Combahee River Collective and the Third World Women’s collective, centered the work of care labor by Black women and women of color as key to understanding systematic oppression: until we recognized the work of women of color, we’re not well suited to change much of anything.

Act up, fight back: care work and AIDS

Act Up Die In, San Francisco, 1990

Another point of reference to consider is the role that care and caregiving played in AIDS activism. As queer historians like Lillian Faderman point out, public advocacy for AIDS and HIV treatment ran parallel to community- based care efforts, often out of necessity due to neglect and lack of support from traditional institutions. Ann Cvetkovich also writes about this relationship between caregiving, activism, memory, and trauma in the excellent book, An Archive of Feelings.

Precarity and Austerity

Housing is A Human Right, Tarp Quilt, Wynde Dyer, 2016

If we shift to today, and our current political climate, there are immediate opportunities to think about how care work is political and politicized. Growing gaps in income, and in access to housing, healthcare, and education, demand our attention and action. It is not a coincidence that we’re living in the wake of decades of austerity politics, cuts to care and social services.

Madeline Bunting describes what she calls a “crisis of care” in the face of “a cultural preoccupation with independence and a profound aversion to dependence, vulnerability and need” that makes austerity politics, of the like we’ve seen since the Reagan administration, possible. The social safety net has been eroded through what the performance artist Harry Giles calls “defunding and demolition of care”, making conditions of “precarity as the norm”.

Tech’s Crisis of Care

Sometimes it’s hard to reconcile the fact that engineering and computing culture exists within our current climate of austerity politics and eroding structures of care. We’re comfortable, for the most part. We’re told that technology work, engineering in particular, is special, that it’s best done in the most comfortable situations possible, and that our desire for convenience is easily justified .

As I argue, this is bad for business. It’s short sided, it’s bad for both the workers and the consumers, and it greatly limits what we can accomplish and who we can include in building and using technology.

In computing, we glorify people who go it alone. We glorify learning things on your own, as in the “self-taught programmer”. We hear so many stories about solo geniuses working on their own, solving problems independently. Automating something that used to require far more steps.

We glorify machine intelligence, and for the situations where the algorithms need some help, we quietly rely, in increasing numbers, on faceless workers through services like Amazon’s Mechanical Turk. We still need humans making the human connections that make automation possible.

We try to make this as invisible as possible. Much of automated crowdwork centers doing around what Joanne McNeil calls “the kind of things a computer can’t do” in segmented bits. We admire those doing the segmentation, and try to erase those who are doing the work.

The tech industry’s instrumental role in precarity as the norm is undeniable, and fairly multifaceted. And yet, we seem to proceed as usual. Over the past few years, we’ve heard accounts of really horrifying and unfortunate things from care workers in tech companies: from content curators in the Philippines, from support workers at Yelp and at Squarespace. From folks working in cafeterias and in contract positions, not making a living wage or having access to sick days.

The ideal, and the norm, it seems, in fast growing tech projects is scaling technology for products and services without regard for the work required to support and maintain them. “Oh, we’ll outsource that”.

The average single mother makes less than 28k a year, as does the average customer service rep (via glassdoor)

Tech companies, in their efforts to make workers as comfortable, and as productive, as possible, rely on growing numbers of outsourced laborers to feed, clean up after, transport, entertain and smooth out the edges of their labor force, yet that work is underpaid, unrecognized, and devalued. We silo care work, we are unconcerned with it.

We make it invisible in supporting our technologies: in the majority of large tech companies, support teams rarely have contact with product teams, and never as equals- like Brooks advocated for, we seem to adhere to the opinion that technologies are best developed in minimal contact with the people that use and support them.

The silo that we place around care work, around humans doing the things computers can’t, yet do, pervades both the inner workings of building technology and technology’s role in the larger world. We exist without much in the way of care ethics.

Yet, as a single woman, a parent, an adult child of aging parents, someone who tries to be a part of my community, I see lots of care necessary around me. And it is valuable, life-affirming work. We are nothing without the care of others, and without other characters in our lives that we care for and interact with.

Characters, like the folks in our Open Source community, who are fallible humans. Who need regular medical care, benefit from therapy, who have dogs to walk, kids to see, lives to have. They’re not 10x people, but they’re pretty good on other metrics beyond productivity. (Don’t get me wrong, I love gathering data on things, especially other people! )

People aren’t perfect. They’re not autonomous, need care. Computers and code do too, duh, that’s why we’re here geeking out about them. We’re showing off new things we’ve taught them, that we’ve nurtured, that we’ve struggled with. We’re hula hooping with open hardware:

Jennie Rose Halperin hooping at Lindsey Bieda’s brilliant “Hardware, Hula Hoops, and Flow” session

And we’re here, together, trying to figure this all out.

Next: Care, Automation and Design