Cultures of Computing

Sanna Sharp
Campuswire
Published in
9 min readFeb 22, 2021

Instructed by Ricky Crano at Tufts University

PREVIOUS: Megafauna Among Us: Humans and other Charismatic Animals

Instructed by Dr. Zoë Eddy at Harvard University

Modern computational technologies have been in a period of rapid development since their initial introduction to the public. In the past three decades, cellphones have shrunk in size while dramatically increasing in power and ability. Social media platforms have shattered records in winning billion dollar valuations –– and the hearts of their users –– while building a global community of creatives. Major medical advancements have been made possible thanks to crowdsourcing platforms and cloud data-sharing.

As the technology industry has grown, so too has our reliance upon it; this is perhaps best evidenced by Zoom’s recent rise to prominence. Our world is becoming increasingly digital. But what does that mean for our culture?

Tufts University instructor Ricky Crano sets out to examine the causal relationship between technological advancement and society in his course, Cultures of Computing.

School: Tufts University

Course: Cultures of Computing

Instructor: Ricky Crano

Course Description:

Cultures of Computing examines computers and computation as sociocultural phenomena and questions universalizing narratives of technological progress by exploring the variety of human experience with computing. Topics include social media, postcolonial computing, the gender of artificial intelligence, the social analysis of mathematics, and the sociocultural implications of big data and contemporary algorithmic systems.

Read more.

Ask the Instructor: Ricky Crano

Tufts University instructor Ricky Crano, speaking at a panel on media theory & politics

Can you start by telling me a little bit about your background in education?

Sure. I did my PhD in Comparative Studies at Ohio State in Columbus, where I focused specifically on Critical Media Theory and the philosophy of visual culture. After graduating, I taught classes within OSU’s New Media and Visual Culture programs. I joined Tufts University’s English department as a part-time lecturer in 2015.

As a part-time lecturer at Tufts, do you feel that you’ve been afforded the freedom to assign the course materials you want to assign, and teach the topics that you want to teach?

You know, I have pretty good latitude. There are certainly some classes that I’d like to design and teach that I haven’t been able to yet, but there are also opportunities to be creative. Recently, I spoke with a few faculty members from the School of Fine Arts about developing a media theory class that would be specifically aimed towards artists.

The first iteration of the Cultures of Computing course was created and taught by Dr. Nick Seaver about five years back, and he taught it just once before going on leave. I was asked to instruct in his stead and have now been teaching the class for four years. It’s housed within the Anthropology department, and I’m not an anthropologist; I work more closely with the Media Studies and Science, Technology, and Society programs and –– as I mentioned –– my home is within the English department. But because the course was born out of Professor Seaver’s anthropological expertise, it used to be almost entirely ethnographic in nature. I’ve been able to introduce a lot more of the methods and approaches that the humanities take within the class, as those practices align more closely with my background and expertise.

I’ve held part-time lecturing and adjunct-type roles at numerous other schools that have not provided me with the same degree of freedom in course design that Tufts has ––so I feel lucky.

How does Cultures of Computing examine computation and technology within the context of American culture?

The course was designed to push against the standard narratives of media and technological determinism, and that’s really its impetus. There’s this growing idea of, you know, “computers are here, and they’re going to take over our lives! They’re ruining our relationships!”

In practice, this idea cuts both ways; computers can be used to achieve both utopian and dystopian ends. Within the course we try to bring complexity to each of these trajectories. The goal isn’t necessarily to find a middle ground, but rather to cut an alternative path which considers elements of each perspective. I want to teach students that these new technologies aren’t the sole determinant of the negative — or positive — socio-psychological occurrences we see them associated with. People adopt technology within specific contexts, and that contextualization of tech and media begets the events that we see play out in our culture.

I bring a number of different approaches to considering these contexts, from critical theoretical approaches to multimodal discourse analysis visuals, cultural studies approaches –– and even some philosophical, political, and economic theories. My students come from a broad range of Tufts departments, so I want to give them as complete a suite of tools as possible. Then it’s a matter of them using these tools to engage with the same technologies they rely on each day, but in a more sophisticated manner.

There are so many events unfolding right now — in politics, in business, in civics — that are really relevant to the topics you cover in Cultures of Computing: our elections have obviously been heavily influenced by social media algorithms. There’s been a lot of discussion about Facebook and the Cambridge Analytica scandal; just last week, the Australian government announced that they plan to ban all news links from being shared on the platform. Google has recently been criticized for the firing of Timnit Gebru, an employee and woman of color who had been vocal about improving diversity and equalizing outcomes within the company’s AI team. Is it challenging to work all of these cultural happenings into your class each term?

Tufts students are, across the board, very civically engaged. They have a very strong sense of social justice and no qualms about questioning authority, which makes them an ideal group for examining these topics.

One of the most fun aspects of teaching this class is being able to take the concepts I’ve planned for at the beginning of the term and apply them to current, unfolding events. Viewing what is occurring in America through the analytical framework we build in class helps them to better understand the realities of these events beyond the media’s sensationalization of them.

How long did it take you to originally design that analytical framework?

It’s hard to prescribe a specific length of time to its development. I’ve taught this course for four years now and it’s been significantly different each time, because I try to keep the topics as culturally relevant as possible. We always start the term by working through foundational narratives espoused towards the web and internet — ideologies of freedom and personal authenticity. Then, pretty swiftly, we get into more contemporary issues.

This past fall, the class was organized around the idea of power and resistance. We discussed how activists have utilized social media over the past few years to raise awareness and organize. We have a unit on selfhood and objectification in which we look at algorithmic bias and the cultures in which artificial intelligence is developed, amongst other growing technologies; that unit really examines the gender and racial inequities at play in cultures like Silicon Valley. And then we also have a unit on how technology has been used to commodify work and leisure.

How do you select the materials and readings that you assign in class?

This past year we read a great book called ‘Ghost Work’ by the anthropologist Mary Gray, which examines the invisible workforce of employees who manage the training algorithms used to build artificial intelligence. Because they’re responsible for the data input into these algorithms, the biases these employees hold can affect what the AI learns. We also discussed how computer scientists are applying crowdsourcing within the health field — anyone can contribute massively to biomedical research by running programs in their computer’s background that model protein folding.

I believe that exposing students to the different contexts of computing is critical for their understanding of how computers uphold and define our society’s hierarchical power structures. So, to go back to your earlier question, yes –– trying to reorganize the primary refrain of the course around these developing technologies and associated events is challenging, and time-consuming. But it also keeps the course relevant and the students engaged.

Netflix released a documentary called ‘The Social Dilemma’ last year, which shocked a lot of people by revealing the “insidious” nature of algorithmic bias and ads targeting. I personally felt that the documentary was unnecessarily alarmist and that the processes the filmmakers framed as being ‘shocking’ and ‘destructive’ are, at heart, fairly neutral.

It’s the same sort of question, to me, as: is a hammer evil? Or is it evil to hit another person with it? Tools themselves aren’t inherently deleterious, and Facebook doesn’t magically drive people to behave badly. People behave badly. Then the tool applies its understanding of that toxic behavior, and a user’s interactions with said behavior, to the algorithm which defines the user’s timeline... compounding their access to like-minded individuals.

Have you seen the film? What do you think of its message?

You nailed it. I don’t assign that film, but it is a constant topic of discussion in my class — I get students emailing me all the time, saying, “have you seen this? This is crazy, this is mind-blowing…”. At the same time, Silicon Valley spends millions of dollars each year on advertising. The rhetoric in these ads is always that ‘this is the next big thing’, or ‘this is going to change the world’.

Most of these platforms are not the next big thing, and ultimately won’t change the world. As critical thinkers, we don’t need to buy into their messaging –– just as we don’t need to buy into the idea that technology is intrinsically detrimental to society.

So, my personal beliefs are more copacetic with your own. Social media technologies are used to uphold existing power structures, but are also used to create new power structures and give a platform to those who have historically gone voiceless. Technology can be used for utopian or dystopian means, and what determines the ends of those means are the context the technology resides within, and the intentions of the person wielding it.

The last four years have been marked by the end of the Millennial Generation’s college years and the beginning of Gen Z’s. Have you noticed a difference between the students of each generation and their relationship to technology?

I’ve been teaching courses which look critically at social media, technology, and civics for about fifteen years now, at a variety of different institutions. For the longest time, critical approaches to these topics were few and far between; they didn’t often come up in popular discourse or in my students’ other classes.

Now, technology is so ubiquitous that it has become a massive talking point in civics and politics. It was only in the past few years that the government began to regulate antitrust laws, and just this year that the government decided to impose regulations on Facebook and Google. So as the field has grown, the public’s perception of it has as well.

Gen Z is the first generation to be considered ‘digitally native’. I’ve noticed that students now enter my class with a far more sophisticated and savvy sense for social media. It used to take a bit of convincing to get my students to think about the potential perils of these platforms — now, they regularly raise concerns that I haven’t considered yet. I’ve learned a lot from my students, because many of them come from diverse backgrounds and work in fields that I don’t have experience with.

Out of the expansive list of topics in technology that you cover within Culture of Computing, what is the one lesson that you want your students to walk away having learned?

That’s a great question. I think — and you know, I don’t wanna sound too pithy, but — it would be that technology is really what we make it.

I don’t mean that on an individual level; one person deleting their Facebook does not change America’s culture. It’s about how we engage with social media in the biggest, most collective, most civically-minded sense. It’s about how we apply our social struggles, political struggles, economic struggles, and personal struggles to the technologies which have become integral to our lives. Ultimately, our culture must choose to move towards a more utopian future.

NEXT: E3: Emerson’s Entrepreneurial Experience

Instructed by Professor Lu Ann Reeb at Emerson College

--

--