While most Computer Science students are still years from joining the workforce, it’s vital to remember they are among the C-suite decision-makers and full stack developers of tomorrow. A healthier, more ethical technological future is in their hands, so how they view their responsibilities as technologists is critical.
To help students develop a richer sense of that responsibility, the Tech and Society Solutions Lab partnered with Major League Hacking (MLH) this past year to support an Ethical Tech Initiative at its 250+ global hackathons with 65,000 participating students. Over the course of the “season,” more than 3,000 students engaged with our initiative that asked them to think more deeply about the potential impacts of their products.
In supporting the Ethical Tech Initiative, we aimed to:
1. Identify a baseline of student attitudes towards ethics in their curriculum.
We asked questions like “Have you ever had a conversation about ethics during a class in computer science or programming?” to better understand how and where students encounter ethical considerations in formal and informal settings.
2. Assess the viability of informal education as an acupressure point for intervention at the undergraduate level.
Even something simple, like distributing “Ethical Hacker” stickers for students to place on their laptops, let us gauge how deeply participants engaged with the material.
3. Inspire a deeper level of inquiry on second and third order consequences of products that may seem innocuous.
By tasking students to consider the impacts of their product at scale, we could see if students were able to engage in a substantive process of discernment related to something they themselves had created.
4. Introduce the notion that ethics should be embedded in the entire product design cycle.
It is no longer viable for ethical considerations to lie solely in the hands of policy or legal teams. We hope to expose future technologists to the responsibility inherent to their own decisions. From the very first line of code, we want a developer to keep in mind the possibility that their work might be abused or used maliciously, and to strive to mitigate such possible consequences.
To say the initiative produced some enlightening results would be a major understatement. Notably:
1. 50% of the students told us they have never discussed ethics in a Computer Science classroom.
As a result of this extreme lack of exposure, most students are unable to think beyond data privacy and security when considering ethics. Indeed, when asked if their product could lead to unintended consequences, the vast majority who responded “yes” essentially said “it isn’t very secure right now because I didn’t have time to write a bunch of security protocols.” Critical thinking about code is normative; critical thinking about the impacts of that code is not.
2. CS is not as developed as other disciplines.
Law, medicine, biology, civil engineering, nursing, journalism, accounting, education…each has a code of professional ethics that informs how students are educated. These disciplines are light years ahead of CS when it comes to the maturity of their approach to teaching ethics and responsibility in the classroom and beyond.
3. Solutions may be simple.
Building “ethical products” might be a challenging issue with a relatively easy solution. We can create a process of discernment, asking the evident and close-to-evident questions, to regard each product from all angles. That’s not really all that hard, as there’s a relatively small set of initial questions that would make a big difference. “What’s the worst thing you can imagine someone doing with your product?” “How could you prevent it?” “Have you tested your product with a diverse set of users, representing diversity of age, gender, race, socioeconomic status and income, geography, political affiliation, language, ability, sexual orientation, religion, and education?” Even this would represent a giant leap forward.
AND THE WINNER IS…
From all our incredible submissions, one really stood out: Keefer Rourke, Alex Parent, and Salem Abuammer, from the University of Guelph and Illinois Institute of Technology, whose original project — Stegamsg — was submitted at “MHacks X” hosted at the University of Michigan.
Stegamsg’s submission got right to the very heart of our own mission. As they wrote:
“Undercutting trust to justify a use case for a technology, or for the sake of profit and innovation, without evaluating the risks that it may pose with concern to social wellness is harmful.”
This team distinguished themselves with the sophistication of their thinking about how to mitigate the potential negative impacts of tech, clearly articulating individual product challenges, such as with Google I/O Duplex (the human-sounding voice assistant that could make scheduling phone calls on your behalf). As for Stegamsg’s own potential vulnerabilities, while other teams could foresee possible misuses or abuses of their product, Keefer’s team were the clear standouts in their ability to also propose concrete solutions in the form of encrypted group channels and user authentication codes built directly into their anonymous communication platform.
Our team at the Tech and Society Solutions Lab looks forward to advancing solutions based on three important recommendations coming out of this experience:
1. Integrate ethical inquiry into CS classrooms.
This will provide students valuable experience confronting and working through ethical questions throughout their work in CS, building moral muscle memory and better equipping them to produce socially and ethically responsible tech down the road.
2. Provide support for students once they leave academia.
Without a supportive environment that encourages ethical actions, even the most well-trained computer scientists are doomed to failure. Industry should provide an environment in which tech workers feel empowered speak up, one that rewards ethical behavior. When hiring, companies might even consider giving preference to students with CS ethics training.
3. Find new champions and role models.
A generation that’s grown up lionizing the “geek pirate” needs new examples of ethical and moral leadership. For too long, we’ve idolized rogue, garage-based hackers and titans of industry with entrepreneurial zeal who assume the benefits of their ideas are obvious and therefore don’t need to think about their potential downsides. It’s time we elevate better tech gurus for students to admire and aspire to be. (Have someone in mind? Tweet suggestions to @yschlesinger)
And congratulations Keefer and team!