The Ethics of a Machine Learning Object

Jennifer Kim
CCA IxD Thesis Writings
2 min readDec 1, 2017

Sara Wachter-Boettcher’s book Technically Wrong addresses some of the many issues that exist within the tech and design world, particularly around problems such as discrimination, abuse, and biases. It’s given me a deeper understanding of what sorts of design decisions can lead to these outcomes and has made me not only more aware while I am designing something, but also thinking back to previous projects where my decisions could have been “technically wrong.” It helps me with my thesis project, to think ahead about what implications could exist within my own project.

The goal of my thesis is to create an interactive, machine learning object that would react and respond to a musician’s playing in order to encourage more practice time and self expression, which would lead to self-regulation for the player. When I was trying to think about the ethics of the space I am working in, I had to take a step back and look at what I doing from the perspective of someone who would be seeing my project for the first time, or from the feedback I had received during previous stand-ups with my classmates. A question I was asked during an earlier part of my progress was if I had considered creating the object so it could be used not just by musicians, but as a household object used by anyone. While I did think about what that could look like, I decided to keep my audience to musicians. I did realize however, that anyone who is approaching my work as a newcomer might think my project is biased towards talented musicians. Although I explain that it could be used by anyone who has any level of playing ability, I still haven’t really figured out how to communicate that thought through the object itself… or if I even need it to do just that.

When I thought back to a couple of initial sketches and ideas I had done during the brainstorming sessions, many of the them involved instruments that have a mind of their own. While they were interesting and at times amusing, it also made me think about how that could affect the person who owns the instrument. How would that affect their mentality towards the object? Would that affect how they play? Would it get to a point where the instrument is either annoying its owner, or even threatening them to put in some practice time? I started asking some similar questions when it came to a machine learning object that encourages playing. Would the user be able to control the frequency in which the object would bother them, and if so how do they determine that? Is it ethical to potentially allow the instrument or the object to dictate the user’s schedule?

Questions like these are helping me to think through the details and make sure nothing is missing or at risk for misinterpretation. I felt that I haven’t really thought too critically about my thesis project thus far and this is allowing me to go in depth with everything about my idea.

--

--