Data Night Show

Bringing design-led research on algorithmic bias to life from an intersectional feminist standpoint

Katharina L. Brunner
Designing Fluid Assemblages
3 min readDec 3, 2020

--

Data is never neutral, and what gets counted matters. This project tries to explore how data technology and society are entangled, how these interdependencies embody power dynamics and forms of oppression, and how we can discover approaches in design to tackle algorithmic injustice. The outcome is three episodes of a fictional TV show series, explaining intersectionality, ai bias, and is introducing a platform and app; which serve as an inspirational visualization for practical tools on how to create design and data justice awareness. They are a proposal of what a democratic way to produce, and compare data might look like, and how individuals might be involved.

As we refer to Fluid Assemblages as things that do not only take a physical form, I defined ‘things’ as something that strongly reflects social inequality in systems and dynamics of the world we live in, they manifest patriarchal structures, which are materializing in our daily lives through algorithmic bias, with often great, and unforeseeable consequences.

Trailer

https://vimeo.com/487132618/9fbd106b8c

Visualization through storytelling

One of the main methodologies used within the process was to create a storyboard with a coherent narrative. Part of taking this path was to find a way of how research through design may contribute with useful knowledge. I aimed to focus on establishing knowledge on algorithmic bias within the “Data Night Show”, as depending on the format we choose, as well as the narrative we create is a central aspect, as they determine an object’s expression and its intention.

Episode 1: Feminism. Not again.

“Why should that be relevant, some may ask. Therefore I am hosting guests like Peter Jordanson (feminist philosopher), and a renowned medical scientist who want to help understand what it means if certain data is just not produced, with examples from snow clearing to urinals and medical research.” (Data Night Show Production Team)

Episode 2: Algorithmic bias. But how?

“Data reflects biases and views of those who make the ordering system.
Mark Zuckerberg confirmed to talk to us directly in the show and give some insight on algorithmic bias and how it affects us.” (Data Night Show Production Team)

Episode 3: Meet the new cycle tracking app
“Another week, another guest: we mainly talk about the development of data justice initiatives, with the developer of a new cycle tracking app, and how feminists have spent a lot of time thinking about classification systems and the criteria by which people are divided into categories.” (Data Night Show Production Team)

Platform and ‘Self-ness’ in cycle tracking apps

Although my main focus lay in showing, communicating and creating awareness, there was one question constantly being part of the whole process: what can a democratic way to produce, and compare data look like; how might individuals be involved and be given a voice in such an ongoing process?

I created a fictional platform, that aims for ethical and responsible technologies that enable marginalized groups to collect and interrogate their own data that pertain their own concerns. This is additionally articulated through a practice-oriented design example, as I have been investigating cycle tracking apps. Such apps offer a quantitative understanding of one’s body and functions, so-called ‘objectivity’. As a critique to the contemporarily existing ones, that were mainly developed through male-centric assumptions, I suggest a postphenomenological approach, an exploration of their potential from an intersectional feminist standpoint, more specifically in regard to Donna Haraways (1988) ‘situated knowledge’.

Due to that I offer a concept of self-tracking, that demands a more personal approach to data, that ignores the policing of the outside on individual experiences, and sees participation as a contribution to self-initiated development.

--

--