Playing with algorithms (#mscedc Week 8)

Stephen Abblitt
6 min readNov 14, 2018

--

Paranoia

Algorithmic culture makes me paranoid––not that anyone or anything is out to get me, but more so in the sense that Eve Kosofsky Sedgwick writes of a paranoid reading: expecting to take issue with the text before I have begun reading, anticipating what I will critique about the text (Sedgwick, 2003; Love, 2010). Building on Paul Ricoeur’s “hermeneutics of suspicion” (Ricoeur, 1970; Felski, 2012, 2015), Sedgwick theorises a method of reading and writing (especially about systemic oppression of sex, gender, sexuality, race, culture, class, etc.––and diverse intersections thereof) that is, among other things, anticipatory, expository, and broad in scope. (On the other hand, Sedgwick theorises reparative reading, which focuses on playfully allowing oneself to be surprised, allowing oneself to find pleasure and sustenance in texts.) Paranoid reading hinges on revealing that things are worse than you think they are; or maybe, they’re just as bad as we (should) know. Read between the lines to make a path…

But it’s only really paranoia if they’re not out to get you. “Strange how paranoia can link up with reality now and then,” wrote Philip K. Dick in A Scanner Darkly. We live in a postdigital society where algorithms all too easily reproduce and even amplify existing human prejudices and biases oppressing gender, sexual, racial and other minorities (Noble, 2018)––whether deliberately or inadvertently. Part of the issue is that our digital knowledge infrastructures are fundamentally opaque and inscrutable (Edwards, 2015), concealed beneath technoscientific layers of software, databases, ontologies, codes, and Ridiculously Complicated Algorithms.

Algorithms are not the “purely formal beings of reason”(Goffey, 2008, p.16), and “[c]ode is not purely abstract and mathematical” (Montfort et al., 2012, p.3). Like any technology, “algorithms are created for purposes that are often far from neutral” (Kitchin, 2017, p.18)––necessarily implicated, shaping as they are shaped by, the socio-material technoscientific conditions of their production and consumption.

On the other hand, algorithms and other artificial intelligences aren’t necessarily that smart:

The lesson really is that the non-human actors still only as intelligent the human actors programming them, intra-acting with them.

Play (i)

I hadn’t thought too deeply about the influence of algorithms on popular music until recently, upon reading this scathing review of the forgettable, retro-fetishistic debut album Anthem of the Peaceful Army by Greta Van Fleet.

Greta Van Fleet exist to be swallowed into the algorithm’s churn and rack up plays, of which they already have hundreds of millions. They make music that sounds exactly like Led Zeppelin and demand very little other than forgetting how good Led Zeppelin often were.

People love Led Zeppelin, so sound enough like Led Zeppelin that you can trick the algorithm into dropping the sone into everyone’s Discover Weekly playlist. The algorithm is necessarily inscrutable and opaque, although software engineer Sophia Ciocca excavates the three models used by Spotify to makes its recommendations:

  1. Collaborative filtering models, which analyse both your behaviour and others’ behaviours based on implicit feedback data obtained from stream counts of tracks played. Spotify predicts which genres and microgenres you like based on songs, artists and playlists, and recommends new songs from similar categories. On the flip side, it also learns which songs, artists, and playlists you don’t like based on the songs you skip often.
  2. Natural Language Processing (NLP) models, which analyse text. The source data for these models include track metadata, news articles, blogs, and other text pulled and scraped from around the internet.
  3. Audio models, which analyse the raw audio tracks themselves, using neural networks to determine songs which share common characteristics such as estimated time signature, key, mode, tempo, and loudness (Ciocca, 2017, n.p.).

Posthuman

“The levels of abstraction, the functionalities of code, the relations of code and hardware and human somatics, and the temporalities of computation and internet transmission do not appear, do not engage nor operate at friendly or at any conscious level of human perception. In this human imperceptibility in service to circulation and value extraction, the computational algorithm offers itself as an artefact of the posthuman and the Capitalocene.

For many users, critical knowledge of this artefactuality is conscious, if not consensual, but the materiality, functionalities and modalities of algorithms remain, in the most classic sense of the term, black-boxed, a knowing by demonstrated effects without comprehension of process. And demonstrated effects constitute only a small intersection of designed (including non-conscious) affects: desiring, somatic and rhythmic” (Bianco, 2018, p.24).

Play (ii)

The algorithm was too hard to game––as a long-time and frequent Spotify user, it knew my musical tastes too well. Logging out, I created a few fake new user profiles, seeing how the fresh start might influence my recommendations. In one case, the recommendations became a lot more popular and generic (against my own obscure and avant-garde musical taste), and in another I fell hard down an ambient rabbit hole. Despite the algorithmic models described above, I don’t know musical taste can be reduced to mathematical formula.

In the meantime, I discovered this really interesting hack called Nelson, which lets the user play with different genres and audio features to make your own algorithmically generated playlists.

But it works just as opaquely as does Spotify’s algorithms: Maybe aspects such as popularity and tempo (BPM) can be positively and objectively measured, but what weird algorithm tracks danceability or valence (“the positivity of the track”)! Danceability, for example, are calculated thusly:

Danceability describes how suitable a track is for dancing based on a combination of musical elements including tempo, rhythm stability, beat strength, and overall regularity. A value of 0.0 is least danceable and 1.0 is most danceable.

Possibilities

What most strikes about algorithms is the assumed loss of human agency, and the downright positivist notion that ‘neutral’ mathematical formulae calculate objectivist answers.

An algorithm is only as intelligent as the human who programs it, and this relationship between human and non-human actor is where the greatest radical potential lies. Against the algorithm, consider the altergorithm:

References

  • Bianco, J. 2018. Algorithm. In Braidotti, R. & Hlavajova, M. (eds.). Posthuman Glossary. 23–26. London: Bloomsbury.
  • Ciocca, S. 2017. How Does Spotify Know You So Well? Medium, 11 October 2017. Retrieved from: https://medium.com/s/story/spotifys-discover-weekly-how-machine-learning-finds-your-new-music-19a41ab76efe
  • Cossins, D. 2018. Discriminating algorithms: 5 times AI showed prejudice. New Scientist, 12 April 2018, retrieved from https://www.newscientist.com/article/2166207-discriminating-algorithms-5-times-ai-showed-prejudice/
  • Dick, P. K. 1977. A Scanner Darkly. New York, NY: Doubleday.
  • Edwards, R. 2015. Knowledge infrastructures and the inscrutability of openness in education. Learning, Media and Technology, 40(3): 251–264.
  • Felski, R. 2012. Critique and the hermeneutics of suspicion. M/C Journal, 15(1): n. p.
  • Felski, R. 2015. The Limits of Critique. Chicago, IL: The University of Chicago Press.
  • Kitchin, R. 2017. Thinking critically about and researching algorithms. Information, Communication & Society, 20(1): 14–29.
  • Goffey, A. 2008. Algorithm. In M. Fuller (ed.). Software studies — A lexicon. pp. 15–20. Cambridge: MIT Press.
  • Larsen, J. D. 2018. Greta Van Fleet: Anthem of the Peaceful Army. Pitchfork, 23 October 2018. Retrieved from: https://pitchfork.com/reviews/albums/greta-van-fleet-anthem-of-the-peaceful-army/
  • Love, H. 2010. Truth and consequences: On paranoid reading and reparative reading. Criticism,52(2): 235–241.
  • Montfort, N., Baudoin, P., Bell, J., Bogost, I., Douglass, J., Marino, M. C., … Vawter, N. 2012. 10 PRINT CHR$ (205.5 + RND (1)): GOTO 10. Cambridge: MIT Press.
  • Noble, S. U. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. New York, NY: New York University Press.
  • Ricoeur, P. 1970. Freud and Philosophy. An Essay on Interpretation. Trans. D. Savage. New Haven, CT: Yale University Press.
  • Sedgwick, E. K. 2003. Paranoid Reading and Reparative Reading, or, You’re So Paranoid, You Probably Think This Essay Is About You. In Touching Feeling: Affect, Pedagogy, Performativity. 123–151. Durham, NC: Duke University Press.

--

--

Stephen Abblitt

Literary scholar. Educational researcher. Queer theorist. Applied grammatologist. (post)critical (post)digital (post)humanist. #mscde student. @thepostcritic