Augmented collaboration: social movements, creativity and automation
The final session of Creativity & Collaboration: Revisiting Cybernetic Serendipity was focused around collective knowledge creation, accumulation and dissemination in citizen science and social media. While in session 4a we heard about citizen science infrastructure in detail, session 4b dealt with the power (or lack thereof) of collectives and crowds mediated by algorithms, both in social movements and creativity research. Panel chair Alyssa Goodman framed the session as approaching both hopeful and troubling examples of how collaboration works among communities of technologically mediated humans.
Niki Kittur of the Human-Computer Interaction Institute at Carnegie Mellon University had the more optimistic vision of human-machine collaboration to share. His presentation was titled “Scaling Up Serendipity: Augmenting Analogical Innovation with Crowds and AI”, and in it he covered his research in streamlining the generation of novel, useful ideas through combining crowd-sourcing and artificial intelligence. He started by defining analogical innovation, drawing on Manuel Castells’ description of humanity’s unique ability to “creatively accumulate and recombine knowledge in non-routine ways”. He showed how this ability could be improved through drawing a massive range of ideas from non-experts and sorting them algorithmically to find the to the most relevant, then putting those “relevant ideas” in front of experts who could select the most exciting to put to work in their own domains. According to his analysis, this frees experts from “fixation” on ideas that have come before and gets around the restrictions of disciplinary silos. He closed by suggesting that as automation replaces human jobs, idle human hands and minds might be put to work in a model where crowd-sourcing of novel ideas accelerates an unending cycle of faster innovation, producing a “golden age of progress on society’s thorniest problems”.
Zeynep Tufekci, a self-proclaimed “technosociologist”, New York Times columnist and professor at the University of North Carolina had a more critical perspective. While claiming to be optimistic about the potential of research like Kittur’s, she was quick to point out the context in which such work is produced. The rarified world of the academy, with its controlled conditions, can’t possibly reflect the multiplicity of possible uses and effects of artificial intelligence and machine learning in the broader world. Tufekci’s thorough approach to situating herself for her audience was instrumental in helping us understand her focus on a broader, non-academic context for the social effects of technology. She provided background information about her youth in Turkey, where information flow was restricted and censored, how she ended up working in computing, and how her high expectations for freedom and connectivity in the internet era lead her to technosociology as a means of tracking and analyzing its effects. Drawing on her recent book Twitter and Teargas, she described how this background lead her to analyze the role of social media in facilitating the rapid growth of protest movements and their subsequent decline. For Tufekci, such movements are vulnerable to a contemporary form of censorship that deploys distraction and confusion to mask messages that are deemed undesirable or unprofitable by those in power, be they politicians or algorithms.
The specificity of Kittur’s talk resonated in productive ways with Tufekci’s more expansive approach, demonstrating why it’s so essential to have technologists and sociologists thinking broadly, together, about human-technology collaborations. While Kittur was optimistic about computer assisted innovation in the future, Tufecki provided a glimpse into how social media algorithms function in the current moment. Tied up in these visions of the present and future are questions about what productivity and progress look like under late capitalism. As Tufecki reminded us in her closing comments, the logic of capitalism is not the logic of social movements. Perhaps it is not the algorithms we should fear (or embrace) but rather their implementation in system that is built on perpetuating profound inequality. As Stephen Hawking wrote in what has been described as his last public message on the internet, “The outcome will depend on how things are distributed. Everyone can enjoy a life of luxurious leisure if the machine-produced wealth is shared, or most people can end up miserably poor if the machine-owners successfully lobby against wealth redistribution.” Strong social movements will be essential to fighting our way towards the wealth-sharing version of Hawking’s prognosis.