From the “Führer” to the “sextoy”:
The techno-politics of algorithmic work control
I am sitting at my workstation in the factory hall and assemble plug sockets. As I am completely unexperienced, I lean over to the worker sitting next to me once in a while to ask him what to do. In one of these moments, a foreman walks by on his patrol through the factory hall. He immediately stops and gives me a philippic. “There is a prescribed distance between the workers that is to be kept,” he says. “Otherwise, as one can see here, there would always be talking instead of working. Time is money, after all.”
It is fall 2017, and a dream of mine had come true: I got an internship as an unskilled assembly worker at a factory in rural Germany.
As a sociologist, I am researching the organizational impact of algorithmic work control in industries. In short, algorithmic control implies, that not a human supervisor gives orders and feedbacks to workers but a computer. Many sociologists conduct their studies by distributing and analyzing questionnaires. I take a different approach: I try to grasp the experience of what it means to work under digital supervision first-hand: simply by doing it myself. Over the course of one year, I worked at a factory several times before and after the introduction of a set of digital technologies. This method of participant observation substantially changed my understanding of how workers deal with algorithmic control: It showed me that digitalization is by no means only a process of technical innovation: It is an arena of conflicts that can best be described as techno-politics. However, with a potential of drastic surveillance on one side and visions of industrial democracy on the other, the outcome of these techno-politics is far from certain.
Obviously, I did not start my PhD research without any preconceptions on the matter. I had studied practices of digital-self-surveillance before. With this, I am referring to the intentional self-measurement of everyday life by means of digital media, also known as “self-tracking” or the “quantified-self-movement.” In a comprehensive analysis of advertising campaigns for self-tracking technologies, I noticed a decidedly economic rhetoric. Potential customers were referred to as “entrepreneurs,” even if the programs were to be applied for health, fitness or other private matters.
Even making friends and deciding with whom to stay in touch was presented as a problem of “management” that seemed to require a statistical basis for optimization.
To explain this, I used the theorem of the entrepreneurial self by the sociologist Ulrich Bröckling. He diagnoses a general imperative of perpetual self-optimization modeled on corporate governance. Following this, I have argued, that permanent self-optimization in post-industrial societies has become a prerequisite for a successful job hunt. Therefore, digital self-surveillance should not be dismissed as narcissistic obsession, as it is often presented in the media. Rather, self-optimization is an essential part of the modern labor process. Thus, the development of rationalization techniques for self-optimization becomes a structural necessity. In that sense, I have analyzed self-measurement as a form of bookkeeping for the company of the self.
In this, self-tracking revived the old idea of feedback-based control from cybernetics: a science that developed from steering anti-aircraft guns into a universal science of “communication and control.”
Cybernetics was founded after World War II. The advocates of the idea argued for a new mode of control. They emphasized self-organization instead of deterministic hierarchical order. This was to be based on continuous data collection and immediate feedbacks. However, lacking sufficiently advanced technology, the original cyberneticists where not able to realize their ideas. Only nowadays, the ubiquity of digital tracking allows for the realization of their ideas for the first time. From the huge media hype around “Industrie 4.0” in Germany I got the sense that cybernetic ideas had made their way into industry. Therefore, I decided to study the use of technologies comparable to self-tracking in the context of industrial work.
There is a major sociological difference between self-tracking and algorithmic work control in industry. It lies in the fact that the latter is used within hierarchical organizations. In such a situation, ‘users’ — i.e., the controlled workers — , can normally not ‘opt out.’ Therefore, the curious element of voluntary self-surveillance hardly plays a role in the context of industrial work. Nonetheless, I assumed that I would find a similar situation in this field. I expected that corporate ideology would make workers consent in their permanent evaluation and that deviant behavior would be absorbed by feedback-based self-optimization.
I started my research by conducting interviews with managers, engineers and works councils. I wanted to understand thier respective ideas of the digital transformation. In fact, it turned out that the visions of the industrial managers and engineers were implicitly or explicitly based on the cybernetic control model of feedback-based self-regulation. For example, the manager of the factory, where I would later do my internship, explained that he wanted to implement a variety of tracking technologies. These would allow for automatic evaluation and self-optimization of the production process:
“I track everything. This way, I capture everything: When is the desk moved up? How does the worker hold the soldering iron? Everything.” He then reasoned that the purpose of this intensive data-collection was not meant as repression, but as automatic feedback:
“Don’t give the data to me as supervisor, but give the workers the time they need to compare themselves with each other and to understand who is doing the best job. This way, they can see their own weaknesses and start to tune themselves.”
On another instance, an engineer explained his newest product to me: A glove for manual production work, which was equipped with a scanner and sensors in the fingers to track every movement of the hand. Its special feature: if it detected an undesired movement, it would vibrate. Additionally, it was supposed to give feedback on the optimization of the workflow. The developer of the glove explained to me that the workers would receive “instant feedback, directly to their bodies.”
This way, “management would get more tools” to “improve processes or get a better sense of what was happening.” Moreover, the glove would “empower the workers to proceed in a more self-organized way.”
Both the manager and the engineer seemed to apply the cybernetic idea of control through feedback.
A sociologist at the factory
Then, in the second halve of my PhD project, I started doing participant observation, trying to put myself into the situation of the workers. I was lucky: I had the opportunity to work at the same factory twice, once before and once after a system of algorithmic work control was introduced.
Initially, it seemed that the strategy of feedback-based self-optimization functioned as envisioned by the cybernetic ideas of the managers.
During my first stay at the factory, a human foreman controlled the assembly workers. Thus, every act of disciplining would result in a personal conflict. For example, when the foreman scaffolded me because I was talking too much, he was later ridiculed as “the Führer” by nearby workers. When I returned to the factory six months later, a digital assistance system guided my work. A screen showed me pictures of what I had to do. Every step along the way I had to confirm with a click. Then a time clock exact to the tenth of a second stared to race on the display. This way, it was exactly clear how long I had been working on the product in the end. At that point, I still did not know my way around the factory. Thus, when the system suddenly displayed a part I had never seen before, it took me quite a while to find it. Once I was finally finished, the system told me that I had fallen back three times behind my average working time. I turned to my seatmate to make fun of the system, but he only stared at his screen and did not pay attention to me. I turned back to my socket panel and increased my working speed.
At this moment, the act of resistance against disciplining had been erased.
The role of the foreman was taken over by a machine that sent immediate feedbacks and interpellations for self-optimization. Conflict seemed impossible. And yet, I soon realized that the workers had by no means stopped to resist the disciplining. Instead of ridiculing their human supervisors, they were now making fun of the digital technology. They referred to the autonomous transportation robot that was driving around the factory only as “Fiffi.” With this stereotypical German name for a stupid dog, they highlighted that it never found its way. A shelving unit was supposed to store high-value tools and to register digitally whenever they were taken out. But the workers only referred to it as the “candy machine,” and actually used it to store their chocolate bars. Finally, the “smart glove,” that was supposed to control them, went by the name of “the sex toy” to make fun of the vibrations.
“I will outsmart you, you system!”
The workers had clearly noticed the new control strategy. “You have the feeling that you are monitored permanently,” one of them told me.
“You are always in a rat race with the others because the competition is pressed to the maximum. This creates an incredible burden for your head. Everybody has to run permanently and be the best. Are we all supposed to become professional athletes or what?”
To alleviate the pressure, the workers developed appropriation strategies of their own. “One or the other clever worker is thinking to himself: ‘I will outsmart you, you system!’,” explained a works council member.
“I’ll just go one gear slower, and then the colleagues will also try to go a bit slower; to outsmart the system, so to speak. It is possible to build up resistance to protect yourself.”
Workers were indeed very creative in “outsmarting” the system. For example, when the assistance system analyzed the production process and predicted that no errors were likely to occur, the collective work unit just took an unofficial cigarette break. Without foreman, there was nobody around to scaffold them.
What is more, the workers developed strategies to fight back against what they saw as technological attacks on their autonomy.
In this, they explicitly took advantage of the vulnerabilities of the new technologies. For example, in a high-bay warehouse which was organized by a software, they would intentionally hide a few items. Eventually, this led to a complete crash of the warehouse system. By this and other actions, the workers expressed their dissent over being controlled algorithmically, which they felt had not been articulated properly by their formal representations such as the works council or trade union. Thus, they built up pressure, which among other things led to the cancellation of the “smart glove”-experiment.
Algorithmic work control as techno-politics
After experiencing these struggles, I had to revise my understanding of algorithmic control quite a bit. On the one hand, there clearly was the managerial vision to transform the whole factory into a cybernetic system. It was supposed to work like a fully functional machine without any inner frictions. The machine should regulate itself based on ubiquitous digital feedbacks. However, that was only half of the truth as most workers did not at all buy into these visions. Instead, they built up humorous subcultures of resistance. The primary focus of their jokes was on ridiculing the new production regime: The “Führer” had turned into a “sex toy.” These subcultures became the ground for practical acts of resistance.
In most media coverage on this ‘hot topic,’ digitalization is depicted as an inevitable process. It is coming down on us, almost like a natural catastrophe. With no way out, we seem to be forced to adjust to the process. Yet, the findings of my PhD-project indicate that the implementation of digital technology is a highly political process. Algorithms are organizational technologies. They function in similar ways as laws do. They set rules and execute sequences of actions based on these rules. But, unlike the process of law making, the implementation of algorithmic management is usually ignored as a political process.
Algorithms are shaped by diverging interest groups of which some have more articulation power than others. Therefore, it is important to flag out techno-politics as one of the central political arenas of our time.
As algorithms become ever more ubiquitous, this arena is the battleground on which conflicts are fought over the organization of large parts of our society. Understanding how digitalization is a political process is, therefore, a precondition for shaping it in such a way that it supports a dignified human existence and not only the accumulation of capital.
To show your support for this post and recommend it to your followers, click on the clap icon 👏 below. Each users is allowed to clap up to 50 times to show how much they appreciated a story.
The University of Basel has an international reputation of outstanding achievements in research and teaching. Founded in 1460, the University of Basel is the oldest university in Switzerland and has a history of success going back over 550 years. Learn more