Platform Workers to be Seen and Heard

Reshaping Work
5 min readFeb 16, 2022

by Dr. Deborah Giustini, KU Leuven

Platform work has become a major global broker of commerce in labour, assets, and services sold and purchased through a variety of digital platforms. In the EU, over 28 million people work through digital platforms; this workforce is projected to reach 43 million people by 2025. The European Commission’s new proposed measures recognise that platform work comes with specific challenges related to the incorrect classification of employment status. In addition to that, the use of algorithms raises concerns over transparency and accountability, as well as inadequate workers’ rights.

These challenges can be abstracted as a paradigmatic example of conflicting regimes of “invisible labour”. Invisible labour occurs within employment relationships which are not clearly or explicitly conceptualised as such, and which hence remain hidden from consumers, policy, and public imagination.

In this sense, invisible work is a functional labour logic of the organisation and performance of work in specific employment settings, such as the platform economy, which can result in workers’ devaluation and neglect of their labour rights. These dynamics of invisible labour are encapsulated in the vulnerabilities by the way platform work is structured through algorithmic management, resulting in the opaqueness of job allocation; the precarious and often under-paid nature of the work; its fragmentation into small tasks; and workers’ competition through self-branding and online identity management as key to being noticed and secure a successful job search.

Overall, the nature of platform work points to an invisible infrastructure that shuns away from social responsibilities — from recognising employment status to fractioning working activities. This makes de facto platform workers invisible to the public sphere and policy eye. Notwithstanding significant concerns over the continuing lack of social dialogue, the European Commission’s proposed measures target these hidden spots of the platform economy.

The first objective addresses the misclassification of employment status, to ensure applicable labour and social protection rights. As sociologist Erin Hatton argues, work is invisible when it is excluded from legal definitions of employment or is misclassified (for instance as bogus self-employment), and is therefore not monitored and regulated by states as such. The consequences are blatant. Platform workers are largely unprotected by employment laws and, given their uncertain legal status, often uncounted in official employment data. As a result, there is little systematic knowledge of platform labour, so that its boundaries remain largely unseen, particularly when it comes to working conditions and real economic impact. Thus, aside from counteracting platforms’ gain from such misclassification, the proposal can direct platform workers towards enjoying the labour and social rights that come with clarified status, in turn dismantling their blurred legal standing.

The second objective concerns algorithmic fairness. Algorithms are “the invisible hand” of the platform economy, a form of control and information asymmetry that directs the criteria for work allocation. Algorithmic management determines workers’ access to projects while preventing them from manipulating their rating evaluation. Insidiously, it can make workers’ profiles “invisible”, so that they no longer appear in clients’ search results. The article 7 of the directive stipulates the need for containing this invisible infrastructure, proposing that digital platforms ensure sufficient, competent human resources for monitoring algorithms and exercising functions that protect workers from automated negative consequences, as sanctions or dismissal. This may counteract workers’ inability to learn from and control their opaque platform evaluation, giving them the right to contest automated decisions.

The third objective concerns the traceability of platform work. Due to inaccessibility to algorithmic and employment data, platform workers’ activities are further invisibilised by lack of clear information. The directive aims to facilitate information access, by having platforms to declare key labour information to national authorities. It is likely that the Commission’s proposal will enhance workers’ recognition and protection by better understanding where and by whom platform work is performed.

These actions can represent an important step towards platform workers’ collective power. Transparency, traceability, and accountability to national authorities can secure solidarity, defeating at least some of the strategies of exploitation and fragmentation imposed by employers. This is particularly important within the future employment setting of platform work, as currently asymmetric power relationships (as evinced e.g. in algorithmic management) are ingrained in through the lack of the traditional employer-employee relationship and workers’ misaligned position at the bargaining table.

The metaphorical take on invisibility reflects the complexity of casting light on the mechanisms obscuring platform work, and the directive’s potential in exposing what is recognised as work, who is recognised as a worker, and how these questions are negotiated in a political space.

Although the platform economy structure is unlikely to survive in Europe as it stands, its business model will not be given up without a fight, and Member States could spend years untangling and implementing the proposal’s intricacies. The Commission has adopted a bold positioning, reinforcing public and policy views that challenge platforms’ absolutism and initiating the fundamental steps that illuminate the vulnerabilities of platform employment.

However, the directive should clarify some of its visions if workers are to fully enjoy those safeguards. Regarding the employment status, the proposal presumes that platforms control labour when this meets at least two broadly defined criteria, including: limits of remuneration; control over workers’ performance, work organisation, working hours (e.g. by digital means); and restriction of the client base. It is unclear yet how the directive envisions the overturning of these presumptions, and whether they would need to be sailed through local employment laws.

Furthermore, the measures picture that significant decisions taken by automated systems, such as sanctions or profile deactivation, must include appropriate human channels for discussing and requesting review of such decisions. This proposition implies that workers cannot effectively counteract platforms’ algorithmic systems, as they can contest decisions and interface with human resources only at the very last stage. This initiative will thus need to be adjusted organically, in the frame of overlapping legislation such as the Artificial Intelligence Act and the Digital Markets Act. Otherwise, visible enforcement might be challenged.

The opinions and views expressed in this publication are those of the authors. They do not purport to reflect the opinions or views of Reshaping Work.



Reshaping Work

We inform, inspire, and challenge future of work discussions by organising events and facilitating multi-stakeholder dialogue.