Explanatory Memorandum to COM(2021)762 - Improving working conditions in platform work

Please note

This page contains a limited version of this dossier in the EU Monitor.

dossier COM(2021)762 - Improving working conditions in platform work.
source COM(2021)762 EN
date 09-12-2021


1. CONTEXT OF THE PROPOSAL

Reasons for and objectives of the proposal

One of the objectives of the Union is the promotion of the well-being of its peoples and sustainable development of Europe based on a highly competitive social market economy, aiming at full employment and social progress. 1 The right of every worker to working conditions which respect their health, safety and dignity, and workers’ right to information and consultation are enshrined in the Charter of Fundamental Rights of the European Union. The European Pillar of Social Rights states that “regardless of the type and duration of the employment relationship, workers have the right to fair and equal treatment regarding working conditions and access to social protection.” 2

In her political guidelines, President von der Leyen stressed that “digital transformation brings fast change that affects our labour markets” and undertook the commitment to “look at ways to improve the labour conditions of platform workers”. 3 The proposed Directive delivers on this commitment and supports the implementation of the European Pillar of Social Rights Action Plan, welcomed by Member States, social partners and civil society at the Porto Social Summit in May 2021, by addressing the changes brought by the digital transformation to labour markets.

The digital transition, accelerated by the COVID-19 pandemic, is shaping the EU’s economy and its labour markets. Digital labour platforms 4 have become an important element of this newly emerging social and economic landscape. They have continued expanding in size, with revenues in the digital labour platform economy in the EU estimated to have grown by around 500% in the last five years. 5 Today, over 28 million people in the EU work through digital labour platforms. In 2025, their number is expected to have reached 43 million. 6 Digital labour platforms are present in a variety of economic sectors. Some offer services “on-location”, such as ride-hailing, delivery of goods, cleaning or care services. Others operate solely online with services such as data encoding, translation or design. Platform work varies in terms of level of skills required as well as the way the work is organised and controlled by the platforms.

Digital labour platforms promote innovative services and new business models and create many opportunities for consumers and businesses. They can efficiently match supply and demand for labour and offer possibilities to make a living or earn additional income, including for people who face barriers in access to the labour market, such as young people, people with disabilities, migrants, people with minority racial and ethnic background or people with caring responsibilities. Platform work creates opportunities to establish or broaden a client base, sometimes across borders. It brings businesses a much wider access to consumers, opportunities to diversify revenues and develop new business lines, thereby helping them to grow. For consumers it means improved access to products and services which would be otherwise hard to reach, as well as access to a new and more varied choice of services. Still, as digital labour platforms introduce new forms of work organisation, they challenge existing rights and obligations related to labour law and social protection.

Nine out of ten platforms active in the EU currently are estimated to classify people working through them as self-employed. 7 Most of those people are genuinely autonomous in their work and can use platform work as a way to develop their entrepreneurial activities. 8 Such genuine self-employment is making a positive contribution to job creation, business development, innovation, accessibility of services, and digitalisation in the EU.

However, there are also many people who experience subordination to and varying degrees of control by the digital labour platforms they operate through, for instance as regards pay levels or working conditions. According to one estimate, up to five and a half million people working through digital labour platforms could be at risk of employment status misclassification. 9 Those people are especially likely to experience poor working conditions and inadequate access to social protection. 10 As a result of the misclassification, they cannot enjoy the rights and protections to which they are entitled as workers. These rights include the right to a minimum wage, working time regulations, occupational safety and health protection, equal pay between men and women and the right to paid leave, as well as improved access to social protection against work accidents, unemployment, sickness and old age.

Digital labour platforms use automated systems to match supply and demand for work. Albeit in different ways, digital platforms use them to assign tasks, to monitor, evaluate and take decisions for the people working through them. Such practices are often referred to as “algorithmic management”. While algorithmic management is used in a growing number of ways in the wider labour market, it is clearly inherent to digital labour platforms’ business model. It creates efficiencies in the matching of supply and demand but has also a significant impact on working conditions in platform work. Algorithmic management also conceals the existence of subordination and control by the digital labour platform on the persons performing the work. The potential for gender bias and discrimination in algorithmic management could also amplify gender inequalities. Understanding how algorithms influence or determine certain decisions (such as the access to future task opportunities or bonuses, the imposition of sanctions or the possible suspension or restriction of accounts) is paramount, given the implications for the income and working conditions of people working through digital labour platforms. Currently, however, there is insufficient transparency regarding such automated monitoring and decision-making systems and people lack efficient access to remedies in the face of decisions taken or supported by such systems. Algorithmic management is a relatively new and – apart from EU data protection rules – largely unregulated phenomenon in the platform economy that poses challenges to both workers and the self-employed working through digital labour platforms.

Difficulties in enforcement and lack of traceability and transparency, including in cross-border situations, are also thought to exacerbate some instances of poor working conditions or inadequate access to social protection. National authorities do not always have sufficient access to data on digital labour platforms and people working through them, such as the number of persons performing platform work on a regular basis, their contractual or employment status, or digital labour platforms’ terms and conditions. The problem of traceability is especially relevant when platforms operate across borders, making it unclear where platform work is performed and by whom. This, in turn, makes it difficult for national authorities to enforce existing obligations, including in terms of social security contributions.

The general objective of the proposed Directive is to improve the working conditions and social rights of people working through platforms, including with the view to support the conditions for the sustainable growth of digital labour platforms in the European Union.

1.

The specific objectives through which the general objective will be addressed are:


to ensure that people working through platforms have – or can obtain – the correct employment status in light of their actual relationship with the digital labour platform and gain access to the applicable labour and social protection rights;

to ensure fairness, transparency and accountability in algorithmic management in the platform work context; and

to enhance transparency, traceability and awareness of developments in platform work and improve enforcement of the applicable rules for all people working through platforms, including those operating across borders.

The first specific objective will be attained by establishing a comprehensive framework to tackle employment status misclassification in platform work. This framework includes appropriate procedures to ensure correct determination of the employment status of people performing platform work, in line with the principle of primacy of facts, as well as a rebuttable presumption of employment relationship (including a reversal of the burden of proof) for persons working through digital labour platforms that control certain elements of the performance of work. This legal presumption would apply in all legal and administrative proceedings, including those launched by national authorities competent for enforcing labour and social protection rules, and can be rebutted by proving that there is no employment relationship by reference to national definitions. This framework is expected to benefit both the false and the genuine self-employed working through digital labour platforms. Those who, as a result of correct determination of their employment status, will be recognised as workers will enjoy improved working conditions – including health and safety, employment protection, statutory or collectively bargained minimum wages and access to training opportunities – and gain access to social protection according to national rules. Conversely, genuine self-employed people working through platforms will indirectly benefit from more autonomy and independence, as a result of digital labour platforms adapting their practices to avoid any risk of reclassification. Digital labour platforms will also gain from increased legal certainty, including with respect to potential court challenges. Other businesses that compete with digital labour platforms by operating in the same sector will benefit from a level playing field as regards the cost of social protection contributions. Member States will enjoy increased revenues in the form of additional tax and social protection contributions.

The proposed Directive aims at attaining the second specific objective of ensuring fairness, transparency and accountability in algorithmic management by introducing new material rights for people performing platform work. These include the right to transparency regarding the use and functioning of automated monitoring and decision-making systems, which specifies and complements existing rights in relation to the protection of personal data. The proposed Directive also aims at ensuring human monitoring of the impact of such automated systems on working conditions with a view to safeguarding basic workers’ rights and health and safety at work. To ensure fairness and accountability of significant decisions taken or supported by automated systems, the proposed Directive also includes establishing appropriate channels for discussing and requesting review of such decisions. With certain exceptions, these provisions apply to all people working through platforms, including the genuine self-employed. As regards workers, the proposed Directive also aims at fostering social dialogue on algorithmic management systems by introducing collective rights regarding information and consultation on substantial changes related to use of automated monitoring and decision-making systems. As a result, all people working through platforms and their representatives will enjoy better transparency and understanding of algorithmic management practices as well as more effective access to remedies against automated decisions, leading to improved working conditions. These rights will build on and extend existing safeguards in respect of processing of personal data by automated decision-making systems laid down in the General Data Protection Regulation as well as proposed obligations for providers and users of artificial intelligence (AI) systems in terms of transparency and human oversight of certain AI systems in the proposal for an AI Act (see more details below).

Finally, concrete measures are proposed to achieve the third objective of enhancing transparency and traceability of platform work with a view to supporting competent authorities in enforcing existing rights and obligations in relation to working conditions and social protection. This includes clarifying the obligation for digital labour platforms which are employers to declare platform work to the competent authorities of the Member State where it is performed. The proposed Directive will also improve labour and social protection authorities’ knowledge of which digital labour platforms are active in their Member State by giving those authorities access to relevant basic information on the number of people working through digital labour platforms, their employment status and their standard terms and conditions. These measures will help those authorities in ensuring compliance with labour rights and in collecting social security contributions, and thus improve working conditions of people performing platform work.

Consistency with existing policy provisions in the policy area

In order to prevent a race to the bottom in employment practices and social standards to the detriment of workers, the EU has created a minimum floor of labour rights that apply to workers across all Member States. The EU labour and social acquis sets minimum standards through a number of key instruments.

Only workers who fall under the personal scope of such legal instruments benefit from the protection they afford. 11 Self-employed people, including those working through platforms, fall outside the scope and typically do not enjoy these rights, making the worker status a gateway to the EU labour and social acquis. (The only exception are the equal treatment directives which also cover access to self-employment 12 , due to broader legal bases.)

2.

Relevant legal instruments for employed people working through platforms include:


–The Directive on transparent and predictable working conditions 13 provides for measures to protect working conditions of people who work in non-standard work relationships. This includes rules on transparency, the right to information, probationary periods, parallel employment, minimum predictability of work and measures for on-demand contracts. These minimum standards are particularly relevant for people working through platforms, given their atypical work organisation and patterns. However, while the Directive ensures transparency on basic working conditions, the information duty on employers does not extend to the use of algorithms in the workplace and how they affect individual workers.

–The Directive on work-life balance for parents and carers 14 lays down minimum requirements related to parental, paternity and carers’ leave and flexible work arrangements for parents or carers. It complements the Directive on safety and health at work of pregnant workers and workers who have recently given birth or are breastfeeding 15 , which provides for a minimum period of maternity leave, alongside other measures.

–The Working Time Directive 16 lays down minimum requirements for the organisation of working time and defines concepts such as ‘working time’ and ‘rest periods’. While the Court of Justice of the European Union (CJEU) has traditionally interpreted the concept of ‘working time’ as requiring the worker to be physically present at a place determined by the employer, in recent cases the Court has extended this concept in particular when a ‘stand-by’ time system is in place (i.e. where a worker is not required to remain at their workplace but shall remain available to work if called by the employer). In 2018, the Court made clear that ‘stand-by’ time, during which the worker's opportunities to carry out other activities are significantly restricted, shall be regarded as working time. 17

–The Directive on temporary agency work 18 defines a general framework applicable to the working conditions of temporary agency workers. It lays down the principle of non-discrimination, regarding the essential conditions of work and of employment, between temporary agency workers and workers who are recruited by the user company. Due to the typically triangular contractual relationship of platform work, this Directive can be of relevance for platform work. Depending on the business model of a digital labour platform and on whether its customers are private consumers or businesses, it might qualify as a temporary-work agency assigning its workers to user companies. In some cases, the platform might be the user company making use of the services of workers assigned by temporary-work agencies.

–The Occupational Health and Safety (OSH) Framework Directive 19 lays down the main principles for encouraging improvements in the health and safety at work. It guarantees minimum safety and health requirements throughout the EU. The Framework Directive is accompanied by further directives focusing on specific aspects of safety and health at work.

–The Directive establishing a general framework for informing and consulting employees 20 plays a key role in promoting social dialogue by setting minimum principles, definitions and arrangements for information and consultation of workers’ representatives at the company level within each Member State.

–When adopted, the proposed Directive on adequate minimum wages 21 will establish a framework to improve the adequacy of minimum wages and to increase the access of workers to minimum wage protection.

–When adopted, the proposed Directive on pay transparency 22 will strengthen the application of the principle of equal pay for equal work or work of equal value between men and women.

In addition, regulations on the coordination of national social security systems apply to both employed and self-employed people working through platforms in a cross-border situation 23 .

Finally, the Council Recommendation on access to social protection for workers and the self-employed 24 recommends Member States to ensure that both workers and the self-employed have access to effective and adequate social protection. The Recommendation covers unemployment, sickness and health care, maternity and paternity, invalidity, old-age and survivors’ benefits and benefits in respect of accidents at work and occupational diseases.


Consistency with other Union policies

Existing and proposed EU internal market and data protection instruments are relevant for digital labour platforms’ operations and the people working through them. Still, not all identified challenges in platform work are sufficiently addressed by those legal instruments. While they tackle algorithmic management in certain respects, they do not specifically address the perspective of workers, labour market specificities and collective labour rights.

3.

Relevant EU internal market and data protection instruments include:


–The Regulation on promoting fairness and transparency for business users of online intermediation services (or ‘Platform-to-Business Regulation’) 25 aims at ensuring that self-employed ‘business users’ of an online platform’s intermediation services are treated in a transparent and fair way and that they have access to effective redress in the event of disputes. The relevant provisions focus, among others, on transparency of terms and conditions for business users, procedural safeguards in case of the restriction, suspension and termination of accounts, transparency regarding ranking, and complaint-handling mechanisms. These are linked to algorithmic management, but the Regulation does not cover other key aspects, such as transparency of automated monitoring and decision-making systems (other than ranking), human monitoring of such systems and specific rights regarding the review of significant decisions affecting working conditions. The Regulation does not apply to persons in an employment relationship or to digital labour platforms that are considered, as a result of an overall assessment, as not providing ‘information society services’, but for instance a transport service.

–The General Data Protection Regulation (GDPR) 26 lays down rules for the protection of natural persons concerning the processing of their personal data. It grants people working through platforms a range of rights regarding their personal data, regardless of their employment status. Such rights include, in particular, the right not to be subject to a decision based solely on automated processing which produces legal effects concerning the data subject or similarly significantly affects them (with certain exceptions), as well as the right to transparency on the use of automated decision-making. Where automated processing is permitted under the exceptions, the data controller must implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express their point of view and to contest the decision. While these rights are particularly relevant for people working through platforms subject to algorithmic management, recent court cases have highlighted the limitations and difficulties that workers – and most notably persons performing platform work – face when aiming to assert their data protection rights in the context of algorithmic management. 27 This concerns in particular the difficulty to draw the line between algorithmic decisions that do or do not affect workers in a sufficiently ‘significant’ way. Moreover, while the GDPR grants individual rights to the people affected, it does not encompass important collective aspects inherent in labour law, including those related to the role of workers’ representatives, information and consultation of workers and the role of labour inspectorates in enforcing labour rights. The legislator therefore provided for the possibility of more specific rules to ensure the protection of workers’ personal data in the employment context, including as regards the organisation of work (Article 88 GDPR).

–When adopted, the proposed Artificial Intelligence Act 28 will address risks linked to the use of certain artificial intelligence (AI) systems. The proposed AI Act aims to ensure that AI systems placed on the market and used in the EU are safe and respect fundamental rights, such as the principle of equal treatment. It tackles issues related to development, deployment, use and regulatory oversight of AI systems and addresses inherent challenges such as bias (including gender bias) and lack of accountability, including by setting requirements for high-quality datasets, helping to tackle the risk of discrimination. The proposed AI regulation lists AI systems used in employment, worker management and access to self-employment that are to be considered as high-risk. It puts forward mandatory requirements that high-risk AI systems must comply with, as well as obligations for providers and users of such systems. The proposed AI Act provides specific requirements on transparency for certain AI systems, and will ensure that digital labour platforms, as users of high-risk AI systems, will have access to the information they need to use the system in a lawful and responsible manner. Where digital labour platforms are providers of high risk AI systems, they will need to test and document their systems appropriately. Furthermore, the proposal for an AI Act imposes requirements on providers of AI systems to enable human oversight and to issue instructions in this regard. By ensuring transparency and traceability of high-risk AI systems, the AI Act aims to facilitate the implementation of existing rules on the protection of fundamental rights, whenever such AI systems are in use. Nonetheless, it does not take into account the diversity of rules on working conditions in different Member States and sectors, and it does not provide safeguards in relation to the respect of working conditions for the people directly affected by the use of the AI systems, such as workers.

2. LEGAL BASIS, SUBSIDIARITY AND PROPORTIONALITY

Legal basis

The proposed Directive is based on Article 153(1)(b) of Treaty on the Functioning of the European Union (TFEU), which empowers the Union to support and complement the activities of the Member States with the objective to improve working conditions. In this area, Article 153(2)(b) TFEU enables the European Parliament and the Council to adopt – in accordance with the ordinary legislative procedure – directives setting minimum requirements for gradual implementation, having regard to the conditions and technical rules obtaining in each of the Member States. Such directives must avoid imposing administrative, financial and legal constraints in a way which would hold back the creation and development of small and medium-sized undertakings.

This legal basis allows the Union to set minimum standards regarding the working conditions of people performing platform work where they are in an employment relationship and thus classified as ‘workers’. The CJEU has ruled that the classification of a ‘self-employed person’ under national law does not prevent that person being classified as a worker within the meaning of EU law if their independence is merely notional, thereby disguising an employment relationship. 29 False self-employed people are thus also covered by EU labour legislation based on Article 153 TFEU.

The proposed Directive is also based on Article 16(2) TFEU insofar as it addresses the situation of persons performing platform work in relation to the protection of their personal data processed by automated monitoring and decision-making systems. This Article empowers the European Parliament and the Council to lay down rules relating to the protection of individuals with regard to the processing of personal data.

Subsidiarity (for non-exclusive competence)

Flexibility and constant adaptation of business models are key features of the platform economy, whose primary means of production are algorithms, data and clouds. As they are not tied to any fixed assets and premises, digital labour platforms can easily move and operate across borders, swiftly starting operations in certain markets, sometimes closing down for business or regulatory reasons and re-opening in another country with laxer rules.

While Member States operate in one single market, they have taken different approaches on whether or not to regulate platform work, and in what direction. More than 100 court decisions and 15 administrative decisions dealing with the employment status of people working through platforms have been observed in the Member States, with varying outcomes but predominantly in favour of reclassifying people working through platforms as workers. 30 In addition to the legal uncertainty this entails for the digital labour platforms and for those working through them, the high number of court cases points to difficulties in maintaining a level playing field among Member States as well as between digital labour platforms and other businesses, and in avoiding downward pressure on labour standards and working conditions. Certain digital labour platforms may engage in unfair commercial practices with respect to other businesses, for example, by not complying with the same rules and operating under the same conditions. Consequently, EU action is needed to ensure that the highly mobile and fast-moving platform economy develops alongside the labour rights of people working through platforms.

Digital labour platforms are often based in one country, while operating through people based elsewhere. 59% of all people working through platforms in the EU engage with clients based in another country. 31 This adds complexity to contractual relationships. The working conditions and social protection coverage of people performing cross-border platform work is equally uncertain and depends strongly on their employment status. National authorities (such as labour inspectorates, social protection institutions and tax authorities) are often not aware of which digital labour platforms are active in their country, how many people are working through them and under what employment status the work is performed. Risks of non-compliance with rules and obstacles to tackling undeclared work are higher in cross-border situations, in particular when online platform work is concerned. In this context, relevant actions aimed at tackling the cross-border challenges of platform work, including notably the lack of data to allow for a better enforcement of rules, are best taken at EU level.

National action alone would not achieve the EU’s Treaty-based core objectives of promoting sustainable economic growth and social progress, as Member States may hesitate to adopt more stringent rules or to strictly enforce existing labour standards, while they compete with one another to attract digital labour platforms’ investments.

Only an EU initiative can set common rules that apply to all digital labour platforms operating in the EU, while also preventing fragmentation in the fast-developing single market for digital labour platforms. This would ensure a level playing field in the area of working conditions and algorithmic management between digital labour platforms operating in different Member States. Hence, the specific EU added value lies in the establishment of minimum standards in these areas which will foster upward convergence in employment and social outcomes across the Union, and facilitate the development of the platform economy across the EU.

Proportionality

The proposed Directive provides for minimum standards, thus ensuring that the degree of intervention will be kept to the minimum necessary in order to reach the objectives of the proposal. Member States which have already more favourable provisions in place than those put forward in the proposed Directive will not have to change or lower them. Member States may also decide to go beyond the minimum standards set out in the proposed Directive.

The principle of proportionality is respected considering the size and nature of the identified problems. For instance, the rebuttable presumption proposed to address the problem of misclassification of the employment status will only apply to digital labour platforms that exert a certain level of control over the performance of work. Other digital labour platforms will thus not be concerned by the presumption. Similarly the provisions on automated monitoring and decision-making systems do not go beyond what is necessary to achieve the objectives of fairness, transparency and accountability in algorithmic management.

Choice of the instrument

Article 153(2)(b) TFEU in combination with 153(1)(b) TFEU provides explicitly that Directives may be used for establishing minimum requirements concerning working conditions to be implemented by Member States. Rules based on Article 16(2) TFEU may also be laid down in Directives.3.RESULTS OF STAKEHOLDER CONSULTATIONS AND IMPACT ASSESSMENTS

Stakeholder consultations

In line with Article 154 TFEU, the Commission has carried out a two-stage consultation of the social partners on possible EU action to improve working conditions in platform work. In the first stage, between 24 February and 7 April 2021, the Commission consulted social partners on the need for an initiative on platform work, and its possible direction. 32 In the second stage, between 15 June and 15 September 2021, the Commission consulted social partners on the content and legal instrument of the envisaged proposal. 33

Both trade unions and employers’ organisations agreed with the overall challenges as identified in the second-stage consultation document, but differed on the need for concrete action at EU level.

Trade unions called for a Directive based on Article 153(2) TFEU providing for the rebuttable presumption of an employment relationship with reversed burden of proof and a set of criteria to verify the status. They maintained that such an instrument should apply both to online and on-location platforms. Trade unions also supported the introduction of new rights related to the algorithmic management in the employment domain, and generally opposed a third status for people working through platforms. They highlighted the need for social dialogue.

Employers’ organisations agreed that there are issues that should be tackled, such as regarding working conditions, misclassification of employment status or access to information. However, action should be taken at national level, on a case-by-case basis and within the framework of the different national social and industrial relations systems. Regarding algorithmic management, they highlighted that the focus should be on efficient implementation and enforcement of existing and upcoming legal instruments.

There was no agreement among social partners to enter into negotiations to conclude an agreement at Union level, as foreseen in Article 155 TFEU.

In addition, the Commission held a substantial number of exchanges with many stakeholders to inform this initiative, including dedicated and bilateral meetings with platform companies, platform workers’ associations, trade unions, Member States’ representatives, experts from academia and international organisations and representatives of civil society. 34 On 20 and 21 September 2021, the Commission held two dedicated meetings with platform operators and representatives of platform workers to hear their views on the possible directions for EU action.

The European Parliament has called 35 for a strong EU action to address employment status misclassification, and improve transparency in the use of algorithms, including for workers’ representatives. The Council of Ministers of the EU 36 , the European Economic and Social Committee 37 and the Committee of the Regions 38 have also called for specific action on platform work.

Collection and use of expertise

4.

The Commission contracted external experts to produce several studies gathering relevant evidence, which was used to support the impact assessment and prepare this proposal:


–“Study to support the impact assessment of an EU Initiative on improving the working conditions in platform work” (2021) by PPMI. 39

–“Digital Labour Platforms in the EU: Mapping and Business Models” (2021) by CEPS. 40

–“Study to gather evidence on the working conditions of platform workers” (2019) by CEPS. 41

The Commission also based its assessment on the reviews by the European Centre of Expertise in the field of labour law, employment and labour market policies (ECE).

–“Thematic Review 2021 on Platform work” (2021) based on country articles for the 27 EU Member States. 42

–“Case Law on the Classification of Platform Workers: Cross-European Comparative Analysis and Tentative Conclusions” (2021). 43

–“Jurisprudence of national Courts in Europe on algorithmic management at the workplace” (2021). 44

5.

The Commission additionally drew on external expertise and used the following studies and reports to support the impact assessment:


–Eurofound reports: “Employment and Working Conditions of Selected Types of Platform Work” (2018). 45

–JRC reports: “Platform Workers in Europe Evidence from the COLLEEM Survey” (2018), 46 and “New evidence on platform workers in Europe. Results from the second COLLEEM survey” (2020). 47

–ILO report: “The role of digital labour platforms in transforming the world of work” (2021). 48

Moreover, the Commission’s assessment has also relied on its mapping of policies in Member States, relevant academic literature and CJEU case-law.

Impact assessment

The Impact Assessment 49 was discussed with the Regulatory Scrutiny Board (RSB) on 27 October 2021. The RSB issued a positive opinion with comments, which have been addressed by further clarifying the coherence with linked initiatives, explaining why and how the issues related to algorithmic management are particularly relevant for the platform economy and better reflecting the views of different categories of stakeholders, including digital labour platforms and people working through them. The combination of measures put forward in this proposal was assessed in the Impact Assessment as the most effective, efficient and coherent. The quantitative and qualitative analysis of the preferred combination of measures shows that a substantial improvement in the working conditions and access to social protection for people working through platforms is expected. Digital labour platforms will also benefit from increased legal certainty and conditions for a sustainable growth, in line with the EU’s social model. As a spillover effect, other businesses competing with digital labour platforms will benefit from a newly levelled playing field.

As a result of actions to address the risk of misclassification, between 1.72 million and 4.1 million people are expected to be reclassified as workers (circa 2.35 million on-location and 1.75 million online considering the higher estimation figures). This would grant them access to the rights and protections of the national and EU labour acquis. People who currently earn below the minimum wage would enjoy increased annual earnings of up to EUR 484 million, as statutory laws and/or industry-wide collective agreements would cover them as well. This means an average annual increase in EUR 121 per worker, ranging from EUR 0 for those already earning above the minimum wage before reclassification to EUR 1 800 for those earning below it. In-work poverty and precariousness would thus decrease as a result of reclassification and the resulting improved access to social protection. Hence, income stability and predictability would improve. Up to 3.8 million people would receive confirmation of their self-employment status and, as a result of actions by platforms aimed at relinquishing control to avoid reclassification as employers, they would enjoy more autonomy and flexibility. New rights related to algorithmic management in platform work may lead to improved working conditions for over 28 million people (both workers and self-employed) and greater transparency in the use of artificial intelligence (AI) at the workplace, with positive spill-over effects for the wider market of AI systems. The initiative would also improve transparency and traceability of platform work, including in cross-border situations, with positive effects for national authorities in terms of better enforcement of existing labour and fiscal rules, as well as improved collection of tax and social protection contributions. To this end, Member States could benefit from up to EUR 4 billion in increased tax and social protection contributions per year.

Actions to address the risk of misclassification could result in up to EUR 4.5 billion increase in costs per year for digital labour platforms. Businesses relying on them and consumers may be faced with part of these costs, depending on how digital labour platforms decide to pass them onto third parties. New rights related to algorithmic management and the foreseen measures to improve enforcement, transparency and traceability imply negligible to low costs for digital labour platforms. The initiative may negatively affect the flexibility enjoyed by people working through platforms. However, such flexibility, especially in terms of arranging work schedules, may be only apparent already now, since actual working times depend on the real-time demand for services, supply of workers, and other factors. It was not possible to meaningfully quantify what this would entail in terms of change in full-time equivalents and potential job losses, given the very high number of variables such calculation would entail (e.g. evolving national regulatory landscapes, shifts in platforms’ sources of investment, reallocation of tasks from part-time false self-employed to full-time workers). For some people working through digital labour platforms currently earning above the minimum wage, reclassification might lead to lower wages, as some digital labour platforms might offset higher social protection costs by reducing salaries.

Other measures considered in the Impact Assessment included: non-binding guidelines on how to deal with misclassification cases; a combination of a shift in the burden of proof with out-of-court administrative procedures for dealing with the misclassification of the employment status; non-binding guidelines on algorithmic management; algorithmic management rights for workers only; data interoperability obligations for digital labour platforms; setting up national registers to improve relevant data collection and keep track of platform work developments, including in cross border situations. They were overall considered less efficient, less effective and less coherent vis-à-vis the stated objectives of the initiative, as well as with the overarching values, aims, objectives and existing and forthcoming initiatives of the EU.

Regulatory fitness and simplification

The initiative includes different sets of measures, some of which aim at minimising compliance costs for micro, small and medium-sized enterprises (SME). While measures addressing the risk of misclassification cannot be mitigated because they directly pertain to fundamental workers’ rights, the administrative procedures required by the measures on algorithmic management and on improving enforcement, traceability and transparency allow for mitigations tailored to SMEs. Notably, these include longer deadlines to provide responses for requests of review of algorithmic decisions and the reduction in the frequency of updating relevant information.

Fundamental rights

The Charter of Fundamental Rights of the European Union protects a broad range of rights in the employment context. These include the workers’ right to fair and just working conditions (Article 31) and to information and consultation within the undertaking (Article 27) , as well as the right to the protection of personal data (Article 8) and the freedom to conduct a business (Article 16). The proposed Directive promotes the rights contained in the Charter in the platform work context by addressing the employment status misclassification and by putting forward specific provisions regarding the use of automated monitoring and decision-making systems in platform work. It also strengthens information and consultation rights for platform workers and their representatives on decisions likely to lead to the introduction of or substantial changes in the use of automated monitoring and decision-making systems.

4. BUDGETARY IMPLICATIONS

The proposal does not require additional resources from the European Union's budget.

5. OTHER ELEMENTS

Implementation plans and monitoring, evaluation and reporting arrangements

Member States must transpose the Directive two years after it enters into force and communicate to the Commission the national execution measures via the MNE-Database. In line with Article 153(3) TFEU, they may entrust the social partners with the implementation of the Directive. The Commission stands ready to provide technical support to Member States to implement the Directive.

The Commission will review the implementation of the Directive five years after it enters into force and propose, where appropriate, legislative amendments. Progress towards achieving the objectives of the initiative will be monitored by a series of core indicators (listed in the Impact Assessment Report). The monitoring framework will be subject to further adjustment according to the final legal and implementation requirements and timeline.

Explanatory documents (for directives)

The proposed Directive touches on labour law, specifies and complements data protection rules and contains both substantive and procedural rules. Member States might use different legal instruments to transpose it. It is therefore justified that Member States accompany the notification of their transposition measures with one or more documents explaining the relationship between the components of the Directive and the corresponding parts of national transposition instruments, in accordance with the Joint Political Declaration of 28 September 2011 of Member States and the Commission on explanatory documents. 50

Detailed explanation of the specific provisions of the proposal

Chapter I – General provisions

Article 1 – Subject matter and scope

This provision establishes the purpose of the Directive, namely to improve the working conditions of persons performing platform work by ensuring correct determination of their employment status, by promoting transparency, fairness and accountability in algorithmic management in platform work and by improving transparency in platform work, including in cross-border situations.

This article also defines the personal scope of the Directive, which includes persons performing platform work in the Union, irrespective of their employment status, albeit to a various extent depending on the provisions concerned. As a general rule, the Directive covers persons who have, or who based on an assessment of facts may be deemed to have, an employment contract or employment relationship as defined by the law, collective agreements or practice in force in the Member States, with consideration to the case-law of the CJEU. This approach is meant to include situations where the employment status of the person performing platform work is not clear, including instances of false self-employment, so as to allow correct determination of that status.

However, the provisions of the chapter on algorithmic management, which are related to the processing of personal data and thus covered by the legal basis of Article 16(2) TFEU, also apply to persons performing platform work in the Union who do not have an employment relationship, i.e. the genuine self-employed and those with another employment status that may exist in some Member States.

The digital labour platforms concerned by this proposal are those which organise platform work in the Union, irrespective of their place of establishment and irrespective of the law otherwise applicable. The decisive element for the territorial applicability is thus the place where platform work is performed and not the place where the digital labour platform is established or where the service to the recipient is offered or provided.

Article 2 – Definitions

This provision defines a number of terms and concepts necessary to interpret the provisions of the Directive, including ‘digital labour platform’, ‘platform work’ and ‘representative’. It distinguishes between ‘persons performing platform work’ – irrespective of their employment status – and ‘platform workers’ – who are in an employment relationship.

Chapter II – Employment status

Article 3 – Correct determination of the employment status

This article requires Member States to have in place appropriate procedures to verify and ensure the correct determination of the employment status of persons performing platform work, so as to allow persons that are possibly misclassified as self-employed (or any other status) to ascertain whether they should be considered to be in an employment relationship – in line with national definitions – and, if so, to be reclassified as workers. This will ensure that false self-employed have the possibility to obtain access to working conditions laid down in Union or national law in line with their correct employment status.

The provision also clarifies that the correct determination of the employment status should be based on the principle of the primacy of facts, i.e. guided primarily by the facts relating to the actual performance of work and the remuneration, taking into account the use of algorithms in platform work, and not by how the relationship is defined in the contract. Where an employment relationship exists, the procedures in place should also clearly identify who is to assume the obligations of the employer.

Article 4 – Legal presumption

This provision establishes the legal presumption that an employment relationship exists between the digital labour platform and a person performing platform work, if the digital labour platform controls certain elements of the performance of work. Member States are required to establish a framework to ensure that the legal presumption applies in all relevant administrative and legal proceedings and that enforcement authorities, such as labour inspectorates or social protection bodies, can also rely on that presumption.

The article defines criteria that indicate that the digital labour platform controls the performance of work. The fulfillment of at least two indicators should trigger the application of the presumption.

Member States are also required to ensure effective implementation of the legal presumption through supporting measures, such as disseminating information to the public, developing guidance and strengthening controls and field inspections, which are essential to ensure legal certainty and transparency for all parties involved.

The provision also clarifies that the legal presumption should not have retroactive effects, i.e. should not apply to factual situations before the transposition deadline of the Directive.

Article 5 – Possibility to rebut the legal presumption

This provision ensures the possibility to rebut the legal presumption in relevant legal and administrative proceedings, i.e. to prove that the contractual relationship at stake is in fact not an ‘employment relationship’ in line with the definition in force in the Member State concerned. The burden of proof that there is no employment relationship will be on the digital labour platform.

Chapter III – Algorithmic management

Article 6 – Transparency on and use of automated monitoring and decision-making systems

This provision requires digital labour platforms to inform platform workers of the use and key features of automated monitoring systems – which are used to monitor, supervise or evaluate the work performance of platform workers through electronic means – and automated decision making systems – which are used to take or support decisions that significantly affect platform workers’ working conditions.

The information to be provided includes the categories of actions monitored, supervised and evaluated (including by clients) and the main parameters that such systems take into account for automated decisions. The article specifies in what form and at which point in time this information is to be provided and that it should also be made available to labour authorities and platform workers’ representatives upon request.

In addition, the article provides that digital labour platforms must not process any personal data concerning platform workers that are not intrinsically connected to and strictly necessary for the performance of their contract. This includes data on private conversations, on the health, psychological or emotional state of the platform worker and any data while the platform worker is not offering or performing platform work.

Article 7 – Human monitoring of automated systems

This provision requires digital labour platforms to regularly monitor and evaluate the impact of individual decisions taken or supported by automated monitoring and decision-making systems on working conditions. In particular, digital labour platforms will have to evaluate the risks of automated monitoring and decision-making systems to the safety and health of platform workers and ensure that such systems do not in any manner put undue pressure on platform workers or otherwise put at risk the physical and mental health of platform workers.

The article also stipulates the need for digital labour platforms to ensure sufficient human resources for this monitoring of automated systems. The persons charged by the digital labour platform with that task must have the necessary competence, training and authority to exercise their function and must be protected from negative consequences (such as dismissal or other sanctions) for overriding automated decisions.

Article 8 – Human review of significant decisions

This provision establishes the right for platform workers to obtain an explanation from the digital labour platform for a decision taken or supported by automated systems that significantly affects their working conditions. For that purpose the digital labour platform should provide the possibility for them to discuss and clarify the facts, circumstances and reasons for such decisions with a human contact person at the digital labour platform.

In addition, the article requires digital labour platforms to provide a written statement of reasons for any decision to restrict, suspend or terminate the platform worker’s account, to refuse the remuneration for work performed by the platform worker, or affecting the platform worker’s contractual status.

Where the explanation obtained is not satisfactory or where platform workers consider their rights infringed, they also have the right to request the digital labour platform to review the decision and to obtain a substantiated reply within a week. Digital labour platforms have to rectify the decision without delay or, if that is not possible anymore, to provide adequate compensation, if the decision infringes the platform worker’s rights.

Article 9 – Information and consultation

This provision requires digital labour platforms to inform and consult platform workers’ representatives or, if there are no representatives, the platform workers themselves on algorithmic management decisions, for instance if they intend to introduce new automated monitoring or decision-making systems or make substantial changes to those systems. The aim of this provision is to promote social dialogue on algorithmic management. Given the complexity of the subject matter, the representatives or the platform workers concerned can be assisted by an expert of their choice. This article is without prejudice to existing information and consultation requirements under Directive 2002/14/EC.

Article 10 – Persons performing platform work who do not have an employment relationship

This provision ensures that the provisions on transparency, human monitoring and review of Articles 6, 7 and 8 – which relate to the processing of personal data by automated systems – also apply to persons performing platform work who do not have an employment contract or employment relationship, i.e. the genuine self-employed. This does not include the provisions on health and safety at work, which are specific to workers.

This is without prejudice to the provisions of the Platforms-to-Business Regulation (2019/1150). Its provisions prevail if they cover specific aspects of the Directive in respect of self-employed ‘business users’ within the meaning of the Regulation. Article 8 does not apply to ‘business users’ at all.

Chapter IV – Transparency on platform work

Article 11 – Declaration of platform work

This provision clarifies that digital labour platforms which are employers have to declare work performed by platform workers to the competent labour and social protection authorities of the Member State in which the work is performed and to share relevant data with those authorities, in accordance with national rules and procedures. This clarification is particularly relevant for digital labour platforms which are established in another country than the one where platform work is performed.

Article 12 – Access to relevant information on platform work

This provision requires digital labour platforms to make certain information accessible to labour, social protection and other relevant authorities ensuring compliance with legal obligations and the representatives of persons performing platform work. This information includes the number of persons performing platform work through the digital labour platform concerned on a regular basis and their contractual or employment status, as well as the general terms and conditions applicable to those contractual relationships. The information should be regularly updated and further clarifications and details should be provided at request.

Chapter V – Remedies and enforcement

Article 13 – Right to redress

This provision requires Member States to provide access to effective and impartial dispute resolution and a right to redress and, where appropriate, adequate compensation, for infringements of the rights established under the Directive.

Article 14 – Procedures on behalf or in support of persons performing platform work

This provision enables representatives of persons performing platform work or other legal entities which have a legitimate interest in defending the rights of persons performing platform work, to engage in any judicial or administrative procedure to enforce any of the rights or obligations under this proposal. Such entities should have the right to act on behalf or in support of a person performing platform work, with the person’s approval, in such procedures and also to bring claims on behalf of more than one person performing platform work. This aims at overcoming the procedural and cost-related obstacles that persons performing platform work face in particular when seeking to have their employment status correctly determined.

Article 15 – Communication channels for persons performing platform work

This article requires digital labour platforms to create the possibility for persons performing platform work to contact and communicate with each other, and to be contacted by representatives of persons performing platform work, through the digital labour platforms’ digital infrastructure or similarly effective means. The aim is to ensure the possibililty for persons performing platform work to get to know and communicate with each other, also in view of defending their interests, despite the lack of a common place of work.

Article 16 – Access to evidence

This article ensures that national courts or other competent authorities can order the digital labour platform to disclose relevant evidence lying in their control, during proceedings concerning a claim regarding correct determination of the employment status of persons performing platform work. This includes evidence containing confidential information – such as relevant data on algorithms – where they consider it relevant to the claim, provided that effective measures are in place to protect this information.

Article 17 – Protection against adverse treatment or consequences

This provision requires Member States to provide persons performing platform work complaining about breaches of provisions adopted pursuant to the Directive with adequate judicial protection against any adverse treatment or consequences by the digital labour platform.

Article 18 – Protection from dismissal

If a person performing platform work considers that he or she has been dismissed or subject to equivalent detriment (such as the deactivation of the account) on the ground that he or she exercises rights established in the Directive, and is able to establish facts which support this assertion, this provision places on the digital labour platform the burden to prove that the dismissal or alleged detrimental treatment was based on other objective reasons.

Article 19 – Supervision and penalties

This provision clarifies that the procedural framework for the enforcement of GDPR rules, in particular as regards supervision, cooperation and consistency mechanisms, remedies, liability and penalties applies to the provisions on algorithmic management which are based on Article 16 TFEU and that the data protection supervisory authorities are competent to monitor the application of those provisions, including the power to impose administrative fines.

The provision requires labour and social protection authorities and data protection supervisory authorities to cooperate, including by exchanging relevant information.

It also requires Member States to provide for effective, proportionate and dissuasive penalties for breaches of the obligations under this Directive, and to make sure that they are applied.

Chapter VI – Final provisions

Article 20 – More favourable provisions

This provision allows Member States to provide a higher level of protection for workers than that guaranteed by the Directive, and preventing its use to lower existing standards in the same fields. This applies to self-employed persons only insofar as more favourable rules are compatible with internal market rules.

Article 21 – Implementation

This provision establishes the maximum period that Member States have in order to transpose the Directive into national law and communicate the relevant texts to the Commission. This period is set at two years after the date of entry into force. Moreover, it highlights that Member States may entrust the social partners with the implementation of the Directive, where social partners request to do so and as long as the Member States take all the necessary steps to ensure that they can at all times guarantee the results sought under this Directive.

Article 22 – Review by the Commission

This is a standard provision requiring the Commission to review the implementation of this Directive five years after its entry into force, and to assess the need to revise and update the Directive.

Articles 23 and 24 – Entry into force and Addressees

These provisions stipulate that the Directive is to enter into force on the twentieth day following its publication in the Official Journal and is addressed to the Member States.