Explanatory Memorandum to COM(2018)640 - Preventing the dissemination of terrorist content online - Contribution to the Leaders’ meeting, September 2018

Please note

This page contains a limited version of this dossier in the EU Monitor.



1. CONTEXTOFTHEPROPOSAL

1.1. Reasons for and objectives of the proposal

The ubiquity of the internet allows its users to communicate, work, socialise, create, obtain and share information and content with hundreds of millions of individuals across the globe. Internet platforms generate significant benefits for users’ economic and social wellbeing across the Union and beyond. However, the ability to reach such a large audience at minimal cost also attracts criminals who want to misuse the internet for illegal purposes. Recent terrorist attacks on EU soil have demonstrated how terrorists misuse the internet to groom and recruit supporters, to prepare and facilitate terrorist activity, to glorify in their atrocities and urge others to follow suit and instil fear in the general public.

Terrorist content shared online for such purposes is disseminated through hosting service providers that allow the upload of third party content. Terrorist content online has proven instrumental in radicalising and inspiring attacks from so-called lone wolves in several recent terrorist attacks within Europe. Such content not only creates significantly negative impacts on individuals and society at large, but it also reduces the trust of users in the internet and affects the business models and reputation of those companies affected Terrorists have misused not only large social media platforms, but increasingly smaller providers offering different types of hosting services globally. This misuse of the internet highlights the particular societal responsibility of internet platforms to protect their users from exposure to terrorist content and the grave security risks this content entails for society at large.

Hosting service providers, responding to calls from public authorities, have put in place certain measures to tackle terrorist content on their services. Progress has been made through voluntary frameworks and partnerships including the EU Internet Forum which was launched in December 2015 under the European Agenda on Security. The EU Internet Forum has promoted Member States' and hosting service providers’ voluntary cooperation and actions to reduce the accessibility to terrorist content online and to empower civil society to increase the volume of effective, alternative narratives online. These efforts have contributed to increased cooperation, improving responses by companies to referrals by national authorities as well as Europol’s Internet Referral Unit, the deployment of voluntary proactive measures to improve automated detection of terrorist content, increased cooperation between the industry -including the development of the “database of hashes” to prevent known terrorist content from being uploaded on connected platforms-, as well as increased transparency in efforts. While cooperation under the EU Internet Forum should continue in the future, the voluntary arrangements have also shown their limitations. Firstly, not all affected hosting service providers have engaged in the Forum and secondly, the scale and pace of progress among hosting service providers as a whole is not sufficient to adequately address this problem.

Given these limitations, there is a clear need for enhanced action from the European Union against terrorist content online. On 1 March 2018, the Commission adopted a Recommendation on measures to effectively tackle illegal content online, building upon the Commission Communication of September1 as well as efforts under the EU Internet Forum. The Recommendation included a specific chapter identifying a number of measures to effectively stem the uploading and sharing of terrorist propaganda online, such as

Communication (COM(2017) 555 final) on tackling illegal content online.

2.

improvements to the referral process, a one-hour timeframe for responding to referrals, more proactive detection, effective removal and sufficient safeguards to accurately assess terrorist


content.2

The need to enhance action in relation to terrorist content online has also been reflected in calls by EU Member States, and some have already legislated or have expressed plans to do so. Following a series of terrorist attacks in the EU and given the fact that terrorist content online continues to be easily accessible, the European Council of 22-23 June 2017 called for industry to “develop new technology and tools to improve the automatic detection and removal of content that incites to terrorist acts. This should be complemented by the relevant legislative measures at EU level, if necessary". The European Council of 28 June 2018 welcomed “the intention of the Commission to present a legislative proposal to improve the detection and removal of content that incites hatred and to commit terrorist acts”. Furthermore, the European Parliament, in its resolution on online platforms and the digital single market of 15 June 2017 urged the platforms concerned 'to strengthen measures to tackle illegal and harmful content', and called on the Commission to present proposals to address these issues.

To address these challenges and in responding to the calls by Member States and the European Parliament, this Commission proposal seeks to establish a clear and harmonised legal framework to prevent the misuse of hosting services for the dissemination of terrorist content online, in order to guarantee the smooth functioning of the Digital Single market, whilst ensuring trust and security. This Regulation seeks to provide clarity as to the responsibility of hosting service providers in taking all appropriate, reasonable and proportionate actions necessary to ensure the safety of their services and to swiftly and effectively detect and remove terrorist content online, taking into account the fundamental importance of the freedom of expression and information in an open and democratic society. It also introduces a number of necessary safeguards designed to ensure full respect for fundamental rights such as freedom of expression and information in a democratic society, in addition to judicial redress possibilities guaranteed by the right to an effective remedy as enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the EU.

By setting a minimum set of duties of care on hosting service providers which includes some specific rules and obligations, as well as obligations on Member States, the proposal intends to increase the effectiveness of current measures to detect, identify and remove terrorist content online without encroaching on fundamental rights, such as freedom of expression and information. Such a harmonised legal framework will facilitate the provision of online services across the Digital Single Market, ensure a level playing field for all hosting service providers directing their services to the European Union and will provide a solid legal framework for the detection and removal of terrorist content accompanied by appropriate safeguards to protect fundamental rights. In particular, obligations for transparency will increase trust among citizens, and in particular internet users, and improve the accountability and transparency of companies' actions, including in respect to public authorities. The proposal also sets out obligations to put in place remedies and complaint mechanisms to ensure that users can challenge the removal of their content. Obligations on Member States will contribute to these objectives, as well as improve the ability of relevant authorities to take appropriate action against terrorist content online and to combat crime. Where hosting service providers fail to comply with the Regulation, Member States may impose penalties.

Recommendation (C(2018)1177 final) of 1 March 2018 on measures to effectively tackle illegal content online.

2

1.2. Consistency with existing EU legal framework in the policy area

The present proposal is consistent with the acquis related to the Digital Single Market and in particular the E-Commerce Directive. Notably, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for, under certain conditions, in Article 14 of the E-Commerce Directive. A decision by national authorities to impose proportionate and specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as defined in Article 15(1) of Directive 2000/31/EC towards Member States. However, given the particularly grave risks associated with the dissemination of terrorist content, the decisions under this Regulation may exceptionally derogate from this principle under an EU framework. Before adopting such decisions, the competent authority should strike a fair balance between public security needs and the affected interests and fundamental rights including in particular the freedom of expression and information, freedom to conduct a business, protection of personal data and privacy. Hosting service providers’ duties of care should reflect and respect this balance which is expressed in the E-Commerce Directive.

The proposal is also consistent and closely aligned with the Directive (EU) 2017/541 on Combatting Terrorism, the aim of which is to harmonise Member States' legislation criminalising terrorist offences. Article 21 of the Directive on Combating Terrorism requires Member States to take measures ensuring the swift removal of online content limited to public provocation and leaving Member States the choice of the measures. This Regulation, given its preventative nature, covers not only material inciting terrorism but also material for recruitment or training purposes, reflecting other offences related to terrorist activities, which are also covered by Directive (EU) 2017/541. This Regulation directly imposes duties of care on hosting service providers to remove terrorist content and harmonises procedures for removal orders with the aim to reduce accessibility to terrorist content online.

The Regulation complements the rules laid down in the future Audiovisual Media Services Directive insofar as its personal and material scope are broader. The Regulation does not only cover video sharing platforms but all different kinds of hosting service providers. Moreover, it covers not only videos but also images and text. Furthermore, the present Regulation goes beyond the Directive in terms of substantive provisions by harmonising rules for requests to remove terrorist content as well as proactive measures.

The proposed Regulation builds upon the Commission's Recommendation3 on illegal content of March 2018. The Recommendation remains in force, and all those who have a role to play in reducing accessibility to illegal content – including terrorist content - should continue to align their efforts with the measures identified within the Recommendation.

1.3. Summary of the proposed Regulation

The personal scope of the proposal includes hosting service providers who offer their services within the Union, regardless of their place of establishment or their size. The proposed legislation introduces a number of measures to prevent the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the Digital Single Market, whilst ensuring trust and security. The definition of illegal terrorist content is in accordance with the definition of terrorist offences as set out in Directive (EU) 2017/541 and is defined as information which is used to incite and glorify the commission of

Recommendation (C(2018)1177 final) of 1 March 2018 on measures to effectively tackle illegal content online.

3

terrorist offences, encouraging the contribution to and providing instructions for committing terrorist offences as well as promoting participation in terrorist groups.

To ensure the removal of illegal terrorist content, the Regulation introduces a removal order which can be issued as an administrative or judicial decision by a competent authority in a Member State. In such cases, the hosting service provider is obliged to remove the content or disable access to it within one hour. In addition, the Regulation harmonises the minimum requirements for referrals sent by Member States’ competent authorities and by Union bodies (such as Europol) to hosting service providers to be assessed against their respective terms and conditions. Finally, the Regulation requires hosting service providers, where appropriate, to take proactive measures proportionate to the level of risk and to remove terrorist material from their services, including by deploying automated detection tools.

The measures designed to reduce terrorist content online are accompanied by a number of key safeguards to ensure the full protection of fundamental rights. As part of the measures to protect content which is not terrorist content from erroneous removal, the proposal sets out obligations to put in place remedies and complaint mechanisms to ensure that users can challenge the removal of their content. In addition, the Regulation introduces obligations on transparency for the measures taken against terrorist content by hosting service providers, thereby ensuring accountability towards users, citizens and public authorities.

The Regulation also obliges Member States to ensure that their competent authorities have the necessary capacity to intervene against terrorist content online. In addition, Member States are obliged to inform and cooperate with each other and may make use of channels set up by Europol to ensure co-ordination with regards to removal orders and referrals. The regulation also foresees obligations on hosting service providers to report in more detail on the measures taken and inform law enforcement when they detect content which poses a threat to life or safety. Finally, there is an obligation on hosting service providers to preserve the content they remove, which functions as a safeguard against erroneous removal and ensures potential evidence is not lost for the purpose of the prevention, detection, investigation and prosecution of terrorist offences.

2. LEGALBASIS, SUBSIDIARITYAND PROPORTIONALITY

2.1. Legal basis

The legal basis is Article 114 of the Treaty on the Functioning of the European Union, which provides for the establishment of measures to ensure the functioning of the Internal Market.

Article 114 is the appropriate legal basis to harmonise the conditions for hosting service providers to provide services across borders in the Digital Single Market and to address differences between Member State provisions which might otherwise obstruct the functioning of the internal market. It also prevents the emergence of future obstacles to economic activity resulting from differences in the way national laws might develop.

Article 114 TFEU can also be used to impose obligations on services providers established outside the territory of the EU where their service provision affects the internal market, since this is necessary for the desired internal market goal pursued.

2.2. Choice of

the instrument

Article 114 TFEU gives the Union’s legislator the possibility to adopt regulations and directives.

As the proposal concerns obligations on service providers usually offering their services in more than one Member State, divergence in the application of these rules would hinder the provision of services by providers operating in multiple Member States. A regulation allows for the same obligation to be imposed in a uniform manner across the Union, is directly applicable, provides clarity and greater legal certainty and avoids divergent transposition in the Member States. For these reasons the most appropriate form to be used for this instrument is considered to be a regulation.

2.3. Subsidiarity

Given the cross-border dimension of the problems addressed, the measures included in the proposal need to be adopted at Union level in order to achieve the objectives. The internet is by its nature cross-border, and content hosted in one Member State can normally be accessed from any other Member State.

A fragmented framework of national rules to tackle terrorist content online is appearing and risks increasing. This would result in a burden for companies to comply with diverging regulations and create unequal conditions for companies as well as security loopholes.

EU action therefore enhances legal certainty and increases the effectiveness of hosting service providers’ actions against terrorist content online. This should allow more companies to take action, including companies established outside the European Union, strengthening the integrity of the digital single market.

This justifies the need for EU action, as echoed by the European Council Conclusions of June 2018 inviting the Commission to present a legislative proposal in this area.

2.4. Proportionality

The proposal lays down rules for hosting service providers to apply measures to expeditiously remove terrorist content from their services. Key features limit the proposal to only that which is necessary to achieve the policy objectives.

The proposal takes into account the burden on hosting service providers and safeguards, including the protection of freedom of expression and information as well as other fundamental rights. The one-hour timeframe for removal only applies to removal orders, for which competent authorities have determined illegality in a decision which is subject to judicial review. For referrals, there is an obligation to put in place measures to facilitate the expeditious assessment of terrorist content, without however imposing obligations to remove it, nor within absolute deadlines. The final decision remains a voluntary decision by the hosting service provider. The burden on companies to assess the content is alleviated by the fact that the competent authorities of Member States and Union bodies provide explanations why the content may be considered terrorist content. Hosting service providers shall, where appropriate, take proactive measures to protect their services against the dissemination of terrorist content. Specific obligations related to proactive measures are limited to those hosting service providers exposed to terrorist content, as evidenced by the receipt of a removal order which has become final, and should be proportionate to the level of risk as well as resources of the company. The preservation of the removed content and related data is limited for a period of time proportionate to the purposes of enabling proceedings of administrative or judicial review and for the prevention, detection, investigation and prosecution of terrorist offences.

3. RESULTS OF EX-POST EVALUATIONS, STAKEHOLDER

1.

CONSULTATIONS


ANDIMPACTASSESSMENT


3.1. Stakeholder consultations

In preparing this legislative proposal, the Commission consulted all relevant stakeholders to understand their views and a potential way forward. The Commission conducted an open public consultation on measures to improve the effectiveness of tackling illegal content, receiving 8,961 replies, of which 8,749 were from individuals, 172 from organisations, 10 from public administrations, and 30 from other categories of respondents. In parallel, a Eurobarometer survey was conducted with a random sample of 33,500 EU residents on illegal content online. The Commission also consulted Member States’ authorities as well as hosting service providers throughout May and June 2018 with regards to specific measures to tackle terrorist content online.

By and large, most stakeholders expressed that terrorist content online is a serious societal problem affecting internet users and business models of hosting service providers. More generally, 65% of respondent to the Eurobarometer4 survey considered that the internet is not safe for its users and 90% of the respondents consider it important to limit the spread of illegal content online. Consultations with Member States revealed that while voluntary arrangements are producing results, many see the need for binding obligations on terrorist content, a sentiment echoed in the European Council Conclusions of June 2018. While overall, the hosting service providers were in favour of the continuation of voluntary measures, they noted the potential negative effects of emerging legal fragmentation in the Union.

Many stakeholders also noted the need to ensure that any regulatory measures for removal of content, particularly proactive measures and strict timeframes, should be balanced with safeguards for fundamental rights, notably freedom of speech. Stakeholders noted a number of necessary measures relating to transparency, accountability as well as the need for human review in deploying automated tools.

3.2. Impact Assessment

The Regulatory Scrutiny Board issued a positive opinion on the impact assessment with reservations and made various suggestions for improvement5. Following this opinion, the Impact Assessment report was amended to address the main comments of the Board, setting the focus specifically on terrorist content while further emphasising the implications on the functioning of the Digital Single Market as well as providing a more in depth analysis of the impact on fundamental rights and the functioning of the safeguards proposed in the options.

If no additional measures were taken, voluntary actions under the baseline would be expected to continue and have some impact on reducing terrorist content online. However, voluntary measures are unlikely to be taken by all hosting service providers exposed to such content, and further legal fragmentation is expected to emerge introducing additional barriers to crossborder service provision. Three main policy options were considered besides the baseline scenario with increasing levels of effectiveness in addressing the objectives set out in the impact assessment and the overall policy goal of reducing terrorist content online.

3.

The scope of these obligations in all three options focused on all hosting service providers (personal scope) established in the EU and in third countries - insofar as they offer their


4 Eurobarometer 469, Illegal content online, June 2018.

5 Link to the RSB opinion on RegDoc.

services in the Union (geographic scope). Given the nature of the problem and the need to avoid the abuse of smaller platforms, no exemptions are foreseen for SMEs under any of the options. All options would require hosting service providers to establish a legal representative in the EU – including for companies established outside the EU – so as to ensure enforceability of EU rules. Under all options, Member States were foreseen to develop sanction mechanisms.

All options envisaged the creation of a new, harmonised system of legal removal orders relating to terrorist content online, issued by national authorities to hosting service providers and the requirement to remove that content within one hour. These orders would not necessarily require an assessment on the part of the hosting service providers, and would be subject to judicial redress.

Safeguards, notably complaint procedures and effective remedies, including judicial redress as well as other provisions to prevent the erroneous removal of content which is not terrorist content, whilst ensuring compliance with fundamental rights, are all common features of the three options. Furthermore, all options include reporting obligations in the form of public transparency and reporting to Member States and the Commission, as well as towards authorities for suspected criminal offences. In addition, cooperation obligations between national authorities, hosting service providers, and where relevant Europol are foreseen.

The main differences between the three options relate to the scope of the definition of terrorist content, the level of harmonisation of referrals, the scope of proactive measures, co-ordination obligations on Member States, as well as data preservation requirements. Option 1 would limit the material scope to content disseminated to directly incite to commit a terrorist act, following a narrow definition, while options 2 and 3 would adopt a more comprehensive approach, covering also material concerning recruitment and training. On proactive measures, under option 1, hosting service providers exposed to terrorist content would need to carry out a risk assessment but proactive measures addressing the risk would remain voluntary. Option

4.

2 would require hosting service providers to prepare an action plan which may include deploying automated tools for the prevention of re-upload of already removed content. Option


3 includes more comprehensive proactive measures requiring service providers exposed to terrorist content to also identify new material. In all options, the requirements related to proactive measures would be proportionate to the level of exposure to terrorist material as well as the economic capacities of the service provider. With regards to referrals, option 1 would not harmonise the approach to referrals whereas option 2 would do so for Europol and option 3 would additionally include Member States referrals. Under options 2 and 3, Member States would be obliged to inform, coordinate and cooperate with each other and in option 3 they would also have to ensure that their competent authorities have the capacity to detect and notify terrorist content. Finally, option 3 also includes a requirement to preserve data as a safeguard in cases of erroneous removal and to facilitate criminal investigations.

In addition to the legal provisions, all legislative options were envisaged to be accompanied by a series of supporting measures, in particular to facilitate cooperation across national authorities and Europol, as well as the collaboration with hosting service providers, and Research, Development and Innovation support for development and take-up of technological solutions. Additional awareness-raising and supporting instruments for SMEs could also be deployed following the adoption of the legal instrument.

The Impact Assessment concluded that a series of measures are required to achieve the policy objective. The comprehensive definition of terrorist content capturing the most harmful material would be preferable to a narrow definition of content (option 1). Proactive

obligations limited to preventing the re-upload of terrorist content (option 2) would be less impactful compared to obligations related to the detection of new terrorist content (option 3). Provisions on referrals should include referrals from both Europol and Member States (option 3) and not be limited to just referrals from Europol (option 2) as referrals from Member States are an important contribution as part of the overall effort on reducing accessibility to terrorist content online. Such measures would need to be implemented in addition to the measures common to all options, including robust safeguards against erroneous removal of content.

3.3. Fundamental rights

Terrorists’ online propaganda seeks to incite individuals to carry out terrorist attacks, including by equipping them with detailed instructions on how to inflict maximum harm. Further propaganda is commonly released after such atrocities, whereby they glorify in these acts, and encourage others to follow suit. This Regulation contributes to the protection of public security, by reducing accessibility to terrorist content that promotes and encourages the violation of fundamental rights.

The proposal could potentially affect a number of fundamental rights:

(a) rights of the content provider: right to freedom of expression; right to protection of personal data; right to respect of private and family life, the principle of non-discrimination and the right to an effective remedy;

(b) rights of the service provider: right to freedom to conduct a business; right to an effective remedy;

(c) rights of all citizens: and right to freedom of expression and information.

Taking into account the relevant acquis, appropriate and robust safeguards are included in the proposed Regulation to ensure that the rights of these persons are protected.

A first element in this context is that the Regulation establishes a definition of terrorist content online in accordance with the definition of terrorist offences in Directive (EU) 2017/541. This definition applies to removal orders and referrals, as well as to proactive measures. This definition ensures that only illegal content which corresponds to a Union-wide definition of related criminal offences is to be removed. In addition, the Regulation includes general duties of care for hosting services providers to act in a diligent, proportionate and non-discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions with a view to avoiding removal of content which is not terrorist content.

More specifically, the Regulation has been designed to ensure proportionality of the measures taken with respect to fundamental rights. Regarding removal orders, the assessment of the content (including legal checks, where necessary) by a competent authority justifies the one-hour removal time limit for this measure. Furthermore, the provisions in this Regulation that relate to referrals are limited to those sent by competent authorities and Union bodies

providing explanations why the content may be considered terrorist content. While the responsibility for removing content identified in a referral remains with the hosting service provider, this decision is facilitated by the aforementioned assessment.

For proactive measures, the responsibility for identifying, assessing and removing content remains with the hosting service providers and they are required to put in place safeguards to ensure content is not removed erroneously, including through human review, particularly if further contextualisation is required. Furthermore, unlike in the baseline scenario where the

most affected companies set up automated tools without public oversight, the design of the measures as well as their implementation would be subject to reporting to competent bodies in Member States. This obligation reduces the risks of erroneous removals both for companies setting up new tools as well as for those who are already using them. In addition, hosting service providers are required to provide user-friendly complaint mechanisms for content providers to contest the decision to remove their content as well as publish transparency reports to the general public.

Finally, should any content and related data be removed erroneously despite these safeguards, hosting service providers are required to preserve it for a period of six months to be able to reinstate it to ensure the effectiveness of complaint and review procedures in view of protecting freedom of expression and information. At the same time, the preservation also contributes to law enforcement purposes. Hosting service providers need to put in place technical and organisational safeguards to ensure the data is not used for other purposes.

The proposed measures, in particular those related to removal orders, referrals, proactive measures and the preservation of data should not only protect internet users against terrorist content but also contribute to protecting the right of citizens to life by reducing the accessibility of terrorist content online.

4. BUDGETARYIMPLICATIONS

The legislative proposal for a Regulation does not have an impact on the Union’s budget.

5. OTHERELEMENTS 5.1. Implementation plans and monitoring, evaluation and reporting arrangements

The Commission will establish within [one year from the date of application of this Regulation] a detailed programme for monitoring the outputs, results and impacts of this Regulation. The monitoring programme shall set out the indicators and the means by which and the intervals at which the data and other necessary evidence will be collected. It shall specify the actions to be taken by the Commission and by the Member States in collecting and analysing the data and other evidence to monitor the progress and evaluate this Regulation.

On the basis of the established monitoring programme, within two years of the entry into force of this Regulation, the Commission will report on the implementation of this Regulation based on the transparency reports published by companies as well as information provided by Member States. The Commission will carry out an evaluation no sooner than four years after

the

Regulation entered into force.

Based on the findings of the evaluation, including whether certain gaps or vulnerabilities remain, and taking into account technological developments, the Commission will assess the need to enlarge the scope of the Regulation. If necessary, the Commission will submit proposals to adapt this Regulation.

The Commission will support the implementation, monitoring and evaluation of the Regulation through a Commission expert group. The group will also facilitate the cooperation between hosting service providers, law enforcement and Europol; foster exchanges and practices to detect and remove terrorist content, provide its expertise on the evolution of terrorists' modus operandi online; as well as provide advice and guidance where appropriate to allow for the implementation of the provisions.

The implementation of the proposed Regulation could be facilitated through a number of supporting measures. These include the possible development of a platform within Europol to assist in the co-ordination of referrals and removal orders. EU funded research about how terrorists' modus operandi is evolving enhances the understanding and awareness of all relevant stakeholders. In addition, Horizon 2020 supports research with a view to developing new technologies, including automated prevention of uploading of terrorism content. Furthermore, the Commission will continue analysing how to support competent authorities and hosting service providers in the implementation of this Regulation through EU financial instruments.

5.2. Detailed explanation of the specific provisions of the proposal

Article 1 sets out the subject matter, indicating that the Regulation lays down rules to prevent the misuse of hosting services for the dissemination of terrorist content online, including duties of care on hosting service providers and measures to be put in place by Member States. It also sets out the geographical scope, covering hosting service providers offering services in the Union, irrespective of their place of establishment.

Article 2 provides definitions of terms used in the proposal. It also establishes a definition of terrorist content for preventative purposes drawing on the Directive on Combating Terrorism to capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group.

Article 3 provides for duties of care to be applied by hosting service providers when taking action in accordance with this Regulation and in particular, with due regard to the fundamental rights involved. It provides for appropriate provisions to be put in place within hosting service providers' terms and conditions and to then ensure that these are applied.

Article 4 requires Member States to empower competent authorities to issue removal orders and lays down a requirement for hosting service providers to remove content within one hour of the receipt of a removal order. It also sets out the minimum elements removal orders should contain and procedures for hosting service providers to give feedback to the issuing authority, and to inform the latter if it is not possible to comply with the order or if further clarification is required. It also requires the issuing authority to inform the authority overseeing proactive measures of the Member State of jurisdiction of the hosting service provider.

Article 5 lays down a requirement for hosting service providers to put in place measures to expeditiously assess content referred through a referral from either a competent authority in a Member State or a Union body without however imposing a requirement to remove the content referred nor does it set specific deadlines for action. It also sets out the minimum elements referrals should contain and procedures for hosting service providers to give feedback to the issuing authority and to request clarification to the authority which referred the content.

Article 6 requires hosting service providers to take effective and proportionate proactive measures where appropriate. It sets out a procedure ensuring that certain hosting service providers (i.e. those having received a removal order that has become final) take additional proactive measures where necessary to mitigate the risks and in accordance with the exposure

of terrorist content on their services. The hosting service provider should cooperate with the competent authority with regards to the necessary measures required, and, if no agreement can be reached, the authority may impose measures upon the service provider. The Article also sets a review procedure of the authority’s decision.

Article 7 requires hosting service providers to preserve removed content and related data for six months for review proceedings and for investigative purposes. This period can be extended in order to allow the finalisation of the review. The Article also requires service providers to put in place safeguards to ensure the preserved content and related data is not accessed or processed for other purposes.

Article 8 lays down an obligation for hosting service providers to explain their policies against terrorist content and to publish annual transparency reports on the actions taken in this regard.

Article 9 provides for specific safeguards regarding the use and implementation of proactive measures when using automated tools to ensure that decisions are accurate and well-founded.

Article 10 requires hosting service providers to implement complaint mechanisms for removals, referrals and proactive measures and to examine promptly every complaint.

Article 11 establishes an obligation for hosting service providers to make available information about the removal to the content provider, unless the competent authority requires the non-disclosure for public security reasons.

Article 12 requires Member States to ensure that competent authorities have sufficient capability and resources in order to fulfil their responsibilities under this Regulation.

Article 13 requires Member States to cooperate with each other and where appropriate with Europol to avoid duplication and interference with investigations. The Article also provides for the possibility of Member States and hosting service providers to make use of dedicated tools, including those of Europol, for the processing and feedback of removal orders and referrals and to cooperate on proactive measures. It also requires Member States to have the appropriate communication channels in place to ensure the timely exchange of information in implementing and enforcing provisions under this Regulation. The Article also obliges hosting service providers to inform the relevant authorities when they are aware of any evidence of terrorist offences within the meaning of Article 3 of the Directive (EU) 2017/541 on Combating Terrorism.

Article 14 provides for the establishment of points of contact by both hosting service providers and Member States to facilitate communication between them, particularly in relation to referrals and removal orders.

Article 15 establishes the Member State jurisdiction for the purposes of overseeing proactive measures, setting penalties and monitoring efforts.

Article 16 requires hosting service providers which do not have an establishment within any Member State but which do offer services within the Union, to designate a legal representative in the Union.

Article 17 requires Member States to designate authorities for issuing removal orders, for referring terrorist content, for overseeing the implementation of proactive measures and for enforcement of the Regulation.

Article 18 sets out that Member States should lay down rules on penalties for non-compliance and provides criteria for Member States to take into account when determining the type and level of penalties. Given the particular importance of expeditious removal of terrorist content identified in a removal order, specific rules should be put in place on financial penalties for systematic breaches of this requirement.

Article 19 lays down a faster and more flexible procedure for amending the templates provided for removal orders and authenticated submission channels through delegated acts.

Article 20 lays down the conditions under which the Commission has the power to adopt delegated acts to provide for necessary amendments to the templates and technical requirements for removal orders.

Article 21 requires the Member States to collect and report specific information related to the application of the Regulation with a view to assist the Commission in the exercise of its duties under Article 23. The Commission shall establish a detailed programme for monitoring the outputs, results and impacts of this Regulation.

Article 22 sets out that the Commission shall report on the implementation of this Regulation two years after its entry into force.

Article 23 sets out that the Commission shall report on the evaluation of this Regulation no sooner than three years after its entry into force.

Article 24 establishes that the proposed Regulation will enter into force the twentieth day after its publication in the Official Journal and will then apply 6 months after its date of entry into force. This deadline is proposed considering the need for implementing measures while also recognising the urgency for full application of the rules of the proposed Regulation. This deadline of 6 months has been set on the assumption that negotiations will be conducted swiftly.