Considerations on COM(2018)640 - Preventing the dissemination of terrorist content online - Contribution to the Leaders’ meeting, September 2018

Please note

This page contains a limited version of this dossier in the EU Monitor.

 
 
table>(1)This Regulation aims to ensure the smooth functioning of the digital single market in an open and democratic society, by addressing the misuse of hosting services for terrorist purposes and contributing to public security across the Union. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers and users’ trust in the online environment, as well as by strengthening safeguards to the freedom of expression, including the freedom to receive and impart information and ideas in an open and democratic society and the freedom and pluralism of the media.
(2)Regulatory measures to address the dissemination of terrorist content online should be complemented by Member State strategies to address terrorism, including the strengthening of media literacy and critical thinking, the development of alternative and counter narratives, and other initiatives to reduce the impact of and vulnerability to terrorist content online, as well as investment in social work, deradicalisation initiatives and engagement with affected communities, in order to achieve the sustained prevention of radicalisation in society.

(3)Addressing terrorist content online, which is part of a broader problem of illegal content online, requires a combination of legislative, non-legislative and voluntary measures based on collaboration between authorities and hosting service providers, in a manner that fully respects fundamental rights.

(4)Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, the services of hosting service providers are in certain cases abused by third parties for the purpose of carrying out illegal activities online. Of particular concern is the misuse of those services by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit followers, and to facilitate and direct terrorist activity.

(5)While not the only factor, the presence of terrorist content online has proven to be a catalyst for the radicalisation of individuals which can lead to terrorist acts, and therefore has serious negative consequences for users, citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, hosting service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help address terrorist content disseminated through their services online, while taking into account the fundamental importance of the freedom of expression, including the freedom to receive and impart information and ideas in an open and democratic society.

(6)Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers. Those efforts need to be complemented by a clear legislative framework in order to further reduce the accessibility of terrorist content online and adequately address a rapidly evolving problem. The legislative framework seeks to build on voluntary efforts, which were reinforced by Commission Recommendation (EU) 2018/334 (3), and responds to calls made by the European Parliament to strengthen measures to address illegal and harmful content online in line with the horizontal framework established by Directive 2000/31/EC of the European Parliament and of the Council (4), as well as by the European Council to improve the detection and removal of content online that incites terrorist acts.

(7)This Regulation should not affect the application of Directive 2000/31/EC. In particular, any measures taken by a hosting service provider in compliance with this Regulation, including any specific measures, should not in themselves lead to that hosting service provider losing the benefit of the liability exemption provided for in that Directive. Moreover, this Regulation does not affect the powers of national authorities and courts to establish the liability of hosting service providers where the conditions set out in that Directive for liability exemption are not met.

(8)In the event of a conflict between this Regulation and Directive 2010/13/EU of the European Parliament and of the Council (5) in relation to provisions governing audiovisual media services as defined in point (a) of Article 1(1) of that Directive, Directive 2010/13/EU should prevail. This should leave the obligations under this Regulation, in particular with regard to video-sharing platform providers, unaffected.

(9)This Regulation should set out rules to address the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market. Those rules should fully respect the fundamental rights protected in the Union and, in particular, those guaranteed by the Charter of Fundamental Rights of the European Union (the ‘Charter’).

(10)This Regulation seeks to contribute to the protection of public security while establishing appropriate and robust safeguards to ensure the protection of fundamental rights, including the right to respect for private life, to the protection of personal data, to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and to an effective remedy. Moreover, any discrimination is prohibited. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information and the freedom and pluralism of the media, which constitute the essential foundations of a pluralist and democratic society and are values on which the Union is founded. Measures affecting the freedom of expression and information should be strictly targeted to address the dissemination of terrorist content online, while respecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas, in accordance with the law. Effective online measures to address terrorist content online and the protection of freedom of expression and information are not conflicting but complementary and mutually reinforcing goals.

(11)In order to provide clarity about the actions that both hosting service providers and competent authorities are to take to address the dissemination of terrorist content online, this Regulation should establish a definition of ‘terrorist content’ for preventative purposes, consistent with the definitions of relevant offences under Directive (EU) 2017/541 of the European Parliament and of the Council (6). Given the need to address the most harmful terrorist propaganda online, that definition should cover material that incites or solicits someone to commit, or to contribute to the commission of, terrorist offences, solicits someone to participate in activities of a terrorist group, or glorifies terrorist activities including by disseminating material depicting a terrorist attack. The definition should also include material that provides instruction on the making or use of explosives, firearms or other weapons or noxious or hazardous substances, as well as chemical, biological, radiological and nuclear (CBRN) substances, or on other specific methods or techniques, including the selection of targets, for the purpose of committing or contributing to the commission of terrorist offences. Such material includes text, images, sound recordings and videos, as well as live transmissions of terrorist offences, that cause a danger of further such offences being committed. When assessing whether material constitutes terrorist content within the meaning of this Regulation, competent authorities and hosting service providers should take into account factors such as the nature and wording of statements, the context in which the statements were made and their potential to lead to harmful consequences in respect of the security and safety of persons. The fact that the material was produced by, is attributable to or is disseminated on behalf of a person, group or entity included in the Union list of persons, groups and entities involved in terrorist acts and subject to restrictive measures should constitute an important factor in the assessment.

(12)Material disseminated for educational, journalistic, artistic or research purposes or for awareness-raising purposes against terrorist activity should not be considered to be terrorist content. When determining whether the material provided by a content provider constitutes ‘terrorist content’ as defined in this Regulation, account should be taken, in particular, of the right to freedom of expression and information, including the freedom and pluralism of the media, and the freedom of the arts and sciences. Especially in cases where the content provider holds editorial responsibility, any decision as to the removal of the disseminated material should take into account the journalistic standards established by press or media regulation in accordance with Union law, including the Charter. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered to be terrorist content.

(13)In order to effectively address the dissemination of terrorist content online, while ensuring respect for the private life of individuals, this Regulation should apply to providers of information society services which store and disseminate to the public information and material provided by a user of the service on request, irrespective of whether the storing and dissemination to the public of such information and material is of a mere technical, automatic and passive nature. The concept of ‘storage’ should be understood as holding data in the memory of a physical or virtual server. Providers of ‘mere conduit’ or ‘caching’ services, as well as of other services provided in other layers of the internet infrastructure, which do not involve storage, such as registries and registrars, as well as providers of domain name systems (DNS), payment or distributed denial of service (DdoS) protection services, should therefore fall outside the scope of this Regulation.

(14)The concept of ‘dissemination to the public’ should entail the making available of information to a potentially unlimited number of persons, namely making the information easily accessible to users in general, without requiring further action by the content provider, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admittance to a group of users, that information should be considered to be disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access. Interpersonal communication services, as defined in point (5) of Article 2 of Directive (EU) 2018/1972 of the European Parliament and of the Council (7), such as emails or private messaging services, should fall outside the scope of this Regulation. Information should be considered to be stored and disseminated to the public within the meaning of this Regulation only where such activities are performed upon direct request of the content provider. Consequently, providers of services, such as cloud infrastructure, which are provided at the request of parties other than the content providers and only indirectly benefit the latter, should not be covered by this Regulation. This Regulation should cover, for example, providers of social media, video, image and audio-sharing services, as well as file-sharing services and other cloud services, insofar as those services are used to make the stored information available to the public at the direct request of the content provider. Where a hosting service provider offers several services, this Regulation should apply only to the services that fall within its scope.

(15)Terrorist content is often disseminated to the public through services provided by hosting service providers established in third countries. In order to protect users in the Union and to ensure that all hosting service providers operating in the digital single market are subject to the same requirements, this Regulation should apply to all providers of relevant services offered in the Union, irrespective of the country of their main establishment. A hosting service provider should be considered offering services in the Union if it enables natural or legal persons in one or more Member States to use its services and has a substantial connection to that Member State or those Member States.

(16)A substantial connection to the Union should exist where the hosting service provider has an establishment in the Union, its services are used by a significant number of users in one or more Member States, or its activities are targeted towards one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in the Member State concerned, or the possibility of ordering goods or services from such Member State. Such targeting could also be derived from the availability of an application in the relevant national application store, from providing local advertising or advertising in a language generally used in the Member State concerned, or from the handling of customer relations such as by providing customer service in a language generally used in that Member State. A substantial connection should also be assumed where a hosting service provider directs its activities towards one or more Member States as set out in point (c) of Article 17(1) of Regulation (EU) No 1215/2012 of the European Parliament and of the Council (8). The mere accessibility of a hosting service provider’s website, of an email address or of other contact details in one or more Member States, taken in isolation, should not be sufficient to constitute a substantial connection. Moreover, the provision of a service with a view to mere compliance with the prohibition of discrimination laid down in Regulation (EU) 2018/302 of the European Parliament and of the Council (9) should not, on that ground alone, be considered to constitute a substantial connection to the Union.

(17)The procedure and obligations resulting from removal orders requiring hosting service providers to remove or disable access to terrorist content, following an assessment by the competent authorities, should be harmonised. Given the speed at which terrorist content is disseminated across online services, an obligation should be imposed on hosting service providers to ensure that the terrorist content identified in the removal order is removed or access to it is disabled in all Member States within one hour of receipt of the removal order. Except for in duly justified cases of emergency, the competent authority should provide the hosting service provider with information on procedures and applicable deadlines at least 12 hours in advance of issuing for the first time a removal order to that hosting service provider. Duly justified cases of emergency occur where the removal of or disabling of access to the terrorist content later than one hour after receipt of the removal order would result in serious harm, such as in situations of an imminent threat to the life or physical integrity of a person, or when such content depicts ongoing events resulting in harm to the life or physical integrity of a person. The competent authority should determine whether cases constitute emergency cases and duly justify its decision in the removal order. Where the hosting service provider cannot comply with the removal order within one hour of its receipt, on grounds of force majeure or de facto impossibility, including for objectively justifiable technical or operational reasons, it should inform the issuing competent authority as soon as possible and comply with the removal order as soon as the situation is resolved.

(18)The removal order should contain a statement of reasons qualifying the material to be removed or access to which is to be disabled as terrorist content and provide sufficient information for the location of that content, by indicating the exact URL and, where necessary, any other additional information, such as a screenshot of the content in question. That statement of reasons should allow the hosting service provider and, ultimately, the content provider to effectively exercise their right to judicial redress. The reasons provided should not imply the disclosure of sensitive information which could jeopardise ongoing investigations.

(19)The competent authority should submit the removal order directly to the contact point designated or established by the hosting service provider for the purposes of this Regulation by any electronic means capable of producing a written record under conditions that allow the hosting service provider to establish the authenticity of the order, including the accuracy of the date and the time of sending and receipt thereof, such as by secured email or platforms or other secured channels, including those made available by the hosting service provider, in accordance with Union law on the protection of personal data. It should be possible for that requirement to be met through the use of, inter alia, qualified electronic registered delivery services as provided for by Regulation (EU) No 910/2014 of the European Parliament and of the Council (10). Where the hosting service provider’s main establishment is or its legal representative resides or is established in a Member State other than that of the issuing competent authority, a copy of the removal order should be submitted simultaneously to the competent authority of that Member State.

(20)It should be possible for the competent authority of the Member State where the hosting service provider has its main establishment or where its legal representative resides or is established to scrutinise the removal order issued by competent authorities of another Member State to determine whether it seriously or manifestly infringes this Regulation or the fundamental rights enshrined in the Charter. Both the content provider and the hosting service provider should have the right to request such scrutiny by the competent authority in the Member State where the hosting service provider has its main establishment or where its legal representative resides or is established. Where such a request is made, that competent authority should adopt a decision on whether the removal order comprises such an infringement. Where that decision finds such an infringement, the removal order should cease to have legal effects. The scrutiny should be carried out swiftly so as to ensure that erroneously removed or disabled content is reinstated as soon as possible.

(21)Hosting service providers that are exposed to terrorist content should, where they have terms and conditions, include therein provisions to address the misuse of their services for the dissemination to the public of terrorist content. They should apply those provisions in a diligent, transparent, proportionate and non-discriminatory manner.

(22)Given the scale of the problem and the speed necessary to effectively identify and remove terrorist content, effective and proportionate specific measures are an essential element in addressing terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers exposed to terrorist content should put in place specific measures taking into account the risks and level of exposure to terrorist content as well as the effects on the rights of third parties and the public interest to information. Hosting service providers should determine what appropriate, effective and proportionate specific measure should be put in place to identify and remove terrorist content. Specific measures could include appropriate technical or operational measures or capacities such as staffing or technical means to identify and expeditiously remove or disable access to terrorist content, mechanisms for users to report or flag alleged terrorist content, or any other measures the hosting service provider considers appropriate and effective to address the availability of terrorist content on its services.

(23)When putting in place specific measures, hosting service providers should ensure that users’ right to freedom of expression and information as well as the freedom and pluralism of the media as protected under the Charter are preserved. In addition to any requirement laid down in the law, including legislation on the protection of personal data, hosting service providers should act with due diligence and implement safeguards, where appropriate, including human oversight and verifications, to avoid any unintended or erroneous decision leading to the removal of or disabling of access to content that is not terrorist content.

(24)The hosting service provider should report to the competent authority on the specific measures in place in order to allow that authority to determine whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary capacity for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters, including the number of removal orders issued to the hosting service provider, the size and economic capacity of the hosting service provider and the impact of its services in disseminating terrorist content, for example on the basis of the number of users in the Union, as well as the safeguards put in place to address the misuse of its services for the dissemination of terrorist content online.

(25)Where the competent authority considers that the specific measures put in place are insufficient to address the risks, it should be able to require the adoption of additional appropriate, effective and proportionate specific measures. The requirement to implement such additional specific measures should not lead to a general obligation to monitor or to engage in active fact-finding within the meaning of Article 15(1) of Directive 2000/31/EC or to an obligation to use automated tools. However, it should be possible for hosting service providers to use automated tools if they consider this to be appropriate and necessary to effectively address the misuse of their services for the dissemination of terrorist content.

(26)The obligation on hosting service providers to preserve removed content and related data should be laid down for specific purposes and limited to the period necessary. There is a need to extend the preservation requirement to related data to the extent that any such data would otherwise be lost as a consequence of the removal of the terrorist content in question. Related data can include data such as subscriber data, in particular data pertaining to the identity of the content provider, as well as access data, including data about the date and time of use by the content provider and the log-in to and log-off from the service, together with the IP address allocated by the internet access service provider to the content provider.

(27)The obligation to preserve the content for administrative or judicial review proceedings is necessary and justified in view of the need to ensure that effective remedies are in place for content providers whose content has been removed or access to which has been disabled, as well as to ensure the reinstatement of that content, depending on the outcome of those proceedings. The obligation to preserve material for investigative or prosecutorial purposes is justified and necessary in view of the value the material could have for the purpose of disrupting or preventing terrorist activity. Therefore, the preservation of removed terrorist content for the purposes of prevention, detection, investigation and prosecution of terrorist offences should also be considered to be justified. The terrorist content and the related data should be stored only for the period necessary to allow the law enforcement authorities to check that terrorist content and decide whether it would be needed for those purposes. For the purposes of the prevention, detection, investigation and prosecution of terrorist offences, the required preservation of data should be limited to data that are likely to have a link with terrorist offences, and could therefore contribute to prosecuting terrorist offences or to preventing serious risks to public security. Where hosting service providers remove or disable access to material, in particular through their own specific measures, they should inform the competent authorities promptly of content that contains information involving an imminent threat to life or a suspected terrorist offence.

(28)To ensure proportionality, the period of preservation should be limited to six months to allow content providers sufficient time to initiate administrative or judicial review proceedings and to enable access by law enforcement authorities to relevant data for the investigation and prosecution of terrorist offences. However, upon the request of the competent authority or court, it should be possible to extend that period for as long as necessary in cases where those proceedings are initiated but not finalised within that six-month period. The duration of the period of preservation should be sufficient to allow law enforcement authorities to preserve the necessary material in relation to investigations and prosecutions, while ensuring the balance with the fundamental rights.

(29)This Regulation should not affect the procedural guarantees or procedural investigation measures related to access to content and related data preserved for the purposes of the investigation and prosecution of terrorist offences, as regulated under Union or national law.

(30)The transparency of hosting service providers’ policies in relation to terrorist content is essential to enhance their accountability towards their users and to reinforce trust of citizens in the digital single market. Hosting service providers that have taken action or were required to take action pursuant to this Regulation in a given calendar year should make publicly available annual transparency reports containing information about action taken in relation to the identification and removal of terrorist content.

(31)The competent authorities should publish annual transparency reports containing information on the number of removal orders, the number of cases where an order was not executed, the number of decisions concerning specific measures, the number of cases subject to administrative or judicial review proceedings and the number of decisions imposing penalties.

(32)The right to an effective remedy is enshrined in Article 19 of the Treaty on European Union (TEU) and in Article 47 of the Charter. Each natural or legal person has the right to an effective remedy before the competent national court against any of the measures taken pursuant to this Regulation which can adversely affect the rights of that person. That right should include, in particular, the possibility for hosting service providers and content providers to effectively challenge the removal orders or any decisions resulting from the scrutiny of removal orders under this Regulation before a court of the Member State whose competent authority issued the removal order or took the decision, as well as for hosting service providers to effectively challenge a decision relating to specific measures or penalties before a court of the Member State whose competent authority took that decision.

(33)Complaint procedures constitute a necessary safeguard against the erroneous removal of or disabling of access to content online where such content is protected under the freedom of expression and information. Hosting service providers should therefore establish user-friendly complaint mechanisms and ensure that complaints are dealt with expeditiously and in full transparency towards the content provider. The requirement for the hosting service provider to reinstate content that has been removed or access to which has been disabled in error should not affect the possibility for the hosting service provider to enforce its own terms and conditions.

(34)Effective legal protection in accordance with Article 19 TEU and Article 47 of the Charter requires that content providers are able to ascertain the reasons upon which the content they provide has been removed or access to which has been disabled. For that purpose, the hosting service provider should make available to the content provider information for challenging the removal or the disabling. Depending on the circumstances, hosting service providers could replace content which has been removed or access to which has been disabled with a message indicating that the content has been removed or access to it has been disabled in accordance with this Regulation. Further information about the reasons for the removal or disabling as well as the remedies for the removal or disabling should be provided upon request of the content provider. Where the competent authorities decide that for reasons of public security, including in the context of an investigation, it is inappropriate or counterproductive to directly notify the content provider of the removal or disabling, they should inform the hosting service provider accordingly.

(35)For the purposes of this Regulation, Member States should designate competent authorities. This should not necessarily imply the establishment of a new authority and it should be possible to entrust an existing body with the functions provided for in this Regulation. This Regulation should require the designation of authorities competent for issuing removal orders, scrutinising removal orders, overseeing specific measures and imposing penalties, while it should be possible for each Member State to decide on the number of competent authorities to be designated and whether they are administrative, law enforcement or judicial. Member States should ensure that the competent authorities fulfil their tasks in an objective and non-discriminatory manner and do not seek or take instructions from any other body in relation to the exercise of the tasks under this Regulation. This should not prevent supervision in accordance with national constitutional law. Member States should communicate the competent authorities designated under this Regulation to the Commission, which should publish online a register listing the competent authorities. That online register should be easily accessible to facilitate the swift verification of the authenticity of removal orders by the hosting service providers.

(36)In order to avoid duplication of effort and possible interferences with investigations and to minimise the burden to the hosting service providers affected, the competent authorities should exchange information, coordinate and cooperate with each other and, where appropriate, with Europol, before issuing removal orders. When deciding whether to issue a removal order, the competent authority should give due consideration to any notification of an interference with an investigative interest (deconfliction). Where a competent authority is informed by a competent authority of another Member State of an existing removal order, it should not issue a removal order concerning the same subject matter. In implementing the provisions of this Regulation, Europol could provide support in line with its current mandate and existing legal framework.

(37)In order to ensure the effective and sufficiently coherent implementation of specific measures taken by hosting service providers, competent authorities should coordinate and cooperate with each other with regard to the exchanges with hosting service providers as to removal orders and the identification, implementation and assessment of specific measures. Coordination and cooperation are also needed in relation to other measures to implement this Regulation, including with respect to the adoption of rules on penalties and the imposition of penalties. The Commission should facilitate such coordination and cooperation.

(38)It is essential that the competent authority of the Member State responsible for imposing penalties is fully informed of the issuing of removal orders and of the subsequent exchanges between the hosting service provider and the competent authorities in other Member States. For that purpose, Member States should ensure appropriate and secure communication channels and mechanisms allowing the sharing of relevant information in a timely manner.

(39)To facilitate the swift exchanges between competent authorities as well as with hosting service providers, and to avoid duplication of effort, Member States should be encouraged to make use of the dedicated tools developed by Europol, such as the current internet Referral Management application or its successors.

(40)Referrals by Member States and Europol have proven to be an effective and swift means of increasing hosting service providers’ awareness of specific content available through their services and enabling them to take swift action. Such referrals, which are a mechanism for alerting hosting service providers of information that could be considered to be terrorist content for the provider’s voluntary consideration of the compatibility of that content with its own terms and conditions, should remain available in addition to removal orders. The final decision on whether to remove the content because it is incompatible with its terms and conditions remains with the hosting service provider. This Regulation should not affect the mandate of Europol as laid down in Regulation (EU) 2016/794 of the European Parliament and of the Council (11). Therefore, nothing in this Regulation should be understood as precluding the Member States and Europol from using referrals as an instrument to address terrorist content online.

(41)Given the particular serious consequences of certain terrorist content online, hosting service providers should promptly inform the relevant authorities in the Member State concerned or the competent authorities of the Member State where they are established or have a legal representative of terrorist content involving an imminent threat to life or a suspected terrorist offence. In order to ensure proportionality, that obligation should be limited to terrorist offences as defined in Article 3(1) of Directive (EU) 2017/541. That obligation to inform should not imply an obligation on hosting service providers to actively seek any evidence of such imminent threat to life or a suspected terrorist offence. The Member State concerned should be understood to be the Member State with jurisdiction over the investigation and prosecution of those terrorist offences based on the nationality of the offender or of the potential victim of the offence or the target location of the terrorist act. In the case of doubt, hosting service providers should submit the information to Europol, which should provide the relevant follow-up action in accordance with its mandate, including by forwarding that information to the relevant national authorities. The competent authorities of the Member States should be allowed to use such information to take investigatory measures available under Union or national law.

(42)Hosting service providers should designate or establish contact points to facilitate the expeditious handling of removal orders. The contact point should serve only for operational purposes. The contact point should consist of any dedicated means, in-house or outsourced, allowing for the electronic submission of removal orders and of technical or personal means allowing for the expeditious processing thereof. It is not necessary that the contact point be located in the Union. The hosting service provider should be free to make use of an existing contact point for the purpose of this Regulation, provided that the contact point is able to fulfil the functions provided for in this Regulation. With a view to ensuring that terrorist content is removed or that access thereto is disabled within one hour of receipt of a removal order, the contact points of hosting service providers exposed to terrorist content should be accessible at any time. The information on the contact point should include information about the language in which it can be addressed. In order to facilitate the communication between the hosting service providers and the competent authorities, hosting service providers are encouraged to allow for communication in one of the official languages of the Union institutions in which their terms and conditions are available.

(43)In the absence of a general requirement for hosting service providers to ensure a physical presence within the territory of the Union, there is a need to ensure clarity under which Member State’s jurisdiction the hosting service provider offering services within the Union falls. As a general rule, the hosting service provider falls under the jurisdiction of the Member State in which it has its main establishment or in which its legal representative resides or is established. That should be without prejudice to the rules on competence established for the purpose of removal orders and decisions resulting from the scrutiny of removal orders under this Regulation. With regard to a hosting service provider which has no establishment in the Union and does not designate a legal representative, any Member State should, nevertheless, have jurisdiction and therefore be able to impose penalties, provided that the principle of ne bis in idem is respected.

(44)Hosting service providers that are not established in the Union should designate in writing a legal representative in order to ensure compliance with and the enforcement of the obligations under this Regulation. It should be possible for hosting service providers to designate, for the purposes of this Regulation, a legal representative already designated for other purposes, provided that that legal representative is able to fulfil the functions provided for in this Regulation. The legal representative should be empowered to act on behalf of the hosting service provider.

(45)Penalties are necessary to ensure the effective implementation of this Regulation by hosting service providers. Member States should adopt rules on penalties, which can be of an administrative or criminal nature, as well as, where appropriate, fining guidelines. Non-compliance in individual cases could be subject to penalties while respecting the principles of ne bis in idem and of proportionality and ensuring that such penalties take account of systematic failure. Penalties could take different forms, including formal warnings in the case of minor infringements or financial penalties in relation to more severe or systematic infringements. Particularly severe penalties should be imposed in the event that the hosting service provider systematically or persistently fails to remove or disable access to terrorist content within one hour of receipt of a removal order. In order to ensure legal certainty, this Regulation should set out which infringements are subject to penalties and which circumstances are relevant for assessing the type and level of such penalties. When determining whether to impose financial penalties, due account should be taken of the financial resources of the hosting service provider. Moreover, the competent authority should take into account whether the hosting service provider is a start-up or a micro, small or medium-sized enterprise as defined in Commission Recommendation 2003/361/EC (12). Additional circumstances, such as whether the conduct of the hosting service provider was objectively imprudent or reprehensible or whether the infringement has been committed negligently or intentionally, should be taken into account. Member States should ensure that penalties imposed for the infringement of this Regulation do not encourage the removal of material which is not terrorist content.

(46)The use of standardised templates facilitates cooperation and the exchange of information between competent authorities and hosting service providers, allowing them to communicate more quickly and effectively. It is particularly important to ensure expeditious action following the receipt of a removal order. Templates reduce translation costs and contribute to a higher standard of the process. Feedback templates allow for a standardised exchange of information and are particularly important where hosting service providers are unable to comply with removal orders. Authenticated submission channels can guarantee the authenticity of the removal order, including the accuracy of the date and the time of sending and receipt of the order.

(47)In order to allow for a swift amendment, where necessary, of the content of the templates to be used for the purposes of this Regulation, the power to adopt acts in accordance with Article 290 of the Treaty on the Functioning of the European Union should be delegated to the Commission in respect of amending the annexes to this Regulation. In order to be able to take into account the development of technology and of the related legal framework, the Commission should also be empowered to adopt delegated acts to supplement this Regulation with technical requirements for the electronic means to be used by competent authorities for the transmission of removal orders. It is of particular importance that the Commission carry out appropriate consultations during its preparatory work, including at expert level, and that those consultations be conducted in accordance with the principles laid down in the Interinstitutional Agreement of 13 April 2016 on Better Law-Making (13). In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States’ experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts.

(48)Member States should collect information on the implementation of this Regulation. It should be possible for Member States to make use of the hosting service providers’ transparency reports and complement them, where necessary, with more detailed information, such as their own transparency reports pursuant to this Regulation. A detailed programme for monitoring the outputs, results and impacts of this Regulation should be established in order to inform an evaluation of the implementation of this Regulation.

(49)Based on the findings and conclusions in the implementation report and the outcome of the monitoring exercise, the Commission should carry out an evaluation of this Regulation within three years of the date of its entry into force. The evaluation should be based on the criteria of efficiency, necessity, effectiveness, proportionality, relevance, coherence and Union added value. It should assess the functioning of the different operational and technical measures provided for by this Regulation, including the effectiveness of measures to enhance the detection, identification and removal of terrorist content online, the effectiveness of safeguard mechanisms as well as the impacts on potentially affected fundamental rights, such as the freedom of expression and information, including the freedom and pluralism of the media, the freedom to conduct a business, the right to private life and the protection of personal data. The Commission should also assess the impact on potentially affected interests of third parties.

(50)Since the objective of this Regulation, namely ensuring the smooth functioning of the digital single market by addressing the dissemination of terrorist content online, cannot be sufficiently achieved by the Member States but can rather, by reason of its scale and effects, be better achieved at Union level, the Union may adopt measures, in accordance with the principle of subsidiarity as set out in Article 5 TEU. In accordance with the principle of proportionality, as set out in that Article, this Regulation does not go beyond what is necessary in order to achieve that objective,