Annexes to COM(2011)556 - Application of two recommendations on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry-PROTECTING CHILDREN IN THE DIGITAL WORLD-

Please note

This page contains a limited version of this dossier in the EU Monitor.

agreements with third countries [24].

Enhanced cooperation and harmonised protection concerning problematic Internet content seem desirable. Although this content originates mostly outside the EU, some Member States consider such an approach to be more realistic at European level than by involving third countries. |

Media literacy and awareness-raising

All Member States are committed to promoting media literacy and enhancing awareness of

the risks of online media and of existing protection tools as effective preventive instruments.

In particular, there are a growing number of relevant initiatives in the Member States taking the form of public private partnerships. According to feedback from Member States, the European Commission's Safer Internet Programme and the EU Kids Online project have proven valuable frameworks in these fields[25].

Media literacy and awareness-raising initiatives are partly integrated into formal education and some efforts are being made to sensitise parents and teachers, too. However, an assessment carried out by the Commission in 2009 showed that even though the topic is included in national curricula in 23 European countries, actual delivery of such education is fragmented and inconsistent[26].

While increasing integration of media literacy and awareness-raising in school education is positive, universal coverage of all children and parents and consistency across schools and Member States remain significant challenges. |

Access restrictions to content

Restricting minors' access to content that is appropriate for their age requires two things: on the one hand, age rating and classifying content and on the other hand ensuring respect for these ratings and classifications. The latter task falls primarily within parents' responsibility, but technical systems - filtering, age verification systems, parental control systems, etc. - provide valued support.

Age rating and classification of content

The age-rating and content classification systems for audiovisual content in place are in principle considered sufficient and effective by 12 Member States[27], whereas 13 Member States[28] and Norway deem they should be improved .

16 Member States[29] and Norway responded that they have diverging age ratings and classifications for different types of media. Ten Member States[30] and Norway consider this to be a problem. Eight Member States[31] and Norway point out that there are measures or initiatives being considered to introduce greater consistency in this field.

Altogether 15 Member States[32] and Norway consider cross-media and/or pan-European classification systems for media content helpful and feasible . This is contradicted by nine Member States[33] which point to the cultural differences.

This is an area of most extreme fragmentation – the conceptions of what is necessary and useful diverge significantly between and within Member States. |

Technical systems (filtering, age verification systems, parental control systems, etc.)

Overall, there seems to be a consensus that technical measures alone cannot protect minors from harmful content and can only be one element in a bundle of measures.

Regarding technical measures aimed at avoiding potentially harmful content by ensuring respect for the relevant ratings and classifications, the Member States are divided over their usefulness, appropriateness (with a view to the right to information and possible misuse for censorship), technical feasibility and reliability [34]. In addition, there was a shared emphasis on the need for transparency as regards the inclusion of certain content in a black list and the possibility of its removal.

20 Member States[35] report that efforts have been made by industry or public authorities to develop a filtering and rating system for the Internet. 24 Member States[36] and Norway report that parental control tools are used . They are available free of charge in 15 Member States and upon payment in four Member States[37].

Moreover, there are growing efforts to inform subscribers about the available filtering and rating systems and age verification software , which is an obligation – by law or in relevant codes of conduct for ISPs or mobile operators – in 16 Member States[38].

While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature of online content, ways to better align such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of “appropriateness” and reflecting the established approaches to the liability of the various Internet actors. |

Audiovisual Media Services

As regards co/self-regulation systems for the protection of minors from harmful content, on-demand audiovisual media services (where such systems are in place in eight Member States, with seven having a code of conduct) are lagging behind television programmes where such systems are in place in 14 Member States, with 11 of them having a code of conduct in place[39].

The most common techniques to signal to parents the presence of harmful content and the need for parents to restrict access are on-screen icons and/or acoustic warnings immediately prior to the delivery of potentially harmful content. This is true of both television broadcasts and on-demand audiovisual media services.

Most Member States consider such signals useful, and some require them by law or their use is stipulated by codes of conduct. Less used are technical filtering devices or software , including pre-locking systems and pin codes . Age classifications and transmission time restrictions for on-demand audiovisual media services are applied only in a small number of Member States[40].

As regards the reliability of labelling and warning systems, some Member States stressed the importance of parental responsibility and the fact that such systems can only work when parents ensure their effectiveness by controlling what their children are watching.

The variety of actions carried out in this field reflects the distinctions made in the AVMS Directive but also the difficulty to come to consensual policy responses to this challenge. Universally available technical means for offering children a selective access to content on the Internet, such as parental control tools linked to age-rated and labelled content are very diverse; the solutions developed for linear/TV broadcasting (e.g. transmission times) often seem ill-adapted to Internet and other on-demand audiovisual media services. |

Video games

A total of 17 Member States and Norway consider the functioning of their age rating systems to be satisfactory [41]. With the exception of Germany, Member States rely on PEGI (Pan-European Games Information System)[42] and PEGI Online [43].

As regards online games, PEGI Online is considered to be a good solution in principle, but a number of Member States are concerned by the still limited participation of industry in this system.

Evaluation systems for the assessment of possible favourable or adverse effects of video games on minors' development or health are in place in only five Member States[44] and Norway.

As regards possible further measures to protect minors from harmful video games, the ones mentioned most were media literacy and awareness-raising , in particular to better signal the risks from video games, and to promote existing protection tools. However, only in eight Member States and in Norway are such measures integrated into school education .

The replies given by the Member States furthermore confirm the need for more action on the retail sale of video games in shops in order to deal with the "underage" sale of video games. There have been relevant awareness raising measures in six Member States and Norway[45] only and in only four Member States[46] retailers have implemented relevant codes of conduct.

While age rating systems (notably PEGI) function well in most Member States, the reported challenges include their limited application to online games and "underage" sales of games in the retail market. In addition, more awareness-raising measures (e.g. media literacy at schools) would have useful preventive effects. |

Right of reply in online media

16 Member States[47] provide for a right of reply covering online newspapers / periodicals; in 13 Member States[48], it covers Internet-based news services; in 17 Member States[49], it covers online television services; in 15 Member States[50], it covers online radio services and in nine Member States[51], it covers other online services.

Member States assess the level of protection from an assertion of facts[52] in online media and the effectiveness of the respective system(s) in place in about equal measure as sufficient and effective and as unsatisfactory.

The introduction of a right of reply covering online media in the Member States is inconsistent and differs for each type of online medium. Moreover, there is scope for improving the effectiveness of the systems in place. |

CONCLUSIONS

As a positive general result, the survey of Member States on the various dimensions of the 1998 and 2006 Recommendations shows that all Member States are conscious of the challenges for the protection of minors online and are increasingly making efforts to respond to them. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible and responsive a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content.

However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse – and in a number of cases, even diverging - actions across Europe. This is in particular true of tackling illegal and harmful content, making social networks safer places and streamlining content rating schemes.

Quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers who try to identify the “do's” and “don't” to protect and empower children who go online.

This report and the detailed responses gathered in this survey of Member States[53] demonstrate that further action at European level may build on the best practices of the Member States and reach economies of scale for the ICT sector that will help children to safely reap the benefits of the constantly evolving digital world.

[1] 1998 : Council Recommendation of 24 September 1998 on the development of the competitiveness of the European audiovisual and information services industry by promoting national frameworks aimed at achieving a comparable and effective level of protection of minors and human dignity(98/560/EC, OJ L 270, 07.10.1998 P. 48–55 ( http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31998H0560:EN:NOT))2006: Recommendation of the European Parliament and of the Council on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry of 20 December 2006 (2006/952/EC, OJ L 378, 27.12.2006, p. 72–77 ( http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32006H0952:EN:NOT))

[2] At the same time it should be ensured that all co- or self-regulatory measures taken are in compliance with competition law.

[3] COM(2010) 245 final/2: Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions – A Digital Agenda for Europe (26 August 2010 – corrected version) ( http://ec.europa.eu/information_society/digital-agenda/index_en.htm)

[4] See Staff Working Paper page 7 and footnote 27.

[5] See Staff Working Paper pages 7, 8 and footnotes 31, 32.

[6] http://www.inhope.org/gns/home.aspx

[7] Hotlines from 35 countries worldwide are members of INHOPE.

[8] See Staff Working Paper footnote 35.

[9] See Staff Working Paper pages 8, 9. For the limited liability and responsibility of ISPs according to the E-Commerce-Directive, see footnote 13 of this Report.

[10] See Staff Working Paper footnote 39.

[11] See Staff Working Paper page 9.

[12] See Staff Working Paper page 9.

[13] According to the E-Commerce-Directive (Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce) ( OJ L 178 , 17/07/2000 P. 0001 – 0016) ( http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000L0031:en:NOT), ISPs have no general obligation to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity (Art. 15 (1). ISPs benefit from a limited liability for the information transmitted (Art. 12 (1)), for the automatic, intermediate and temporary storage of that information (Art. 13 (1)) and for the information stored at the request of a recipient of the service (Art. 14 (1)).

[14] See Staff Working Paper footnote 46.

[15] See Staff Working Paper footnote 48.

[16] See Staff Working Paper footnote 49.

[17] See Staff Working Paper footnote 50.

[18] See Staff Working Paper footnote 52.

[19] See Staff Working Paper footnote 58.

[20] See Staff Working Paper page 12.

[21] http://ec.europa.eu/information_society/activities/social_networking/docs/sn_principles.pdf.

[22] See Staff Working Paper footnote 60.

[23] See Staff Working Paper page 13.

[24] See Staff Working Paper page 13 and footnote 63. In terms of fighting online distribution of child sexual abuse material, the Safer Internet Programme focuses on international and European cooperation, in particular by supporting the INHOPE network of hotlines.

[25] See Staff Working Paper page 14.

[26] See Staff Working Paper footnote 65.

[27] See Staff Working Paper footnote 81.

[28] See Staff Working Paper footnote 82.

[29] See Staff Working Paper footnote 83.

[30] See Staff Working Paper footnote 85.

[31] See Staff Working Paper footnote 86.

[32] See Staff Working Paper footnote 87.

[33] See Staff Working Paper footnote 88.

[34] The Safer Internet Programme has commissioned a benchmarking study of the effectiveness of available filtering solutions available in Europe. The first results were published in January 2011. http://ec.europa.eu/information_society/activities/sip/projects/filter_label/sip_bench2/index_en.htm

[35] See Staff Working Paper page 16.

[36] See Staff Working Paper footnote 77.

[37] See Staff Working Paper page 16 and footnote 78.

[38] See Staff Working Paper footnote 76.

[39] See Staff Working Paper pages 20-22 and footnotes 93, 94, 99, 100.

[40] See Staff Working Paper pages 20-22.

[41] See Staff Working Paper footnote 107.

[42] http://www.pegi.info/en/

[43] http://www.pegionline.eu/en/

[44] See Staff Working Paper footnote 118.

[45] See Staff Working Paper pages 24, 25 and footnote 119.

[46] See Staff Working Paper footnote 120.

[47] See Staff Working Paper footnote 128.

[48] See Staff Working Paper footnote 129.

[49] See Staff Working Paper footnote 130.

[50] See Staff Working Paper footnote 131.

[51] See Staff Working Paper footnote 132.

[52] In the sense of 2006 Recommendation, Annex 1 – Indicative Guidelines for the Implementation, at national level, of measures in domestic law or practice so as to ensure the right of reply or equivalent remedies in relation to on-line media.

[53] Staff Working Paper