Opening keynote speech by Commissioner Mariya Gabriel at the 9th Annual European Data Protection and Privacy Conference

Source: M.I. (Mariya) Gabriel i, published on Wednesday, March 20 2019.

Dear international guests,

Dear Andrea Jelinek,

Dear Ladies and Gentlemen,

First of all, I would like thank the organisers for putting together a particularly rich programme for this 9th annual data protection and privacy conference. Today's panels will cover many of our currently most important digital policy topics, such as the data economy and the challenge of online disinformation.

So thank you very much for inviting me. I am very pleased to be with you and to open this event this morning.

Ladies and Gentlemen,

For the Digital Single Market to flourish, individuals must have trust in digital products and services in the first place. Several tools help us achieve this.

First and foremost, there is of course the General Data Protection Regulation. Not yet one year in force but already a global standard. I know that it will be much discussed during the day.

But there is more. I can only briefly mention the European Electronic Communications Code, the new regulatory framework for the telecom sector agreed last year, as well as several measures for reinforcing the level of cybersecurity in the EU, namely the NIS Directive, also in force since last year, and the Cybersecurity Act, a regulation that will become binding in April. Finally, in the area of online privacy we have of course the e-Privacy Directive, which is in force since 2013, and its designated successor, the e-Privacy Regulation as proposed by the Commission in 2017.

These instruments all complement each other and together provide a solid framework for digital trust. They constitute the foundations for future competitiveness of European companies that develop services based on trusted data technologies.

But today I will focus my remarks, firstly, on the ePrivacy Regulation and the importance of this proposal for completing our legal framework for online privacy. And secondly, I will explain what the Commission has done so far to help prevent and tackle disinformation online because this challenge is closely linked to the use, and indeed misuse, of our personal data and our privacy online.

Let me start with the ePrivacy Regulation. The basic idea behind this proposal is to guarantee the confidentiality of electronic communications in the same way for all citizens of the EU, just like the GDPR brought us harmonised rules for the protection of personal data. The starting point could not be simpler: Our electronic communications are confidential. Nobody should be allowed to read our mail or listen to our telephone conversations. Of course this applies to our private communications, I think everybody naturally expects that. But the right to privacy also protects the professional communications of businesses, governments and other organisations. By the way, this includes the increasing amount of data that is generated and exchanged with and also between machines and software. For example, when you speak to your bank computer on the telephone. Or when a truck is “phoning home” to continuously update its company about the progress of its journey. Even companies who use artificial intelligence algorithms to tie their business together across the continent need to know that nobody has the right to just take their data.

Beyond ensuring the high level of protection for the confidentiality of communications, the two other main objectives of the proposal are to create a level playing field and to support innovation.

This is achieved, first, by ensuring that all functionally equivalent communications services are subject to the same rules, independently of the technology used to communicate. This means that our communications should be protected regardless of whether we use a landline or mobile phone, whether we send an SMS or an instant message and also when we use an online chat. It might be difficult to believe, but that is not the case today!

Second, the ePrivacy Regulation will create additional possibilities for operators to process electronic communications data compared to the current situation. Certain types of processing are just forbidden today. The new regulation would give providers the chance to earn the trust and the consent of their users for example for processing information about their location.

At a time when we have to talk about loss of data and other breaches of online trust seemingly every day, the need for this new instrument is clearer than ever. We therefore were very happy to see the European Parliament, and its rapporteur on e-privacy, MEP Birgit Sippel, who is invited to speak at one of the later panels, to rise to the occasion and put forward its position already in the autumn of 2017. There is no time to lose when it comes to the fundamental rights all of us depend on.

Unfortunately, on the side of the Council, the work has taken longer despite very intensive discussions after the last two years. And we now finally begin to see light at the end of the tunnel. I believe that it would be a very good signal for all citizens if the Romanian Presidency could arrive at a Council position on this important fundamental rights proposal still before this year's European Parliament elections.

And this reference to the European elections, ladies and gentlemen, brings me to my second topic, disinformation.

Ahead of the European elections, we need to ensure that social media are not used to spread disinformation, that is, false information spread deliberately to deceive.

Elections must be free and fair, as pointed out by President Juncker in his State of the Union speech last September. In our digital world, the risk of interference and manipulation has never been so high. It is high time to put our electoral rules in line with the digital age.

The European Commission's strategy to combat this threat was set out in the Communication of 26 April 2018 and reinforced by an Action Plan in December, which focuses on four key areas. It aims to effectively strengthen the EU's capacity and to step up the relevant cooperation between the Member States and the Union.

At our request, as part of these initiatives, the main players in the internet sector and advertisers have subscribed to a Code of Practice.

With this Code, industry engages in a wide range of actions from transparency in political advertising to the closure of fake accounts and demonetisation of providers of disinformation.

Specifically in view of the European Parliament elections, Google and Facebook are providing training to candidates, political parties and campaigners on how to manage their online presence and on how to protect their campaigns.

These actions should contribute to a rapid and measurable reduction of online disinformation. To this end, the Commission is paying particular attention to their effective implementation. We are doing so through a monthly monitoring system since the end of last year.

We are releasing today the third such reporting and analysis and yesterday I convened Twitter, Facebook and Google to assess the situation.

My view, and this was the essence of my message to them, is that we are not yet there despite a lot of progress.

There's no doubt that these platforms know what needs to be done. On the good end, for example, all have put in place a tool to monitor political ads, a tool particularly helpful for fact checkers and academics: who has advertised what, and targeting which population. This is a great achievement.

Yet, other areas remain patchy and uneven in between platforms. Take fake and malicious accounts: Youtube declares to have removed an impressive number of them in February, more than 600 000, but this is a worldwide figure and doesn't differentiate between political disinformation and commercial scams. But at least, they seem to act.

Now, Facebook only reports on this issue on a quarterly basis, so we cannot judge how good the trend is, but they say they have shut down three networks in the UK, Romania and Moldova.

Twitter, on the other hand, did not report anything. Beyond the Code of Practice, the Action Plan also wants to ensure better coordination between Member States. A specific rapid alert system has just been set up between EU institutions and Member States to facilitate data sharing and analysis of disinformation campaigns, and to report potential threats in real time.

On our own end of this rapid alert system, which is managed by the High Representative Mogherini, we're mobilising more resources by increasing our strategic communication budget for countering disinformation from 1.9 million in 2018 to 5 million euros in 2019.

This overview on disinformation wouldn't be comprehensive without saying a few words about the set of concrete measures to address potential threats to elections that we adopted last September. Under this package, election cooperation networks were set up in order to quickly detect potential threats, exchange information and ensure a swift and well-coordinated response. Furthermore, the Commission recommended greater transparency in online political advertisements and targeting of such ads, for instance by disclosing which party or political support group is behind individual political ads.

This is not all. The Commission is also supporting the setup of a network of independent fact-checkers and researchers to detect and expose disinformation campaigns across social networks.

Finally, we are paying particular attention to digital education actions to strengthen the resilience of our societies by providing citizens with critical and digital skills to analyse media. To this effect, this week is the European week of Media Literacy! Just yesterday, I had the honour to award the most innovative, the most European and the most educative projects and more than 200 projects are taking place right now across Europe.

In conclusion, I think one cannot highlight enough the importance of privacy and data protection not just for creating trust in the digital single market, but also for creating and maintaining trust in our democratic processes.

Political and government organisations increasingly use personal data and sophisticated profiling techniques and big data analytics to monitor and target voters and opinion leaders on social media. They send highly personalised messages to groups of persons based on their particular interests, lifestyle, and values. Targeting is relying on the complex online advertisement ecosystem.

Indeed, the same processes used to sell us shoes and cars, are used to influence our political views. While some of these political uses appear legitimate, the processing of data for political purposes may also pose serious risks not only to our privacy but also to trust in the integrity of the democratic process.

The processing of personal data, given its impact on society as a whole, should be transparent, fair and lawful. Individuals should understand why they are receiving targeted messages, and should also know who is attempting to influence them. They should also be able fully to exercise their rights when it comes to the data concerning them, including accessing the profile that an organisation has built about them. The Cambridge Analytica scandal has illustrated how the breach of the right to protection of personal data could affect other fundamental rights, such as freedom of expression and freedom to hold opinions, and the possibility to think freely without manipulation.

The GDPR and the e-Privacy Regulation together bring a set of rules that protects the privacy and individual autonomy of all Europeans from any form of disinformation, propaganda and unfair attempts of persuasion. The protection applies irrespective of whether our data is processed by private or public entities, and of whether this is done for economic or political gain.

Data protection and privacy laws must allow businesses as well as political organisations opportunities to take advantage of new innovative ways of processing personal data. Businesses must be able to continue to innovate and offer new services to customers. Similarly, political and government organisations should be able to communicate and engage with citizens in new and innovative ways.

Trust must, however, be earned. This means giving citizens transparency and control over the processing of their personal data when used to deliver a service. Data protection and privacy laws, when properly implemented, play a key role in ensuring that cases where trust will be misused will be few and far between.

Thank you very much.

SPEECH/19/1789

 

Press contacts:

General public inquiries: Europe Direct by phone 00 800 67 89 10 11 or by email