Enforcing Data Privacy: A Conversation With Giovanni Buttarelli

Tuesday, November 20, 2018
Yves Herman/Reuters
Speaker
Giovanni Buttarelli

European Data Protection Supervisor, European Commission

Presider
Nicholas Thompson

Editor in Chief, Wired

The General Data Protection Regulation (GDPR) went into effect in the European Union in May 2018, providing more protection for consumers’ personal data on the internet. Giovanni Buttarelli discusses the challenges of implementing and enforcing privacy regulations, market impacts of the GDPR, and the ethical debate over protecting personal data.

THOMPSON: Good morning. Welcome to today’s Council on Foreign Relations meeting with Giovanni Buttarelli. A great honor to be here with him.

I’m Nicholas Thompson. I’m the editor in chief of Wired. Let’s get going. Good morning. Oof. (Laughter.)

All right. My first question.

BUTTARELLI: I’m back. It was the 10th of March, 2015, when I was in D.C. And so—

THOMPSON: Since the last time you were here?

BUTTARELLI: Yes, at your premises in D.C. So I’m so proud to be here, and I’ll try after to come back and say—thank you for hosting, and I would be proud to go back to my strategy presented three years a month ago to see where we are now.

THOMPSON: Well, you’ve gone pretty far.

BUTTARELLI: At that time I was not called Mr. GDPR. I don’t know to which sense this is a compliment. (Laughter.)

THOMPSON: It wouldn’t have made sense to anybody. (Laughs.)

BUTTARELLI: As I’ve noted—(inaudible)—this is a compliment or not.

THOMPSON: (Laughs.) All right. Well, you’ve been working on privacy for twenty-three years, which is data privacy for twenty-three years, which is longer than many of us have used the internet. You rolled out the big law six months ago. Let’s start by tell me what is the most surprising thing you’ve learned in the last six months about the way data privacy works.

BUTTARELLI: That things are going as expected. So—

THOMPSON: What does that mean?

BUTTARELLI: I was expecting a little bit of a black swan or some accidents, but actually we succeeded as European Data Protection Supervisor in allowing the ball to start since day one, midnight 25th, with all the relevant rules and procedures in place, the platform security. So when I started in 2009 in Brussels, we were a couple of people in a room. Now they have one, let’s say, task force which is composed actually by twenty people. And all the roots are in place.

So months after people is asking, where are the fines? Austria started applying fines, Portugal with regard to a hospital, and ICO as a follow up to the Cambridge Analytica scandal, but focusing on also political movements. Of course, sanctions can only be applied according to the GDPR to breaches as from the 25th of May while business continues as usual with regard to the previous legal framework.

I think we are all committed to speak with one voice. I see no tensions between data protection authorities. Some inconsistencies at national level because of the (homeworks ?) by national legislators. Not all are good guys. Romania is still late. And others, particularly Spain for instance, need to do it again.

And so I consider these last six months as an avoidable test to see to which extent consistency mechanisms and a one-stop shop pre-litigations can be—can be—(inaudible). What I see is that some companies are late in identifying their main establishment, and this is a novelty since it’s relevant to identify which, let’s say, contact point would act as the lead authority. And therefore, there are important consequences for data subjects.

The last one is that, regrettably, we will not succeed in passing the e-privacy regulation to specify and complement the GDPR. While we are on the right track on many other important dossiers—e-evidence, which is in parallel with the CLOUD Computing Act; the election package, which is extremely relevant for the EU. This is why data protection is on top of the political agenda. We have elections in May in thirteen member states, political elections plus, of course (the renewal of ?) the political parliament. And therefore, my opinion on online manipulation, fake news, and fairness during political campaign is on top of the relevant discussions.

Some important initiatives also have been passed: the free flow of data, for instance, regulation; copyright reform; some other important initiatives on business-to-consumer, business-to-business.

So we are looking now to the big dossier, which is Brexit. It affects a little bit data protection. I continue to put my money that something by surprise will happen before the 29th of March. Many options are still available, although we have a withdrawal agreement, and it contains few provisions on privacy.

THOMPSON: Well, data protection is complicated enough. If we get into Brexit, this meeting’s toast.

All right. So, if I heard you correctly, we’re going to have the first fines soon. Are we going to have the first bans soon?

BUTTARELLI: True.

THOMPSON: Yes, OK. Oh, I want to talk about some of the critiques of GDPR. I think that would be the best way to spend the next few minutes.

BUTTARELLI: Yes, please. (Laughs.)

THOMPSON: So one of the first critiques of GDPR is that it’s extremely hard to comply with it—very hard to hire the lawyers to make sure that you’ve done everything you need to do, that GDPR has given you to do, and that only big companies can do that; and, in fact, GDPR is inadvertently anticompetitive, makes it much harder to create—for example, if I wanted to start a new search engine or new social media platform, how am I going to hire all of the lawyers it takes me to do that? And in fact, I’ve heard that from some of the people at the big tech companies. So let’s start with that one. Right or wrong?

BUTTARELLI: I’m not saying this is fake news and I pay—(laughter)—a lot of attention to this concern. It’s on top of my priorities since, to me, GDPR is not one fit for all. So scalability is an implicit principle. And therefore, in dealing with privacy impact assessments, the design of notice and choice approaches, the identification of different legal requirements as legal basis, you have a lot of options.

Data protection was and still remain complicated. So if you are not so expert, you suffer a little bit. But there is also a lot of continuity. So some of the concerns by small and medium enterprises actually are related to the previous legal framework. So if you see the list of legal requirements—so it’s business as usual—I think for small and medium enterprises, startups, to implement privacy by design and privacy by default is much easier than expected.

Anyway, I recommended to the legislator, to data controllers, to my colleagues not to have a conservative approach, to—I continue to say data protection should become digital. I would like to focus more on effective safeguards, not on formal requirements. We need to de-bureaucratize as much as possible this procedure. And I see—I’m also speaking with consultants and law firms to refrain from using a lot of Pater Noster, Ave Gloria where there are endless documents with “whereas,” “whereas,” “whereas,” but then at the end where is the analysis of the relevant risks?

I’m a member of the judiciary, though detached to EU institutions. And being in front of, let’s say, a case about a data breach, my first question would be not what you did—so, tell me to which extent you comply with Article A, B, C. I would ask the data controller or processer: Do you have a policy on privacy? How much are you creative in, let’s say, developing the principles into your scheme? So how much you have an overview and a helicopter view, a long-term experience? And although perhaps you are late in introducing certain improvements, perhaps you are on the right track. And to me, this—his answer would be—would be—would be key.

So I see some panic modes, but also, let’s say, a reasonable start.

THOMPSON: Can I go back to one of the—one thing in your answer that—there many things in your answer that struck me. One that particularly struck me was your notion about small and medium companies and privacy by design So privacy by design, as I understand it, is in GDPR. If you start a company that collects data, you have to think about privacy from the beginning, you have to think about it at the middle, you have to think about it at the end. But that’s hard. That’s easy for a startup, right that got the law: privacy by design, I’m starting a new company, I’m in. But a medium-sized company that’s been working for a while, figuring out how the rules of privacy by design applies to you, it’s kind of hard, isn’t it?

BUTTARELLI: Yes and no. I mean, what is essential is that data protection, which is not privacy, is better embedded into the day-to-day management. So yesterday at the dinner someone was speaking about the complete separation of, let’s say, activities in a big company in the U.S. where lawyers are, let’s say, excluded from the engineer building. So they look to the legal dimension when it’s a done deal. This is exactly the opposite of what we are looking for.

I noted on the midnight on May—between May 24 and May 25, a tsunami of notices were—

THOMPSON: (Laughs.) I’m still getting them.

BUTTARELLI: Yeah. From what I call very small enterprises, and they were useless. They didn’t change their business model, so nothing new was requested, particularly consent. But also what I noted is an entirely different approach where—and I’m now referring to big companies here in the U.S.—it’s a sort of—so we feel a little bit betrayed in terms of spirit of the GDPR because these legalese notices are designed to protect the data controller. They’re using a language which is incompatible with the rules on the clarity of notices to data subjects. And they depart from privacy by design and privacy by default, to go to your question, since they—they say, OK, consent is now to be really freely given, can be withdrawn, and therefore, OK, certain services—could be facial recognition or fingerprints or the sharing of information—are not anymore presented as an option subject to consent; they are presented in a take-it-or-leave-it approach. So this is our business model: Would you like to be my customer? This is what I offer. Otherwise, please choose another social network, another platform. This kind of business model is currently under our analysis.

THOMPSON: Right, understood. All right. Let me go to another critique, which is one of the—I think probably the most important one. And that is the U.S. and China are heading into a—in the middle of some kind of very complicated tech competition, and at stake is foreign policy influence, who will build the 5G networks of the world, all kinds of things. And there is a belief that having all of this regulation in Europe, which in ways spreads to America as well, is hamstringing the American companies. China doesn’t have nearly the same concerns about privacy that we have in the United States, certainly not the concerns that are held in Europe, and hence is able to centralize much more data. And when you’re building an AI system to monitor your citizens, if you have lots of data you can build a much more efficient system. And so the argument would be in the West we are tying rope around our ankles and around our hands as we head into this competition of artificial intelligence by limiting the data that companies have to analyze and build products with.

BUTTARELLI: Much more than you may imagine, China is looking to GDPR. So they are harassing me to visit China—(laughter)—and I refuse because of the international conference I hosted in October. If I—

THOMPSON: If I could interview you in China, that would be my dream.

BUTTARELLI: Yeah. I counted twelve weeks more to fully recover, and then I promised to travel to China in between March and April. And I know what they are looking for. I think they will have a problem in terms of softer language one day to make their services fully interoperable with the rest of the world.

THOMPSON: That’s interesting.

BUTTARELLI: So I was told that the way in which they design, they already design, although it’s (sparkling ?) and subject to limited developments, may create a problem in offering goods and services in the EU in profiling people. And as you know, the GDPR is only applicable to the standard that even from China you operate in the EU offering goods and services. So question number one, of course, the Chinese market would be (sparkling ?) in a few years from now and may even prevail against the U.S. Silicon Valley, to which extent they are in a position to be really, let’s say, dominant in the EU.

But to answer to your question, I would also say that now the emphasis is on GDPR. So Europeans are so proud to say we like to lead by example. The Council of Europe has modernized the Convention of 108, which is open to ratification by many known EU or European countries. But a big change is now, for me, around two things.

First of all, the debate versus—regulation versus regulation is over. And therefore, now we have 128 countries in the world, seventy-one out of which are outside Europe geographically, designed—so I’m not speaking about European Union; I’m also considering in Europe northern countries, Russia, Israel—seventy-one countries outside this region means that data protection is now a reality everywhere, including Brazil.

I’ve been invited in Japan in 2014, one thousand four hundred people from every kind of public and private sectors discussing what they should do to be GDPR-like or GDPR-lite. And I was impressed because they had nothing at that time. They had fewer—DPA composed by three people was named specific data protection authority. I’ve never heard something like that. Now they are a big task force. They amended twice their national law. They are deeply committed to establish the biggest single market area in the world.

So, as you know, we are discussing the trade agreement with Japan. Europe has been successful in establishing the principle data protection should not be regulated within agreements, so agreements are to be linked with GDPR. And therefore, in return, we offered Japan the chance to consider Europe as—in terms of—(inaudible)—finding. So Japan will be the first country in the world assessing the adequacy of the European Union legal framework on privacy.

I see there are some friends here in the tables assessing our adequacy in terms of surveillance, for instance, by intelligence in the EU, and I see their point. But now this exercise will be made with regard to the private sector.

South Korea will come soon. So I’m expecting the trade agreement with Japan, passed in early spring—South Korea will come soon—this will be extremely relevant for the future of the privacy shield.

So U.S. and EU continue to be, let’s say, strategic partners. We have many developments. And therefore, I guess that this would also influence a little bit the debate at the U.S. federal level on—

THOMPSON: Well, you already have. I mean, we’ve got our California laws. We’ve got privacy laws kicking around in other states.

BUTTARELLI: Yeah, not only because of the GDPR.

THOMPSON: But there will certainly be no privacy regulation passed in our Congress in the next few years, given recent elections. I mean, you can’t imagine the Congress and the—but let’s leave that aside, because we get into that it’s as messy as Brexit.

OK. One more question before we move to audience questions, which is as the world all goes towards privacy, as you’ve just laid out—and even China, absolutely, is building better privacy into their systems—there are tradeoffs, right? Privacy isn’t free. So one of the tradeoffs that interests me is the tradeoff between privacy and safety. And as you increase privacy, in some ways you decrease safety because a platform no longer has the data to identify whether there are terrorists organizing, or whether someone is coming to the Council on Foreign Relations to do something damaging, or whether a kid—if you can’t analyze all their data—is heading towards self-harm or suicide. So how do you think about the tradeoffs between privacy and safety as we move to a world in which privacy is given much more value?

BUTTARELLI: The mantra is everywhere the same, so the two values are not incompatible. Some of you are familiar with my background, which was related to organized crime, anti-mafia legislation, intelligence. (I don’t ?) know why I moved to data protection then.

But to me law enforcement bodies, intelligence services, are to be now equipped with the modern tools. They should be invasive where necessary, including intelligence services. What is essential is transparency in what they can do, assessing the proportionality, selectiveness. So in my view, they can even commit a crime where necessary to preserve and defend certain central values.

The question is that, as the Court of Justice and the Court of Europe—the European Court of Human Rights said, that what they do should not be subject to a discretionary power but should be subject to a little bit of flexibility where in advance you know what they should do.

But I know—(inaudible)—police forces collects from the judiciary, they think that it is essential to have an approach where data ought to be collected as much as possible. So there is an appetite in terms of public big data to be used for enforcement. So data—mandatory data retention of all traffic data, for example, GPS, and then no they are following Internet of Things. And they believe that if you have nothing to hide, it is better to collect as much as possible data to be collected just in case. The Court of Human Rights said that the just in case approach—so meaning you collect and retain under secure measures every kind of data to be used, were indispensable, is incompatible with EU values.

So to my—to me, the European Union passenger name record is unconstitutional, because there is not only mandatory data retention—so you retain the data that you traveled to Boston or a domestic flight in the EU—but to start profiling people, to analyze to which extent this trip can be considered as a—as risky. So everybody’s considered as a potential suspect. And this is different. I think there is another way to deal with big data more intelligently.

THOMPSON: All right. Let’s go to questions. Standard Council on Foreign Relations policies. Here you go, right up here.

Q: Hi. My name is Alex Yergin. I work at a company called Datawallet. It’s a consumer to business data exchange.

I have a question on your point about AI and sort of data ownership. How much do you see, as a way to counter this world where, yes, there are emerging AI superpowers where people have no data rights, would be, as enabled by GDPR, giving people ownership of their data, and then allowing them, potentially for compensation, to permission that to companies for use? Would that sort of solve that problem for free societies?

BUTTARELLI: Good point. I published yesterday a blog focusing on our last input for AI. Let me say that for years everybody was speaking about privacy. Then we moved to data protection. More recently, everybody is speaking about big data. Now it seems that everything is artificial intelligence. So we need to focus on what is real AI. The three documents I commended in the blog, which is publicly available on my website, relate to the background I produced in 2016 for the international conference in Marrakesh. The resolution we adopted, 236 regulators from eighty-one country in, let’s say, a proactive way, so by focusing on all efforts and then to boost the development of AI. I also took the liberty to contribute to—with my signature—to a document adopted by Public Voice. There are some other concerns. And finally, I also commended a document by SIPOL (ph) that are interesting, interesting points.

So GDPR doesn’t speak exactly about artificial intelligence. You will fail by finding anything relevant. Two days ago, the Austrian presidency of the European Union council failed in finding a compromise on e-privacy regulation. Why? Since Germany and France consider that there are problems by AI viewpoint, it’s effect. So understand that perhaps we are approaching elections and therefore this is more than natural. So if you ask me with regard to both big data and artificial intelligence, to which extent GDPR may be an obstacle or will be very optimistic. On the contrary, by our viewpoint some principles such as partners and limitation, transparency, fairness, ownership, controllership, and even the notion of data subject, because we are grouped increasingly into different clusters, are a little bit now under challenge.

So this is why we need to make, as I said, data protections digital. We need to make these principles more effective in practice. Although this is not a Chatham House event, let me say that GDPR is not the one of my dreams. I’m called Mr. GDPR, but I did my best to simplify and to—(inaudible)—in the best way. It’s the best piece of legislation we may achieve, much better than many other GDPR-lite in the world. And the big success is that now twenty-five out of forty-seven provisions in the original proposal of the European Commission, speaking about additional routes—could you imagine forty-seven other routes on data protection in the EU—this will be good for lawyers and law firms—are now replaced by flexible guidelines. And therefore, I think artificial intelligence could be one of these areas. So I’m confident.

THOMPSON: But we can’t let you go there. What are the other things you don’t love about GDPR, Mr. GDPR? (Laughter.)

BUTTARELLI: Well, they have been shy in reducing the kind of formalities. And there are some exercises, such as those on privacy impact assessments, notifications, where we run the risk to be—to be lost. And the establishment of the board is currently the best achievement we may have. We had a difficult debate in between proximities—so, proximity of data subjects and data protection authorities—and centralization. So in May, I will publish my manifesto for the years to come. GDPR will not be profoundly amended before ten, up to fifteen years, which is more than a century compared to the existing directive.

Discussion will start on substantive changes in—not before then—ten years from now. But we need to start with an exercise wherein a return to—of the extended scope of application to everybody in the world. We need to speak more with one voice. So it’s time to introduce a truly European board, and where every national DPA. And my solution can be the center—something like the European Central Bank approach. I am in the U.S. I can even say something similar to FTC. But then I can be criticized by my colleagues living in the EU. (Laughter.)

THOMPSON: All right. In the back there.

Q: Very good to see you, Dr. Buttarelli.

I’m a member of the European Union high-level group fighting disinformation. And of course, when we’re meeting with our different colleagues from other European countries the issue of data protection always pops up. It helps tackling of disinformation or actually it becomes more difficult to tackle disinformation. And also, when GDPR first came into effect, there were worries about international crime and organized crime. And did you see any evidence against that or not? Thank you.

BUTTARELLI: Not at all. I will be—I met the director of the European Union FBI, Europol. And my institution is the new advisor of this police force. And so because of the new specific regulation, I didn’t see any, let’s say, adverse effects. The same with regard to Eurojust, who will supervise soon, the European public prosecutor office component for anti-fraud activities against the interests of the EU. And I will be meeting the director of OLAF, which is the operational arm on anti-fraud, in a few days from now. And so I didn’t receive any specific concern.

What I see is that there are many unjustified concerns. And the lack of cooperation depends on something else. Competition mode between different police forces. But when Europol has been established, it was a (cathedral ?) in the desert, in The Hague. So everything was ready, but not the data, because police forces used to exchange the data via Interpol. Why? Because via Interpol, I give you a favor today and I’m expecting something back one day. In Europol, OK, I gave all my data and then you get the benefit by using the same information. So it’s a—it’s a change of culture.

There are many obstacles by criminal procedure viewpoint which are connected to data protection. But they are not—there are many mutual legal assistance treaties saying that data exchange between certain parties cannot be used for other purposes. But this is not for data protection. This is for certain safeguards and by criminal code viewpoint. So what I say is that Europe should do more by national security viewpoint. As you know, Europe, the EU, is a separate entity, compared to national member states. EU is not component for national security unless you as a member state in developing your national security approach, you are acting in breach of the values. For instance, those of the—of the (chart ?). But we have a fragmented approach. And here we have an area of concern. If Brexit occurs, the hard Brexit, we will have an interesting exercise on the adequacy finding of the U.K. system, since in this case, like it happens for privacy shield and safe harbor, we need to assess the necessity of proportionalities of their surveillance system.

THOMPSON: Over here, right at the front.

Q: Ricardo Tavares from Techpolis.

Cars are becoming completely digital machines and producing an enormous amount of data. Data about the performance, but also data about how people move. An interesting aspect is agricultural machines, because they are moving a little bit faster because their risk is lower. The agricultural machines, some of the manufactures are controlling the data. So they don’t allow systems integrators to access it, and they don’t allow sometimes not the owner, unless the purchase agreement had data sharing. So how does the GDPR will deal with an issue like that, because it’s an issue of who owns the data, right?

BUTTARELLI: GDPR is technologically neutral and future(-read ?). So you would never find any specific provision dealing with a connected automatic vehicle. Then there are technologies, platforms, scientific technological developments where we—data protection regulators arrive when it’s too late, this was the case of social networks, for instance, while there are other happy-end stories where we intervened in the right time. Drones has been one of these examples. And connected automatic vehicles is the second one. Last year, in Hong Kong at the privacy conference of all data protection commission, we adopted a wonderful resolution unanimously I think—only FDC abstained—but for a question of a procedural viewpoint. But you will find a lot of creating input to be tailored depending on the system.

And of course, integrity is on top of—in the sharing of data—is on top of our concerns. And the resolution is not drafted in a dictative way. So it depends by the scheme, so. And therefore there is a—so people is encouraged to reflect on where the data are to be located in terms of centralization, decentralization, opt-in, opt-out, privacy by default, privacy by design, where because of security needs or ethical principles. So the public concern prevails which kind of personal data—(inaudible)—information or metadata are to be collected, for instance, for the public good. And some ethical principles, because as you know some of the, let’s say, the settings are not only related to legal issues, no?

THOMPSON: Back there.

Q: Angela Sun from Western Union.

Can you share your perspective on how to change the conversation among companies for data protection from a more risk and compliance-driven discussion to a broader business and opportunity-driven conversation?

BUTTARELLI: Well, I continue to say that data protection is an asset and it’s also big business opportunity in terms of trust and confidence. So we can go back to the debate on e-commerce. E-commerce was considered risky because of data protection at that time. Why we know why it didn’t develop in that. I think we are—and Cambridge Analytica and similar cases, Cambridge Analytica is only the peak of an iceberg, demonstrate that trust and confidence are essential. We have concerns about the predominant business model. We’re—so discussions are taking place to see which legitimate model can be developed to, let’s say, offer more space for competition by a privacy viewpoint.

But by a consultancy viewpoint, in terms of accreditation, certification, seals, new professions, particularly those concerning data protection officers, communicative campaigns, availability of data subject to share their information, the experience of data (vaults ?), described by Tim Berners-Lee in Brussels, where companies make benefit of data validated in real time by data subjects by simplifying a lot what they’re doing in terms of notice and choice. It’s another technical approach where you switch on, you switch off the data, which remain under the availability of data subject what can be used for business purposes.

What I would recommend is to make profit of the new definition of (pseudonymous ?) information. So do you really need to have the full name of the relevant data subjects? Are you in a position to, let’s say, perform your marketing campaigns, to single out people without, let’s say, approaching them? So scalability also relates to the kind of information you are processing. Seems not yet a reality, but perhaps they may be relevant since there are facilities in terms of transfer of data, for instance, sharing of information, provided that you satisfy a certain criteria. So I would appreciate that companies invest a lot and become partners in the accreditation/certification system, which may help in having external audits, and de-bureaucratizing, and simplifying.

THOMPSON: Oh my gosh. I have, like, six follow ups to that answer, but there are nine questions here in the audience. So, OK, let’s go to the middle. And remember this—

BUTTARELLI: I can speak to you this evening. Don’t worry. I have a flight at 9:00. (Laughter.)

THOMPSON: Great. We’ll be here all day. And this meeting is not GDPR compliant, so please state your name and your organization. Right here in the center.

Q: Thank you. Hi. I’m Jason Kint with DCN.

I guess my question is, so last week there was a ruling in France, French DPA, and it seemed to have gone unnoticed here in the U.S.

BUTTARELLI: About consent.

Q: Yes. I haven’t seen press here in the U.S. about it for the most part. But it would seem to have some implications on the entire digital advertising market and the way it works, and maybe even one of the large platforms. Can you—should we be paying more attention to that ruling? Or are we making too much of it?

BUTTARELLI: May I contact you after the fifth of December? (Laughs.)

THOMPSON: What happens on the fifth of December?

 BUTTARELLI: We have a plenary session to discuss the French case.

THOMPSON: Ah. Well, you’re sitting next to the press, Jason, so you can talk to Sue (sp) about it.

Who has another question? Right here in the front table.

Q: Alan Raul, Sidley Austin.

Supervisor Buttarelli, you mentioned national security and the lack of confidence of the EU for that topic, which inheres in the members states. But you talked about the e-evidence regulation that’s been proposed and the counterpart in the United States, the CLOUD Act. I think you said it was proceeding in parallel. And you mentioned that if Brexit occurs, there would need to be a review of the British privacy system for adequacy, in light of its rather formidable surveillance regime. But the e-evidence regulation is really quite aggressive in Europe as well. And if you think that governments in Europe are going to want to access digital information in the United States under the CLOUD Act, which is an option, do you think that the U.S. and Europe are going to get closer on electronic surveillance and privacy, and abate some of the tensions that arose after Snowden, which really I think started the war on data between the United States and Europe?

BUTTARELLI: We have the e-evidence proposal because of the Microsoft case, and because of the cloud computing act. I support the initiative, which is likely to be passed by 2020. So they’re not in a position to have a deal before February. Everyone is disappearing in Brussels in a few weeks because of the elections. There are some, let’s say, technical discussions. And the aim is to ensure as much as possible, as you say, a synergy and consistency between the two. So I think Europe needs an initiative. There is an agreement that Articles 42, 43, 44 of the GDPR should remain untouched.

So anytime there is a third-country decision to provide access to certain data, this is to be in some way validated by national DPA or a European national court in the EU, provided there are no agreements, such as bilateral agreements, mutual legal assistance treaties, or in cases—in case you cannot follow the council procedure. So it will take more than expected. So we were pushing for a deal before February, but regrettably this is not anymore in the priority list, because of the complicated debate on certain details that go beyond data protection.

THOMPSON: OK. Go ahead.

Q: (Off mic)—Techonomy.

You know, Brexit is one form of split in the EU. But there seems to be a developing split even around the very concept of democracy and freedom with Orbán and an orbit around him beginning to really argue that—in a way that I would suspect would affect the very notion of privacy. And you are arguing against the idea of preemptive data collection for sort of potential future use. But how do you think the evolving politics of the EU itself could affect the notion of privacy in Europe, which from here seems like a worrisome set of developments?

BUTTARELLI: Well, not only Orbán, but also Poland, or you can even say Italy now, yes, are interesting countries to be better analyzed. There’s a big debate in between nationalism and the EU. I don’t have a crystal ball, but I expect that the power of the bigger political parties, the center democrats—the social democrats and the liberals will be a little bit reduced. But perhaps the tsunami—the anti-EU tsunami will be, let’s say, smaller than expected. In between now and May, many things may happen.

In any event, I don’t see an anti-privacy, let’s say, approach by this country. On the contrary, what I see is that from France and Germany in particular there is a concern that the technological gap in between EU and U.S. needs to—needs to be rebalanced. And therefore, we continue to hear sort of a recall from the jungle so that GDPR may affect the European Union market, European Union businesses against the—over the top. So we have to clarify. I have to appear before the different fora and demonstrate that this is another fake news.

In terms of fairness and transparency, the electoral package to be passed before February will have a lot. It will introduce fast-track procedures in terms of cooperation by national DPAs. For instance, imagine the investigation by the U.K. ICO and the body to be reinforced in Brussels to apply fines where necessary, even to political parties. So transparency on the use of social networks, we need, let’s say, full transparency of what they are doing. Intensive use of the data, yes. Some facilities, as those regulated in the Spanish law, but within the umbrella of the GDPR. So to answer to your questions, I see some possible counter-effect on the e-privacy regulation, but not on GDPR.

THOMPSON: Here on the left.

Q: Hi. Stuart Levi from Skadden Arps.

You talked earlier about the GDPR being technology agnostic, but also talked about, you know, missing a couple of trends. I was wondering if you could comment on blockchain and the GDPR. I know the CNIL came out with its report recently. The EU Observatory had its own report. But do you think that that’s an area that needs to be addressed, you know, in a more sort of EU-wide level in terms of how the GDPR and blockchain will interrelate? Because the committee report had a lot of—you know, there’s a lot of good questions out there, but not a lot of good answers.

BUTTARELLI: It’s an area where I’m investing a lot of energies differently from some of my colleagues. They consider that this is not a reality in their country. This is not a priority. Blockchain is a technology, difficult to say to which extent it will have a big future. As such, it’s not incompatible with some data protection principles. It may help in terms of integrity, documentation of certain transactions. The question is, how the technology is implemented in practice, and in which areas. So if you are looking to e-procurement, the public sector, perhaps you will have minor, let’s say, effects. It may trigger discussion by the purpose and limitation principle, according to which, as you know, data are to be collected and identified—or, in an identifiable manner. To the extent, this is essential to achieve the original purposes or other purposes compatible with the original one. While blockchain is designed forever.

But data protection is now a consolidated legal framework but is not forever. So it cannot exclude that because of prevailing needs around blockchain we can even amend certain data protection principles in the long term. You may have some practical consequences in terms of controllership. Who is the—not the owner. So who is the entity in charge of—so, being the counterpart of data subjects, access to the data, so provided that certain data are encrypted. But all in all, these are, let’s say, modalities, specificities. But I don’t see any fundamental, let’s say, negative trend.

THOMPSON: Have you been asked to investigate any potential GDPR violations by blockchain companies so far?

BUTTARELLI: No. But we are deeply involved in many discussions. Blockchain in the EU is still in its infancy.

THOMPSON: We have time for one more question. In the back.

Q: Hi. Natasha Singer from The New York Times.

We’re starting to see social networks in the U.S. perform health scans of people’s posts, particularly—I’m sure you know about the Facebook suicide AI that scans every post and then scores it for suicidal thoughts. And we’re also seeing government agencies interested in scanning social media for the same thing in North America. So I’m interested under GDPR how that would be viewed, both by private companies and by government agencies. There’s no consent. You neither can consent to have it done nor can you opt out.

BUTTARELLI: It depends on what you mean for scanning and depending by whom. So where applied to public entities, you need to have a legal basis. And therefore the discussion is about transparency, but largely necessity and proportionality. And the legal system for access by law enforcement bodies may vary depending on the country. For instance, in my country of origin, Italy, police forces do not have any kind of access, since the confidentiality of communications is protected at the upward level more than, let’s say, your physical freedom. While in other countries, secret services or other law enforcement bodies may have a reasonable access to the data.

As applied to companies, I go back to my comments on the tsunami of notices. I noted exactly a take-it or leave-it approach on some of these examples you mentioned, I shared in an interview with The New York Times weeks ago, and they are under investigation now.

THOMPSON: All right. Thank you very much. It’s 9:00. We all have to go back to work. Thank you so much. It was a great honor to have you.

BUTTARELLI: Thank you so much. (Applause.)

THOMPSON: It was wonderful. Thank you.

(END)

Top Stories on CFR

Indonesia

Prabowo Subianto was named the winner of the Indonesian presidential election. But it is unclear which version of Prabowo—the more moderate candidate from the campaign trail or the self-styled strongman—will govern Indonesia.

Russia

The mass casualty theater attack in Moscow was a reminder that affiliates of the Islamic State have reorganized and infiltrated even powerful states.

India

With India's development continuing to gain steam, one of the biggest challenges will be to avoid the mistake that others have made when they failed to recognize their newly acquired global systemic influence and adapt accordingly. Both China and Big Tech show that it is never too early to start managing one's own rise.