Will the GDPR serve its purpose? Three difficult lessons after its first year

Article
21.05.2019
11 min. read
Text
Image

Legal regulations take shape by practice. The GDPR - as a legal document - was born in 2016. After two years of incubation it was thrown into the market. A year later, many are curious to see whether it can swim. But the expectations were higher. We wanted it to change the whole ecosystem, change the distribution of power over data. Are these hopes lost already?  

Since I was involved in European data protection reform from its very beginning - from the moment Viviane Reding announced her vision and opened consultations in 2012, throughout all smaller and bigger fights in the Parliamentary committees and the challenging trilog - I share a sense of responsibility for this brainchild. Have we set it for success or failure? Will it enable a more sustained, human-centred digital environment? Can we ensure its coherent enforcement not only throughout the EU but also beyond, as it will have effects on other continents? How many years should we wait before making definite judgements?

I don’t have all these answers yet, not even for myself. One year is certainly not enough to judge the value of the reform that took six years to prepare and is here to stay for a couple of decades. But it is enough to show us where the obstacles are, or, if you like, what needs to be fixed if we want to see the GDPR develop to its full potential in the near future. In my own practice of using and testing the GDPR over last 12 months I have learnt three lessons, which I am happy to share.   

Users give up because they see no alternative

It is not true that people don’t care about their data. It was not true either back in 2011, when Eurobarometer showed that 70% of Europeans were concerned about how companies use their data and felt they only have partial, if any, control of it. Back then 74% wanted to give their specific consent before their data is collected and processed on the internet. In 2019 many are frustrated with pop-ups that prompt them to give consent to something they don’t really understand. But if they stopped reading information clauses and moved on to clicking these pop-ups away, it is rather a sign of resignation than acceptance. One cannot accept something one does not comprehend. And the complexity of data flows involved in average internet service by far exceeds the average users’ capacity to follow, understand and control.

“Informed consent” is a fallacy when we look at complexity of data flows and the power of behavioural analysis.

Still the numbers of users across the EU who decided to use their rights under the GDPR - in particular their right to access and verify their own data - has probably exceeded 100 000 so far. Just one tool meant to support such requests, provided by Bits of Freedom, attracted 17 000 active users in 7 months only. Internet service providers that I spoke to in the first months of GDPR application admitted to being flooded with requests numbered by the thousands (per company). On the other hand, based on data collected by EDRi not more than 0,05% of citizens across the EU filed a complaint under the GDPR. Should we then say that – as Europeans – we are not concerned with data protection, or have no reason to complain? I think it would not be a fair conclusion. We have to see this individual effort - or the lack of it - in the right perspective, which is the standard of GDPR interpretation adopted by major commercial players.

Even with the GDPR in place it still takes a lot of clicking to refuse consent to targeted advertising, while in some cases you may give in without actually noticing it (for example by closing a pop-up!). The Norwegian Consumer Council, in their report, documented “dark patterns” applied by the biggest online companies in the design of their services to make sure that users “choose” less privacy-friendly options and share more data than is necessary to run the service. But it is not just a matter of user experience and web design. Above all, it is a matter of narrative we have been told on and on for the last 20 years: free internet will not work without your data! Either you accept this transaction or we will have to close the net. Choose! 

Users were convinced that internet cannot survive without tracking and letting third parties use their data. This narrative must change if we want to talk about free choice.

There is no space for choice in this blackmail. Other business models are possible, but companies who asserted their dominance based on user data have no interest in developing them. Users have this interest but - in order to push the market in this direction -  they would first need to understand what is wrong with the current model, such as that they are not “users of free services” but merely fuel for the development of AI and B2B services based on prediction (incl. advertising). On this market, the stake is influence. Their data is used to build statistical models and train algorithms that are there to change or control their behaviour.

The GDPR, with respect for human dignity and autonomy as its cornerstone, could be a great legal tool to fight the current model, but the battle has to start with promoting a new narrative. First, we need to reinvent the concept of a “user” to mean a person that has a voice and rights towards service providers, but also willingness to pay a fair price for such service, a price that will not be calculated in data. Draft guidelines from the European Data Protection Board already show the way by defining what data can be deemed necessary for a specific service. While users need to wake up, data protection authorities need to fight cookie walls and other “forced transactions” involving data. The premise has already been established: personal data cannot be considered as a tradable commodity. Full stop.

Cooperation is difficult. But it is a must!

European regulation needs European institutions to shape its interpretation and control its enforcement. And we have it, on paper. The GDPR makes it very clear that cooperation between national DPAs is a must and creates many ways in which it can be facilitated or, if necessary, enforced. It is not up to lead authorities to decide whether they need to engage other DPAs. If a “significant number of data subjects” from a certain country is “likely to be substantially affected by processing operations”, national supervisory authority has the right to participate in joint operations. Cases considered to have an impact in more than one member state may be referred to the EDPB. But there is no need to wait for any case to come up: the EDPB can also issue opinions on any matter of general application of the GDPR or having an effect in more than one member state. Any DPA, as well as the Chair of the EDPB, can start such procedure.

On paper it does look pretty straightforward. In real life, it will take a lot of red tape and inside games before these mechanisms are alive. We hear inside stories about first proceedings that, apparently, are pending before the EDPB. We hear about the attempts of various DPAs to get involved in cases that affect citizens under their jurisdictions. However, not a single opinion on a matter of general application has been issued so far and there are no joint operations formally running. The first Google case was resolved by the French authority CNIL without even notifying other supervisory bodies. Now the spotlight is on Ireland, as lead authority for cases brought against Facebook.

Data protection authorities need to learn how to cooperate, while maintaining their legal independence. It means they will need staff with political skills as well.

There’s no doubt that cooperation is difficult. Before the GDPR, supervisory authorities operated in different legal and political cultures. Under the regulation they remain independent from one another but, at the same time, they are expected to establish a culture of cooperation. In case of obstacles, each of them can try to engage the European Data Protection Board and seek more forceful measures. This step, in practice, means dealing with huge bureaucracy in Brussels and playing politics with other DPAs. No doubt that moving from nice-looking principles on paper  to real cooperation between DPAs will require diplomatic and political skills from their staff. But there is really no other way, so we better not hide this challenge.

GDPR is not what commercial consultants made of it. This harm can be reduced if European Data Protection Board starts pushing its own interpretations.

It is high time for the EDPB to start explaining how the GDPR should be applied and interpreted, to reclaim that narrative from commercial players. In the last two years this space was occupied by consultants and law firms, having their own interest in picturing this regulation as complex, formalistic and dangerous (the vision of financial fines that can fall on “innocent companies” almost without warming). In 2019 it is clear that DPAs will not rush to impose fines and even with regard to the biggest companies, such as Google in France, they are starting with reasonable amounts. But the discourse around this regulation remains rather negative.

It is now the task for public bodies, including the European Data Protection Board itself, to correct this image. The GDPR was miscommunicated and misunderstood to such an extent that it became, in common perception, the opposite of what it was supposed to be. Fortunately, on paper we still have sound principles, risk-based approach, and encouragement to cut the red tape and think strategically about data management. If this perspective is consistently promoted by EDPB guidelines and opinions, good practices on the market will follow.

We cannot fix technology if we don’t understand it

We live surrounded by and immersed in technology that is complex. Our brains have not evolved to grasp invisible and intangible dangers, such as targeted messages, information bubbles or data leaks. Data is an abstract concept, and so are algorithms and other tech buzzwords that come with the GDPR. For decades we were not able to respond to data exploitation because we did not experience it in the way we experience storm or disease. Now we have strong regulation, which can be applied to all kinds of technologies processing our data across the globe (as long as we – as data subjects – are located in the EU), but common understanding of how this all works is still missing.

Users are lured into thinking that they control their data just because they do the clicking and they choose what to share. Big data is presented as pure statistics. Powerful marketing profiles that determine what we get to see online turn out to be “just guessing” and “nothing personal” if we demand access to them. Algorithms and AI are marketed as “black boxes” that cannot be explained. This narrative doesn't give justice to what really happens behind our screens. It hides from users a shocking, when you realize it, fact that constant observation of their behaviour generates massive amounts of truly sensitive personal data. This is how tech companies make real profit and this is where our information autonomy truly does not exist. 

We need powerful authorities to check what happens behind our screens. Independent research sheds light on the patterns of data exploitation but, by itself, it will not deliver evidence.

The GDPR leaves no doubt that, as individuals, we have the right to control personal data that was generated by algorithms and statistical analysis. We can also demand an explanation of the logic behind the use of algorithm if it leads to decisions with significant impact. These are sound principles and great starting points for strategic litigation. However, we have to honestly admit that one year after the GDPR coming into force we are nowhere near opening these black boxes.

We have filed first complaints that question advertising models based on constant, hidden tracking and predictive profiling, and we have learnt that DPAs will need our support and access to independent research in order to handle such cases with understanding. While we are prepared to share all expertise that we have, there are also barriers to our understanding that only strong authorities can break. Without access to servers, databases and code, we can only do educated guessing of what happens inside of these systems, but we cannot present reliable evidence. This is where DPAs, civil society and academia need to cooperate beyond specific cases. We can call it pre-control workshops, we can use another name and formula, as long as it brings us closer to understanding hidden data flows and opening algorithmic black boxes.

Katarzyna Szymielewicz

For the past 10 year Panoptykon has been fighting for strong protection of freedom and privacy. Support our work! Donate to Panoptykon Foundation