2018 was a very exciting year that demonstrated clearly how our society is changing in the context of digitisation and the ever increasing relevance of data. In March, the Cambridge Analytica & Facebook scandal was published by various media, in May the world’s most modern data protection law, the General Data Protection Regulation came into effect, posing major challenges and insecurity for many companies, and in October Gartner even named “Privacy & Ethics” as one of the top strategic trends for 2019.
Another highlight for us was the German podcast “The Future of Privacy” by netzpolitik.org, which was recorded during the Netzpolitik Conference 2018. For this podcast they gathered such notable privacy experts as the privacy researcher Wolfie Christl, Florian Glatzner from the German Consumer Advice Center and Frederike Kaltheuner from Privacy International.
The whole discussion goes for over 1 hour and they highlight several interesting points such as data economy, AI and algorithmic decision making, the new role of data protection authorities, big data, tracking, the perception of the GDPR in the public and many more.
It’s highly recommended for everyone who is interested in data privacy and its future implications – and who is able to understand german. For those who don’t – don’t worry! We have an edited and translated transcript of the last part below, where they discuss anonymization methods, data ownership, their demands for the future of privacy and conclude with the following recommendations:
- Enforce the General Data Protection Regulation
- Strengthen the supervisory authorities
- Adopt a strong ePrivacy regulation that protects digital communications
- Focus on market power and competition issues and connect with the data protection debate
- Regulate discrimination problems in algorithmic decision making and AI beyond simple data protection issues
- Take a closer look at the effects of the data economy on marginalised groups
- Promote the use of data for the common good
- Intensify research into anonymization techniques
Ingo Dachwitz is a media and communication scientist, editor at netzpolitik.org and member of the “Verein Digitale Gesellschaft” (Digital Society Association). He writes and speaks about data capitalism, data protection and the digital structural change of the public.
Frederike Kaltheuner is Data Exploitation Programme Lead at Privacy International and also develops PI’s positions on the privacy and security challenges of connected spaces. Frederike regularly speaks at tech, policy and art conferences and comments on emerging technologies in the British and international press.
Florian Glatzner has been a consultant for the Federation of German Consumer Organisations (vzbv) since 2011 and is responsible for data protection in the digital & media team of the German Consumer Advice Center (Verbraucherzentrale Bundesverband e.V.).
Wolfie Christl is a technologist, researcher, writer, educator, and digital rights activist based in Vienna, Austria.
Anonymization Tools for the Future of Privacy
Ingo Dachwitz: We have to talk about anonymization – do you think that it is possible? As I know, there are at least two telephone providers in Germany who have developed quite elaborate anonymization procedures — at least according to their own statements — in order to be able to use data from their mobile customers. Telefonica and Telekom have built anonymization solutions, have achieved accreditation by the [German] Federal Data Protection Commissioner, and they are now analysing mobile phone data that is not on a personal level. Is that the solution? Will we ever have anonymization procedures that are good enough to compete against these “triangulation possibilities”?
Wolfie Christl: I think we have to differentiate between two things. The first is that companies very often use the word anonymization when really nothing is anonymized at all. That really has to be said. These are very often pseudonymised data. In the cases [you mentioned] it’s probably really what you mean by anonymization. I don’t know [their projects] in detail, but you are certainly able to do so. I think it basically makes sense to use anonymization methods and then try to use data more extensively. If this works, I would even say that in many cases we need more access to such evaluations for society as a whole, especially for the common good. The real problem, however, is that we can hardly solve this technically, because almost everything can be deanonymized. And I am — and this is not a clear matter for me — I do not know much about the objections if we really have something like a re-identification ban, but that does exist in the UK or has been discussed, right?
Frederike Kaltheuner: It has been discussed.
Ingo: Can you further explain what you mean?
Wolfie: The point is that if companies anonymize data with the latest technologies — at least in the best possible way, since perfect anonymization is not possible — then deanonymization, i.e. if data again refers to individuals, would be illegal. That would be a restriction of data usage in a specific form and from my point of view it would solve a lot of problems with Big Data. A legal concept like that has its problems, but I would really like to discuss that in more detail.
Florian Glatzner: I don’t think anonymization is a one-time, static process for now. What is effectively anonymized today will no longer be so in a few years. In any case, you have to have that in mind.
Frederike: I think maybe anonymization should also be considered like “security” – there is no one hundred percent security. Nevertheless, we are taking steps to secure data. One hundred percent security doesn’t exist — it’s more of a spectrum than a “yes/no”.
Data Ownership vs. Data Monetization
That’s the one thing I wanted to say. Then I wanted to go back again, Ingo started with “data property”. I have the impression that there are at least some people who are now demanding data ownership, who actually want data protection or who have not understood what data protection can actually do. The hopes are often the same. The hope is that it will come not from the car industry but from consumer protection organisations and also from countries that do not have data protection: I want to have control over my data. And without talking too much about it but that is a great discourse. “My data belong to me” sounds much better than “You have rights to your personal data” – 90% already don’t listen at this point anymore, and I think it’s important that people recognise it. My nightmare scenario is that privacy becomes a luxury. That [the] only people have real rights over their data, in this scenario of ownership, [are those] who can afford it, and everyone else has no real rights to it. The moment I give up my property, what happens to it and how it will continue to be protected.
Florian: It is also really problematic — and there are such proposals — that individuals are involved in the monetary value creation of their data. As soon as you do that, you create absolutely wrong incentives: “Oh I was young and needed the money, I have now sold all my data and then they suddenly no longer belong to me”. What if the data I provided is incorrect? Can I then be held liable or something similar? The whole construct is of course absolutely nonsensical from my point of view.
Frederike: And my Cambridge [Analytica] data was sold for 6 cents – that wouldn’t have made me rich either…
Wolfie: …because the value of the data is always context specific.
Ingo: We sit here together and try to think about the topic from a civil society perspective. What are the next steps where you say: OK, we have to get down to it now?
Wolfie: Enforcing GDPR and bring [the] ePrivacy [regulations] to a good close anyway. Then break or contain data monopolies. And the third, where we didn’t discuss it any further: On the one hand I am relatively radical in my opinion that companies misuse data in an unbelievable commercial way, that a Wild West has emerged, that it is dangerous for society as a whole, for fundamental rights, for justice, for many other things. I want this to be curbed, but I also want data that used to be personal data to be available to be used for the common good of society. This is just as important an issue for me, but one that is very difficult to discuss, because then the industrial lobbies usually come and jump right on it, immediately get shining eyes “how can we soften all sorts of laws and say that we are only doing this for the benefit of society as a whole”.
Ingo: So in the sense of hacker ethics – “protect private data, use public data”?
Wolfie: Of course it goes in that direction, yes.
Florian: I would pick up the point again: Encourage and strengthen research in anonymization techniques. I think there is a lot of movement in it and I think there are also new developments and different startups researching it. Apple is now also working with anonymization and I think we simply have to intensify research. The second point out of our discussion is to strengthen the data protection supervisory authorities, a very concrete point that needs to be done, in my view, and third, I have just indicated that I find the discussion about how we deal with automated or algorithmic decision-making and AI to be a discussion of many issues outside data protection that need to be discussed by society as a whole.
Frederike: It’s difficult to add anything now. It is still important [to understand] that data protection will not solve all problems and challenges in the field of AI. There is a whole series of issues that we have to deal with. I also find competition exciting and I think we should talk more about Amazon than we do, for example, in this context, or about Chinese companies. The second point is: If we really want to use the GDPR to shake up the surveillance capitalism to the foundation at least a little, or at least to question it, it is not easy and it will take years. The battle still lies ahead of us. We now have the tools, we still have to use them. That is not a small project. And maybe the last one: I would like to see that the focus is more on those who are really the weakest and most marginalised group anyway. That we do not only talk about data in a general abstract fashion, but also look as civil society where new procedures, new techniques and data are used in ways that are against those who are already particularly vulnerable. I think that is the most important thing.
Ingo: Hopefully we will be able to see you here again or listen to you at the conference in a year’s time. Maybe we will come together again in this or another context to continue this discussion. What remains is perhaps the impulse to those in our civil or digital society take up these questions or proposed solutions, because I have the feeling that there is still a “void”. When it comes to leaving the discourse “My data belong to me” and actually coming up with the idea of making regulatory proposals, proposing how things could or should go better. I think there are a few more steps to be taken. And I hope this podcast can be a good approach. Thank you very much!
Disclaimer: Aircloak is a provider of an anonymization solution and are not connected with netzpolitik.org. We value and encourage discussions about data privacy and would like to contribute to the future of privacy through privacy enhancing technologies like Aircloak Insights.
Netzpolitik.org Die Zukunft des Datenschutzes Podcast
Licence: CC-BY-NC-SA 4.0
Courtesy of http://netzpolitik.org and the involved experts