Bart van der Sloot

“The realization that we have to deal with new technologies has really changed”

Article 4 min. Corine Schouten

Since April, Dr. Bart van der Sloot has been seconded to the Dutch Data Protection Authority (Autoriteit Persoonsgegevens, DPA) one day a week. There, he provides input for a strategic team on reviewing the DPA’s policy. An interesting place for cross-fertilization for a lawyer and philosopher of the Tilburg Institute for Law, Technology and Society (TILT) specializing in personal data protection.

Why did you respond to this challenge?

The Dutch Data Protection Authority was enthusiastic about the idea to have me and a colleague from another university on board in their Strategy Department for six months. I was doubtful at first – I was afraid my own research might suffer. On the other hand: I get frustrated often enough when one of my reports ends up gathering dust somewhere. I wanted to experience first-hand how policies sometimes evaporate in practice. In the past, I worked for the Netherlands Scientific Council for Government Policy (Wetenschappelijke Raad voor het Regeringsbeleid) and saw the details of how the Ministry of General Affairs operates. So I hope to find out what the sticking points are at the DPA.

The new big questions are about how to deal with disinformation, algorithms, cybersecurity, generative AI like ChatGPT, in other words, synthetic reality; it’s a very broad range of issues. Sound choices need to be made while resources are limited

Bart van de Sloot

Why did the DPA invite you at this time?

I contribute ideas on reviewing the role of the Dutch Data Protection Authority in connection with new technologies that are probably going to have an enormous impact in society. I was pleasantly surprised by the amount of discussion: the Dutch DPA itself is very reflective. We are now at the point when the EU’s General Data Protection Regulation (GDPR) has been in force for five years. It was the DPA’s priority to let that sink in and now it must review its position. So they ask themselves questions like: what have we achieved? Are we effective enough? What is our future role? 

The new big questions are about how to deal with disinformation, algorithms, cybersecurity, generative AI like ChatGPT, in other words, synthetic reality; it’s a very broad range of issues. Sound choices need to be made while resources are limited.

So, will you be able to work it out?

It is not really easy to determine the DPA’s role, because the issues play out in all kinds of border areas. For instance, take law enforcement, the use of technology in different domains, criminal law aspects with respect to deepfakes, or the protection of democracy – for instance, manipulation by means of fake news. Should the DPA address these kinds of issues or not? What tools can they use? Are they going to impose fines or issue a guideline? I try to provide input or develop models.

One of the best things is that I work with a very mixed group from different disciplines. At TILT, we also have a team of lawyers, techies, economists, philosophers, etcetera. You need them all to solve the social issues around technology. Cookies are a case in point: the legal issue is whether you really have freedom of choice in accepting or rejecting cookies. What if the design of the website nudges you in a certain direction? That is the reason we also need technical and psychological knowledge.

I’m happy to say that there is growing awareness of concerns about technology

What exactly are you doing at the DPA?

Many different things, that makes it so interesting: co-writing a speech on the DPA’s vision of the future, providing input on the redesign of the organization, but also contributing ideas about questions like: how do we address the cookie problem? Given my research perspective, I have ideas about the developments and issues around personal data protection in digital society. I give my opinion, based on my expertise. Sometimes it is received well, and sometimes it isn’t. Sometimes I can confirm what they already thought anyway and sometimes I am rebellious. It is good that I am seconded to the DPA so I can keep my academic independence.

Shouldn't we all be learning to make better use of new technology?

I’m happy to say that there is growing awareness of concerns about technology, also among civil servants and politicians. There is now a Parliamentary Standing Committee on Digitalization and a dedicated Minister. The realization that we really need to address the subject is quite different now from when the GDPR was introduced. At the time, the general feeling was: do we really have to? Now everyone realizes that it is extremely important.

But I think we should be stricter with citizens. They have a lot of data about other people and also use techniques against others. People have responsibilities as well as obligations, and these continue to be crucial.

Cross-fertilization has always been my objective when I work for public organizations. To my mind, science and practice need each other

Is there something in your work at the DPA that contributes to your own research?

Cross-fertilization has always been my objective when I work for public organizations. I have written a lot of reports, translating science to practice. That really teaches you what is feasible. To my mind, science and practice need each other. For me, that has always been the reason to be closely in touch with practice and it’s the policy context in particular that I find fascinating. It is great fun, but it is also essential that you keep your ear to the ground.

Are you currently working on any other practice-oriented research?

With my colleagues Aviva de Groot and Esther Keymolen, I am working on the use of facial recognition in football stadiums to prevent racist and discriminatory behavior. We had already published an academic report about facial recognition and now we are part of a broader consortium. The football world is really trying to do something about the problem, but the question is whether facial recognition is the right instrument. Is it permitted to violate people’s privacy with a technology to prevent racist chanting? Does it even work? Is it permitted to identify people who have a stadium ban when they are in a crowd? I don’t know the answers to these questions either, but that makes it quite interesting. There is no universal truth. These are really tricky issues.

Besides this, I am writing a book on synthetic reality. I also want to include these dilemmas. In the post truth era, is the government going to determine what is true and what isn’t? Does politics decide the truth? Look at the Covid crisis: that was also about the question of truth. I find these kinds of questions increasingly fascinating; they make my philosophical heart beat faster. I like to define the essence of a dilemma. And then you hope it gets picked up.

Date of publication: 7 September 2023