woman with camera

Deepfakes may in time disrupt society

Published: 05th January 2022 Last updated: 13th January 2022

With the rise of deepfakes, the question of what is real and what is not will increasingly be on the political and social agenda. Deepfakes can be used to spread fake news, influence elections, introduce realistic fake evidence in court proceedings or to make fake pornographic videos of women. Each of these applications potentially has a very large impact on society, social structures and the rule of law. This is evident from research published today by the Tilburg Institute for Law, Technology, and Society (TILT), commissioned by the WODC, a knowledge institute linked to the Dutch Ministry of Justice and Security.

A deepfake is an image, sound or other digital material that is fake, but appears highly realistic. Machine learning and artificial intelligence are used to create this new digital content from existing videos, voices and images. Entirely new (virtual) persons or images can also be produced. High-quality deepfakes cannot or hardly be distinguished from authentic material. Moreover, anyone can access this technology easily and free of charge. Hence experts predict that in five years' time more than 90% of all online content will be wholly or partially manipulated. Deepfake detection technologies are limited in capacity and can only filter out part of the fake material automatically.

More regulation and protection

The researchers note that the current legal regime, such as laid down in in particular in the General Data Protection Regulation, tort law and the freedom of speech, sets many restrictions for deepfakes, but also find that enforcement of these rules lags behind. To alleviate the enforcement burden on data protection authorities and the public prosecutor, the researchers discuss a wide range of regulatory options. One is to prohibit the production, selling, use and possession of deepfake technology for the consumer market. The current regulatory regime does not restrict the creation of deepfakes or their distribution on social media. Deepfakes are assessed on their legitimacy only after they have been distributed; by then, the damage has already been done. In addition, it may be worthwhile to consider laying down postmortem rights to the deceased, for example if they are 'brought back to life' against their will by their former spouse or by a commercial entity.

Societal debate

The researchers argue that a broad political and social debate about the desirability of deepfakes is necessary and that more awareness and regulation should be considered, particularly with regard to a number of areas:

  • Elections can be influenced through fake videos, for example by foreign powers.
  • The media may eventually run into trouble because so much digital material has been manipulated that they will have to choose between Scylla, or publishing quickly, with the risk of bringing fake news, and Charybdis, or waiting until material has been checked for authenticity, which may involve a great deal of technical expertise and expenses, and may delay the moment of publication by days.
  • Police, judiciary and judges will have to develop systems to ensure that faked material is not introduced in the courtroom, in order to avoid miscarriages of justice.
  • Finally, and perhaps most importantly, it is estimated that currently more than 95% of all deepfakes concern so-called non-consensual porn. That is, porn videos are made and distributed in which, for example, a woman appears to be performing all kinds of sexual acts, which she did not perform. It is known that the distribution of such videos can have a very big impact on the private lives, social position and self-image of women in general and young girls in particular.

Report

The report 'Deepfakes: The legal challenges of a synthetic society' is available here (pdf) and at the WODC. Authors: Bart van der Sloot, Yvette Wagensveld and Bert-Jaap Koops, Tilburg Institute for Law, Technology, and Society.

Note to editors

For questions please contact Dr. Bart van der Sloot, Tilburg Institute for Law, Technology, and Society, via b.vdrsloot@tilburguniversity.edu.