The alleged Pope in a white down jacket, the supposed arrest of Trump or Franziska Giffey’s conversation with a fake Klitschko – what looks deceptively real at first glance turns out to be a so-called deepfake in retrospect. These are images, audios or videos that have been deceptively altered or falsified. In videos, for example, someone can be made to say or do something that he or she never said or did in reality. A slight majority of Germans have already heard or read about such deepfakes (60 percent). However, only 15 percent can explain well what is meant by deepfakes. 23 percent know at least something about it, while 22 percent have read or heard about it but do not know exactly what it is. A third (33 percent), on the other hand, have never read or heard of deepfakes. These are the findings of a representative survey of 1,002 people in Germany aged 16 and over commissioned by the digital association Bitkom.
What’s real, what’s fake? For many people, deepfakes lead to uncertainty: 8 out of 10 Germans (81 percent) say they would not recognize a deepfake. 44 percent say they have already been fooled by a deepfake. 70 percent believe that photos and videos can no longer be trusted, and 63 percent even say that deepfakes scare them. 60 percent see deepfakes as a threat to our democracy. On the other hand, more than half also see positive uses for them: 55 percent think deepfakes could be put to good use, for example in cinema or art. “Retouching is as old as photography, and audios and videos have been edited and changed since time immemorial. Such interventions used to be reserved for highly specialized experts, but today they can be created with just a few clicks, even without the appropriate prior training. This makes it all the more important to raise awareness of this phenomenon and sensitize people to it,” says Bitkom CEO Dr. Bernhard Rohleder.
The vast majority have encountered deepfakes in information programs: 63 percent say they have seen deepfakes in reports on the topic. Only 2 percent have identified deepfakes on the Internet that were not marked as such. 8 percent have encountered deepfakes that were labeled as such. And 3 percent have tried out software themselves that can be used to create deepfakes.
A broad majority (84 percent) calls for mandatory labeling of deepfakes, and 60 percent say they should be banned altogether. Rohleder: “Even if there were a labeling obligation or even a ban, the very people we want to protect ourselves from, such as cybercriminals or the troll factories of countries that are hostile to us, would not adhere to it. Education and media literacy are indispensable in the fight against deepfakes. Each and every individual should carefully check whether a text, image or video is authentic before liking or sharing it on social media, for example.”
Note on methodology: The data is based on a survey conducted by Bitkom Research on behalf of the digital association Bitkom. For this purpose, 1,002 people in Germany aged 16 and over were interviewed by telephone. The survey is representative. The questions were: “Have you ever read or heard about deepfakes?” and “Which of the following statements about deepfakes apply to you or in your opinion?