Peter Eisert: AI scientist sees new deepfake dangers (Photo: hu-berlin.de)
AI researchers at Humboldt University demonstrate the limits of conventional detection methods
Fake videos created with artificial intelligence (AI) often show no changes in the skin of the people depicted, which is precisely why they have been easy to detect until now. In particular, the representations have so far lacked a heartbeat and the resulting skin changes. But here too, technology is catching up when it comes to deepfakes, warn researchers at Humboldt University (https://www.hu-berlin.de/de ) in Berlin.
Skin shading and pulse
The experts have simulated natural-looking skin and a realistic pulse. ‘We are now showing for the first time that current high-quality deepfake videos can feature a realistic heartbeat and tiny changes in facial colour, making them much more difficult to detect,’ says Peter Eisert, a specialist in visual data processing.
Deepfake creators alter facial expressions and gestures in real videos, for example, and swap them between different people. Their intentions are not necessarily malicious: apps that allow users to transform themselves into a harmless cat or age digitally are very popular and harmless fun.
Detection method fails
The analysis of light reflection through the skin and the pulsating blood vessels beneath it is used in medicine to measure vital parameters. The same technique can now be used to expose deepfakes. Eisert and his colleagues have now succeeded in designing deepfakes in such a way that this method no longer works.
The scientists created a video based on images of a real person, into which they transferred the skin shades and their pulsation. A latest-generation deepfake detector recognised these two features and confirmed them incorrectly. The fake was therefore fatally marked as a real video.
Eisert is nevertheless certain that even such deepfakes can be exposed, albeit with a new strategy. ‘In future, we will have to prove that something in a video has remained unchanged instead of recognising that something is fake,’ emphasises the researcher. If there is nothing original left, it is therefore a fake.