Fake videos soon impossible to detect

May 4, 2025

Peter Eisert: AI scientist sees new deepfake dangers (Photo: hu-berlin.de)

AI researchers at Humboldt University demonstrate the limits of conventional detection methods

Fake videos created with artificial intelligence (AI) often show no changes in the skin of the people depicted, which is precisely why they have been easy to detect until now. In particular, the representations have so far lacked a heartbeat and the resulting skin changes. But here too, technology is catching up when it comes to deepfakes, warn researchers at Humboldt University (https://www.hu-berlin.de/de ) in Berlin.

Skin shading and pulse

The experts have simulated natural-looking skin and a realistic pulse. ‘We are now showing for the first time that current high-quality deepfake videos can feature a realistic heartbeat and tiny changes in facial colour, making them much more difficult to detect,’ says Peter Eisert, a specialist in visual data processing.

Deepfake creators alter facial expressions and gestures in real videos, for example, and swap them between different people. Their intentions are not necessarily malicious: apps that allow users to transform themselves into a harmless cat or age digitally are very popular and harmless fun.

Detection method fails

The analysis of light reflection through the skin and the pulsating blood vessels beneath it is used in medicine to measure vital parameters. The same technique can now be used to expose deepfakes. Eisert and his colleagues have now succeeded in designing deepfakes in such a way that this method no longer works.

The scientists created a video based on images of a real person, into which they transferred the skin shades and their pulsation. A latest-generation deepfake detector recognised these two features and confirmed them incorrectly. The fake was therefore fatally marked as a real video.

Eisert is nevertheless certain that even such deepfakes can be exposed, albeit with a new strategy. ‘In future, we will have to prove that something in a video has remained unchanged instead of recognising that something is fake,’ emphasises the researcher. If there is nothing original left, it is therefore a fake.

Related Articles

BITKOM: Many use AI, but few pay for it

BITKOM: Many use AI, but few pay for it

8 percent of users of generative AI use paid services 6 out of 10 want to continue using free services in the future AI is now used to create images almost as often as text Free services currently dominate the artificial intelligence market in Germany. Only 10 percent...

Recession keeps Germany in a stranglehold

Recession keeps Germany in a stranglehold

Economic forecast by the German Economic Institute predicts contraction of 0.2 per cent According to the latest economic forecast in German language by the German Economic Institute (IW), the German economy will shrink by 0.2 percent this year. Experts attribute the...

Share This