June 1, 2025 Deepfakes just got more convincing — and harder to catch. A new study shows AI-generated videos can now simulate human heartbeats, defeating some of the most advanced deepfake detection systems.
Published in Frontiers in Imaging, the research reveals that modern deepfakes can replicate subtle physiological signals like pulse variations across a person’s face. These signals are typically used by detectors based on remote photoplethysmography (rPPP) — a medical imaging technique similar to hospital-grade pulse oximeters.
That method, once considered reliable, may no longer be effective. “A detector that worked nicely two years ago begins to completely fail today,” said Prof. Peter Eisert, one of the study’s co-authors. His team found that even deepfakes not explicitly programmed to show heartbeats often displayed them when built from real driving videos.
This raises new concerns about misinformation, identity fraud, and public trust in digital media. As deepfakes improve, experts like Eisert warn that traditional detection may be insufficient. Instead, they suggest focusing on cryptographic digital fingerprints to verify video authenticity before manipulation occurs.
The findings underscore how fast detection technology is falling behind — and how the deepfake arms race may be nearing a tipping point.