Abstract
Objective: As GAN-based deepfakes have become increasingly mature and realistic, the demand for effective deepfake detectors has become essential. We are inspired by the fact that normal pulse rhythms present in real-face video can be decreased or even completely interrupted in a deepfake video; thus, we have introduced a new deepfake detection approach based on remote heart rate estimation using the 3D Central Difference Convolution Attention Network (CDCAN).
Methods: Our proposed fake detector is mainly composed of a 3D CDCAN with an inverse attention mechanism and LSTM architecture. It utilizes 3D central difference convolution to enhance the spatiotemporal representation, which can capture rich physiological-related temporal context by gathering the time difference information. The soft attention mechanism is to focus on the skin region of interest, while the inverse attention mechanism is to further denoise rPPG signals.
Results: The performance of our approach is evaluated on the two latest Celeb-DF and DFDC datasets, for which the experiment results show that our proposed approach achieves an accuracy of 99.5% and 97.4%, respectively.
Conclusion: Our approach outperforms the state-of-art methods and proves the effectiveness of our DeepFake detector.
Graphical Abstract
[http://dx.doi.org/10.1109/RTEICT52294.2021.9573740]
[http://dx.doi.org/10.1109/ACCESS.2022.3154404]
[http://dx.doi.org/10.1109/TIFS.2022.3141262]
[http://dx.doi.org/10.1007/978-3-030-68821-9_38]
[http://dx.doi.org/10.1007/978-3-030-01216-8_22]
[http://dx.doi.org/10.1109/ICTC52510.2021.9621186]
[http://dx.doi.org/10.1145/3394171.3413707]
[http://dx.doi.org/10.1109/IJCB52358.2021.9484405]
[http://dx.doi.org/10.1145/2070781.2024164]
[http://dx.doi.org/10.1109/CVPR.2014.537]
[http://dx.doi.org/10.48550/arXiv.1710.10196]
[http://dx.doi.org/10.1109/CVPR.2019.00453]
[http://dx.doi.org/10.1109/CVPR42600.2020.00813]
[http://dx.doi.org/10.1145/3422622]
[http://dx.doi.org/10.1109/ICCPhot.2012.6215223]
[http://dx.doi.org/10.1109/ICIP.2016.7532336]
[http://dx.doi.org/10.1145/3082031.3083247]
[http://dx.doi.org/10.1109/CVPRW.2017.229]
[http://dx.doi.org/10.1109/JSTSP.2020.2999185]
[http://dx.doi.org/10.48550/1910.12467]
[http://dx.doi.org/10.1364/OE.18.010762] [PMID: 20588929]
[http://dx.doi.org/10.1109/TBME.2013.2266196] [PMID: 23744659]
[http://dx.doi.org/10.1109/TBME.2016.2609282] [PMID: 28113245]
[http://dx.doi.org/10.1109/WACV45572.2020.9093337]
[http://dx.doi.org/10.48550/arXiv.2010.07770]
[http://dx.doi.org/10.48550/arXiv.2010.00400]
[http://dx.doi.org/10.1109/ICIP.2014.7025049]
[http://dx.doi.org/10.1109/TPAMI.2020.3009287] [PMID: 32750816]
[http://dx.doi.org/10.1109/CVPRW.2018.00096]
[http://dx.doi.org/10.1109/TIFS.2022.3142993]
[http://dx.doi.org/10.1162/neco.1997.9.8.1735] [PMID: 9377276]
[http://dx.doi.org/10.1109/ICISCE.2017.95]
[http://dx.doi.org/10.1109/CVPR42600.2020.00327]
[http://dx.doi.org/10.48550/arXiv.1910.08854]
[http://dx.doi.org/10.1109/ICASSP.2019.8683164]
[http://dx.doi.org/10.1109/WIFS.2018.8630761]
[http://dx.doi.org/10.48550/arxiv.2010.11844]