Digger – Deepfake Detection
,When we cannot trust our eyes and ears anymore, then what can we trust?
Deepfakes are hyper-realistic video forgeries in which people say or do things they’ve never actually said or done. The visual quality of deepfakes will soon become so flawless that it will be hard to make a judgement of veracity by mere visual verification.
Digger will approach the problem from a different side: by using state-of-the-art audio forensics technologies to detect audio tampering in videos. Our aim is to make the audio forensics toolkit available to verification experts, either as stand-alone solution or integrated in a (video) verification application like Truly Media.
The project team working on Digger will consist of Fraunhofer IDMT (audio forensics technology), Athens Technology Center (product development) and Deutsche Welle (project lead and concept development). The effort will be co-funded by Google DNI.
This Wall Street Journal video explains why deepfakes are something the media industry needs to prepare for:
Luckily, we have no widespread real-life deepfake misinformation issues yet. However, it's important that we prepare for such cases and safeguard veracity. No reason to panic, but reason to be alert. We could be a step ahead this time.
"As a global media organisation constantly researching and verifying stories on social media platforms, DW has to deal with misinformation on a daily basis. Deepfakes are a new challenge. We are grateful for the opportunity of working on this project which will enhance our toolkit for verification of manipulated audio or video content. This will support our efforts to safeguard DW as a trustworthy source of information." Peter Limbourg, Director General DW
The Digger project also aims to develop a community to share knowledge and initiate collaboration in the field of deepfake detection. Want to join? Follow us on Twitter: @DeepfakeDigger!