Scientist: Henry Ajder

Henry Ajder ESMH ScientistHenry Ajder is Head of Communications and Research Analysis at Deeptrace, a Dutch start-up that is working on building an anti-virus for deepfakes, powered by deep learning technologies for detecting and understanding fake audiovisual media.
He is the lead author of Deeptrace’s pioneering report entitled ‘The State of Deepfakes’ and writes for Tracer, an acclaimed weekly newsletter covering the latest developments in deepfakes, disinformation, and emerging cyberthreats.
Previously, Henry has worked as a researcher for Nesta’s Technology Futures programme, and holds a Master of Philosophy from Queens’ College, University of Cambridge.

Deepfake Face Manipulation of Asian Male

Henry Ajder: “We need to make finding and accessing these tools as difficult as possible”

In 2019, you released a report which found that 96 % of all deepfakes produced depict some kind of image abuse. What has changed since then? Back then, no one really knew what the deepfake landscape looked like. Our report, The State of Deepfakes, was the first research that mapped it comprehensively. One of the most ...

Deepfake Face Manipulation of Asian Male

Fighting abusive deepfakes: the need for a multi-layered action plan

Deepfake videos herald some promise for the media industry, but at present they are mostly used for malicious purposes including the production of non-consensual pornography. The law is failing to protect victims as it struggles to keep pace with technological advances.

A scientist’s opinion : Interview with Henry Ajder about Deepfakes

A scientist’s opinion : Interview with Henry Ajder about Deepfakes

The key point I would make about regulating deepfakes is that, in some cases, new laws may be needed, but much of the harm deepfakes cause is covered by existing laws, or could be covered by amending existing laws.

Deepfakes shallowfakes and speech synthesis tackling audiovisual manipulation

Deepfakes, shallowfakes and speech synthesis: tackling audiovisual manipulation

Despite alarmist news stories about deepfakes heralding the end of democracy or truth itself, the technology – for better or worse – is far from perfect, which suggests that there is still a window of opportunity to prepare society, institutions and regulatory frameworks for the moment it is.