Mariëtte van Huijstee: “Video footage is perceived as evidence”

Mariëtte van Huijstee is an expert on responsible business conduct. She focuses on technology, innovation companies’ societal responsibility and how ongoing digitalisation is changing the fabric of society.


Are deepfakes going to play a decisive role in future fake news?

Mariëtte van Huijstee pictureMariëtte van Huijstee: I’m afraid they will. Just the potential of actual deepfakes circulating is already impacting how we perceive the news and the truthfulness of information and impacting our trust. Can we still trust in what we see? We already had two cases in the Netherlands with public figures making statements on the news that were so unexpected that the audience thought they were looking at deepfakes. As it turns out, the statements were authentic. So the whole prospect that we could be encountering deepfake videos anywhere changes the dynamics of trust and suspicion.


What particular insight has changed the way you look at deepfakes?

Mariëtte van Huijstee: How we will deal with information. I have realised that you cannot trust what you see or even what you hear. We have had Photoshop for a while, so everybody knows that still images can be manipulated. If an image looks particularly great, we immediately suspect it has been adjusted using software. But looking at video clips, for most people, that is intuitively proof that something has happened the exact way it is being depicted. It is perceived as evidence. The same goes for audio clips. If my daughter called me in distress and asked me to wire her money, I would have to restrain myself from jumping into action and instead verify that it really is her who is calling me. We will all have to develop skills and routines that enable us to question and critically deal with information.


What does the rapid rise of deepfake pornography tell us about society?

Mariëtte van Huijstee: We have found three levels when it comes to the impact of deepfakes: individual, organisational and societal. Sometimes they overlap. Pornographic deepfakes can cascade from individual to societal effects. There is this new app that can undress people from a picture. So you upload a picture of a dressed woman to the app, and then you can undress her with one tap of a button and see a preview of that generated result. If such apps are installed by hundreds of thousands of people who are having fun undressing non-consenting women, then that has a societal impact. And I think it is a very striking fact that the algorithm has not been trained on male bodies. When you uploaded a picture of a man, the algorithm won’t work.

Related article

Leave a Reply