Deepfakes videos herald some promise for the media industry, but at present they are mostly used for malicious purposes including the production of non-consensual pornography. The law is failing to protect victims as it struggles to keep pace with technological advances.
The key point I would make about regulating deepfakes is that, in some cases, new laws may be needed, but much of the harm deepfakes cause is covered by existing laws, or could be covered by amending existing laws.
Synthetic audio means that it will be possible to make any person say anything, making it even easier to make politicians or celebrities victims of deepfake as there is a lot of publicly accessible audio of their voices.
Despite alarmist news stories about deepfake heralding the end of democracy or truth itself, the technology – for better or worse – is far from perfect, which suggests that there is still a window of opportunity to prepare society, institutions and regulatory frameworks for the moment it is. The state of play Deepfakes (i.e. synthetic ...
Rhetoric about the ‘end of truth’ plays into the hands of people who already are saying you can’t believe anything – and that is neither true of most audiovisual material, nor true yet for deepfakes. We should not panic but prepare instead.
onDecember 4, 2019
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings. This also includes all cookies allowing the proper functioning of the website under Wordpress.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.
This website uses AT Internet to collect anonymous information such as the number of visitors to the site, and the most popular pages.
Keeping this cookie enabled helps us to improve our website.
Please enable Strictly Necessary Cookies first so that we can save your preferences!