Following calls for reform, the UK Government has announced plans to criminalise creating sexually explicit deepfakes without consent. This plan has been widely reported, and I called it a bold and powerful step forward when speaking to BBC Radio 4’s Today programme and on Channel 4 news.
Then, in a dramatic move, the largest deepfake sexual abuse website withdrew access to UK users, followed by the notorious nudify app at the centre of the abuse of many young women teenagers. Ending the easy access and normalisation of deepfake sexual abuse has always been my main justification for criminalisation.
This is a seismic moment in the fight against deepfake sexual abuse and these changes are very welcome. However, as I comment in this BBC news report and in Glamour, there is a long way to go.
The Government’s proposal is limited, only making it a criminal offence where it can be proven the perpetrator intended to cause distress or was motivated by sexual gratification. This will limit the effectiveness of the law and, in turn, reduce the power of regulators to demand greater action from internet platforms.
Further information:
My comment in Glamour setting out need for a straightforward, comprehensive law.
My comment in The Conversation calling for a new law criminalising creation in light of the threat of deepfake sexual abuse pervading the lives of all women and girls.
My Policy Briefing outlining why a comprehensive, consent-based creation offence is required.
My short video reacting to the withdrawal of UK access by notorious deepfake and nudify apps.
Watch Channel 4 news report on new deepfake law including my interview.
Comments