FakePorn, Deepfakes and Updating the Criminal Law

‘Fakeporn’ is the use of technology to alter videos or images to make them sexual: for example, taking a profile picture from Facebook and digitally altering it, to make it sexual or pornographic. It’s a growing problem, with these fake videos commonly be shared online. English law must do more to criminalise this form of abuse: it’s a form of image-based sexual abuse.

While ‘photoshopping’ technology has been available for decades, the use of artificial intelligence through new apps (‘deepfakes’) are making altering videos much more straightforward and sophisticated. To the untrained eye, it is very difficult to tell the difference between the fake and real images, and so the harm and harassment felt by victim-survivors is just as significant.

To date, few criminal laws cover the creation or distribution of ‘fakeporn’. Together with MPs and other campaign organisations, I recommended that the upskirting law be extended to include fakeporn but the Government refused.

In 2016, the Ministry of Justice refused to extend the law to cover altered images, arguing that while ‘distressing’, fakeporn ‘does not have the potential to cause the same degree of harm as the disclosure of images that record real private sexual events’. This runs counter to what victim-survivors tell us.

For more background on deepfakes and the US debates, read here.

Advertisements