Looming crisis of deepfakes and fakeporn

Non-consensual deepfake videos, that humiliate and demean women, are racking up millions of views on mainstream porn sites. Nothing is being done about them.

Deepfakes and fake porn – where technology is used to create new pornographic images and videos by superimposing non-sexual pictures into existing porn videos without consent – are rising exponentially and there are few laws against the practice.

While ‘photoshopping’ technology has been available for decades, the use of AI through new apps (‘deepfakes’) are making altering videos much more straightforward and sophisticated. To the untrained eye, it is very difficult to tell the difference between the fake and real images, and so the harm and harassment felt by victim-survivors is just as significant. To date, few criminal laws cover the creation or distribution of ‘fakeporn’.

Report recommending reform: In my report with colleagues on the need for law reform, we recommended an immediate change to cover deepfakes and fakeporn. Our research had found that many victims experienced considerable harms from their images being used in this way. While the Law Commission is currently reviewing this area, change is a long way off and will be too late for many victims with justice being delayed.

Together with MPs and other campaign organisations, in 2018, together with others, I recommended that the upskirting law be extended to include fakeporn, but the Government refused.

In 2016, the Ministry of Justice refused to extend the law to cover altered images, arguing that while ‘distressing’, fakeporn ‘does not have the potential to cause the same degree of harm as the disclosure of images that record real private sexual events’. This runs counter to what victim-survivors tell us.

For more background on deepfakes and the US debates, read here.

Create a free website or blog at WordPress.com.

Up ↑

%d bloggers like this: