The latest research has found an exponential rise in the amount of deepfake porn being made available and being viewed, as reported in August 2020 in Wired. Contributing to this news report, while many victims of deepfakes are celebrities, I commented that this is a looming crisis for everyone else, as the technology is getting easier and easier, and there are few laws to protect individuals and challenge perpetrators.
While ‘photoshopping’ technology has been available for decades, the use of AI through new apps – deepfakes – are making altering videos much more straightforward and sophisticated. To the untrained eye, it is very difficult to tell the difference between the fake and real images, and so the harm and harassment felt by victim-survivors is just as significant. To date, few criminal laws cover the creation or distribution of fakeporn or deepfakes. Yet little is being done about this.
Report recommending reform: In my 2019 report with colleagues on the need for comprehensive law reforms relating to image-based sexual abuse , we recommended an immediate change to cover deepfakes and fakeporn. Our research had found that many victims experienced considerable harms from their images being used in this way. While the Law Commission is currently reviewing this area, change is a long way off and will be too late for many victims, resulting in justice being delayed.
There is a danger that the harms of deepfakes and fakeporn are not properly understood. In 2016, the Ministry of Justice refused to extend the law to cover altered images, arguing that while ‘distressing’, fakeporn ‘does not have the potential to cause the same degree of harm as the disclosure of images that record real private sexual events’. This runs counter to what victim-survivors tell us.