Why We Need To Criminalise Cyberflashing Now

S&LS Blog

Kelly Johnson
Assistant Professor of Criminology, Durham University

Clare McGlynn
Professor of Law, Durham University

… and let’s get the law right first time

During the Covid-19 pandemic, we’ve seen new forms of online sexual harassment such as zoomflashing where men have infiltrated Zoom calls and exposed themselves and zoombombing where unwanted pornographic images are flashed onscreen. Despite taking place on a new platform, these are not ‘new’ behaviours – rather they’re another version of what we already know as ‘cyberflashing’.

Cyberflashing is where a man – and yes it is almost always a man – sends a picture of his penis to another without their consent. Often the penis images are sent through Bluetooth or Airdrop technology, so the perpetrator is unknown. Sometimes referred to as ‘unsolicited dick pics’, women commonly experience this sexual harassment, in public spaces and on public transport. It’s also common on dating…

View original post 892 more words

The looming crisis of deepfakes

Non-consensual deepfake videos, that humiliate and demean women, are racking up millions of views on mainstream porn sites. Nothing is being done about them.

The latest research has found an exponential rise in the amount of deepfake porn being made available and being viewed, as reported in August 2020 in Wired. Contributing to this news report, while many victims of deepfakes are celebrities, I commented that this is a looming crisis for everyone else, as the technology is getting easier and easier, and there are few laws to protect individuals and challenge perpetrators.

While ‘photoshopping’ technology has been available for decades, the use of AI through new apps – deepfakes – are making altering videos much more straightforward and sophisticated. To the untrained eye, it is very difficult to tell the difference between the fake and real images, and so the harm and harassment felt by victim-survivors is just as significant. To date, few criminal laws cover the creation or distribution of fakeporn or deepfakes. Yet little is being done about this.

Report recommending reform: In my 2019 report with colleagues on the need for comprehensive law reforms relating to image-based sexual abuse , we recommended an immediate change to cover deepfakes and fakeporn. Our research had found that many victims experienced considerable harms from their images being used in this way. While the Law Commission is currently reviewing this area, change is a long way off and will be too late for many victims, resulting in justice being delayed.

This is not a new issue, with myself and others calling for change in 2018 when we recommended that the upskirting law be extended to include fakeporn, but the Government refused.

There is a danger that the harms of deepfakes and fakeporn are not properly understood. In 2016, the Ministry of Justice refused to extend the law to cover altered images, arguing that while ‘distressing’, fakeporn ‘does not have the potential to cause the same degree of harm as the disclosure of images that record real private sexual events’. This runs counter to what victim-survivors tell us.

Blog at WordPress.com.

Up ↑