A key focus of my work is on securing better laws and policies to challenge all forms of image-based sexual abuse – a term that refers to the non-consensual taking, making and/or sharing of intimate images without consent, including threats and ‘deepfakes’.
What is image-based sexual abuse? My early work with Erika Rackley developed the term ‘image-based sexual abuse‘ to better explain the nature and harms of these abuses and you can read more about this in our academic research and in this blog. We have argued for comprehensive legal changes to enable victim-survivors to seek justice, and for improved support and legal assistance to enable victims to reclaim control of their lives. We are currently participating in the Law Commission’s review of this area of law and you can read here our policy briefing and response to the consultation.
Our landmark report published in 2019 – Shattering Lives and Myths – identified legal and policy failings that must urgently be addressed. We interviewed over 50 victims and stakeholders across the UK to find out their experiences and recommendations for change, as part of larger project involving 75 victims and 6000 survey participants across the UK, Australia and New Zealand. More info here and in our book.
EU Digital Services Act: I have recently worked with the charity HateAid and Prof Lorna Woods on proposals to hold large porn companies accountable for the non-consensual sexual imagery on their websites. We produced an expert opinion which justifies these measures.
Over recent years, I have worked closely with politicians, policy-makers and campaign groups to introduce new laws criminalising all forms of image-based sexual abuse – a term that includes ‘revenge porn’, ‘fakeporn’ and ‘upskirting’. English law was reformed in 2015, with Scots law the year after. These laws are a welcome start, but more must be done to better challenge these abuses and protect victims.
I have given evidence before the Scottish Parliament Justice Committee on reform proposals, as well as recommending comprehensive law reforms before Parliament’s Women & Equalities Select Committee. I actively participate in policy and political debates, most recently suggesting improvements to the laws on ‘upskirting’ including here on the BBC, outlining the harms of photoshopped images and ‘deepfakes’ in the Guardian and advocating the protection of victims by granting them automatic anonymity when reporting to the police.
This policy activity includes working with tech companies such as TikTok and Facebook. For example, in 2018 I addressed Facebook’s Global Safety team at their HQ in Silicon Valley, with Durham Sociology’s Kelly Johnson and participated in their global roundtable brainstorming next steps to challenge the sharing of non-consensual sexual imagery.
Quick and easy explainers of my suggestions and recommendations can be found in my blogs including one on why ‘Revenge Porn’ Is A Form Of Sexual Assault another on why the Law Must Protect All Victims of Image-based Sexual Abuse, Not Just Upskirting and recently why laws on ‘upskirting’ must not require proof of sexual gratification.
You can read the research in more depth in my article with Erika Rackley in the Oxford Journal of Legal Studies (access the full research article) which examines harms of image-based sexual abuse and set out the ways in which laws and policies need to be reformed.
We have further developed our ideas in another article which argues that all forms of image-based sexual abuse are part of a pattern of sexual violence, a form of sexual assault, and should be recognised as such. This article is published in the journal Feminist Legal Studies and is called: Beyond ‘Revenge Porn’: the continuum of image-based sexual abuse.
In this video produced by Durham Law School, I discuss strengthening the law on image-based sexual abuse.