Laws Don’t Prevent Harm: 5 Things That Will Protect Women from Deepfake “Porn”
AI deepfake videos become a hot topic with each passing election, but they are a concern for women year-round. Image-based abuse, the creation and sharing of intimate or sexual photos without consent, is an example of online gender-based violence – and it’s only growing with the popularity and accessibility of AI. The Labour Government has called violence against women and girls a national emergency and voiced their commitment to addressing it. During this year’s Labour Party Conference, Home Secretary Yvette Cooper announced the Labour party will introduce a law for online image abuse to reach their mission of cutting violence against women and girls in half within the decade.
Last month, “Jodie” partnered with the End Violence Against Women Coalition, #NotYourPorn, GLAMOUR UK and sexual violence scholar, Professor Clare McGlynn to launch a petition to demand the government address sexual deepfakes. In this petition, she shared how she learned her face was used in AI-generated sexual images. This was in 2021, when the UK did not have laws against using AI to create and share nude photos, and also prior to the Online Safety Act reaching royal assent. Jodie did not get support when she reported the abuse to the police; she believes changing the law is a crucial step in the changes she would like to see for women’s safety. Glitch is supporting their petition to demand better action to stop image-based abuse. However, we do not believe justice is guaranteed for everyone targeted by image-based abuse. We know women and girls’ safety will not come from another law. We also know Black women, girls, and other marginalised people will not be protected by the justice system in the same way because of systemic racism. For example, Victim Support’s research shows Black and ethnic minority domestic violence victims are disproportionately dismissed, and have to report abuse to the police up to three times before they see action.
However, we do believe the government must prioritise and resource actions to stop women and girls from being targeted in the first place. Here’s how we can prevent image-based abuse from harming more women and girls.
1.Understand that image-based abuse and sexual deepfakes are not porn
If we want women and girls to be safe, we need to be on the same page about what is harming them. Deepfakes are AI-generated photos or videos where someone’s face or likeness is used to create sexually explicit content. What’s concerning is image-based abuse makes up 98% of all deepfake videos online.
The key word here is consent. Deepfakes that include sexual content are popularly, yet incorrectly known as “deepfake pornography.” When sexually explicit deepfakes are created without consent, this is not sex work or pornography. It is abusive and deeply harmful to victims.
When images are taken or manipulated without consent, women and girls are stripped of their agency over their bodies. It needs to be named for what it is — abuse — instead of being treated as a sexual commodity.
2. Create specialist services for victims and survivors
The government should develop support services for the women and girls who have been impacted by image-based abuse – and survivors should be the ones leading these conversations. Survivor-led organisations should be financially resourced so that they can dictate approaches for redress that are grounded in their lived experience, trauma-informed training and expertise. Our Manifesto on Tech Accountability for Ending Online Gender-Based Violence accounts for how this can be financed. The government can use the revenue they already collect from Big Tech companies and use it for preventative online gender-based interventions. This can include by-and-for organisations providing their expertise to support online harm survivors.
Historically, victims and survivors are re-traumatised by police when reporting sexual abuse, domestice abuse, or online abuse, because of victim-blaming and lack of support. The domestic abuse organisation, Refuge found that Black women were 14% less likely to be referred to them after reporting abuse to police, compared to white survivors.
Victims and survivors need services that can meet them where they are – whether they are sharing their abuse for the first time or seeking assistance a week, a month, or even a year later. These services should also be culturally-informed and available for people who are racialised. Black women and other women of colour need specific interventions, appropriate to their race and gender, to heal and recover. This means support from organisations with people who share and have an understanding of their marginalised identities and lived experience.
3. The Labour government needs to focus on prevention, instead of criminalisation
If the Labour government is serious about their pledge to reduce violence against women and girls, they need to focus on stopping image-based abuse before it happens. Their recent election campaign centred on bringing provisions to keep everyone safe online, specifically when using social media, so they should be held to that commitment. It is not enough to simply create new laws, relying on legislation to deal with abusers after women and girls are on the receiving end of violence.
In 2019, there were 14,678 deepfake videos circulated on social media. An estimated 500,000 videos were shared in 2023. By 2025, we could see up to 8 million deepfake videos online. We are already seeing how nudify apps are becoming popular with children.
The government should be concerned with the attitudes and technologies that encourage and facilitate the sharing of these manipulated images.
4. Passing legislation that defines online abuse
Image-based abuse victims need legislation that defines and illustrates how online abuse is a form of gendered abuse, one that we know disproportionately impacts Black and other women of colour. These issues exist within a broader scope of racism and sexism in our society that the UK’s police and the criminal justice system have been known to ignore, perpetuate, and uphold. Our research, alongside others, illustrates that Black women are disproportionately harmed by online abuse. The systemic racism that allows for Black women to be disproportionately targeted online, will also allow for perpetrators to be absolved of accountability – unless the laws are explicit about forms of racialised OGBV; the consequences of online abuse; and, how we as a society can protect each other from it.
5. The Labour government needs to hold tech companies accountable
Ofcom, the office that regulates the Online Safety Act, should ensure that at the very least, tech companies report on the types of image-based abuse that appears among their content, how frequent it is, and how they moderate and respond to it as a platform.
Glitch recently responded to Ofcom’s proposed transparency guidelines regarding the ways platforms should report on various incidences, including abuse. One of our suggestions was asking tech companies to provide data on their harm prevention plans. We also suggested making tech companies share which and how many internal policies are used in companies’ trust and safety work – and what their teams’ trust and safety priorities are overall, to ensure they are doing everything in their power to prevent online harm.
So, what can you do to help?
Glitch is asking people concerned about AI and online gender-based violence to sign this petition, so we can demand government action to stop image-based abuse.
Image-based abuse does not have to be an inevitability of our increasingly online worlds. It can and should be stopped.