Minimum Standards: Glitch’s Response to Ofcom’s VAWG Guidance, “A Safer Life Online for Women and Girls”

In November 2025, Ofcom published the long-awaited violence against women and girls (VAWG) Guidance on ‘a safer life online for women and girls’ (subsequently, ‘the Guidance’) (1). The Online Safety Act requires Ofcom to publish guidance (2), which was the result of the years-long campaign we and others undertook to ensure that women and girls were considered in legislation for the online environment (3). Given the original scope of the Online Safety Bill (4), the existence of this Guidance should not be taken for granted. It is no small thing to now have a regulator set out steps to tackle online VAWG, and the Government — specifically the Secretary of State — continually affirm that platforms are expected to abide by it (5).

However, while we are glad to see the fruits of our collective efforts materialise, there is still more that can be done to prevent online VAWG. It is in this analysis — an expansion of our initial shorter response here — that we aim to highlight where the guidance falls short, and what we consider necessary to develop a solid baseline of support and mitigation of tech-facilitated gender-based violence on and by social media platforms. 

In this response, we first begin by providing an overview of the structure of the Guidance, then consider how far platform compliance with the Guidance can be achieved through regulatory enforcement and transparency. We then examine the contents of the Guidance, how far it addresses violence against Black women and Black gender-expansive people online, and its role in bringing about safety-by-design and structural change on online platforms.

We argue that to effectively tackle violence against women and girls, platforms’ duties under the Online Safety Act need to be strengthened via the legislation itself. This could be done by introducing an explicit duty on platforms to have adequate systems in place to tackle gender-based violence on their services. This would allow the full Guidance to be made into a Code, and Ofcom to take greater enforcement action where platforms are failing to design their services in ways which adequately reduce the risks of gender-based and racialised violence. 

Our Recommendations

  • The Government could strengthen platforms’ VAWG duties in the Online Safety Act, by introducing an explicit duty on platforms to have adequate systems in place to tackle gender-based violence on their services.

This would enable the Guidance to be elevated to the status of a Code of Practice, with the 'good practice' steps within the VAWG Guidance enjoying the same enforceability as the 'foundational' steps.

  • To ensure such a duty on platforms can be robustly enforced, including where risks persist despite compliance with a Code, the Government could:

    • Remove the safe harbour provision in the Act 

    • Strengthen the obligations in the Act for platforms to mitigate the risks they identify in their risk assessments (6).

The Government could also:

  • Agree on and insert a definition of safety-by-design, which incorporates design justice principles, into the Act, and requires Ofcom to produce a safety-by-design Code of Practice as part of the Guidance

  • Implement a data access framework to enable independent monitoring of violence against women and girls that would support Ofcom’s compliance monitoring and the potential to bring supercomplaints.

We do believe that there are also steps Ofcom can take without requiring Parliament to revisit the Act, which would strengthen the Guidance and ensure the regulatory regime better protects those most disproportionately impacted by TFGBV — Black women and Black gender-expansive people.

For instance, Ofcom could: 

  • Incorporate greater considerations of how online harms are both racialised and gendered throughout the implementation of the Guidance and ensure this is measured when monitoring platforms’ implementation, including:

    1. Setting clear measures for how platforms should be assessing whether Black women and Black gender-expansive people are disproportionately impacted by safety and moderation features

    2. Setting higher expectations on platforms to take measures to reduce the risks of violence against LGBT+ people.

  • Integrate safety-by-design principles, founded in design justice, throughout their Codes and Guidance, including:

    1. Targeting more specific recommendations in future iterations of the Guidance at more fundamental platform decisions that affect the safety of women and girls on their platforms, including how business models may need to be adapted

    2. Setting higher standards for how human content moderators should be supported and compensated.

With better legislative backing, we might begin to see changes from social media platforms in how they design their products and respond to the risks of technology-facilitated gender-based violence, and Black women, girls and gender-expansive people would be better protected.

Read our full response here.


Endnotes

Next
Next

Protecting the Children? Our Position on Social Media Bans