Glitch Consultation Response to Ofcom’s Additional Safety Measures Consultation

Executive Summary

Glitch welcomes the opportunity to respond to Ofcom’s consultation on additional safety measures under the Online Safety Act (OSA), with a particular interest in Sections 8 and 9 regarding proactive technologies (PTs) and Section 14 on recommender systems. We do so as a Black feminist digital rights charity committed to ensuring technology does not replicate or further discrimination of Black women and Black gender-expansive people, including survivors of intimate image abuse (IIA) and all those at these intersections.

In this consultation response, we will focus on recommender systems, proactive technologies - automated content moderation and hashmatching, as they are solutions which either facilitate or mitigate the spread or dissemination of online abuse or IIA that our focal population face. In terms of IIA and hashmatching to stop the spread of non-consensual images being shared, victim-survivors are routinely retraumatised by the burden of repeated reporting to try to ensure content takedown. Relatedly, with regard to automated content moderation, Black women are disproportionately targeted by digital misogynoir, with automated moderation tools frequently misclassifying explicitly racist content on platforms as not meeting their classification for reporting, while simultaneously over-censoring Black women’s bodies and perspectives through shadow banning(1).

For Glitch, this is not only about technical standards but about whether Black women can trust these systems to protect them fairly and with real accountability. We believe that taking a racial and gender justice approach, audits and enforcement, as well as independent oversight, is essential if PTs are to reduce harm rather than reinforce injustice.

Our key points are:

  1. Mandatory independent audits for accuracy, bias, and transparency, the inclusion of independent audits into the proactive technology proposals overall - more specifically Human Rights Equality and Impact Assessments (HREIAs), with findings made public, available to Ofcom for review. 

  2. More clarity and guidance on humans in the loop, suggesting following the Digital Services Act’s Trusted Flagger scheme due to the reduction of trust and safety staff across all major platforms.

  3. Race and gender impact standards, ensuring misogynoir, transphobic abuse, and other forms of gendered racism are recognised as distinct harms

  4. Inclusion of IIA in the hashmatching and proactive technology proposals as part of the relevant illegal harms.

  5. Redress pathways for those who have had content banned or taken down due to PTs.

You can read our full consultation submission to Ofcom, here.


Endnotes

Next
Next

Black Feminist Visions for Justice: Moving Beyond a Broken System