Glitch’s Response to the UK Government’s Draft Statement of Strategic Priorities on the Online Safety Act

Last week, Peter Kyle, the Secretary of State for the Department of Science, Innovation and Technology (DSIT) published the UK government’s draft Statement of Strategic Priorities (SSP) as it relates to Ofcom’s implementation of the Online Safety Act (OSA). Now Ofcom must take these considerations into account while the regulator develops how it will enforce and implement the Online Safety Act, which came into law last year.

The Statement of Strategic Priorities purpose

The SSP is a regulatory lever the Government can make use of under Section 172 of the Online Safety Act. The SSP allows the Secretary of State with responsibility for a regulator to outline their expectations for the regulator. Once finalised, the SSP is active for five years unless there are any amendments, which can happen if the Government feels there needs to be a significant update to their online safety priorities. While the SSP is set to be finalised in early 2025, this first draft provides insight into how the Government is expecting Ofcom to act. It focuses on the following five areas:

  • Safety by design 

  • Transparency and accountability of platforms

  • Agile regulation (what we at Glitch see as regulation ‘future-proofing’)

  • Inclusivity and resilience in online society

  • Innovation in online safety technologies

The regulatory process of the SSP requires Ofcom to respond to the Secretary of State, outlining what actions it has taken against the priorities the Government put forth in the SSP. This determines the next steps from DSIT and the culmination of the SSP, moving from draft status to completion. 

It is made clear in the SSP that the Government believes Ofcom has – under the OS – a strong and clear mandate to regulate platforms. The Government writes, “We will retain a relentless focus on these priorities, and we expect the same from Ofcom and the online services industry.” 

Our analysis 

We welcome the Government’s use of the SSP. It provides in no uncertain terms their view that Ofcom’s expanded powers under the OSA gives it scope to thoroughly and effectively regulate platforms in relation to online safety. The emphasis on safety by design aligns with industry approaches to design justice, online safety and moderation in general. However, where tech companies in industry can make use of these principles in the design and build process of app design, this is not the same for Ofcom or indeed DSIT. Any safety by design approaches need to be supported by mandatory codes of practice, not voluntary ones (i.e. the guidelines on violence against women and girls), to ensure the most robust protections.

Moreover, if taking a safety by design approach, then the Government would know that Design Justice frameworks for such approaches underscore the fact that design choices are not benign; decisions in the design process result in some users being privileged over others, for example. The Government should support Ofcom in looking at the whole design approach of a platform, including within the context of the platform’s business model, to be able to properly contextualise safety by design elements. Given companies’ long-standing reluctance to do so, the Government must include this as a priority within safety by design. While the Government notes that they focus on children because they should have the “strongest protection”, there are other marginalised and vulnerable groups too. 

Their safety by design approach is defined as one that, “Embed(s) safety by design to deliver safe online experiences for all users but especially children, [and] tackle violence against women and girls…” However, in the entire draft SSP, violence against women and girls is only mentioned once, where they say:

“The Government is particularly concerned about the amount of abuse women and girls receive online, with Ofcom’s annual Online Nation report , finding that women are more likely to encounter misogynistic content or content relating to negative body image and are more likely to be negatively affected by harmful content they encounter.”

Glitch, alongside academics, civil society and activists have researched, campaigned and outlined how Black women and other women of colour are disproportionately impacted by online misogyny and misogynist abuse and harassment. A safety by design framework should include a specific focus on the racialised and gendered aspects of violence against women and girls. If the Government’s goal is to “prevent…harm from occurring in the first place, wherever possible”, the regulator needs to have enforceable regulatory levers upon which it can implement. The Government should, for example, impress upon Ofcom to prioritise safety by design protections for groups on the basis of “any of the following characteristics (a) race, (b) religion, (c) sex, (d) sexual orientation, (e) disability, or (f) gender reassignment”, as per sections 62 and 16 of the Act.

Threats from AI generated content and activity are effectively mitigated

The SSP also discusses how the Government is thinking about threats from AI generated content. On this, we are encouraged to see the Government state that Ofcom should “utilise the full provisions of the Act to mitigate the use of existing/new technologies for harmful purposes, monitoring and guiding the platforms’ own risk assessments and reinforce the duty of platforms to put in place measures to ensure that new technologies are safe for users”. We are less thrilled to see yet a push for more research to build an evidence base on emerging AI risks and harms. While we support the transparency regime of the OSA, there already are several notable reports and research from civil society, the media and academia on AI harms. Furthermore, it isn’t clear how Ofcom is going to be expected to engage with this research, including analysis in a timely manner, and then the necessary dissemination of such reports, so their findings can be used by civil society and other stakeholders. 

It’s now up to Ofcom to respond to the SSP. It has the Government’s backing and expectation to be ambitious and rigorous in its implementation of the OSA. We’ll continue working with partners and Ofcom to ensure Black women’s safety is prioritised in platform regulation. 

Previous
Previous

A New Chapter for Glitch Charity

Next
Next

Laws Don’t Prevent Harm: 5 Things That Will Protect Women from Deepfake “Porn”