Protecting the Children? Our Position on Social Media Bans

Our position 

On the 20th January 2026, the Government announced that it would consult on children’s social media use “to protect young people’s wellbeing and ensure safer online experiences” (1).

We empathise with struggling parents and are also deeply concerned about (un)intended negative and harmful content online being viewed by children and young people. However, we caution against legislative proposals promoted and ushered in a context of Government and media moral panic over young people's social media use, and then framed as a panacea, particularly given the existing difficulties facing Ofcom, and other regulators worldwide with regard to regulating Big Tech. 

Even more concerning are bans approved in the absence of evidence, and without input from children and young people themselves, especially without further understanding the potential consequences of social media bans and restrictions. We should be concerned about restricting children and young people’s use of ‘user-to-user’ services, which include more than major social media apps, for information seeking, connection, and support.

There is much we can learn from Black women that can equip us all with the capabilities and tools to make use of the internet and online tools more carefully. As Catherine Knight Steele writes in Digital Black Feminism:

“What if liberal politicians and progressive writers asked Black women how we made political calculations amid a barrage of fake news and disinformation? What if we inquired about Black women’s relationship with social media and technology, a relationship that did not shield us from exposure but provided a skill set to navigate trolling and hate speech online? What if we tried to learn how the history of Black women’s use of technology and long-developed skills in intra- and intercultural communication better equipped us to be purveyors of social media, making better decisions for ourselves and society?” (2)

Amidst proposals for a social media ban, we would draw attention to the cultures of fear, exclusion and punishment that bans are formed from. There is a risk that young people — who will inevitably seek loopholes to access social media — consequently have negative or dangerous experiences on alternative platforms, and then thanks to implemented bans, become unintentionally discouraged away from disclosing any experience of harm (3). Bans in other contexts - such as the prohibition of alcohol - have rarely been effective in amounting to meaningful behaviour change (4). Similarly, the prohibition of illegal substances has led to the preventable deaths, racist policing and criminalisation of those most marginalised in society, particularly unhoused people, sex workers, Black communities and those that exist at those intersections (5).

For many children, particularly LGBTQ+ young people and those from other marginalised communities, social media can be a lifeline — one that is not easily replicable offline. For poorer children, outside of cities, or who do not have access to the cultural opportunities that their wealthier counterparts do, but instead turn to platforms like YouTube, a social media ban risks locking children out from the benefits of cultural participation. And it’s hard not to hear the “middle class nostalgia” (6) and elitism in retorts that children should rather be ‘playing outside’, while major cities with higher proportions of Black and brown children suffer without the right to clean air (7), considering the closure of vital youth clubs (8), the reduction of community spaces, and the criminalisation of young Black people’s friendships (9). As the Government debates lowering the voting age to 16, prohibiting young people from the key spaces where they develop much of their political and democratic awareness would be a mistake. However, this could also be by design. We know that social media, particularly TikTok, equips Gen Z with access to more progressive viewpoints (10), and we’ve seen the US’s response to this - forcing ByteDance to have a USian owner.  

Bans on children accessing social media could also reduce the incentive and pressure on platforms to take meaningful steps to address harm on their services for adults and children. The impact on children and young people of harmful content through passive exposure raises important questions about platform design and accountability that the Online Safety Act, if enacted as intended, aims to address. More broadly, blanket bans also shift the burden away from a whole-society approach (such as proper funding of youth clubs, funding libraries, reducing the cost of childcare, increasing child benefit payments, revamping PSHE curriculums), and ending systemic online abuse, including digital child sexual abuse material (CSAM).

Given the existing legislative frameworks, as platforms are failing to ensure there are adequate protections to mitigate children seeing harmful content, we would suggest that this calls for stronger scrutiny of the Online Safety Act’s implementation, looking at amendments where necessary, rather than additional legislation to introduce a ‘ban’. 

Recently, an amendment was passed in the Lords to prevent children from using VPNs (Virtual Private Networks) (11), amidst concerns that VPNs would be used to circumvent any social media ban (12). We would caution against any undermining of privacy and safety tools, such as VPNs. VPNs are a cybersecurity tool commonly used to protect against risks of cyberstalking which both adults and children face. 

In light of this, and considering our belief in transformative justice approaches as the best means of driving systemic change for social problems, we do not support proposals for blanket bans - such as the barring of children under the age of 16 from social media platforms. Instead, we would rather advocate for alternatives which support harm minimisation and that equip young people, and adults, with digital literacy. This includes:

Improving online safety regulation

  • The Government should consider that Ofcom’s enforcement of the Online Safety Act — as a matter of priority — looks at how the design and function, alongside the business model of social media platforms, are negatively impacting children

  • The requirement of statutory duties on platforms to act to reduce the risks of violence against women and girls on their services, so that young girls are better protected online

  • The introduction of robust AI regulation that upholds children’s rights, and can keep pace with technological developments.

Investing in media and information literacy 

  • Introducing a statutory, mandatory media and information literacy educational entitlement in schools (13)

  • Supporting caregivers and young people to navigate conversations about safe social media use such as by funding local councils and civil society who provide such services

  • Further guidance from the Government on the phones in schools policy (14), which looks like primary and secondary schools ensuring mobile phones are not accessible or available to children during the school day. 

Even more so, before rushing into legislative changes which could affect the lives of millions, we believe that the Government should conduct a robust review into the academic literature, consult with children and young people and the children’s rights and digital justice experts that represent them, and be sure to prioritise an evidence first and human rights-based approach to policymaking in this arena. Three months is not long enough. 

Glitch’s Executive Director, tèmítópé lasade-anderson says, 

“In a world where tech companies hold significant power over what we see online and how we use their products, the instinct to introduce a social media ban to protect children is understandable. But a social media ban is not a panacea.

This proposal risks driving children into much more negative and darker online forums, reduces the pressure on tech companies to make their products safer for all of us, and lends credence to a State-controlled ban and surveillance infrastructure which always ends up disproportionately harming Black and racialised people.

Rather than fanning the flames of a moral panic around children and social media, we need to take an evidence-based, harm reduction and whole-society approach. This looks like ensuring the Online Safety Act is fit for purpose; having a robust digital and media literacy strategy; the introduction of a holistic AI Bill, and, crucially, increased financial support for parents and carers, and funding libraries and youth centres.”

Background 

Over the past few weeks, concerns about young people and social media have been exacerbated by social media platform Twitter/X’s AI Grok, which is primarily used in a chatbot format, and the ease with which users were able to use AI to create and share undressed images of people – which may amount to intimate image abuse – and “sexualised images” of children that may amount to child sexual abuse material (CSAM) (15). You can read Glitch’s analysis of this issue and Ofcom’s investigation into Twitter/X on the extent to which the platform adhered to its duties under the Online Safety Act to protect its users in the UK (16) here

After numerous debates and political statements on the issue, the Government brought forwards a ban on nudification apps, and also announced they would be bringing forward the enforcement provisions under the Data (Use and Access) Act that would seek to criminalise the creation of nonconsensual intimate images. 

But then Kemi Badenoch got involved, announcing on the 11 January that the Conservative Party would ban under 16s from social media outright (17), following in the footsteps of the Australian government, who introduced a mandatory minimum age of 16 for accounts on certain social media platforms, which came into effect on 10 December 2025 (18).

Shortly after, on 18 January, a cross-party letter was written by MPs on behalf of constituent reports that “children are anxious, unhappy, and unable to focus on learning” (19). And Peers in the House of Lords began lobbying in support of an amendment to the Children’s Wellbeing and Schools Bill, that would enact a ban within a year of the Bill passing (20). 

Buckling under pressure, the Government announced it would consult on various measures, launching a “national conversation” with parents on the impact of technology on children’s wellbeing with nationwide events to hear views (21). Other stakeholders mentioned include the organisations representing children and bereaved families, technology companies, independent experts and children and young people themselves (though it was unclear whether there would be a public call for evidence and submissions). 

What does existing legislation do? 

The Online Safety Act introduced duties on social media platforms and other providers to “identify, mitigate and manage the risks of harm….from illegal content and activity, and content and activity that is harmful to children“ (22). These duties are extensive, and include requirements on platforms to have systems in place to remove illegal content from their services, as well as to prevent or protect children from encountering harmful or age-inappropriate content. This includes preventing children from encountering pornographic content, suicide, self-injury and eating disorder content, as well as protecting them from abuse, bullying, violence and other harmful content (23). 

Platforms which are likely to be accessed by children are required to have safety measures in place such as “highly effective” age verification and/or age estimation in order to meet these duties. These duties are now in force, as we saw last summer during the furore of age verification on pornography platforms and resulting increased spikes in VPN usage in the UK. As we mentioned above, Ofcom have already opened investigations into platforms which may not be in compliance (24).

This means that social media platforms likely to be accessed by children already have legal obligations to protect children on their services from harm. Consequently, if the Online Safety Act is implemented as intended, a social media ban to protect children from harmful content online should not be necessary. But writing legislation is one piece of the puzzle, and ensuring businesses respond accordingly is another. When a company has more power and income than some nation-states (25) following a law made by the UK may look like an option, rather than a strict requirement, in terms of weighing up the opportunity cost. 

An amendment recently passed in the House of Lords last week (26), to the Children’s Wellbeing and Schools Bill, suggests that the proposed parliamentary approach to enacting a ban would be to age-gate platforms. I.e., “requir[ing] all regulated user-to-user services to use highly-effective age assurance measures to prevent children under the age of 16 from becoming or being users” (27). 

Although this amendment would enable the Secretary of State with powers to modify which online platforms are placed under this duty - at least as it is written - it would apply to all regulated user-to-user services. These means it would include any “internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service” (28). As a result, this may include a wide and diverse range of online services, including Wikipedia, for example, beyond colloquial and practical definitions of ‘social media’. Such a sweeping ban would result in unintended negative ramifications for children’s access to information and freedom of expression. For instance, unless the Secretary of State exempts them, some online support services - such as Childline - might have to block children from using them.

What does research tell us about social media bans?

On the same day the Government announced the launch of its consultation (20 January), a Scientific Consortium comprising 14 leading UK scholars, who collaborated to synthesise existing evidence on population-level impacts of social media, smartphones and AI, published their report reviewing current research funding in this area, and recommended strategic research projects for strengthening the causal evidence base over the next two to three years (29). 

The Scientific Consortium’s findings supports our desk-based research undertaken while writing this position paper: that there has been no high-quality scientific study of social media reduction in children (30); that the evidence supporting the effectiveness of bans and severe restrictions on social media in adolescents is limited (31); and, there is also limited research on the impact of social media on adolescent mental health (32)

Of the evidence that does exist, findings remain mixed and do not support universal claims that social media use is inherently harmful (33). Some recent research suggests that time on social media is not predictive of mental health outcomes; on the contrary, research across various types of bans for adolescents (from smartphone bans, book bans, and banning transgender youth from sports participation) indicates that such restrictions carry several negative consequences for young people’s mental health (34).

Crucially, time spent on social media has not been proven to be a reliable indicator of risk alone. According to DSIT’s Scientific Consortium report “our ability to determine whether there is a causal impact of time spent on social media on population-level mental health and wellbeing remains poor.” (35) Rather, it is the type of content that young people are exposed to - from distressing or sexualised content, to unrealistic body ideals - that can amplify harmful outcomes. That this consumption of content can occur via passive exposure (i.e. through feeds and recommendations) rather than being actively sought out by young people, raises important questions about the ways platforms are designed, and the use of algorithms and advertising as their business model — what Shoshanna Zuboff names as “surveillance capitalism” — but also challenges assumptions made about diverse types of content that young people might consensually and intentionally search for online in order to seek understanding, connection or support (36). Social media can also play an important role in young peoples’ lives, supporting their psychosocial needs, the building of connection and provision of safe spaces, particularly for marginalised groups (37). Context is key: exposure alone has not been proven to be the cause of risks, but instead arises from the interaction of different factors, such as family circumstances, peer relationships, school environments and sleep.

Recognising that there is a lacking evidence base, the Scientific Consortium compiled for DSIT have recommended “strategic research projects for strengthening the causal evidence base.” (38) Notably, the University of Cambridge announced it would be conducting the “world’s first major scientific trial looking at the effects of reducing social media use among adolescents, and whether it improves mental health and wellbeing.” (39) The ‘IRL Trial’, funded by the Wellcome Trust, will launch in the Spring, with an initial feasibility study and accompanying report, followed by a full 6-week study taking place in Autumn/Winter 2026. The researchers, led by Professor Amy Orben and Dr Dan Lewer, are aiming to have data analyses completed by the middle of summer 2027.

Footnotes

(1) DSIT/DfE (2026), Government to drive action to improve children’s relationship with mobile phones and social media, https://www.gov.uk/government/news/government-to-drive-action-to-improve-childrens-relationship-with-mobile-phones-and-social-media

(2) Steele, CK. (2021), Digital Black Feminism, NYU Press, p.6

(3) Champion, KE., Birrell L., Smout S., Teesson M., Slade T. (2025) Beyond the ban - empowering parents and schools to keep adolescents safe on social media. https://acamh.onlinelibrary.wiley.com/doi/full/10.1111/camh.70032 

(4) Beck, J. (1998) 100 years of “just say no” versus “just say know”: re-evaluating drug education goals for the coming century. https://pubmed.ncbi.nlm.nih.gov/10183299/

(5) Eastwood, N. (2025), Drug-related deaths increase as government response falls dangerously short, Release, https://www.release.org.uk/blog/drug-related-deaths-increase-government-response-falls-dangerously-short

(6) Livingstone, S. (2026), The UK shouldn’t rush to a social media ban for children under 16, LSE Blogs, https://blogs.lse.ac.uk/politicsandpolicy/the-uk-shouldnt-rush-to-a-social-media-ban-for-children-under-16/?123

(7) Kelly, J.W., and Warren, J. (2024), Air pollution death settlement is not a win - mum, BBC News, https://www.bbc.co.uk/news/articles/c5yx6leg4nqo

(8) UNISON (2024), Closure of more than a thousand youth centres could have lasting impact on society, https://www.unison.org.uk/news/2024/06/closure-of-more-than-a-thousand-youth-centres-could-have-lasting-impact-on-society/

(9) Liberty (2023), New figures reveal Black people 16 times more likely to be prosecuted under ‘racist’ joint enterprise laws, https://www.libertyhumanrights.org.uk/issue/new-figures-reveal-black-people-16-times-more-likely-to-be-prosecuted-under-racist-joint-enterprise-laws/

(10) Carnegie, M. (2022), Gen Z: How young people are changing activism, BBC, https://www.bbc.co.uk/worklife/article/20220803-gen-z-how-young-people-are-changing-activism

(11) Lord Nash's amendment, After Clause 27, Children’s Wellbeing and Schools Bill, Amendment 92, https://bills.parliament.uk/bills/3909/stages/20215/amendments/10027478

(12) Contribution by Lord Nash (2026), Debate on Children’s Wellbeing and Schools Bill, Hansard, https://hansard.parliament.uk/Lords/2026-01-21/debates/FDF32A4B-6004-4C08-8995-EB06C45C0B65/Children%E2%80%99SWellbeingAndSchoolsBill?highlight=vpn#contribution-C045A781-8929-4F0A-B018-0EC863AA7FBE

(13) Media and Information Literacy Alliance (2025), Media & Information Literacy Experts’ Joint Statement, https://mila.org.uk/wp-content/uploads/2025/12/Joint-statement-on-Curriculum-Assessment-Review-FINAL.pdf 

(14) BBC (2026), Schools in England told to ban mobile phones all day, https://www.bbc.co.uk/newsround/articles/cly3j3rm0lwo

(15) UK Parliament Education Committee (2026), MPs ‘deeply alarmed’ by Grok AI images, says Education Committee Chair, https://committees.parliament.uk/committee/203/education-committee/news/211282/mps-deeply-alarmed-by-grok-ai-images-says-education-committee-chair/

(16) Ofcom (2026) Ofcom launches investigation into X over Grok sexualised imagery, https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-launches-investigation-into-x-over-grok-sexualised-imagery 

(17) Nathoo, L. and Seddon, P. (2026), Tories would ban under-16s from social media, BBC News, https://www.bbc.co.uk/news/articles/cx2wyeqw3gpo

(18) Livingstone, H. (2025), Australia has banned social media for kids under 16. How does it work?, BBC News, https://www.bbc.co.uk/news/articles/cwyp9d3ddqyo

(19) Khalil, H. (2026), More than 60 Labour MPs urge PM to ban social media for under-16s, BBC News, https://www.bbc.co.uk/news/articles/c1dk0g5yk06o

(20) Children's Wellbeing and Schools Bill, Running List of All Amendments on Report, Tabled up to and including 9 December 2025, https://bills.parliament.uk/publications/63901/documents/7465

(21) DSIT/DfE (2026), Government to drive action to improve children’s relationship with mobile phones and social media, https://www.gov.uk/government/news/government-to-drive-action-to-improve-childrens-relationship-with-mobile-phones-and-social-media

(22) Online Safety Act, 1 (2)(a), https://www.legislation.gov.uk/ukpga/2023/50

(23) Online Safety Act, (12) and (61), https://www.legislation.gov.uk/ukpga/2023/50

(24) Ofcom (2025), Enforcing the Online Safety Act: Ofcom opens nine new investigations, https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/enforcing-the-online-safety-act-ofcom-opens-9-new-investigations

(25) Fernandez R., Klinge, T.J., Hendrikse, R., and Adriaans, I. (2021), How Big Tech Is Becoming the Government, Tribune, https://tribunemag.co.uk/2021/02/how-big-tech-became-the-government 

(26) Seddon, P. (2026), Lords back UK social media ban for under-16s, https://www.bbc.co.uk/news/articles/cz0pnekxpn8o 

(27) Lord Nash's amendment, Clause 27, Children’s Wellbeing and Schools Bill, Amendment 94, https://bills.parliament.uk/bills/3909/stages/20215/amendments/10027477

(28) Online Safety Act, 3 (1), https://www.legislation.gov.uk/ukpga/2023/50

(29) DSIT (2026), Understanding the impact of smartphones and social media on children and young people, https://www.gov.uk/government/publications/understand-the-impact-of-smartphones-and-social-media-on-children-and-young-people/understand-the-impact-of-smartphones-and-social-media-on-children-and-young-people-executive-summary

(30) Lewsey, F. (2026) Thousands of UK schoolchildren to take part in major study of social media use and teen mental health, University of Cambridge, https://www.cam.ac.uk/stories/irl-trial-social-media-study-launch

(31) Widerhold BK. (2025) Are social media bans the solutions to the youth mental health crisis? Some governments think so. https://journals.sagepub.com/doi/full/10.1089/cyber.2025.0116#B10-cyber-2025-0116

(32) Valkenburg et al. (2022) Social media use and its impact on adolescent mental health: An umbrella review of the evidence. https://www.sciencedirect.com/science/article/pii/S2352250X21001500

(33) Science Media Centre (2026) Expert comments on evidence on benefits and harms of social media and social media bans on young people, https://www.sciencemediacentre.org/expert-comments-on-evidence-on-benefits-and-harms-of-social-media-and-social-media-bans-on-young-people/

(34) McAlister K., Beatty, C., Smith-Caswell, J., Yourell, J., Huberty, J. (2024) Social media use in adolescents: bans, benefits and emotion regulation behaviours. https://mental.jmir.org/2024/1/e64626/#ref26

(35) DSIT (2026), Understanding the impact of smartphones and social media on children and young people, https://www.gov.uk/government/publications/understand-the-impact-of-smartphones-and-social-media-on-children-and-young-people/understand-the-impact-of-smartphones-and-social-media-on-children-and-young-people-executive-summary

(36) Bear H,. Fazel M,. Skripkauskaite S,.I (2025) Isolation despite hyper-connectivity? The association between adolescents’ mental health and online behaviours in a large study of school-aged students. https://link.springer.com/article/10.1007/s12144-025-07643-z

(37) Charmaraman L., Hernandez JM., Rachel H. (2022) Minoritized and understudied populations using digital media. In Nesi J, Telzer EH, Prinstein MJ, editors. Handbook of Adolescent Digital Media Use and Mental Health. Cambridge: Cambridge University Press

(38) DSIT (2026), Understanding the impact of smartphones and social media on children and young people, https://www.gov.uk/government/publications/understand-the-impact-of-smartphones-and-social-media-on-children-and-young-people/understand-the-impact-of-smartphones-and-social-media-on-children-and-young-people-executive-summary

(39) Lewsey, F. (2026) Thousands of UK schoolchildren to take part in major study of social media use and teen mental health, University of Cambridge, https://www.cam.ac.uk/stories/irl-trial-social-media-study-launch

Next
Next

Government VAWG Strategy - Our Response