Research

Our research aims to raise awareness and understanding of online harms experienced by Black women and other marginalised groups. Our methods show how to apply intersectionality as best practice. Here are a few examples of our work so far.

The Digital Misogynoir Report

1,800,000 people suffered threatening behaviour online in the past year. Glitch’s and EVAW’s Ripple Effect Report showed that this is worse for women and significantly worse for women of colour, and that this has sadly increased during the Covid-19 pandemic.

Roundtables on AI harms: deepfake abuse & non-criminal redress

‘The Digital Misogynoir Report: Ending the dehumanising of Black women on social media’ shows the widespread and alarming prevalence of digital misogynoir across five social media platforms, and calls into question efforts by tech companies and the UK government to make online spaces safer, as dehumanising abuse directed towards Black women is allowed (and often enabled) to proliferate online

Our Digital Misogynoir Report analyses almost one million social media posts and is the first to examine the widespread prevalence of digital misogynoir on social media. Despite the widespread and unchecked prevalence of digital misogynoir, current online safety research, policy efforts and the actions of major tech companies have largely ignored the combined racialised and gendered nature of online abuse.

Glitch’s report provides a statistical analysis of digital misogynoir and we call for immediate action to combat the violent dehumanisation of Black women online.

READ our full report

FUND this work by donating towards the report recommendations

JOIN our community to learn more about the findings, and how to get involved in campaigns to make a change

The Ripple Effect Report

We are currently organising roundtables with experts on racism and AI harms came together to exchange ideas on how to move research forward in this area, with a particular focus on non-carceral responses to deepfake abuse.