Black women have been leading movements for equality offline and sounding the alarm about digital misogynoir for years: the very same people who defend our right to safety in the streets are subject to disproportionate harm on social media. Our recently-released Digital Misogynoir Report draws on a long tradition of Black feminist scholarship and activism to provide a statistical analysis of digital misogynoir so that tech companies, governments and concerned digital citizens can bring the necessary energy and action to end online abuse for good.
The internet is both a mirror and a magnifying glass. The unchecked and alarming prevalence of digital misogynoir revealed in our report reflects the structural and interpersonal violence enabled by the dominant culture offline. It also brings the links between various forms of hate into focus, revealing how they strengthen and grow as they move between online platforms. We also see reflections of humanity at its best in our report — generous, joy-filled and encouraging — as well as examples of the ways communities self-organise to create spaces that promote and protect mental health and wellbeing.
If we want to create real change in our society, we must include online spaces in our understanding of what liberation looks like. Whether on the streets or online, we must demand, build and protect spaces that reflect the world we want to live in — one that is joyful, safe, kind and fair.
Our Digital Misogynoir Report is a step towards a more joyful world. It adds to the small but important evidence base about the ways Black women are harmed online and is the first to examine digital misogynoir across multiple online platforms. In the report, we unpack what digital misogynoir looks like by providing further evidence for how Black women are disproportionately affected by online abuse and toxicity.
It’s helpful to define some terms and name the problem.
Online abuse is all or any forms of abuse, intimidation and violence in online spaces, from hate speech targeting marginalised communities, to racist, sexist, transphobic or xenophobic comments. Online gender-based violence is any form of online abuse which focuses on or makes reference to a person’s actual or perceived sex or gender identity.
To help name and research the way online abuse and online gender-based violence meet and compound in the lives of Black women, we utilise the term digital misogynoir. Misogynoir — coined by Dr. Moya Bailey and made viral by Trudy (aka @thetrudz) — is the “particular venom directed at Black women”. Digital misogynoir is the continued, unchecked and often violent dehumanisation of Black women online, including on social media.
Digital misogynoir’s negative impact on the mental and emotional health of Black women is particularly dangerous because of its ability to incite offline violence. In the US, for example, after spending time on far-right social platforms, white supremacist Dylann Roof went on to murder nine Black church members, seven of whom were women, while they were at bible study. In the UK, misogynoir helps explain the sustained and targeted harassment of Meghan Markle in the tabloid press and online. More recently, we see digital misogynoir in the treatment of Kelechi Okafor and Dr Shola Mos-Shogbamimu, who raised important questions about the ways white women are enabled to cause harm, in response to the Lucy Letby case.
If tech companies and governments aren’t researching and resourcing defences against digital misogynoir, they will continue to miss a key element of online abuse and its offline impacts. If Black women are not safe online, no one is.
We see evidence of this in the report’s findings. For example, misogynoir and misogyny are prevalent across all five social media platforms studied in the Digital Misogynoir Report, with posts about women significantly more toxic than the average social media post. Out of almost 1 million posts about women, 20% were highly toxic. This amounts to over 1,000 highly toxic posts per day. As external researchers into this problem, we have fewer resources, tools and data at our disposal than major companies on their own systems, and so our findings only begin to scratch the surface. Imagine this toxicity at its true scale.