Our role in Trust and Safety work is to hold tech companies to account, represent the safety needs of Black women on the platform and ensure that our expertise is impacting policy and practice in the day to day running of the platform. Our position as a Trust and Safety Council member was voluntary and advisory. This means we did not take any decisions on behalf of the company and could only seek to advise based on evidence and expertise.
We know we cannot do our work separately from tech, but we also know that Trust and Safety councils can be used by tech companies to claim they are working with civil society to improve policy and practice, even when neither is truly taking place. For this reason we do not see ourselves as a partner to tech companies, instead we take our role as a ‘critical friend’ with great responsibility, determination and care for our community.
As a Black feminist organisation we hold, as core to our strategic advocacy goals, the breakdown of power structures within tech companies that contribute to harm in our community. The vast majority of the world’s largest tech companies are developed, run by and run for, white western communities, upholding the harms created by white capitalist heteronormative patriarchy and making m/billons of dollars in profit from the engagement of users on their platforms in the process. We know the business models that drive decisions and policies within these companies are often in direct opposition to the safety, joy and empowerment of Black women all over the world, alongside many other marginalised groups. As highlighted by Access Now’s Maria Fatafta “Their modus operandi has been: if it’s not in English, if it’s not happening in ‘the west,’ and it’s not affecting their profit margins, then it’s not worth investing time and energy into,” which has been a long standing fundamental barrier to our work over recent years. Despite this, we also know Black women have been instrumental in creating safe and joyful spaces online for activism, education and social connection — honing the power of tech despite being structurally marginalised by the way it has been developed.
Tech platforms like Twitter are not a “public square”. Social media platforms are businesses, many of which have become instrumental in our communities, democracies and public debate due to how these online spaces can shape the perceptions, views, access to information, participation and visibility of their users. We must acknowledge the power and responsibility that tech platforms hold, alongside our awareness of the profits from advertising and data collection that drive their decision making. Platform users may use these spaces for seemingly “public” discussion, however as platforms are driven by extracting wealth from users, they represent specific private interests (usually their shareholders) meaning that although safety is a systemic issue, it is not the primary interest for decision making in these companies.
Despite the challenges we’re up against, our role in Twitter’s Trust and Safety work in recent years has included advising the platform to develop a Hateful Conduct policy that (until now) acknowledges the differential impact of hate speech on “marginalized and historically underrepresented communities”. Also acknowledging “for those who identify with multiple underrepresented groups, abuse may be more common, more severe in nature and more harmful”. This work was advised and shaped by the now disbanded Trust and Safety Council, which was a coalition of a diverse, global civil society partners, who achieved crucial progress on safety policies at the company. This specific policy is particularly important in supporting and protecting Black women, who experience abuse in context of misogynoir and white supremacy, so we hope it will remain in place under the new leadership.
Our Trust and Safety work at other platforms includes working with TikTok to stop a harmful audio of a domestic abuse incident from trending, or advising the company to respond to prominent influencers spreading hateful and dangerous messages about women including Andrew Tate.
We also worked to develop the Harassment Manager Tool at Google Jigsaw, which is now being run by Thomas Reuters Foundation as TRFilter to support journalists experiencing abuse to filter, collect and submit evidence around this to platforms and authorities. This work, although not preventative, is important in creating trauma informed ways for victim-survivors of abuse to be able to evidence it, and take action if they’d like to. Our Trust and Safety work therefore works with tech platforms to improve safety tools, policies and processes. In doing so, we help increase accountability of those same platforms to implement changes and respond to harms spotlighted by our work.
Our role as critical friend to tech companies is an important part of what we do, but is not the only way we work to hold tech platforms to account. Alongside our Trust and Safety work, we do research, lobbying and awareness raising on how platforms continue to uphold harmful and discriminatory practices, and systems, that result in harm for Black women. Harm that is censoring the freedom of expression of Black women who, our research shows, are more likely to experience abuse and more likely to modify their behaviour because of it (including self censorship, self removal and reduced participation online).
We know that this harm can be prevented, mitigated and stopped with the right investment and political will in leadership. No clearer has the potential impact and significance of the decisions and will of tech leadership been, than in the recent changes at Twitter. Following the high profile and abrasive change in leadership at Twitter, we’ve seen a number of measures introduced which are impacting the Glitch community and our work including drastic workforce changes, a new business model, amateur decisions on verification, removal or loosening of safety policies and the disbanding of the Trust and Safety Council.
We are primarily concerned that Black women may increasingly be at greater risk of harm on the platform due given self declared “free speech abolutist” billionaire Elon Musk’s leadership immediately led to a 500% increase in the use of the n-word on Twitter. We are also concerned the number of Trust and Safety workers at Twitter being fired or leaving the company could lead to more ineffective, amateur and unsafe practices due to loss of sector knowledge and experience, as we saw through the verification decisions made in the early months of the new leadership.
Finally, some of the longest standing coalition partners from Twitter’s Trust and Safety Council resigned before the council was disbanded. This was due to their concerns about the impact of the new leadership on the Council’s work including declining safety and wellbeing of Twitter users, widespread dismissals of Twitter employees, and potentially damaging changes to content moderation practices. We shared many of their concerns and plan to raise them directly with the platform via the Council. We are also deeply concerned about the response to these resignations, which included abuse, misinformation and direct comments from Twitter CEO Elon Musk linking these members to decisions they played no part in. Our concerns about abusive attacks on the council members led us raising concerns directly with the Trust and Safety team yesterday, demanding the company stop misrepresenting the council’s role. We, alongside other members were concerned the companies actions have endangered current and former Council members, while intimidating us into fearing resignation our speaking out.
Glitch fulfilled its position on the Trust and Safety Council in line with what would have the most impact for safety for Black women on the platform. Despite the Council being disbanded we will continue more impact our lobbying efforts to influence policy and practice internally, in order to make the social media platform safe for Black women.