Activist Seyi Akiwowo: 'We must protect Black mental health by stopping the circulation of violent footage online'

With footage of Rayshard Brooks' killing being the latest of distressing videos circulated online showing the police's deplorable treatment of black people, Seyi Akiwowo, the founder of Glitch, a not-for-profit organisation designed to stop online harassment, is campaigning for individuals and tech giants to do their part to make social media a safe space for all

seyi akiwowo

With footage of Rayshard Brooks' killing being the latest of distressing videos circulated online showing the police's deplorable treatment of black people, Seyi Akiwowo, the founder of Glitch, a not-for-profit organisation designed to stop online harassment, is campaigning for individuals and tech giants to do their part to make social media a safe space for all

'Black communities are already disproportionately impacted by online abuse and violence, and a new development is further threatening their ability to take up a safe space – as videos and images capturing the brutalisation of Black people are being shared online. With so many of us passing more time online during lockdown than ever before, our responsibility to act with compassion on social media platforms has never been more important.

Although some social media companies have basic settings which users can implement to control the display of violent material, in recent weeks these settings have failed. With images of Desmond Ziggy Mombeyarara's violent arrest and George Floyd’s murder still circulating social media, another video of the shooting of Rayshard Brooks outside a Wendy's in Atlanta just last Friday has surfaced - without warnings or options to pre-filter this content. At Glitch, a charity dedicated to making the online space safe, we champion digital citizenship. This includes the personal responsibility for social media companies to play their part, so through our petition, we’re calling on tech companies to fulfil their duty to protect users’ welfare by blurring and warning them about graphic content that they’re about to see.

The internet, Instagram, Twitter and new players such as TikTok have become extensions of our offline public spaces; adopting all the beauty of human interactions, creativity and expression as well as all the ugly. However, unlike our offline space where we have the rule of law and social norms, we have yet to go through this process properly online. While Instagram has implemented a feature on their stories allowing users to blur out images and videos showing the brutalisation of Black people, Instagram posts, Facebook stories and posts, Twitter posts and TikTok content all currently lack these settings - taking away users’ ability to decide whether they can engage with this deeply distressing content or not. If these companies don’t act now to respect the lives of Black people, we’re in real danger of not understanding our own responsibility with our digital footprint and actions.

Seyi Akiwowo

Getty Images

This also goes for online hate speech, which should be treated the same as if it was verbalised outside a shopping centre. We should treat others online the way we would like to be treated, when someone is in life-threatening danger, our instant reaction is usually to call 999, but we can’t do that in our online spaces. Facebook for example, has 2.38 billion monthly active users, yet it has no democratically elected governance structure and no emergency services. And with so few under-resourced support services for victims of online abuse around the world, it is even more important for us to be that support for each other online.

I am certainly by no means abdicating governments or billion-dollar private social media companies from their responsibilities to us. Alongside their interventions, I encourage everyone to demonstrate digital citizenship on their own platforms. Digital citizenship is all about our digital rights as well as our digital responsibilities – more so if you’ve cultivated a large platform. Just like offline, we cannot sit back and watch online bullying, trolling and videos of violent racial abuse escalate on our timeline, all because of engagement, entertainment or falling for the myth that ‘it’s not real’.

First, there is no ‘real world’ versus the ‘online world’. The online world is very much real with consequences, from earning significant amounts of money and sponsorship deals right through to censoring and increase death by suicide rates linked to social media. Amnesty International’s survey highlighted the psychological trauma of online abuse: 61% of those who said they’d experienced online abuse or harassment said they’d experienced lower self-esteem or loss of self-confidence as a result and 55% said they had experienced stress, anxiety or panic attacks after experiencing online abuse or harassment. As well as this, being continuously exposed, without consent or warning, to the last moment’s of Black people’s lives is incredibly disturbing and unsafe for all users, particularly Black communities. Research has shown a quarter of people who see content of violent events develop symptoms of PTSD.

It is time to start a conversation on how we can be online active bystanders and ask social media companies to take responsibility for making their platforms safer for Black people. The online space can only be a positive social good if we reflect on our own behaviours as digital citizens and set a zero tolerance to online abuse and the circulation of instances of racial abuse.'

Please sign and share Glitch’s petition calling for social media companies to step up and make their platforms safer for Black people, because Black Lives Matter.

The leading destination for fashion, beauty, shopping and finger-on-the-pulse views on the latest issues. Marie Claire's travel content helps you delight in discovering new destinations around the globe, offering a unique – and sometimes unchartered – travel experience. From new hotel openings to the destinations tipped to take over our travel calendars, this iconic name has it covered.