Pole dancer and trans activist say Instagram censored their posts while not acting on racism

A pole dance instructor and a trans activist have condemned Instagram’s ‘aggressive’ moderation of their content – despite the platform failing to crack down on racist abuse against England footballers.

Eva Echo, founder of Pass It On – an online campaign sharing conversations about trans and non-binary experiences – and Dr Carolina Are have found their Instagram profiles ‘inaccurately’ moderated for nudity and breaching unnamed community guidelines over the past three years.

It comes after footballers Marcus Rashford, Bukayo Saka and Jadon Sancho were subjected to racist abuse on their Instagram profiles just hours after the Euro 2020 final on 11 July.

i found racist posts flagged to Instagram the day after the final including monkey emojis and derogatory slurs were still visible on the profiles of England footballers three days later.

Instagram acknowledged it had made mistakes in moderating racist abuse directed at England footballers, with Instagram boss Adam Mosseri saying that racist posts had “mistakenly” been identified as benign by moderation technology – rather than being referred to humans to be checked.

But some say the platform has been too strict in moderating other types of content.

“I think Instagram’s moderation is pretty aggressive,” Dr Are told i. “What’s interesting is that it’s aggressive against a very specific group of people in a very specific type of content.

“A lot of my posts feature me dancing and performing tricks in a bikini, both because pole dancing requires nudity and friction for grip.”

On Tuesday she realised her profile had been disabled without warning. The next day a spokesperson for Facebook, which owns Instagram, told her it was an error and her account would be reinstated.

In summer of 2019, much of her pole dancing content was hidden from public view without her knowledge. It later emerged she had been shadowbanned for using hashtags associated to pole dancing.

Shadowbanning is blocking or partially blocking a user or their content from an online community so that others will not see it.

Dr Are said Instagram later apologised to her and other pole dancers, but she is still fighting for her account to be safe from moderation.

She says she has struggled to grow her brand online because of the multiple setbacks.

Over the past few years, members of the trans community say they have found their images taken down and told it was because of nudity. However, they claim similar images from cis gendered people have not received the same action.

Eva Echo, who is also a member of the Crown Prosecution Service’s hate crime panel, said her images were taken down but she was only told her content had breached community guidelines. She said she has seen similar posts by cis-gendered people allowed to remain on the site.

“[Instagram’s] moderation is full of inconsistencies and gaps, which can be exploited. I’m forever wondering which community their so-called community guidelines are designed to protect because it’s certainly not the vulnerable or the marginalised,” she told i.

On top of this, abuse she has received on the site has not been dealt effectively, she claimed.

“When trying to appeal against hateful comments, I’m met with standard responses that are just as infuriating.

“As a platform, they hide behind algorithms and offer no human interaction when things go wrong.”

Both Echo and Are are frustrated at the platform’s approach to moderation and say it cannot continue.

“There’s a clear discrepancy between the moderation of nudity and sexuality on Instagram and the moderation of any type of online hate or or any type of online abuse,” Dr Are said.

She believes this is because Facebook’s algorithm struggles to understand nuance, which is why it can tackle something as straight-forward as nudity and do so aggressively, she said.

In 2018, Mark Zuckerburg said: “It’s much easier to build an AI system that can detect a nipple than it is to determine what is linguistically hate speech.”

Between January and March last year, Facebook took down 39.5m pieces of content for adult nudity or sexual activity, and 99.2 per cent of it was removed automatically, without a user reporting it, according to its Community Standards Enforcement report.

In the same period, the platform took down just 9.6 million posts related to hate speech – a significant rise compared to 5.7 million in the prior period.

The platform now has a Sensitive Content Control allowing users to restrict content which does not break its guidelines but could be upsetting to them such as posts which are sexually suggestive or violent.

Instagram has been contacted for comment.

Author: Peter Davis