Facebook and Twitter struggle to control racist use of emojis

facebook

A wave of online racism aimed at some of England’s Black soccer players has highlighted how social media companies’ content moderation systems are failing to monitor the use of emojis.

On Sunday, England’s men’s soccer team, playing in their first major tournament final since 1966, fell to Italy on penalties. In the aftermath, a wave of racist abuse was leveled at three Black England players — Marcus Rashford, Jadon Sancho and Bukayo Saka — and messages on social networks like Twitter, Facebook and Instagram included monkey and banana emojis.

The digital abuse isn’t a new phenomenon. The Professional Footballers’ Association and data science company Signify found in a 2020 study of tweets sent to some players that there were more than 3,000 explicitly abusive messages, with 29% of the racially abusive posts in the form of emojis, the tiny images and symbols used in texts, emails and other digital communications.

But despite the long-standing problem, the abuse via emojis has continued. A more recent analysis published Monday flagged almost 2,000 tweets as potentially abusive targeting some black players during the European tournament, and said that although a number of the tweets were deleted, Twitter Inc. didn’t permanently suspend the accounts.

Social media companies such as Facebook Inc., Twitter and Google, which owns YouTube, have spent years developing algorithms to detect offensive speech so that it can be removed.

But experts say that they have put in a smaller effort and developed less expertise in analyzing emoji language — and that has left an opening.

Spokespeople for Twitter and Facebook said the companies have been removing posts and disabling accounts since Sunday’s final, with Twitter saying that the network was proactive and removed more than 1,000 tweets and permanently suspended accounts in the hours after the game.

U.K. leaders condemned the hate speech, with Prime Minister Boris Johnson saying he warned executives from Facebook, Twitter, ByteDance Ltd.’s TikTok, Snapchat Inc. and Instagram at a Tuesday meeting that they need to crack down on online abuse.

Players and officials also spoke out, including Rashford in a widely shared statement on social media. “I’ve grown into a sport where I expect to read things written about myself,” he wrote. “I can take critique of my performance all day long, my penalty was not good enough, it should have gone in, but I will never apologise for who I am and where I came from.”

Bertie Vidgen, a research fellow in online harms at the Alan Turing Institute, has been working with colleagues from Oxford University to test how speech detection models, including one from Google called Jigsaw, respond to offensive emojis. The findings so far have not been encouraging, and Vidgen said it’s not because emojis necessarily pose a more difficult technical challenge.

Share: