Removing extremist groups from social media limits impact, report shows

News, Social Media
Share
Tommy Robinson
Tommy Robinson was banned from Facebook (Aaron Chown/PA)

Removing extremist groups from major social media platforms such as Facebook is an effective way to limit their impact and reach, new research suggests.

A report by the Global Research Network on Terrorism and Technology says the example of far-right group Britain First and its removal from Facebook in 2018 shows that removal can cut the level of exposure such groups receive.

It claims that since being removed from Facebook, its number of followers has dropped from the 1.8 million it had on the site to around 11,000 on a smaller, alternative platform – Gab.

The page of Britain First and those of its leaders Paul Golding and Jayda Fransen were removed from Facebook last year for what the social network said were violations of its community standards.

The platform has since gone further – banning a number of far-right groups permanently under its dangerous organisations policy.

In February, Facebook also announced a permanent ban for far-right activist Tommy Robinson, real name Stephen Yaxley-Lennon, for behaving “in ways that violate our policies around organised hate”.

The new report encourages social media companies to continue removing extremist groups and urges the UK and US governments to work with smaller, more fringe social networks in order to better regulate content on those sites.

The researchers argue that banning extremist groups from major social media platforms leaves them without a “gateway” to larger pools of potential recruits and removes their ability to signpost people to their pages on other platforms.

Earlier this year, the UK Government published its online harms white paper, which proposes new regulation for social media firms that could see them face large fines and other penalties if they fail to protect users from hateful, extremist and other inappropriate content.

The proposals followed several years of criticism for platforms such as Facebook and Twitter, which have been accused of not doing enough to prevent the spread of such content.

After Facebook announced its permanent ban of some far-right groups in April, Yvette Cooper, chairwoman of the House of Commons Home Affairs Select Committee, said the steps were “long overdue”.

“For too long social media companies have been facilitating extremist and hateful content online and profiting from the poison,” she said.

Chris Price
For latest tech stories go to TechDigest.tv