Social media bosses to discuss suicide and self harm content with the Samaritans

News, Social Media

Representatives from social media giants Facebook, Google, Snapchat and Instagram have been summoned by the Government to meet the Samaritans over plans to rid the internet of content promoting self-harm videos and suicide

Health Secretary Matt Hancock has convened the summit and will ask tech giants to commit to developing ways that might identify and tackle harmful content, including that promoting suicide.

The summit in Whitehall comes three weeks after the Government announced plans to make tech giants and social networks more accountable for harmful material online.

The behind-closed-doors meeting on Monday will be the second involving social media companies, but will mark the first time the Samaritans have been involved.

The maiden summit in February resulted in Instagram agreeing to ban graphic images of self-harm from its platform.

Speaking ahead of the latest meeting, which will also serve as a progress update from all involved, Mr Hancock said: “I want the UK to be the safest place to be online and give parents the confidence to know their children are safe when they use social media.

“As set out in our Online Harms White Paper, the Government will legislate to tackle harmful content online, but we will also work with social media companies to act now.

“I was very encouraged at our last summit that social media companies agreed normalising or glamorising of eating disorders, suicide and self-harm on social media platforms is never acceptable and the proliferation of this material is causing real harm.

“We have made good progress working with companies to take proper steps to improve the safety of their platforms, but we’re clear there is more to be done.

“So I am delighted to announce this world-leading partnership, which will see us team up with Samaritans to enable social media companies to go further in achieving our goal of making the UK the safest place to be online.”

Health Secretary Matt Hancock has convened the summit (Dominic Lipinski/PA)

Social media companies and the Government have been under pressure to act following the death of 14-year-old Molly Russell in 2017. The schoolgirl’s family found material relating to depression and suicide when they looked at her Instagram account following her death.

In a statement, a spokesman for Facebook, which also owns Instagram, said: “The safety of people, especially young people, using our platforms is our top priority and we are continually investing in ways to ensure everyone on Facebook and Instagram has a positive experience.

“Most recently, as part of an ongoing review with experts, we have updated our policies around suicide, self-harm and eating disorder content so that more will be removed.

“We also continue to invest in our team of 30,000 people working in safety and security, as well as technology, to tackle harmful content. We support the new initiative from the Government and the Samaritans, and look forward to our ongoing work with industry to find more ways to keep people safe online.”

Ruth Sutherland, chief executive of the Samaritans, said: “The internet has evolved rapidly to be a force for good and a new forum to connect with others.

“However, there has been a worrying growth of dangerous online content which is an urgent issue to combat and something we cannot solve alone.

“There is no black and white solution that protects the public from content on self-harm and suicide, as they are such specific and complex issues.

“That is why we need to work together with tech platforms to identify and remove harmful content whilst being extremely mindful that sharing certain content can be an important source of support for some.

“This partnership marks a collective commitment to learn more about the issues, build knowledge through research and insights from users and implement changes that can ultimately save lives.”

Molly Russell death
Molly Russell took her own life in November 2017 (Family handout/PA)

The Online Harms White Paper sets out a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services. Compliance with this duty of care will be overseen and enforced by an independent regulator.

Failure to fulfil this duty of care will result in enforcement action such as a company fine or individual liability on senior management.

Chris Price
For latest tech stories go to