Social media firms must ‘purge’ harmful content, warns Health Secretary

News, Social Media

Social media firms need to “purge” the internet of harmful content that promotes self-harm and suicide, the Health Secretary has said.

Matt Hancock delivered the message after the father of a teenager who took her own life said Instagram “helped kill my daughter”.

The minister has written to a number of internet giants following the death of 14-year-old Molly Russell, telling them they have a duty to act.

Molly Russell
Molly Russell (Family handout/PA)

Mr Hancock said he was “horrified” to learn of Molly’s death, and feels “desperately concerned to ensure young people are protected”.

In his letter, he said: “I welcome that you have already taken important steps, and developed some capabilities to remove harmful content. But I know you will agree that more action is urgently needed.

“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.

“It is time for internet and social media providers to step up and purge this content once and for all.”

He added that the Government is developing a white paper addressing “online harms”, and said it will look at content on suicide and self-harm.

He said: “I want to work with internet and social media providers to ensure the action is as effective as possible. However, let me be clear that we will introduce new legislation where needed.”

Molly was found dead in her bedroom in November 2017 after showing “no obvious signs” of severe mental health issues. Her family later found she had been viewing material on social media linked to anxiety, depression, self-harm and suicide.

Molly’s father Ian Russell said the algorithms used by Instagram enabled her to view more harmful content, possibly contributing to her death.

Mr Russell, a television director, told the Sunday Times: “It’s such a mystery. She went to bed so happy. What could have caused it? The only thing she had access to were her two electronic devices. What had triggered her?”

An Instagram spokesman said it does “not allow content that promotes or glorifies eating disorders, self-harm or suicide and works hard to remove it”.

He added: “However, for many young people, discussing their mental health journey or connecting with others who have battled similar issues is an important part of their recovery.

“This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most.”

An inquest into Molly’s death is expected later this year.

Chris Price
For latest tech stories go to