Ofcom to help protect children as Online Harms bill introduced

Social Media
Share


UK watchdog Ofcom is set to gain the power to block access to online services that fail to protect children and other users with the introduction of the Online Harms Bill.

The regulator will also be able to fine Facebook and other tech giants billions of pounds, and require them to publish an audit of efforts to tackle posts that are harmful but not illegal. 

Under the proposed legislation, social media firms will face fines up to 10 per cent of their turnover for breaching Duty of Care laws, as ministers pledged there will be “no safe space online for horrors like child sexual abuse or terrorism”.

Unveiling the new Duty of Care in an article for The Telegraph, Oliver Dowden, the Culture Secretary, said the Government will take powers to shut down firms that fail to remove child abuse, terrorism or suicide content from their sites by blocking their access to UK users.

But he stopped short of immediately introducing criminal sanctions against named directors whose companies failed to comply with the Duty of Care – as demanded by children’s charities and campaigners including the NSPCC.

Instead, these will be held back as “reserve” powers that will be marshalled against the tech giants if they fail to clean up their acts or do not allow Ofcom access to their algorithms which have been blamed for promoting harmful content to children.

Digital Secretary Oliver Dowden told parliament the legislation represented “decisive action” to protect both children and adults online.

“A 13-year-old should no longer be able to access pornographic images on Twitter, YouTube will not be allowed to recommend videos promoting terrorist ideologies and anti-Semitic hate crimes will need to be removed without delay.”

The Children’s Commissioner for England, Anne Longfield, said there were signs that new laws would have “teeth”, including strong sanctions for companies found to be in breach of their duties. She welcomed the requirement on messaging apps to use technology to identify child abuse and exploitation material when directed to by the regulator.

Added Melanie Dawes, Ofcom’s Chief Executive: 

“We’re really pleased to take on this new role, which will build on our experience as a media regulator. Being online brings huge benefits, but four in five people have concerns about it. That shows the need for sensible, balanced rules that protect users from serious harm, but also recognise the great things about online, including free expression. We’re gearing up for the task by acquiring new technology and data skills, and we’ll work with Parliament as it finalises the plans.”

 

Plans to introduce the law were spurred on by the death of 14-year-old Molly Russell, who killed herself after viewing online images of self-harm.

In 2019, her father Ian Russell accused Instagram of being partly to blame, leading ministers to demand social media companies take more responsibility for harmful online content.

Chris Price
For latest tech stories go to TechDigest.tv