Ofcom issues new guidance to protect users from harmful video content

News
Share


People who use online video-sharing sites and apps should be better protected from harmful content, as Ofcom issues new guidance for tech companies today.

Video sharing platforms (VSPs) are a type of online video service where users can upload and share videos. VSPs established in the UK – such as TikTok, Snapchat, Vimeo and Twitch – are required by law to take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or hatred, as well as certain types of criminal content.

The Internet Watch Foundation reported a 77% increase in the amount of “self-generated” abuse content in 2020. Adult VSPs carry a heightened risk of child sexual abuse material and the rise in direct-to-fans subscription sites specialising in user-generated adult content has potentially made this risk more pronounced.

Given this heightened risk, Ofcom expects that VSPs’ creator registration processes and subsequent checks should be strong enough to significantly reduce the risk of child sexual abuse material being uploaded and shared on their platforms.

Ofcom research (PDF, 4.6 MB) shows that a third of users say they have witnessed or experienced hateful content; a quarter claim they’ve been exposed to violent or disturbing content; while one in five have been exposed to videos or content that encouraged racism.

Our research shows that 70% of users say they have been exposed to any potential online harm, 32% to hateful content, 26% to bullying, abusive behaviour and threats, 26% to violent or disturbing content, and 21% to racist content.

If Ofcom finds a VSP provider has breached its obligations to take appropriate measures to protect users, it has the power to investigate and take action against a platform. This could include fines, requiring the provider to take specific action, or – in the most serious cases – suspending or restricting the service.

Says Dame Melanie Dawes, Ofcom Chief Executive:

“Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them.

“The platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.

 

 

Chris Price
For latest tech stories go to TechDigest.tv