One in three children lie about their age to access adult content on social media, according to research commissioned by the regulator, Ofcom.
Yonder Consulting found that the majority of children aged between 8 and 17 (77%) who use social media now have their own profile on at least one of the large platforms. And despite most platforms having a minimum age of 13, the research suggests that 6 in 10 (60%) children aged 8 to 12 who use these platforms are signed up with their own profile.
Among this underage group (8 to 12s), up to half had set up at least one of their profiles themselves, while up to two-thirds had help from a parent or guardian.
When a child self-declares a false age to gain access to social media or online games, as they get older, so does their claimed user age. This means they could be placed at greater risk of encountering age-inappropriate or harmful content online. Once a user reaches age 16 or 18, some platforms, for example, introduce certain features and functionalities not available to younger users – such as direct messaging and the ability to see adult content.
Yonder’s study sought to estimate the proportion of children that have social media profiles with ‘user ages’ that make them appear older than they actually are. The findings suggest that almost half (47%) of children aged 8 to 15 with a social media profile have a user age of 16+, while 32% of children aged 8 to 17 have a user age of 18+.
Among the younger, 8 to 12s age group, the study estimated that two in five (39%) have a user age profile of a 16+ year old, while just under a quarter (23%) have a user age of 18+.
Anna-Sophie Harling, from Ofcom, told BBC News the way social media platforms categorised users by age had a “huge impact” on the content they were shown.
She cited the recent Molly Russell inquest: “That was a very specific case of harmful content that had very detrimental impacts and tragic outcomes on a family in the UK.
“When we talk about potentially harmful content to under-18s, it’s content that might have more significant negative consequences for under-18s because they’re still developing.”
Risk factors that can lead children to harm online
In line with its duty to promote and research media literacy, and as set out in its roadmap to online safety regulation, Ofcom is publishing a series of research reports designed to further build its evidence base as it prepares to implement the new online safety laws.
Given the protection of children sits at the core of the regime, today’s wave of research crucially explores children’s experiences of harm online, as well as understanding children’s and parents’ attitudes towards certain online protections.
Commissioned by Ofcom and carried out by Revealing Reality, a second, broader study into the risk factors that may lead children to harm online (PDF, 826.8 KB) found that providing a false age was only one of many potential triggers.
A range of risk factors were identified which potentially made children more vulnerable to online harm, especially when these factors appear to coincide or frequently co-occur with the harm experienced.
The study indicated that the severity of any impact can vary between children. This ranged from minimal transient emotional upset (such as confusion or anger), temporary behaviour-change or deep emotional impact (such as physical aggression or short-term food restriction), to far-reaching, severe psychological and physical harm (such as social withdrawal or acts of self-harm).