Snapchat Removes Few Children Off Its Platform Every Month in Britain: Ofcom
Snapchat is kicking dozens of children in Britain off its platform each month compared with tens of thousands blocked by rival TikTok, according to internal data the companies shared with Britain’s media regulator Ofcom and which Reuters has seen.
Social media platforms such as Meta’s Instagram, ByteDance’s TikTok, and Snap’s Snapchat require users to be at least 13 years old. These restrictions are intended to protect the privacy and safety of young children.
Ahead of Britain’s planned Online Safety Bill, aimed at protecting social media users from harmful content such as child pornography, Ofcom asked TikTok and Snapchat how many suspected under-13s they had kicked off their platforms in a year.
According to the data seen by Reuters, TikTok told Ofcom that between April 2021 and April 2022, it had blocked an average of around 180,000 suspected underage accounts in Britain every month, or around 2 million in that 12-month period.
In the same period, Snapchat disclosed that it had removed approximately 60 accounts per month, or just over 700 in total.
A Snap spokesperson told Reuters the figures misrepresented the scale of work the company did to keep the under-13s off its platform. The spokesperson declined to provide additional context or to detail specific blocking measures the company has taken.
“We take these obligations seriously and every month in the UK we block and delete tens of thousands of attempts from underage users to create a Snapchat account,” the Snap spokesperson said.
Recent Ofcom research suggests both apps are similarly popular with underage users. Children are also more likely to set up their own private account on Snapchat, rather than using a parent’s, when compared to TikTok.
“It makes no sense that Snapchat is blocking a fraction of the number of children that TikTok is,” said a source within Snapchat, speaking on condition of anonymity.
Snapchat does block users from signing up with a date of birth that puts them under the age of 13. Reuters could not determine what protocols are in place to remove underage users once they have accessed the platform and the spokesperson did not spell these out.
Ofcom told Reuters that assessing the steps video-sharing platforms were taking to protect children online remained a primary area of focus, and that the regulator, which operates independently of the government, would report its findings later this year.
At present, social media companies are responsible for setting the age limits on their platforms. However, under the long-awaited Online Safety Bill, they will be required by law to uphold these limits, and demonstrate how they are doing it, for example through age-verification technology.
Companies that fail to uphold their terms of service face being fined up to 10 percent of their annual turnover.
In 2022, Ofcom’s research found 60 percent of children aged between eight and 11 had at least one social media account, often created by supplying a false date of birth. The regulator also found Snapchat was the most popular app for underage social media users.
Risks to young children
Social media poses serious risks to young children, child safety advocates say.
According to figures recently published by the NSPCC (National Society for the Prevention of Cruelty to Young Children), Snapchat accounted for 43 percent of cases in which social media was used to distribute indecent images of children.
Richard Collard, associate head of child safety online at the NSPCC, said it was “incredibly alarming” how few underage users Snapchat appeared to be removed.
Snapchat “must take much stronger action to ensure that young children are not using the platform, and older children are being kept safe from harm,” he said.
Britain, like the European Union and other countries, has been seeking ways to protect social media users, in particular children, from harmful content without damaging free speech.
Enforcing age restrictions is expected to be a key part of its Online Safety Bill, along with ensuring companies remove content that is illegal or prohibited by their terms of service.
A TikTok spokesperson said its figures spoke to the strength of the company’s efforts to remove suspected underage users.
“TikTok is strictly a 13+ platform and we have processes in place to enforce our minimum age requirements, both at the point of sign up and through the continuous proactive removal of suspected underage accounts from our platform,” they said.
© Thomson Reuters 2023
For details of the latest launches and news from Samsung, Xiaomi, Realme, OnePlus, Oppo and other companies at the Mobile World Congress in Barcelona, visit our MWC 2023 hub.