TikTok bans under 16s from private messaging
The registered date of birth of users will determine whether they can use messaging (illustration)
The new rules on the extremely popular TikTok app mean that children under 16 will no longer be able to send or receive direct messages.
It is the first time that a major social media platform has blocked teenagers’ private messages on a global scale.
A poll by the British regulator Ofcom suggested that TikTok was used by 13% of young people between 12 and 15 years old last year.
Critics say the new rules won’t stop children from lying about their age online.
Until now, all users have been able to send direct messages to others when both accounts follow each other.
The change implies that minors under 16 will no longer be able to communicate privately on the platform under any circumstances.
They will still be able to publish publicly in the comment sections of the videos.
TikTok says interested people will soon receive an in-app notification and lose access to direct messages on April 30th.
The limit is based on the date of birth added to the account at the time of creation, but no verification is performed and the system is based on trust.
In 2018, Facebook introduced the rules to make WhatsApp available to over 16 only across the EU, to adhere to its General Data Protection Regulation.
“The interesting thing here is that the largest group of TikTok users are teenagers,” said social media consultant Matt Navarra.
“This restriction will have an impact on a large number of their demographic base.
“Also, blocking the use of a basic feature like messaging among its largest subset of users is a bold move.”
But Navarra added: “Depending on how cynical he is, you might see him as TikTok following the same strategy as Facebook and others, whereby they launch new” digital well-being “or security features before any hearings or regulatory investigations.
“It gives these platforms something to fight with.
“It is possible that TikTok has observed some incidents or activities on the platform and is now trying to anticipate the problem with this new restriction.”
NSPCC online child safety policy officer Andy Burrows said: “This is a bold move by TikTok as we know groomers use direct messaging to widely launch the network and contact a large number of children. .
“Offenders are leveraging the current climate to target children who spend more time online.
“But this shows that proactive measures can be taken to make sites safer and frustrate groomers from taking advantage of unsafe design choices.
“It’s time for tech companies to do more to identify which of their users are kids and make sure the safest accounts are provided by default.”
John Carr, secretary of the British children’s charity coalition on Internet security, said: “It is good that TikTok shows awareness of these problems, but without having a meaningful way of controlling the age of the children it is much less than it seems. ”
He said the research “when Facebook was the dominant app among children” had suggested in some countries about 80% of children over the age of eight had a Facebook account, with a proportion of about two thirds in the Kingdom Kingdom.
“Nobody did it specifically for TikTok but all the evidence we have shows that there are a huge number of underage children on the site,” he said.
“We all know that children say it.
“If all the older kids are on, that’s where you want to be.
“It’s potentially dangerous because parents could allow children to go on an app believing that age means something, and it isn’t, because they never control.”