Australia is about to introduce a historic-level digital safety reform, a nationwide ban on children under the age of 16 having social media accounts. Beginning on December 10, TikTok, Snapchat, Facebook, Instagram, and Threads will start sending notices to young users to download their information and freeze their accounts; otherwise, their accounts will be deleted.
This historic legislation is timely when most governments around the world are grappling to come up with effective mechanisms of protecting minors against the negative impacts of social media. With this being witnessed in Australia, most of the policymakers believe that this may set the child online safety standards worldwide.
Compulsory Age Checks Initiate Industry Change.
Approximately 20 million Australians, who make up 80% of the population, will continue to use their platforms normally. However, in the case of users below 16, the companies must close accounts and not allow underage users to register.
Over the years, social media companies have claimed that mandatory age checks would infringe on privacy, be inaccurate, and evade. However, the answer was right in their hands: the very same algorithmic tools they apply to target advertising, which deduce the age of a user based on the behavioural patterns likes, follow, and engagement.
Those who allege that they have been labeled with the wrong identity will be redirected to third-party age verification applications, which use selfies and, where applicable, official documents to verify identity. But the system is not flawless- there are alarming rates of errors as demonstrated by tests. A number of users between the ages of 16 and 17 have been blocked, and some 15-year-olds have been wrongly approved.
To the companies, the cost of an error is high: AUD 32 million (EUR 17.9 million) fine in case of providing the minors with unauthorized access.
Yoti, a large age verification company for Meta and TikTok, writes that disruptions will be temporary. Users will get used to it in two or three weeks, according to policy director Julie Dawson.
In the meantime, TikTok announced that it is developing a report button through which users can report suspected underage accounts. In recent parliamentary hearings, all companies except Google admitted that they would do so and would start informing the young users.
An Action in the Face of Growing Anxiety about Adolescent Mental Well-Being
The legislation is the most potent political response, but so far, there is mounting evidence of the adverse influence of social media on adolescents. Following the leakage of internal Meta documents in 2021, the world went on fire. The public pressure was exacerbated by the 2024 bestseller The Anxious Generation and a large-scale campaign advocacy by News Corp.
The advocates of the law claim that it is necessary to decrease the number of screens and harmful materials to guarantee the preservation of mental and physical health. Critics, though, such as children’s rights groups, supporters of free speech, and content creators, fear overstepping and censorship.
In the case of 16-17-year-olds, the age group with the highest error in age estimation systems, blocked access can persist for several days/weeks until verification is stabilized.
The process is complicated by the fact that many of them do not have government IDs, like a driver’s license.
Government statistics show that approximately 600,000 Australians are in this age bracket.
Global Implications: Are Other Countries to Heed the Australian example?
The analysts say this landmark law will influence the world discourse on digital safety. That is not a step taken by other nations:
- UK/France: Strict age verification of pornographic sites adopted.
- Denmark: Said it would put a ban on social media on under-15s.
- EU: Making a digital adulthood infrastructure where minors have to get parental permissions to use platforms such as TikTok and Instagram.
Nevertheless, there are still legal issues. Certain prohibitions have been prevented because of privacy, viability, and freedom of expression issues. The Australian law demands that platforms must take reasonable steps to prevent underage users, such as identifying attempts to overcome restrictions with the help of VPNs. However, researchers caution that adolescents will just shift to new platforms that have not been regulated yet.
Dr. Hassan Asghar of Macquarie University said that it could easily be replaced by other platforms.
Nevertheless, the strategy used in Australia can be seen as the most detailed and enforceable so far, what can be done to ensure the safety of children in the digital age, a case study that the world is eager to follow.