Story so far: Australia-based users under the age of 16 will be banned from Meta’s social media platforms including Facebook, Instagram, and Threads from December 10. This is a direct result of Australia’s ban on teens under 16 using social media platforms. To comply with the new rules being enforced this month, Meta will start deactivating user accounts or blocking new sign-up by users under 16, from December 4. What will implementation of the ban look like on Meta’s side?
Why are social media platforms like Meta complying?
Meta said it has been sending two-week warnings to thousands of Australian teenagers between ages 13 and 15, notifying them to download their digital history and delete their accounts. The Facebook-parent confirmed that account removals will only be done after the sweeping law takes effect on December 10.
However, it is unclear whether the process will be completed on that same day, given verification is a lengthy, multi-step undertaking. The country’s internet regulator, the Australian Communications and Media Authority, has noted that there are close to 1,50,000 Facebook users between 13 and 15 years of age, as well as 3,50,000 Instagram users. The law does not currently apply to Meta’s Messenger app.
Meta’s prompt actions are in line with the restrictions imposed by Prime Minister Anthony Albanese’s government on multiple social media platforms including TikTok, Snapchat, YouTube, X, Reddit, Twitch and the livestreaming website Kick.
These companies are required to take “reasonable steps” to keep underage users off their platforms, failing which they will be facing fines of up to 49.5 million Australian dollars or around $32 million.
Despite doubts surrounding around how binding the law will be and whether the regulation will lead to better mental health in children, companies are begrudgingly following instructions.
A Meta spokesperson said that while they are committed to fulfulling the legal obligations, they have raised their concerns around the regulation, saying a “blanket ban” is hardly the solution. The Mark Zuckerberg-led company claimed the action will isolate teenagers from online communities and information while also giving “inconsistent protection.”
Albanese has responded saying that given this is the first time a law like this is being passed, there will be flaws while implementing it. And even though the system maybe imperfect, it will send a strong message to society, according to him.
How is Meta verifying children’s ages?
Meta advised the affected users to update their contact details so the company can SMS or email them once they turn 16 years of age. Once these children cross the cut-off age, users can resume operating their accounts just as they had been left and find the same Reels, posts, messages and short videos. Users can also choose to delete their account completely, if they wish.
However, there is a fair chance that Meta might inaccurately flag a user as being under 16. An Age Estimation report published by the Australian government found that age verification systems using facial recognition showed false rejection rates higher than “acceptable levels” at 8.5% and 2.6% respectively, for users of 16 and 17 years of age.
In case accounts are incorrectly flagged, Meta has said that users can verify their age either with a government ID or a video selfie via the third-party facial age-verification platform Yoti. Meta has assured users that the platform deletes personal data once verification is over.
Critics has voiced concerns about the surveillance risks of checking children’s ages with age-verification technology.
What are the drawbacks?
Meta vice president and global head of safety, Antigone Davis, stated that the company would like the app stores of Apple and Google Play to collect age-related data when users sign up, and verify whether they have reached 16 years–on behalf of Meta. Davis added this would ensure a standard procedure and also maintain user privacy.
Presently, Meta hasn’t disclosed what methods they will use to determine the ages of users, so that children under 16 don’t find a loophole through which to evade the ban.
But varied options have been discussed, including government IDs, facial or voice recognition, or age inference methods that consider online user data like interactions to estimate a user’s age. The government has encouraged different platforms to look into their own age-verification tools.
Gaming platforms like Roblox and Discord have recently been forced to introduce age restrictions for specific features, fearing that they could potentially be targeted next.
Other platforms on the list are also expected to follow suit and explain how they plan to proceed. While TikTok and Snapchat have agreed to comply with the law, YouTube has differed with the Australian government’s decision to be included in the ban.
The video streaming platform hasn’t said whether they will comply with the law but hinted at taking legal action against it.
Published – December 01, 2025 08:00 am IST


