Meta Platforms Inc. said it has removed almost 550,000 accounts in Australia to comply with the new rules. The action covers multiple Meta-owned platforms and was disclosed in a company blog post outlining its initial response to the legislation.
Australia’s decision to bar children from mainstream social media is moving from legislation to large-scale enforcement.
Less than a month after the law came into force, the impact is already visible across major platforms operating in the country.
Meta said it removed about 330,000 under-16 users from Instagram, 173,000 from Facebook, and 39,000 from Threads between December 4 and December 11.
Australia’s minimum age ban formally took effect on December 10, though Meta began enforcement a week earlier.
The Albanese government is expected to publish data this week showing how many young Australians were removed across all affected platforms.
Meta questions effectiveness of the ban
In an update released overnight, Meta argued that the ban was not achieving the government’s stated aim of improving youth safety and wellbeing.
The US-based company said the policy risks isolating vulnerable teenagers and pushing them toward less regulated online spaces.
Meta also criticised the age-verification approaches being used to enforce the law, describing them as inconsistent.
The company questioned the underlying rationale of the legislation.
“The premise of the law, which prevents under-16-year-olds from holding a social media account so they aren’t exposed to an ‘algorithmic experience,’ is false,” Meta said in a blog post.
“Platforms that allow teens to still use them in a logged-out state still use algorithms to determine content the user may be interested in – albeit in a less personalised way that can be appropriately tailored to a person’s age.”
Despite its objections, Meta said it would continue to comply with Australian law.
Government aims and regulatory penalties
Australia passed its social media minimum age laws in 2024, positioning itself as one of the world’s most aggressive regulators of youth access to social platforms.
The legislation is designed to shield children from targeted algorithms and harmful content.
Under the law, companies face fines of up to A$50 million if they fail to take “reasonable steps” to prevent under-16-year-olds from holding accounts.
The rules apply to a broad list of platforms, including Facebook, Instagram, Snapchat, TikTok, X, YouTube, Reddit, Twitch, Threads, and Kick.
The eSafety Commission, which oversees enforcement, has warned that additional platforms could be added if they meet the criteria for inclusion.
However, the framework includes exemptions for services where gaming, health, or education is the predominant use.
Platforms can adopt a range of methods to verify users’ ages, including government-issued identification, facial age estimation, and age inference technologies.
Meta urged the Australian government to work more closely with technology companies on alternative solutions rather than relying on blanket bans.
“We call on the Australian government to engage with industry constructively to find a better way forward, such as incentivising all of industry to raise the standard in providing safe, privacy-preserving, age-appropriate experiences online, instead of blanket bans.”
The post Meta deletes 550,000 accounts as Australia enforces child social media ban appeared first on Invezz