Meta begins booting Australian children from Instagram and Facebook ahead of new social‑media age law

6 Min Read
pexels julio lopez 75309646 17614476

On 4 December 2025, Meta — owner of Instagram, Facebook and Threads — started deactivating accounts belonging to Australian users believed to be under 16.

The shutdown comes a week before a sweeping new law — the Online Safety Amendment (Social Media Minimum Age) Act 2024 — becomes enforceable on 10 December 2025.

Meta sent out notifications via in‑app messages, email, and SMS to users aged 13–15, giving them a short window to back up or delete their data before access is blocked.

New account sign‑ups by under‑16s are also blocked effective immediately in Australia.

Law and background: Australia’s world‑first social‑media age ban

The law, passed in Parliament last year, requires major social‑media platforms to prevent children under 16 from holding accounts.

If companies fail to take “reasonable steps” to block under‑16s, they face fines of up to A$49.5 million.

Affected platforms include not only Facebook, Instagram and Threads, but also other major services such as TikTok, Snapchat, X (formerly Twitter), YouTube, Reddit, Twitch and others.

The move has been framed by supporters as a necessary step to protect children from online harms — including mental‑health risks, exposure to harmful content, cyberbullying and addiction.


Meta’s compliance — what the company says

Meta said its priority is compliance: the company will use a mix of existing account data and third‑party age‑verification tools (such as ID checks or “video selfies” via verification services) to validate users’ ages.

Users who are over 16 but mistakenly flagged can appeal via such verification.

The company has warned some inaccuracies may occur, acknowledging limitations in age‑estimation technologies and urging parents to help ensure correct birthdates.


Officials and regulators have welcomed Meta’s early move as a sign of compliance. The eSafety Commissioner, the independent regulator enforcing the new law, framed the ban as a significant precedent — one that could ripple globally.

But critics warn the ban is a blunt instrument. Some argue that rather than automatically locking out teens, platforms should have focused on improving safety features and moderation.

There are also reports of wrongly flagged accounts: dozens of Australians have already filed complaints about being locked out erroneously. The platforms for dispute resolution remain unclear — as noted by the Telecommunications Industry Ombudsman, which currently lacks the mandate to handle digital‑platform complaints.

Additionally, some under‑16s and advocacy groups are pushing back. As one 15‑year-old told the press: forcing teens offline may “drive them underground,” intensifying isolation rather than keeping them safe.


What this means for teens, parents and platforms — immediate and long‑term

  • Many adolescents will lose access to social media platforms overnight. For some, this may mean losing connections, creative outlets, and digital memory banks (photos, messages, posts).
  • Platforms will rely heavily on age‑verification technologies, which remain imperfect, so errors and wrongful removals are likely — perhaps disproportionately affecting users without documentation or those with complex identity backgrounds.
  • Parents, educators and policymakers will need to reconsider how teens access online spaces: the ban does not prevent them from browsing sites while logged out, using VPNs, or fleeing to platforms not yet covered, potentially exposing them to more dangerous corners of the internet.
  • For Meta and other big‑tech companies, it’s the beginning of a new regulatory reality — one where governments expect platforms to proactively police age and content, rather than relying on voluntary safety tools.

Broader implications — a potential global precedent

Australia’s enforcement of the social‑media ban marks the world’s first comprehensive, legally mandated age‑restriction on teen social media use.

Regulators abroad — particularly in Europe and North America — are already watching closely. If Australia’s approach proves feasible or effective, it may encourage similar regulations elsewhere. Indeed, even within Australia, there is speculation the list of restricted platforms could expand if teens migrate to alternatives.

But whether the policy actually delivers on its promise — of making social media safer for youth — remains to be seen. Early reports of wrongful exclusions, legal challenges and concerns about digital exclusion suggest a rocky transition.


Final thoughts: safety, rights and digital childhood

The purge of under‑16 accounts by Meta in Australia signals a radical shift in how societies regulate digital life for young people. In a single stroke, tens of thousands of teenage accounts have been disconnected — reshaping experiences of friendship, identity, creativity and community online.

For many parents and regulators, this was a long‑awaited move to protect children from harmful content. For teens, however, it may feel like being cut off mid‑stream — no longer part of the online social fabric they had built.

TAGGED: ,
Share this Article
By Admin
Follow:
7 years in the field, from local radio to digital newsrooms. Loves chasing the stories that matter to everyday Aussies - whether it’s climate, cost of living or the next big thing in tech.
Leave a comment