Saturday, April 4, 2026
ASX 200: 8,412 +0.43% | AUD/USD: 0.638 | RBA: 4.10% | BTC: $87.2K
← Back to home
Technology

Five social media giants face A$49.5M fines over teen ban failures

Facebook, Instagram, TikTok, Snapchat and YouTube under investigation as 70% of under-16s retain accounts

5 min read
Smartphone screen showing social media app icons
Five platforms face fines totalling $49.5 million over failures to enforce Australia's teen social media ban. Illustration: Bushletter
Editor
Apr 3, 2026 · 5 min read
By Caleb Reed · 2026-04-02

Five social media platforms are under investigation for failing to enforce Australia's under-16 age ban.

KEY TAKEAWAYS

01Five platforms under investigation: Facebook, Instagram, TikTok, Snapchat, YouTube
0270% of teens with pre-ban accounts retained access to Instagram, Snapchat and TikTok
03Maximum penalty: A$49.5 million ($34M USD) per breach per platform
04Decision on enforcement action expected mid-2026
05eSafety alleges platforms allow repeated age verification attempts until users pass

Facebook, Instagram, Snapchat, TikTok and YouTube are under investigation for "potential noncompliance" with legislation that took effect December 10. Communications Minister Anika Wells announced the investigations March 31.

"Big tech is taking the piss, to be honest," Wells told reporters in Canberra. "If these companies want to do business in Australia, they must obey Australian laws."

Two-thirds of teens kept accounts

898 parents surveyed in late January reported around one-third of children still held social media accounts post-ban. That's down from 50% before the law.

Of under-16s who had accounts before December 10, between 60% and 70% maintained access to Facebook, Instagram, Snapchat and TikTok. Just under 50% still had YouTube accounts.

"That isn't the law failing, that isn't Australian parents or Australian kids not complying, that is big tech taking the piss," Wells said.

Age verification 'poor practices'

eSafety's first compliance report since the ban identified what the regulator called "poor practices" by platforms.

The report alleges some platforms allow underage users to repeatedly attempt age verification methods until they receive a passing result. It also claims companies fail to prevent teenagers from opening new accounts after being removed.

eSafety Commissioner Julie Inman Grant said facial age estimation technology has higher error rates for users close to the 16-year cut-off. She said platforms would have known some 14 and 15-year-olds would receive false results showing them as over 16.

"We are moving into an enforcement stance," Inman Grant said. "These platforms can comply today, and we certainly expect companies operating in Australia to comply with our safety laws."

A$49.5 million penalty per breach

Under the Online Safety Amendment (Social Media Minimum Age) Act 2024, platforms must show they are taking reasonable steps to prevent underage users or face fines up to 150,000 penalty units per breach.

At the current penalty unit value of A$330, that equals A$49.5 million ($34 million USD).

eSafety must pursue penalties through Federal Court civil action. The regulator said it needs to build evidence that companies failed to introduce adequate systems and processes, not simply demonstrate that some children still have accounts.

A decision on whether to pursue enforcement action is not expected until mid-2026.

"This isn't a police officer issuing a speeding fine on the spot," Wells said. "This is world-leading law that requires the eSafety Commissioner to go to the Federal Court of Australia and to do that we need to build the evidence base."

Meta pushes for app store controls

Meta said accurately determining age "is a challenge for the whole industry."

"The most effective, privacy protective and consistent approach is to require age verification and parental approval at the app store and operating system level before a teen can download an app or create an account," a Meta spokesperson said.

"In the meantime, we'll keep investing in enforcement to detect and remove under-16 accounts and support parents, while advocating for a system that's workable in practice and delivers better safety outcomes for young people."

TikTok and Google did not respond to requests for comment by publication time.

4.7 million accounts deactivated

4.7 million accounts were deactivated, removed or restricted in the first two days after the ban. The government announced the figure in January.

310,000 more accounts were blocked by early March.

Ten platforms fall under the ban. Facebook, Instagram, Snapchat, Threads, TikTok, X, Reddit, YouTube, Kick and Twitch.

Discord, Google Classroom, WhatsApp and Roblox are excluded.

Last week, Wells announced the definition of platforms covered by the ban would be updated to include those with infinite scroll, "feedback features" such as likes or upvotes, and time-limited elements like disappearing stories.

TLDR

Australia's online safety regulator is investigating Facebook, Instagram, TikTok, Snapchat and YouTube for potential breaches of the world's strictest social media age ban. A survey of 900 parents found 70% of under-16s who had accounts before the December ban still have access. Each platform faces fines up to A$49.5 million per breach if found non-compliant.

FREQUENTLY ASKED QUESTIONS

When did Australia's social media age ban take effect?
The ban came into effect on December 10, 2025, following passage of the Online Safety Amendment (Social Media Minimum Age) Act 2024.
Which platforms are covered by the ban?
The ban applies to Facebook, Instagram, Snapchat, Threads, TikTok, X (formerly Twitter), Reddit, YouTube, Kick and Twitch. Discord, Google Classroom, WhatsApp and Roblox are excluded.
What are the penalties for non-compliance?
Platforms face fines up to 150,000 penalty units (A$49.5 million or $34M USD) per breach if they fail to take reasonable steps to prevent under-16s from holding accounts.
Can parents or children be fined for using these platforms?
No. The law only penalises platforms, not parents or children who circumvent the ban.
How many children still have accounts despite the ban?
An eSafety survey found 60-70% of under-16s who had accounts before December 10 still have access to Facebook, Instagram, Snapchat and TikTok. About 50% retained YouTube access.
Editor

Editor

The Bushletter editorial team. Independent business journalism covering markets, technology, policy, and culture.

The Morning Brief

Business news that matters. Five stories, five minutes, delivered every weekday. Trusted by professionals who need clarity before the market opens.

Free. No spam. Unsubscribe anytime.