Cop this: Australia’s government has just handed Mark Zuckerberg a sheriff’s badge and asked him to keep teens off his own casino floor. That’s not policy; that’s a Monty Python sketch with cabinet approval.
Of course it’s a win-win. In today’s world, Zuck or fellow Tech Bros, Musk or MrBeast own all the moral high ground.
Zuckerberg built a career on Connect U, a college program which got Zuck and other studs to rate young women on their hotness. Lucky his rich Mum and Dad did chip in. It nearly all went tits-up when the Winkelvoss twins sued him, alleging he stole their group’s code and ideas to build Facebook, but the case was settled in 2008 after he paid out $65 million dollars.
Today Zuck’s lucked in; locked and loaded. Cock of the walk. What could possibly go wrong?
Everything. Australia’s social media ban for under-16s takes effect December 10, and it’s shaping up to be the most spectacular showpiece of performative theatre masquerading as bad law, since someone decided poker machines were “harmless entertainment.”
Mounting research from mental health experts and digital scholars reveals the ban is buggered; studies show outright bans isolate young people, jigger digital literacy, and drive them to unregulated online corners of the dark web. By deputising social media corporations as age-verification bouncers, moreover, our lawmakers create false security among adults. This helps reinforce the arc of modern parenthood, leaving teens to their own devices.
Go play in peak traffic. Juggle chainsaws. Sniff paint. This isn’t child protection. It’s political theatre with toxic risk.
Act One: The Performance Begins
Officials promise to shield children from the digital boogeyman lurking behind every TikTok clip. Cue News Corp’s “Let Them Be Kids” campaign; more smoke than a climate-fuelled megafire and twice the moral panic, not out of concern for teens, but because social media giants do advertising far better than dying corporate media ever can. This drives News Corp, Canberra’s apex predator, to push government for regulatory favors to limit their competition. Ever further. This ban is less about kids and more about legacy media’s Danse Macabre: a structural death rattle dressed up as moral crusade.
Act Two: The Fox Guards the Henhouse
Comedy now becomes farce: enforcement is handed to platforms that profit wildly by keeping teens glued to screens. With crapware. Age verification technology is poor. 18 months out in 85% of cases; misidentifying 15-year-olds for 20s and 30s adults. That’s not security, that’s fortune cookie forecasting with million-dollar fines attached. Meanwhile, teens Houdini past these restrictions with ease; digital natives mocking their elders’ technological innocence.
Asking Meta to police teen access? You may as well ask Barnaby Joyce to deliver a keynote on water rights integrity, party loyalty, or the proper pastoral care of staffers at bush-doof ute musters. Theoretically possible. Practically ludicrous. A cockamamie, hare-brained charade dressed up as digital virtue, just long enough to dodge regulation and sell more ads.
The Bureaucratic Placebo
Psychological research confirms banning does not mitigate harm but instead isolates youth and impairs development of digital literacy and emotional resilience crucial to navigating online spaces. Prohibition doesn’t build resilience, it builds black markets. (Who’d have thunk?) Banning doesn’t teach critical thinking; it outsources parenting to VPN providers and drives kids toward unregulated platforms that make TikTok look like Playschool.
A survey of 1,054 high school adolescents found one-third report feeling less alone because of social media, and 72% report it has either no impact or a positive impact on their mental health. For LGBTQ+ youth in regional areas, for kids managing chronic illness, for teens building communities around shared interests, social media provides connection that saves lives. This ban severs those lifelines with all the nuance of a blunt, badly-tuned chainsaw.
The Grand Finale: False Security Theatre
Behind this shit-show lies something grimmer. Parents settle into their seats, convinced something’s been done. They’ve bought tickets to “Child Safety: The Musical” and leave feeling productive. Virtuous. Responsible. Back home, their kids are three clicks deep into spaces that make Instagram look like a kindergarten sandpit, navigating entirely unsupervised. The government’s handed parents a security blanket stitched from confirmation bias, self-interest and parliamentary press releases. It’ll keep exactly nobody warm.
Meanwhile, trauma from graphic violence and toxic content festers unabated; often encountered unwillingly across the entire internet, not just social media. Trauma isn’t legislated away by age gates. It demands education, platform accountability, and mental health support. This ban provides none of these.
This policy is Ruth Goldberg meets Heath Robinson; an elaborate, spectacular contraption designed by people who’ve forgotten how contraptions work, guaranteed to achieve the opposite of its stated purpose. Instead of protecting children, it teaches them their first masterclass in creative non-compliance while platform algorithms keep printing money from adult users who think the problem’s solved.
What Policy Theatre Obscures
Prominent researchers show age estimation from facial scans cannot achieve acceptable accuracy levels. In 2023 the eSafety Commission announced the age assurance market was immature with significant gaps, recommending media literacy and education instead. The government ignored this advice to pursue symbolic action over effective policy.
As policy scholars warn, most studies linking social media use to mental health problems show weak and inconsistent associations, with researchers unable to establish whether social media causes poor mental health or whether the relationship is bidirectional or influenced by other factors. The evidence base is contested, not conclusive, but contested science makes for terrible political optics. Easier to ban first, ask questions during the inevitable review.
Oscar Wilde saw how: “Democracy means simply the bludgeoning of the people by the people for the people.” The tragedy is we’re treating complex problems with blunt instruments while pretending sophistication. Next it will be anti-tank missiles on Venezuelan fishermen.
The Alternative Nobody’s Funding
What if we built safe, age-appropriate digital spaces instead? Advanced critical thinking curricula? Empowered parents and educators with actual tools rather than false confidence? Imposed genuine duty-of-care obligations on platforms; not age gates they’ll circumvent, but real accountability for algorithmic harm?
Leading researchers advocate for digital literacy, emotion regulation strategies, and fostering skills that enable adolescents to navigate online spaces; recognizing both risks and the neutral and positive experiences young people have online. But that’s complex, expensive, and doesn’t generate press releases. Much easier to ban something, declare victory, and move on before anyone checks whether it worked.
The Verdict
This dud symbolizes government eager to appear decisive but deaf to expert advice. It lazily outsources responsibility to profit-addicted corporations, mistaking symbolic action for meaningful change. Nine platforms now face the impossible task of verifying age without reliable technology, government ID requirements, or clear enforcement guidelines; set to fail spectacularly while politicians claim clean hands.
So let’s stop applauding the emperor’s new clothes while teenagers nick the wardrobe. Protecting young people demands courage, complexity, and a bloody-minded refusal to mistake performance for policy. Our kids don’t need politicians to look busy, they need adults willing to do the hard work of building safe digital spaces, teaching critical thinking, and holding platforms accountable for actual harm. Not deputizing them as bouncers at their own profit centres
Anything less isn’t child protection. It’s child abandonment with better PR; a screen door on a submarine, solemnly installed by people congratulating themselves on keeping the water out.
CODA: The Interview
A tribute to the past grand masters, John Clarke abd Bruce Dawe, who took the Mickey, the ham and the sham, the pretension and the piffle out of our political bigwigs in a manner that calls for far more than a pallid imitation, the sincerest form of flattery, but it’s all that’s on offer.
CLAWE: Thanks for your time.
DARKE: Pleasure.
CLAWE: You’re the Minister for Keeping Children Off The Internet.
DARKE: Digital Child Safety.
CLAWE: And you’ve banned children under 16 from social media.
DARKE: We have. December 10th. Historic day for Australian families.
CLAWE: How does it work?
DARKE: Simple. Social media companies verify that users are over 16.
CLAWE: And how do they do that?
DARKE: They use age verification technology.
CLAWE: Does it work?
DARKE: Absolutely.
CLAWE: The eSafety Commissioner says it can’t tell a 15-year-old from a 25-year-old.
DARKE: That’s why we’re asking the platforms to take reasonable steps.
CLAWE: What are reasonable steps?
DARKE: Whatever the platforms decide is reasonable.
CLAWE: So the platforms decide what’s reasonable?
DARKE: Correct.
CLAWE: The same platforms that make billions keeping teenagers on their sites?
DARKE: Yes, but now they’re responsible for keeping them off.
CLAWE: Won’t they just pretend to try?
DARKE: They could face fines.
CLAWE: How much?
DARKE: Up to $49 million.
CLAWE: And how much does Meta make per day?
DARKE: About $100 million.
CLAWE: So that’s half a day’s revenue.
DARKE: Bryan, we’re sending a very strong message.
CLAWE: What about VPNs?
DARKE: What about them?
CLAWE: Won’t teenagers just use VPNs?
DARKE: That would be illegal.
CLAWE: So would being on social media under 16.
DARKE: Exactly.
CLAWE: But there’s no penalty for teenagers.
DARKE: No, we’re not punishing children.
CLAWE: Or parents.
DARKE: Or parents. We’re punishing the platforms.
CLAWE: Who can’t tell how old anyone is.
DARKE: They’ll work it out.
CLAWE: What about the queer kid in a country town who uses social media for support?
DARKE: They can wait until they’re 16.
CLAWE: What if they can’t wait?
DARKE: Bryan, we can’t let perfect be the enemy of good.
CLAWE: Is this good?
DARKE: It’s action.
CLAWE: The experts say it won’t work.
DARKE: Which experts?
CLAWE: 140 academics, the eSafety Commissioner two years ago, the Human Rights Commission, digital researchers—
DARKE: Bryan, you can always find someone to criticize.
CLAWE: They’re specialists.
DARKE: So are we. We specialize in listening to parents.
CLAWE: What about listening to teenagers?
DARKE: They don’t vote.
CLAWE: What happens when the kids work around it in 24 hours?
DARKE: We’ll review the policy.
CLAWE: When?
DARKE: Two years.
CLAWE: So for two years, parents will think their kids are safe—
DARKE: Their kids will be safe.
CLAWE: —while their kids are on VPNs accessing completely unregulated platforms?
DARKE: That’s why we’ve made it illegal.
CLAWE: But there’s no enforcement.
DARKE: The platforms—
CLAWE: Can’t verify age.
DARKE: They’ll take reasonable steps.
CLAWE: Who decides what’s reasonable?
DARKE: They do.
CLAWE: The platforms.
DARKE: Correct.
CLAWE: Who profit from teenagers being online.
DARKE: But not these teenagers.
CLAWE: These specific teenagers.
DARKE: Under 16.
CLAWE: Who’ll be 16 eventually.
DARKE: Precisely. We’re building future customers.
CLAWE: So you’re helping the platforms?
DARKE: We’re protecting children, Bryan.
CLAWE: From platforms you’ve asked to protect them.
DARKE: Now you’re getting it.
CLAWE: What about digital literacy?
DARKE: What about it?
CLAWE: Shouldn’t we teach children how to use social media safely?
DARKE: That’s the parents’ job.
CLAWE: How can parents teach children about platforms the children can’t access?
DARKE: They’ll learn when they’re 16.
CLAWE: Without any preparation.
DARKE: Exactly. Baptism of fire. Character building.
CLAWE: So we’re banning them for two years then throwing them in unprepared?
DARKE: That’s democracy, Bryan.
CLAWE: One last question.
DARKE: Fire away.
CLAWE: Does this actually protect children?
DARKE: It protects the appearance of protecting children, which in politics is the same thing.
CLAWE: Is it?
DARKE: It is until the next election.
CLAWE: Thanks for your time.
DARKE: Pleasure, Bryan.
[Darke stands, adjusts his jacket, and walks off purposefully toward a door marked “Exit Strategy.”]
nigger
LikeLike