YouTube is exempt from Australia’s social media ban. Why?

4:58 am on 3 May 2025
Composite of social media apps and teenager with phone

Photo: RNZ

Explainer: Regardless of the outcome of Saturday's Australian federal election, the country in December will implement a world-first social media ban.

Australians will have to prove they are older than 16 to use most social media sites, though exactly how that will happen remains to be seen.

Communications Minister Michelle Rowland has said YouTube, the country's most popular social media service among children, will be exempt from the ban. Rivals TikTok, Snapchat, and Meta aren't happy.

The exemption has also been criticised in New Zealand, where in 2019 an Australian white supremacist opened fire in two mosques, killing 51 people and injuring dozens of others. A Royal Commission of Inquiry highlighted YouTube's role in radicalising the terrorist.

Meanwhile, experts' opinions on whether other countries should follow Australia's lead are mixed.

The Online Safety Amendment (Social Media Minimum Age) Bill 2024

In November, the bill passed by 34 votes to 19.

It places the onus on the technology companies rather than relying on social media users to disclose their age when creating an account. Companies that fail to introduce adequate safeguards will face fines of up to A$49.5 million.

The minimum age will apply to platforms defined as having a sole or significant purpose "to enable online social interaction between two or more end-users", "allow end-users to link to, or interact with, some or all of the other end-users", and "allows end-users to post material on the service".

This includes Snapchat, TikTok, Facebook, Instagram, X, and others, the Albanese Government said in November.

Services and apps "primarily for the purposes of education and health support" such as "Headspace, Kids Helpline, Google Classroom, and YouTube" are excluded. As are messaging and gaming sites.

Local and global reaction

Research suggests children are becoming exposed to the internet at an increasingly younger age. Australia's online safety regulator eSafety found almost two-thirds of 14 to 17-year-olds have viewed "extremely harmful content including drug abuse, suicide or self-harm, as well as violent and gory material".

The ban has attracted global praise and calls in other jurisdictions to follow suit.

But academics and advocacy groups have warned the ban could backfire, driving teenagers to more dangerous places online, or making them feel more isolated.

Associate Professor of Sociology at Monash University Brady Robards told RNZ while he understood the impulse to ban social media, "what's missing from the conversation is how social media can be used productively".

A lot of his previous research on LGBTQ+ people showed positive connections facilitated via social media.

"We risk pushing young people into even less regulated spaces online.

"I also just don't think it'll work. Young people are good at finding ways around these sorts of things."

Although a harder path, his preferred approach "is to work with young people, parents, and platforms to make these places safer".

Netsafe chief executive Brent Carey expressed similar concerns: "Our focus is on creating a safer internet for all New Zealanders based on evidence-backed solutions. We strongly support the need for more in-depth studies focused on the unique experiences and challenges faced by young New Zealanders in the digital environment."

It's unclear exactly how the ban will be implemented, and whether that will involve sharing sensitive data with technology giants.

Australia's eSafety Commissioner Julie Inman Grant has said about 30 different age-verification technologies are being tested in collaboration with various social media platforms.

YouTube's exemption

Around 80 percent or 1.3 million children aged eight to 12 in Australia used one or more social media services in 2024, according to eSafety. The most popular platforms were YouTube (68 percent of children surveyed), TikTok (31 percent), and Snapchat (19 percent).

Among children aged 13 to 15, YouTube was again the most popular, with 73 percent of the age bracket reporting having used the platform.

Documents published by Australian media outlets reveal strong lobbying from YouTube attempting to avoid the ban.

TikTok argued YouTube has been given an "exclusive carve out", likening it to "banning the sale of soft drinks to minors but exempting Coca-Cola".

Snapchat described a case of "preferential treatment" and Meta said it "makes a mockery of the government's stated intention".

YouTube told RNZ it had been given the exemption, for health and education reasons, from the get-go. In a statement, Minister Rowland's office said the same thing.

Associate Professor Robards and colleagues have pointed out YouTube hosts the same sort of dangerous content as prohibited sites, and its algorithm has been criticised for delivering addictive video content to young people.

"There's this risk that people who don't have high levels of media literacy will be served more and more controversial content. They can also be referred from YouTube to other [more hateful] spaces."

Brainbox Institute director Dr Ellen Strickland told RNZ: "YouTube has some of the same types of content both in subject but also format of delivery as the [banned] platforms."

Even YouTube's Shorts section, similar to TikTok and Instagram's Reels, is exempt.

"From a New Zealand perspective, it's a particularly odd exclusion. Around the time of the [Christchurch] terrorist attack, we heard about the role of YouTube," Strickland said. "And it came up again in the creation of the Christchurch Call [a commitment by governments and technology companies to eliminate violent extremism online]."

New Zealand should take the opportunity to learn from what Australia and the United Kingdom do next. But she would be "very concerned" if Aotearoa adopted Australia's approach.

"The focus on platforms [rather than content] creates a lot of complexity and unintended risks."

A YouTube spokesperson told RNZ: "Our policies prohibit content that promotes terrorism, such as content that glorifies terrorist acts or incites violence. Content produced by or in support of terrorist organisations is prohibited on our products that host user-generated content. Terrorist organisations are also prohibited from using these services for any purpose, including recruitment."

The Christchurch connection

On 15 March 2019, a white supremacist opened fire at two Christchurch mosques, killing 51 people and injuring dozens of others. The attack was livestreamed on Facebook.

An Australian citizen, Brenton Tarrant moved to Dunedin in 2017. He is now serving a life sentence without parole and has been classed as a terrorist entity.

After the attack, a Royal Commission of Inquiry detailed Tarrant's long involvement with the far right and his use of YouTube, in particular.

"The individual claimed that he was not a frequent commenter on extreme right-wing sites and that YouTube was, for him, a far more significant source of information and inspiration," the final report said.

"Although he did frequent extreme right-wing discussion boards [...] the evidence we have seen is indicative of more substantial use of YouTube and is therefore consistent with what he told us."

He also used YouTube tutorials to modify his firearms.

Commissioners Sir William Young and Jacqui Caine when contacted by RNZ declined to comment, but both said the report speaks for itself.

The Federation of Islamic Associations of New Zealand's Abdur Razzaq, who also took part in the inquiry's Muslim Community Reference Group, said social media was just one source of radicalisation online. Radicalisation, online bullying, and harm also took place on gaming platforms, for example.

"We know the [Christchurch] terrorist started his online radicalisation in 2004, through gaming platforms."

While Australia's new legislation "is an important start and something [New Zealand] should consider", "we need to upskill our young people, in terms of discerning what's fake and harmful online".

Social cohesion and education programmes were a "more sustainable" and evidence-based route than legislation, he said.

He pointed to a report to be published by the Classification Office on Tuesday, highlighting the extreme and at times illegal content young people in Aotearoa are exposed to online. The report, seen by RNZ, has prompted further work on practical resources.

"If we can raise a generation of young people who have the right tools to discern misinformation and harm online, we're giving them lifelong skills," Razzaq said.

Anjum Rahman, from Inclusive Aotearoa Collective Tahono, and former spokesperson for the Islamic Women's Council of New Zealand, said balancing free speech with mitigating harms online was "a juggle", but praised digital safety regulation progress made in Australia, the United Kingdom, and Canada.

"New Zealand has really dropped the ball on regulation. We're just not doing anything at all to protect your young people from big and small platforms. We could be doing things at a systemic level that are likely more effective than an outright [social media] ban."

YouTube has said it's changed its algorithm post-2019, she noted. "But how do we verify that?"

In many cases, Tarrant's included, it's impossible to separate the influence of online and offline harms.

"He had violence against him at home [...], the problems are complex. The people who go down these rabbit holes have a lot of other factors going on that make them more open to absorbing these kinds of [radical] messages.

"How we think about this holistically is critical."

New Zealand's political appetite for a ban

Prime Minister Christopher Luxon has previously said he's open to looking at proposals to introduce a minimum age for children to access social media.

New Zealand's Internal Affairs Minister Brooke van Velden this week told RNZ: "A minimum age for social media is not something the New Zealand Government is considering.

"I note that the minimum age to use social media in Australia will not come into force until later this year, and I am interested to see how the policy will be implemented."

Labour leader Chris Hipkins said he was concerned about the potential harm social media platforms can create "and this is something we are actively looking at".

While he wouldn't comment on specific platforms, he added: "We are following developments there closely as we develop our own policies for the next election and beyond."

Get the RNZ app

for ad-free news and current affairs