By Byron Kaye
SYDNEY (Reuters) – Australia’s government carved out an exemption for YouTube when it passed laws banning social media access for children under 16, but some mental health and extremism experts say the video-sharing website exposes them to addictive and harmful content.
Australia will block video-sharing platforms TikTok and Snapchat, Meta-owned Instagram and Facebook and Elon Musk’s X for minors by the end of 2025, forcing them to impose strict age restrictions on access or face hefty fines. At the same time, the government will keep Alphabet-owned YouTube open for all ages because it is a valuable educational tool and not “a core social media application”.
The initial ban was meant to include YouTube but after hearing from company executives and children’s content creators who use the site, the government granted an exemption.
“While YouTube undoubtedly functions as a source of entertainment and leisure, it is an important source of education and informational content, relied on by children, parents and carers, and educational institutions,” Communications Minister Michelle Rowland’s spokesperson said, adding that the exemption “matched broad sentiment in the Australian community that YouTube is not a core social media application”. The landmark legislation passed in November sets some of the world’s most stringent social media limits. However, six extremism and mental health researchers interviewed by Reuters say the exemption undermines Australia’s main goal of protecting young users from harmful content.
Surveys show YouTube is the country’s most popular social media website among teenagers, used by 9 in 10 Australians aged 12-17.
FAR-RIGHT MATERIAL
The academics interviewed by Reuters said that it hosts the same sort of dangerous content as the prohibited sites.
“YouTube is deeply problematic, not just because of its role in terms of extremism and the spreading of extremist content and violent content and pornographic content, but also because it delivers highly addictive video content to young people,” said Lise Waldek, a lecturer at Macquarie University’s Department of Security Studies and Criminology who has run two government-commissioned studies on extremist content on YouTube.
Helen Young, a member of the Addressing Violent Extremism and Radicalisation to Terrorism Network, echoed those concerns, saying YouTube’s “algorithm feeds really far-right material, whether it’s primarily racist or primarily sort of misogynist, anti-feminist stuff, to users that it identifies as young men and boys.”
The academics interviewed by Reuters acknowledged that all social media platforms struggle to control the flow of harmful content but questioned why the country’s most popular site was given an exception.
When asked about these criticisms, a YouTube spokesperson said the platform promoted content that met quality principles such as encouraging respect while limiting “repeated recommendations of content that, while innocuous in a single view, can be potentially problematic if viewed in repetition for some young viewers”.
In addition, YouTube has said in public online statements that its moderation is getting more aggressive and that it has broadened its definition of harmful content which will be picked up by its automated detection system. TESTING YOUTUBE’S ALGORITHM
To test what content YouTube’s algorithm would deliver to minors, Reuters set up three accounts using fictitious names of children under 16 years of age. Two searches, one on sex and the other on COVID-19, led to links promoting misogyny and extreme conspiracy theories within 20 clicks. A third search on “European History” led to racist content after 12 hours of intermittent scrolling on the platform.
Searches deliberately seeking out misogynist and racist commentators all landed on harmful content in less than 20 clicks. Reuters shared its methodology and results with YouTube, which said it would review the material.
Reuters also flagged six videos to YouTube that came up during the experiment. YouTube has since taken one down – an interview with an Australian neo-Nazi leader – for violating the site’s hate speech rules. An account promoting misogynistic content was removed. Four of the videos remain online.
YouTube said it has “strict policies prohibiting hate speech, harassment, and violent or graphic content” and that after a review of the flagged videos, it found two of them violated these policies. It did not comment on the ones left online.
(Reporting by Byron Kaye; Editing by Saad Sayeed)