Australia has passed a groundbreaking law banning social media use for children under 16, marking the first such legislation globally. Social media platforms like TikTok, Facebook, and Instagram face fines of up to AUD 50 million ($33 million) for systemic failures to prevent minors from creating accounts.
The legislation, which swiftly moved through Parliament, reflects growing concerns about online harms to children. Prime Minister Anthony Albanese emphasized the law’s intent to prioritize child safety, giving platforms a year to develop compliance measures. However, critics, including Meta Platforms, argue the law was rushed and lacks clarity on practical implementation.
Amendments to the law protect user privacy by prohibiting platforms from demanding government-issued identification or digital verification for age assurance. While the law aims to safeguard children, critics fear it could inadvertently undermine the privacy of all users.
Some child welfare and mental health advocates are concerned the ban may isolate vulnerable children, particularly those in regional areas or from marginalized communities, such as LGBTQ+ youth, who rely on social media for support and connection. Exemptions for educational and health-related services, such as YouTube and Google Classroom, aim to mitigate these effects.
Supporters of the legislation, including families of victims of online exploitation, celebrated the move as a significant step toward protecting children. Advocates like Sonya Ryan and Wayne Holdsworth, who have lost children to online predators and scams, hailed the law as a vital measure to combat the risks posed by social media.
They believe it addresses long-standing issues of platforms prioritizing profit over safety. However, mental health organizations, like Suicide Prevention Australia, caution that the law neglects the positive roles social media can play in fostering mental health and connection among young people.
Social media companies expressed concerns about the law’s practicality and potential negative consequences. They criticized the government for rushing the legislation without sufficient consultation or consideration of existing measures to ensure age-appropriate experiences.
Platforms like Snapchat pledged to collaborate with the government to refine implementation during the 12-month rollout period. Still, many argue the law’s hastiness undermines its effectiveness and raises questions about how it will be enforced.
The law has sparked debate about its motivations and broader implications. Critics suggest the government is leveraging the legislation to appeal to parental concerns ahead of an election, while some experts warn of unintended consequences, such as driving children to less-regulated online spaces.
Opponents argue the ban could erode parental authority, discourage reporting of harm, and reduce platforms’ incentives to improve safety. As the implementation period begins, the law faces scrutiny over whether it can effectively balance safety, privacy, and practical enforcement.