Search Engine Safety Code and Social Media Ban
Summary
Learn more about the human rights risks of a social media ban and Phase 2 codes.The Commission welcomes the opportunity to contribute to the Senate Inquiry into the Internet Search Engine Services Online Safety Code (SES Code) and the Social Media Ban under the Online Safety Act 2021 (Cth).
The Commission supports efforts to improve online safety for children and young people. However, it holds concerns about the human rights implications of the proposed regulatory framework - particularly in relation to privacy, access to information, equality and accountability.
Age assurance
Current age assurance methods (e.g. government-issued identification, facial recognition and age inference) raise significant privacy and inclusion concerns. These technologies may disproportionately affect marginalised groups, normalise surveillance practices and erode public trust in digital systems.
Access to information
Measures aimed at restricting harmful content must be carefully designed to avoid unintentionally blocking legitimate health and educational resources. This is particularly important for LGBTIQA+ young people and others who rely on technology to access inclusive, evidence-based information that supports their wellbeing and development.
Transparency and oversight
The SES Code does not currently provide strong mechanisms for public reporting and independent review. While the social media ban does not have clear criteria for ministerial discretion. In the absence of these safeguards, decisions about age assurance and content access may lack transparency, consistency and accountability.
Global experience
International examples, such as the UK's Online Safety Act, highlight the risks of over-censorship and reduced access to legitimate content. These experiences underscore the importance of clear definitions, transparent enforcement and meaningful consultation with affected communities - particularly children and young people.
Digital duty of care
A legislated Digital Duty of Care would place a positive obligation on service providers to take reasonable steps to prevent foreseeable harms. If designed with robust human rights safeguards, this approach offers a coherent and proportionate framework for addressing systemic online risks and promoting safer digital environments.
Why this matters
Online safety measures should be designed and implemented in ways that respect, protect and promote human rights. The Commission encourages a shift toward proportionate, inclusive and privacy-preserving safety-by-design approaches that enhance, not diminish, children鈥檚 rights in digital environments.
To learn more, download the submission and read our recommendations for reform.