Labor Senators' dissenting report
1.1On 27 August 2025, the 48th Australian Parliament referred the implementation of regulations aimed at protecting children and young people online, with particular reference to the Internet Search Engine Services Online Safety Code (the Online Safety Code) and the Social Media Minimum Age (SMMA) obligation to the Environment and Communications References Committee for inquiry and report. The appointment and conduct of the inquiry are set out in Chapter 1 of the committee's report.
1.2Labor Senators would like to thank all submitters and witnesses who provided evidence during the inquiry. While the goal of improving the safety of children and young people online and minimising their exposure to harm was widely supported, we acknowledge the range of views expressed about how to achieve this in practice and on the Online Safety Code and the SMMA obligation.
1.3As noted in the Chapter 2 of the committee’s report, the inquiry received evidence on a variety of subjects, including: (a) the extensive use of technology by children and young people; (b) the harms associated with age-inappropriate material; (c) the importance of online access for wellbeing and development; (d) the privacy and data implications of age assurance measures; (e) the efficacy of age assurance measures and technical limitations; and (f) the adequacy of oversight mechanisms for age assurance measures.
1.4Balancing the objectives, risks and concernsthat emerged from the evidence to this inquiry will require holistic, coherent, practical, and evidence-based policies.
1.5Labor Senators appreciate the evidence and analysis set out in the committee’s report and support, in whole or in part, several of its recommendations. However, we have some reservations about some of the conclusions that have been reached from this evidence, and with some of the recommendations that have been made in the committee report. Labor Senators' views on aspects of the evidence and the committee report recommendations are outlined below.
1.6Evidence received by the inquiry (and outlined in the committee’s report) emphasised that young people use technology extensively, often have access to electronic devices from a young age, and use search engines and social media frequently.
1.7The Alannah and Madeline Foundation has found that ‘use of digital technologies is almost ubiquitous among Australian children and typically starts at an age when children are far too young to fully understand or manage the risks.’
1.8Based on surveys, it is estimated that four in five Australian children aged eight to twelve used at least one social media service in 2024 and around 36% of children aged eight to twelve who used social media had their own account.[1]
1.9There was evidence that children and young people are more susceptible to harms associated with social media usage than older users,[2]and that social media use was linked to a decline in the mental wellbeing of young people.[3]
1.10Some of the specific concerns raised included:
(a)the availability of dangerous self-harm material online and its potential to cause serious harm, particularly to vulnerable groups;[4]
(b)the damaging effects of exposure to pornography and sexual material at a young age either online or through social media, its impact on attitudes to women, relationships and sex, and its influence in shaping inappropriate and unhealthy sexual behaviours and norms;[5]
(c)the tendency of algorithms to promote extreme, polarising, sensationalist, and discriminatory content to young people;[6] and
(d)the link between social media use and increases in behavioural issues in classrooms[7] and eating disorders.[8]
1.11The eSafety Commissioner has indicated that almost two-thirds of 14–17-year-olds are exposed to extreme or harmful content.
1.12Evidence also suggested that unsafe social media content disproportionately impacts vulnerable groups. Surveys of children aged 10 to 17 showed that exposure to content that suggests self-harm or suicide was higher among trans and gender-diverse children (46%), sexually diverse teens (43%), First Nations children (31%) and children with disability (27%).[9]
1.13It is clear from the evidence received by the inquiry that earlier safeguards and regulations were inadequate to keep young people safe online.
1.14There was evidence that many parents felt insufficiently equipped to monitor the online activities of their children. The Alannah and Madeline Foundation found ‘only 43% of parents used controls or other means of blocking or filtering websites, with usage dropping once children reached their teens. Another survey found only half of Australian parents believed they could apply controls or change filter preferences without help’.[10]
1.15In addition, many Australian parents felt social media was not suitable for children and that significant proportions of Australian parents believed children were not safe on social media platforms.[11]
1.16Evidence indicates that the incentive structures of many online platforms prioritise engagement over safety and that protections have not kept pace with emerging risks. Some of those risks are set out above and in the committee report.
1.17Research undertaken by the eSafety Commissioner indicates that 95% of caregivers say online safety is one of their most difficult parenting challenges.
1.18The Online Safety Code and the SMMA obligation represent important measures to try to improve the safety of children and young Australians in the online world. By creating obligations on platforms and companies, the measures take steps to strengthen platform accountability without penalising Australian children or parents.
1.19The SMMA obligation sets a national consistent minimum age of 16, which reflects research showing heightened vulnerability in early adolescence and a need for additional safeguards during this developmental stage.
1.20The goal is to protect, not isolate, children and young people by trying to exclude harmful online content and behaviour while maintaining access to connection, support, and learning online.
1.21The SMMA obligation creates a norm that will assist in reducing online harm to children and young people and supports Australian parents.
1.22The Online Safety Code endeavours to remove harmful and age-inappropriate content in an online environment with constantly evolving technology.
1.23It was developed by industry with consultation from eSafety and complements the SMMA obligation by introducing additional safeguards.
1.24Labor Senators acknowledge the range of views expressed to the inquiry about the methods of age assurance and the best ways to manage the safety of people online while continuing to protect individual privacy.
1.25Labor Senators acknowledge the community expectation that collection of personal information is done in a way that upholds privacy.
1.26The Alannah and Madeline Foundation found that ‘56% of Australian parents agree 'It is not clear to me how I can protect my child / children's personal information while using a service'; 60% of parents say they often have no choice but to sign their child up to a particular service; and 55% of teens agree 'It's important to me that my personal information is kept private, but it's confusing and I don't really understand it’.[12]
1.27The Age Assurance Technology Trial (the Trial) illustrates that age assurance is both technically feasible and is already being used in Australia and internationally.
1.28The Trial found a wide range of age assurance approaches existed, including official identification checks, AI-based age estimation, and inference from user behaviour. The Trial also concluded that there is no one-size-fits all solution to age assurance, and that multiple effective technologies can be used to be best matched to the context and risk profile of the service.[13]
1.29Overall, a staged or ‘waterfall’ approach to age assurance was recommended by a number of inquiry participants.[14]
1.30Importantly, no Australian will be forced to use government identification for age assurance under the SMMA obligation as the legislation requires that ‘platforms must not collect government-issued identification or require the use of Digital ID (provided by an accredited service, within the meaning of the Digital ID Act 2024), unless a reasonable alternative is also offered’.[15]
1.31Labor Senators also note the evidence received about other measures that may help to minimise privacy risks associated with data collection by corporations for age assurance purposes, such as third-party age verification providers.
1.32The legislation contains stringent data protection requirements. As Collective Shout noted in its submission that the legislation will require companies to ‘ring-fence to destroy data collected for age assurance once the age check is complete’.[16]
1.33Mr Iain Corby from the Age Verification Providers Association (UK) also noted ‘[t]he OAIC have provided very detailed guidance on what their expectations are for privacy in this area’.[17]
1.34Labor Senators also recognise the Online Safety Code and methods of Age Assurance need to be continually reviewed and updated such that providers will improve their approaches to age assurance over time, including ‘where new approaches are more effective, privacy-preserving or decrease the burden on users’.[18]
1.35Labor Senators recognise the concern of stakeholders including sex educators and health service providers that there exists a risk that genuine health information and sex education material may be misclassified as Class 2 material, which includes online pornography. Labor Senators also note that gender diverse and sexually diverse young people may ‘face barriers to comprehensive sex education’, and as a result may seek online spaces in search of understanding, connection and information.
1.36However, genuine health information and sex education material does not constitute Class 2 material within the definition of the Online Safety Act, and continual, ongoing review from the eSafety Commissioner, as defined under the Act, will permit adjustment of how material is classified in practise.
1.37Labor Senators do not agree with the committee report recommendation 1. Labor Senators do not believe it is suitable to further delay the implementation of the SMMA obligation. Social media platforms have been consulted extensively and have had more than 12 months to prepare for the SMMA obligation. Further, the Trial showed that while Age Assurance may not be perfect and may require review and adjustment, it is an important tool in efforts to minimise harms caused by exposure to inappropriate content online.
1.38Labor Senators agree with committee report recommendation 2 that the Australian Government should legislate a digital duty of care to make online platforms safer for all users, and we note that the government’s consultation process on a digital duty of care has already commenced.[19]
1.39Labor Senators note committee report recommendation 3 and observe that the SMMA obligation will significantly reduce the ability of social media companies to harvest and exploit the data of minors and young people.
1.40Labor Senators agree with the intent of committee report recommendation 4 and note that the Australian Government has significantly increased funding for eSafety in Australia, including by funding eSafety awareness campaigns, supporting the eSafety Champions Network in schools, as well as providing toolkits and resources on the eSafety website. Additionally, the Albanese Government has invested $6 million in the Alannah and Madeline Foundation to improve online safety in schools.
Senator Varun GhoshDeputy ChairLabor Senator for Western Australia
Senator Charlotte WalkerMemberLabor Senator for South Australia
Footnotes
[1]eSafety Commissioner, Submission 8, p. 10.
[2]eSafety Commissioner, Submission 8, p. 13.
[3]See, Collective Shout, Submission 33, p. 3; Alannah and Madeline Foundation, Submission 3, p. 5.
[4]Alannah and Madeline Foundation, Submission 3, p. 5
[5]See, Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 13 October 2025, p. 71; NSW Advocate for Children and Young People, Submission 1, p. 2; eSafety Commissioner, Submission 8, pp. 14–15; Collective Shout, Submission 33, p. 6.
[6]See, Ms Harini Kasthuriararchchi, Policy Officer, Centre for Multicultural Youth, Committee Hansard, 13 October 2025, p. 36; ANU Law Reform and Social Justice Research Hub, Submission 28, p. 3.
[7]Collective Shout, Submission 33, p. 6.
[8]Collective Shout, Submission 33, p. 3.
[9]eSafety Commissioner, Submission 8, p. 15.
[10]Alannah and Madeline Foundation, Submission 3, p. 6.
[11]Collective Shout, Submission 33, p. 5.
[12]Alannah and Madeline Foundation, Submission 3, p. 8.
[13]AIIA, Submission 11, p. 2.
[14]Mr Iain Corby, Executive Director, Age Verification Providers Association, Committee Hansard, Committee Hansard, 13 October 2025, p. 57.
[15]DITRDCA, Submission 27, p. 12.
[16]Collective Shout, Submission 33, p. 8.
[17]Mr Iain Corby, Executive Director, Age Verification Providers Association, Committee Hansard, 13October 2025, p. 59.
[18]eSafety Commissioner, Submission 8, p. 12.
[19]Dr Jennifer Duxbury, Director, Policy, Regulatory Affairs and Research, Digital Industry Group Inc, Committee Hansard, 24 September 2025, p. 2. See also Australian Human Rights Commission, Submission 53, p. 8.
Senate
House of Representatives
Get informed
Bills
Committees
Get involved
Visit Parliament
Website features
Parliamentary Departments