Chapter 1 - Introduction

Chapter 1Introduction

1.1From 10 December 2025, certain social media platforms will be required to take reasonable steps to prevent Australians under 16 from creating or holding an account. These obligations are part of a range of regulations being introduced protect children and young people online from risks they may be exposed to through social media accounts. Additionally, the co-regulated Internet Search Engine Online Safety Code comes into effect on 27 December 2025, including requirements to ensure the highest safety settings are applied for logged-in Australian children. Amongst other things, both these measures require a mechanism for age assurance to be implemented by the relevant platforms in order to determine a user’s age.

1.2This report considers many of the important issues raised by inquiry participants about the implementation of these online safety regulations.

1.3This chapter will first provide a brief outline of the Online Safety Act 2021 (the Online Safety Act) and the role of the eSafety Commissioner. The chapter then explores the two significant online safety measures central to this inquiry, the Internet Search Engine Online Safety Code and the under 16 social media ban, also known as the Social Media Minimum Age (SMMA) obligation. These are two distinct regulatory measures being pursued by the government, although the measures deal with similar topics and often involve similar stakeholders. The meaning of ‘age assurance’, as it may apply to the implementation of these regulatory measures, will then be explained. Finally this chapter briefly outlines the conduct of the inquiry.

1.4Chapter 2 will explore the views and concerns of inquiry participants, including in relation to the use of age assurance measures, such as privacy and data implications, efficacy and technical limitation as well as oversight mechanism.

1.5Chapter 3 considers complementary and alternate approaches to online safety for children as raised by some inquiry participants advocating for systemic change.

The Online Safety Act

1.6Online safety in Australia is primarily regulated under the Online Safety Act. Broadly, the Online Safety Act can be described as codifying Australia’s approach to the protection of individuals online in two main ways. Firstly, it establishes an independent eSafety Commissioner with powers to respond to specific online harms.[1] This is complemented by a systems-based approach for industry which aims to prevent harm by placing direct expectations on industry.[2]

1.7Some key features of the Online Safety Act include a set of basic online safety expectations, various complaints and objections systems and an online content scheme. The online content scheme regulates illegal and restricted online content, with the Online Safety Act giving the eSafety Commissioner the power to ‘direct an online service or platform to remove illegal content or ensure that restricted content can only be accessed by people who are 18 or older’.[3]

1.8Illegal and restricted online content is classified as either Class 1 or Class 2. In general:

Class 1 material—is ‘material that is or would likely be refused classification under the National Classification Scheme’; and

Class 2 material—is material that is, or would likely be, classified as either X18+ or R18+.[4]

Role of the eSafety Commissioner

1.9The eSafety Commissioner is the primary body responsible for ensuring the safety of Australians online. It describes itself as ‘Australia’s independent regulator, educator and coordinator for online safety’.[5] It is an independent statutory office, and the role is appointed by the Minister for Communications.[6]

1.10The Online Safety Act explicitly defines and explains the functions of the eSafety Commissioner. These functions include:

promoting online safety for Australians;

administering a complaints system for cyber-bullying and cyber-abuse material;

administering the online content scheme;

coordinating activities of Commonwealth Departments, authorities and agencies relating to online safety for Australians; and

performing various functions related to the social media minimum age provisions.[7]

1.11The eSafety Commissioner has several compliance and enforcement mechanisms at its disposal, though these mechanisms differ according to the specific function it is performing or scheme it is overseeing.[8] The eSafety Commissioner can issue online service providers and end-users with removal notices, end-user notices, remedial directions, link deletion notices, app removal notices and directions to comply with an industry code, among other notifications.[9] The eSafety Commissioner can also take stronger enforcement action where a civil penalty provision has been contravened, including giving a formal warning, giving an infringement notice, seeking a court-ordered injunction and seeking a court-ordered penalty.[10] Non-compliance with elements of the Online Safety Act can lead to a maximum penalty of $49.5 million.[11]

Internet search engine services online safety code

1.12The online content scheme contained within Part 9 of the Online Safety Act regulates illegal and restricted content online. Among other things, Part 9 of the Online Safety Act:

defines the type of material that is considered illegal or restricted (class 1 and class 2 material);

establishes a framework for the eSafety Commissioner to give removal notices to online service providers relating to class 1 and 2 material; and

provides a framework for the development of industry codes and standards for online service providers.[12]

1.13The eSafety Commissioner explained that the codes are primarily intended to address issues of ‘access, exposure and distribution’ of class 1 and class 2 material online.[13] The codes are also intended to ‘standardise and uplift industry’s safety practices’.[14] Development of the industry codes is the responsibility of relevant industry bodies. As described by the eSafety Commissioner, the codes are ‘co-regulatory’ in that they are ‘drafted by industry for industry’, but with registration of the codes and compliance with the codes overseen by the eSafety Commissioner.[15]

1.14The Online Safety Act specifies that there are eight distinct sections of the online industry, with each expected to develop its own industry code.[16] Providers of internet search engine services are one of the eight identified industry sections.[17]

1.15The eSafety Commissioner’s submission indicated that the process for adopting industry codes was split into two phases following discussion with industry.[18]The Phase 1 industry codes deal with Class 1A and Class 1B material (child sexual exploitation material, pro-terror content, extreme crime and violence, and drug-related content).[19] Phase 2 industry codes deal with Class 1C, 2A and 2B material (online pornography, other high-impact material, and simulated gambling).[20]

1.16As described in the Department of Infrastructure, Transport, Regional Development, Communications and the Arts’ (DITRDCSA) submission, the Phase 1 codes and standards ‘require industry to detect, remove and combat the generation of Class 1 material’.[21] The codes are accompanied by a common ‘Head Terms’, which provide a general principles-based framework designed to apply to all online service providers.[22]

1.17Following the development and introduction of the Phase 1 Codes, Phase 2 Codes were developed to deal with material that is ‘legally age restricted and designated as harmful for children by the Australian Government under the National Classification Scheme’.[23]

1.18The Internet Search Engine Services Online Safety Code (Class 1C and Class 2 Material), representing the Phase 2 code for search engines, is set to come into effect on 27 December 2025.[24] Another set of Head Terms also accompany the Phase 2 codes, enshrining ‘principles that will sit alongside the safety measures for every layer of the technology stack’.[25]

1.19The eSafety Commissioner noted that the Phase 2 codes adopt ‘some key good practice measures’ that it claimed are already being implemented by major platforms.[26] The submission also noted that the codes implement best practice approaches from comparable jurisdictions, ensuring ‘greater regulatory parity that will enable stronger compliance by industry’.[27]

1.20There are 25 compliance measures detailed in the code, ranging from requirements to adopt specific types of technology, to instructions on how to engage with the eSafety Commissioner and its policies, to requirements to improve existing technology and maintain dedicated trust and safety teams.[28]

1.21The eSafety Commissioner’s submission drew attention to the compliance measure that, by 27 June 2026, search engine services must ‘implement appropriate age assurance mechanisms for logged-in account holders to ensure that the highest safety settings are applied when a service’s systems detect that an account holder is likely to be an Australian child’.[29]

1.22Additionally, the eSafety Commissioner highlighted the requirement that ‘advertising for online pornography, high-impact violence material and self-harm material is not served to children’.[30] Further, it noted that the code ‘provides enhanced protections for users who are not logged in’, including default blurring of certain material to reduce the risk of accidental exposure and downranking of harmful content in search results.[31] The Commissioner’s submission also stated that many of the age assurance requirements contained in the code ‘expand on existing practices already routinely applied’.[32] DITRDCSA’s submission echoed many of the eSafety Commissioner’s main points, acknowledging the significance of similar key requirements.[33]

Social media minimum age rules

1.23The Online Safety Amendment (Social Media Minimum Age) Bill 2024 received royal assent on 10 December 2024.[34] The bill introduced an additional Part to the Online Safety Act which creates ‘an obligation for age-restricted social media platforms to take reasonable steps to prevent Australian children under 16 from having accounts on their platforms’.[35] This is referred to as the Social Media Minimum Age (SMMA) obligation.

1.24According to the eSafety Commissioner, the SMMA obligation requires providers of age-restricted social media service platforms to ‘take reasonable steps to prevent Australian children under 16 from having accounts on their platforms’.[36] In its regulatory guidance on the SMMA obligation, the eSafety Commissioner defined reasonable steps as consisting of ‘systems, technologies, people, processes, policies and communications that support compliance with the SMMA obligation’.[37]

1.25Its regulatory guidance also indicated that reasonable steps should ultimately serve several purposes, including: determining which accounts are held by age-restricted users and deactivating or removing those accounts, preventing age-restricted users from creating new accounts, and mitigating circumvention of measures employed by platforms.[38] Additionally, the regulatory guidance pointed to a series of guiding principles that ‘should inform providers’ reasonable steps to comply’ with the SMMA obligation.[39]

1.26The eSafety Commissioner has published a view that, as of 5 November 2025, Facebook, Instagram, Snapchat, Threads, TikTok, X, YouTube, Kick and Reddit are age-restricted platforms, noting this list continues to be updated prior to the SMMA obligation coming into effect.[40]

What is an age-restricted social media platform?

1.27The Minister for Communications has four major responsibilities in the implementation of the SMMA obligation.[41] Firstly, the Minister for Communications can specify services that are not age-restricted social media platforms.[42] Section 63C of the Online Safety Act defines the term ‘age-restricted social media platform’ as an electronic service whose sole purpose, or a significant purpose, is to enable online social interaction between two or more users, where the service allows users to interact with some or all other users and where the service allows users to post material on the service.[43]

1.28Using the power to specify services that do not constitute age-restricted social media platforms, the Minister for Communications subsequently limited the scope of the broad definition of an age-restricted social media platform. On 29July 2025, the Minister made the Online Safety (Age-Restricted Social Media Platforms) Rules 2025, which clarified that several services that would otherwise meet the definition of an ‘age-restricted social media platform’ would instead not be considered age restricted social media platforms.[44] This included online gaming and standalone messaging apps.[45]

1.29The Minister for Communications can also make legislative rules that specify the ‘kinds of information that providers of age-restricted social media platforms must not collect for purposes of complying with the SMMA obligation’.[46] The Minister is yet to make any such rules, nor have any rules been proposed.[47]

1.30Additionally, the Minister for Communications is responsible for specifying when the SMMA obligation will take effect.[48] The Minister for Communications has specified that the SMMA obligation will take effect on 10 December 2025.[49]

1.31Finally, the Minister for Communications is responsible for initiating an independent review of the SMMA obligation.[50] The committee notes that this independent review must be initiated within the two years following the 10December 2025 compliance start date.[51]

1.32The eSafety Commissioner’s submission noted that the Office of the Australian Information Commissioner (OAIC) also has some responsibilities in the implementation of the SMMA obligation. Specifically, the OAIC is responsible for enforcing the Privacy Act 1988 in circumstances where a provider uses or discloses information about an individual for purposes other than determining whether the individual is an age-restricted user, or where it does not destroy material collected for age assurance purposes.[52]

1.33Finally, the eSafety Commissioner has several responsibilities in the implementation of the SMMA obligation. It is responsible for developing regulatory guidelines for the SMMA obligation, which were published in September 2025.[53] The eSafety Commissioner is also responsible for monitoring and enforcing compliance with the SMMA obligation to take reasonable steps to prevent age-restricted users from having accounts with age-restricted social media platforms.[54] Further, it is responsible for enforcing compliance with the requirement for entities to not collect Government-issued identification material.[55]

Age assurance

1.34In its submission to the inquiry the eSafety Commissioner noted that while the terms of reference for this inquiry refers to ‘age verification’, the broader and more commonly used term among regulators is ‘age assurance’.[56] The eSafety Commissioner also stated that the Phase 2 industry codes and the SMMA obligation incorporate age assurance to protect children from online harms.[57]

1.35Age assurance refers to a variety of processes and methods used to determine a person’s age or age range.[58] This can include age verification, age estimation and age inference.

Age verification refers to the process of identifying a person’s age by finding, locating or sourcing their date of birth from a reliable document or source, ensuring that the source genuinely refers to the person in question, and communicating that finding to a relying party.[59]

Age estimation is a method of determining a person’s likely age or age-range by ‘analysing physical or behavioural characteristics using artificial intelligence or machine learning models’.[60] Age estimation methods include facial analysis, voice modelling, and motion pattern recognition.[61]

Age inference refers to the method of determining a person’s likely age using verifiable contextual, behavioural, transactional or environmental signals.[62] This can include verifiable life-stage indicators such as electoral enrolment, school year, transaction history, email data and device usage patterns.[63]

1.36In November 2024, DITRDCSA announced that the Age Check Certification Scheme would conduct an Age Assurance Technology Trial.[64] This trial would ‘undertake a point-in-time evaluation of market maturity – gathering evidence on the technical feasibility of existing age assurance technologies, having regard to a range of criteria including accuracy, privacy, security and accessibility’.[65]

1.37The headline findings from the trial included that:

age assurance can be done in Australia;

there are not substantial technological limitations preventing age assurance from being deployed;

there was a robust understanding of secure data handling practices;

tested age assurance systems performed broadly consistently across demographic groups; and

tested age assurance systems were generally secure.[66]

1.38The following chapter will explore age assurance concerns raised by inquiry participants.

Conduct of the inquiry

Inquiry referral

1.39On 27 August 2025, the Senate referred an inquiry into the implementation of regulations aimed at protecting children and young people online, with particular reference to the Internet Search Engine Online Safety Code and the under 16 social media ban, to the Senate Environment and Communications References Committee (thecommittee) for inquiry and report.[67]

1.40The committee was scheduled to report on 31 October 2025. The Senate granted the committee a reporting date extension to 26 November 2025.[68]

1.41The committee advertised the inquiry on its website and called for written submissions by 22 September 2025. The committee also wrote to various stakeholders to invite them to make a submission.

1.42The committee received 101 submissions, as listed at Appendix 1.

1.43The committee also received approximately 3400 items of correspondence which focused on very similar themes and appeared to be prepared in response to a campaign strategy. The committee agreed to publish a representative sample of these documents as de-identified correspondence under ‘additional information’ to the inquiry.

Public hearings

1.44The committee held three public hearings for the inquiry, as follows:

24 September 2025—Parliament House, Canberra

13 October 2025—Parliament House, Canberra

28 October 2025—Parliament House, Canberra

1.45The details of witnesses who appeared at the hearings is listed at Appendix 2.

Acknowledgements

1.46The committee thanks the participants in the inquiry who provided substantial evidence on the implementation of regulations aimed at protecting children and young people online. The committee has carefully considered inquiry participants’ evidence and has drawn on that evidence to prepare this report.

Footnotes

[1]Department of Infrastructure, Transport, Regional Development, Communications, Sport and the Arts (DITRDCSA), Submission 27, September 2025, p. 4.

[2]DITRDCSA, Submission 27, p. 4.

[3]eSafety Commissioner, Illegal and restricted online content, 2024, Illegal and restricted online content | eSafety Commissioner (accessed 27 October 2025).

[4]eSafety Commissioner, Illegal and restricted online content, 2024, Illegal and restricted online content | eSafety Commissioner (accessed 27 October 2025).

[5]eSafety Commissioner, Submission 8, p. 2.

[6]eSafety Commissioner, Submission 8, p. 2.

[7]Online Safety Act 2021, s. 25, p. 27.

[8]eSafety Commissioner, Compliance and Enforcement Policy, October 2024, p. 4.

[9]eSafety Commissioner, Compliance and Enforcement Policy, October 2024, p. 4.

[10]eSafety Commissioner, Compliance and Enforcement Policy, October 2024, p. 4.

[11]Online Safety Amendment (Social Media Minimum Age) Bill 2024, Explanatory Memorandum, p. 6.

[12]Online Safety Act 2021, s. 105, p. 104.

[13]eSafety Commissioner, Submission 8, p. 18.

[14]eSafety Commissioner, Submission 8, p. 19.

[15]Ms Julie Inman Grant, eSafety Commissioner, Proof Committee Hansard, 13 October 2025, p. 70.

[16]eSafety Commissioner, Submission 8, pp. 19–21.

[17]Online Safety Act 2021, s. 135, pp. 127–128.

[18]eSafety Commissioner, Submission 8, p. 18.

[19]eSafety Commissioner, Submission 8, pp. 35–36.

[20]eSafety Commissioner, Submission 8, pp. 35–36.

[21]DITRDCSA, Submission 27, p. 8.

[22]eSafety Commissioner, Submission 8, p. 22.

[23]eSafety Commissioner, Submission 8, p. 20.

[24]eSafety Commissioner, Register of industry codes and industry standards for online safety, 2025, Register of industry codes and industry standards for online safety | eSafety Commissioner (accessed 27 October 2025).

[25]eSafety Commissioner, Submission 8, p. 22.

[26]eSafety Commissioner, Submission 8, p. 21.

[27]eSafety Commissioner, Submission 8, p. 21.

[28]eSafety Commissioner, Schedule 3 – Internet Search engine Services Online Safety Code (Class 1C and Class 2 Material), 27 June 2025, p. 5–14.

[29]eSafety Commissioner, Submission 8, p. 23.

[30]eSafety Commissioner, Submission 8, p. 24.

[31]eSafety Commissioner, Submission 8, p. 24.

[32]eSafety Commissioner, Submission 8, p. 24.

[33]DITRDCSA, Submission 27, p. 10.

[34]Parliament of Australia, Online Safety Amendment (Social Media Minimum Age) Bill 2024, 2025, Online Safety Amendment (Social Media Minimum Age) Bill 2024 – Parliament of Australia (accessed 27 October 2025).

[35]eSafety Commissioner, Submission 8, p. 24.

[36]eSafety Commissioner, Submission 8, p. 25.

[37]eSafety Commissioner, Social Media Minimum Age Regulatory Guidance, September 2025, p. 19.

[38]eSafety Commissioner, Social Media Minimum Age Regulatory Guidance, September 2025, p. 19.

[39]eSafety Commissioner, Social Media Minimum Age Regulatory Guidance, September 2025, p. 21.

[40]eSafety Commissioner, Social media age restrictions, https://www.esafety.gov.au/about-us/industry-regulation/social-media-age-restrictions (accessed 13 November 2025)

[41]eSafety Commissioner, Submission 8, p. 25.

[42]eSafety Commissioner, Submission 8, p. 25.

[43]Online Safety Act 2021, s. 63C, p. 63.

[44]eSafety Commissioner, Submission 8, p. 25.

[45]eSafety Commissioner, Social media age restrictions, 2025, Social media age restrictions | eSafety Commissioner (accessed 27 October 2025).

[46]eSafety Commissioner, Submission 8, p. 25.

[47]eSafety Commissioner, Submission 8, p. 25.

[48]eSafety Commissioner, Submission 8, p. 25.

[49]eSafety Commissioner, Submission 8, p. 25.

[50]eSafety Commissioner, Submission 8, p. 26.

[51]eSafety Commissioner, Social Media Minimum Age Regulatory Guidance, September 2025, pp. 6–7.

[52]Online Safety Act 2021, s. 63F, pp. 69–70.

[53]eSafety Commissioner, Submission 8, p. 26.

[54]eSafety Commissioner, Submission 8, p. 26.

[55]eSafety Commissioner, Submission 8, p. 26.

[56]eSafety Commissioner, Submission 8, p. 6.

[57]eSafety Commissioner, Submission 8, p. 7.

[58]eSafety Commissioner, Submission 8, p. 6.

[59]DITRDCSA, Age Assurance Technology Trial Main Report, August 2025, p. 56.

[60]DITRDCSA, Age Assurance Technology Trial Main Report, August 2025, p. 73.

[61]DITRDCSA, Age Assurance Technology Trial Main Report, August 2025, p. 73.

[62]DITRDCSA, Age Assurance Technology Trial Main Report, August 2025, p. 89.

[63]DITRDCSA, Age Assurance Technology Trial Main Report, August 2025, p. 91.

[64]eSafety Commissioner, Submission 8, p. 10.

[65]DITRDCSA, Submission 27, p. 5.

[66]DITRDCSA, Age Assurance Technology Trial Main Report, August 2025, pp. 14–19.

[67]Journals of the Senate, No. 10, 27 August 2025, pp. 324–325. For full Terms of Reference see page vii of this report.

[68]See, Senate Economics References Committee, Progress report: Internet Search Engine Services Online Safety Code, September 2025, p. 1.