Chapter 1 - Introduction

Chapter 1Introduction

1.1On 21 November 2024, the Senate referred the provisions of the Online Safety Amendment (Social Media Minimum Age) Bill 2024 (the bill) to the Environment and Communications Legislation Committee (the committee) for inquiry and report by 26 November 2024.[1]

Conduct of the inquiry

1.2In accordance with its usual practice, the committee advertised the inquiry on its website and invited submissions by 22 November 2024.

1.3The committee published 107 submissions from organisations and individuals, which are listed in Appendix 1 and available on the committee's website.

1.4A public hearing was held on 25 November 2024 in Canberra. A list of witnesses who gave evidence at the hearing is available in Appendix 2.

Structure of the report

1.5Chapter 1 outlines the context for the bill and conduct of the inquiry.

1.6Chapter 2 sets out the key provisions of the bill. It summarises the perspectives on the bill and contains the committee views and recommendations.

Acknowledgements

1.7The committee thanks the organisations and individuals who made submissions to this inquiry, particularly given the tight timeframe.

Financial impact statement

1.8The Explanatory Memorandum (EM) stated that the measures in the bill are expected to have a minor financial impact on Commonwealth expenditure and will primarily require funding for the Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA) and the Office of the eSafety Commissioner (eSafety Commissioner).[2]

Scrutiny of Bills Committee examination

1.9At the time of writing, the bill has not been considered by the Senate Standing Committee for the Scrutiny of Bills.

Human rights compatibility

1.10The EM to the bill stated that the bill is compatible with the human rights and freedoms recognised or declared in the international instruments listed in section 3 of the Human Rights (Parliamentary Scrutiny) Act 2011.[3] The full statements regarding compatibility with human rights can be found attached to the explanatory memorandum for the bill.

1.11At the time of writing, the Parliamentary Joint Committee on Human Rights had made no comment on the bill.

Overview of the bill

Purpose of the bill

1.12The bill would amend the Online Safety Act 2021 (Online Safety Act) with the aim of establishing a minimum age for social media use and placing responsibility on social media platforms for the safety of their users.[4]

1.13The aim of the bill is to ‘help to mitigate the risks arising from harmful features that are largely associated with user accounts, or the ‘logged-in’ state, such as persistent notifications and alerts which have been found to have a negative impact on sleep, stress levels, and attention’.[5]

1.14In this regard, the bill would set clear parameters similar to ‘other age-based laws that seek to limit or minimise access to known harms, such as for the sale of alcohol and cigarettes’.[6]

1.15However, the bill also has mechanisms to ensure ‘that users under the minimum age retain access to platforms that predominately provide beneficial experiences, such as those that are grounded in connection, education, health and support’.[7]

Background to the bill

1.16Social media is ubiquitous in the modern world. The technology and associated digital spaces have and will continue to rapidly alter how we communicate, socialise, and gain access to news and information. While the benefits of social media are many and integral to life today’s world, social media is also home to harmful content and products.[8]

1.17The EM stated that the use of social media by young people is a complex issue, as:

no two children’s experiences on social media are the same;

social media services vary greatly in their primary purpose and design features, and therefore present a different level of risk to end-users; and

children also vary substantially in how they use social media, including which platforms they access, the content and communities they engage in, and the digital features they are exposed to.[9]

1.18A recent survey conducted by the eSafety Commissioner found that 95 per cent of caregivers reported that ‘children’s online safety is the hardest parenting challenge they face’.[10]

1.19And yet, as stated in the EM, the current business model of social media companies creates an incentive for them to maximise user engagement with, and time spent on, their platforms.[11]

1.20Indeed, the recent final report of the Joint Select Committee on Social Media and Australian Society tabled on 18 November 2024 found that in Australia, approximately 81 per cent of the total population were active users of social media in 2023:

Significant amounts of time each month are spent by users on social media platforms such as Facebook, Instagram, and TikTok, for social connection, communication, entertainment, news, and information.[12]

1.21Further, that report stated that social media companies do whatever they can to stop users from logging off by using ‘opaque algorithms that keep users scrolling, continuously feeding them what they think they want to see, even if it's harmful’.[13]

1.22These matters appear to be particularly problematic for users under 15 years of age:

A United Kingdom (UK) study published in 2022, which examined longitudinal data from more than 17400 participants, found that adolescent social media use is predictive of a subsequent decrease in life satisfaction for certain developmental stages including for girls aged 11 to 13 years old and boys 14 to 15 years old.[14]

Current minimum age of access

1.23The current minimum age of access under the Terms of Service of all major social media services is 13 years. This stems from the 1998 decision by the United States (US) Congress in the Children’s Online Privacy Protection Act, which prohibits websites from collecting information on children younger than 13 years without consent.[15]

Provisions of the bill

1.24This section sets out an overview of the key provisions of the bill.

Reasonable steps

1.25The bill would introduce ‘an obligation on providers of an age-restricted social media platform to take reasonable steps to prevent age restricted users from having an account with the platform’.[16]

1.26The bill would put the onus ‘on platforms to introduce systems and processes that can be demonstrated to ensure that people under the minimum age cannot create and hold a social media account’.[17]

1.27The EM states that age-restricted social media platforms must be able to demonstrate having taken reasonable steps to prevent age-restricted users from ‘having an account’.[18]

1.28While the bill does not dictate how platforms must comply with the minimum age obligation, the EM does set out provide some expectations:

…it is expected that at a minimum, the obligation will require platforms to implement some form of age assurance, as a means of identifying whether a prospective or existing account holder is an Australian child under the age of 16 years. Whether an age assurance methodology meets the ‘reasonable steps’ test is to be determined objectively, having regard to the suite of methods available, their relative efficacy, costs associated with their implementation, and data and privacy implications on users, amongst other things.[19]

1.29The EM also advises that the outcomes of the Australian Government’s age assurance trial ‘are likely to be instructive for regulated entities and will form the basis of regulatory guidance issued by the Commissioner, in the first instance’.[20]

1.30Further, the EM notes that ‘reasonable steps’ is a standard that has been imposed for the purpose of demonstrating compliance and features in national security legislation, privacy law, and elsewhere in the Online Safety Act.[21]

Age-restricted social media platform

1.31The bill introduces a new term to the Online Safety Act, ‘age-restricted social media platform’.[22]

1.32The EM notes that the definition of this term largely draws on the existing meaning of ‘social media service’ in section 13 of the Online Safety Act. However, the bill would expand the ‘sole or primary purpose’ test to a ‘significant purpose’ test when examining whether a service enables online social interactions between 2 or more users.[23]

1.33The EM also notes that this definition will not apply to other parts of the Online Safety Act, with the existing definition in section 13 remaining in effect.[24]

Regulatory instruments

1.34The bill provides flexibility to reduce the scope or further target the definition. This flexibility would occur through delegated legislation in the form of disallowable legislative rules made by the Minister for Communications.[25]

1.35In exercising the rule-making power, the Minister will be required to seek and have regard to advice from the eSafety Commissioner and may also seek advice from other relevant Commonwealth agencies.[26]

1.36Rule-making powers are also available to:

provide additional conditions that must be met in order to fall within the definition of age-restricted social media platform; and

respond to emerging technologies and services that are relevant to be considered or captured by the definition.[27]

1.37The EM advises that the Government proposes to make legislative rules to exclude the following services from the definition of age-restricted social media platforms before the minimum age obligation becomes law:

messaging apps

online gaming services

services with the primary purpose of supporting the health and education of end-users.[28]

Regulated activity

1.38The EM states that regulating the act of having an account would ‘prevent age-restricted users from accessing the content and features that are available to signed-in account holders on social media platforms’.[29]

1.39The EM states that the obligation would not affect user access to ‘logged-out’ versions of a social media platform and provided the following examples:

…the obligation would not affect the current practice of users viewing content on YouTube without first signing into an account; and

similarly, Facebook offers users the ability to view some content, such as the landing page of a business or service that uses social media as their business host platform, without logging in.[30]

Penalties

1.40The bill would create new civil penalty provisions and require age-restricted social media platforms to take reasonable steps to prevent age-restricted users from having an account.[31]

1.41A failure by the provider will be subject to 30000 civil penalty units (currently equivalent to $9.9 million). This increases to 150000 penalty units (currently equivalent to $49.5 million) if the provider is a body corporate, due to the application of section 82 of the Regulatory Powers (Standard Provisions) Act 2014 (Regulatory Powers Act).[32]

1.42Penalties do not apply to age-restricted users who may gain access to an age-restricted social media platform, or to their parents, carers or educators.[33]

1.43The bill would also increase maximum civil penalties in the Online Safety Act for:

non-compliance with a direction to comply with industry codes; and

non-compliance with industry standards.[34]

1.44Penalties would rise from 500 to 30000 civil penalty units (currently equivalent to $9.9 million). For bodies corporate, this increases to 150000 penalty units (currently equivalent to $49.5 million).[35]

1.45These amounts are consistent with the maximum penalties currently available for contravention of the Australian Consumer Law under the Competition and Consumer Act 2010, and for serious and repeated interferences with privacy under the Privacy Act 1988 (the Privacy Act).[36]

1.46The EM notes that the penalty amounts are intentionally large to reflect the significance of the harms the bill is intended to safeguard against and signal the expectation that age restricted social media platforms treat the minimum age obligation seriously.[37]

1.47The EM argues that the civil penalties are appropriate and in line with existing Commonwealth guidelines:

The Guide to Framing Commonwealth Offences, Infringement Notices and Enforcement Powers outlines that larger penalties are more appropriate for bigger companies, such as platforms, as they provide an adequate deterrent against non-compliance. The civil penalties are therefore appropriate for regulatory and disciplinary purposes, while ensuring paying a financial penalty does not become a cost of doing business.[38]

1.48The EM states that the penalty provisions would operate consistently with Part 10 of the Online Safety Act and Part 4 of the Regulatory Powers Act. These Acts provide that in determining pecuniary penalties, a court must take all relevant matters into account, including the circumstances of the contravention, the nature of the contravening conduct, the size of the organisation involved and whether the entity has previously been found to have engaged in similar conduct. Therefore, a court would have discretion to consider the seriousness of the contravention and impose a penalty that is appropriate in the circumstances.[39]

Privacy protections

1.49The bill introduces privacy protections, including prohibiting platforms from using information collected for age assurance purposes for any other purpose, unless the individual has provided their consent. This consent ‘must be voluntary, informed, current, specific and unambiguous’.

This requirement precludes platforms from seeking consent through preselected settings or opt-outs.[40]

1.50Once the platform has used the information for age assurance or any other agreed purpose, it must be destroyed by the platform (or any third party contracted by the platform).[41]

1.51Serious and repeated breaches of these privacy provisions could result in penalties of up to $50 million under section 13G of the Privacy Act.[42]

Deferred commencement

1.52The bill provides that the minimum age obligation will come into effect on a day to be specified by the Minister for Communications by way of notifiable instrument, not earlier than 12 months after Royal Assent.[43]

1.53The deferred commencement is intended to provide industry and the eSafety Commissioner with sufficient time to develop and implement appropriate systems.[44]

Review

1.54An independent review of the part that will be introduced into the Online Safety Act by the bill (Part 4A) will be conducted within two years of the minimum age obligation taking effect.[45]

1.55The review will examine whether the measures are effective and delivering the desired outcomes for Australians, including if any changes to scope or the minimum age are required.[46]

Age assurance trial

1.56The EM explains that the government provided funding in the 2024–25 Budget

to conduct a broad, three-phase trial of age assurance, including an assessment of technologies, to examine options to protect children from harmful online content, including on social media, and age-restricted content such as pornography.[47]

1.57The objective of the trial is to determine the effectiveness of available age assurance technologies as an option to:

prevent access to online pornography by people under the age of 18; and

age limit access to social media platforms for an age range of between 13 and 16 years old.[48]

1.58The three elements to the trial are:

The technology trial: an independent testing and assessment of currently available age assurance technologies;

Research: including consumer research into Australian’s attitudes towards the use of age assurance technologies for access to the 2 use cases; and

Consultation: targeted stakeholder consultation with young Australians, parent groups, academics, the digital industry (including platforms), community and civil society groups, and First Nations representative groups.[49]

Consultation

1.59During the development of the bill, the DITRDCA conducted extensive consultation with young people, parents, mental health professionals, legal professionals, community and civil society groups, state and territory first ministers and industry representatives.[50]

1.60The next chapter addresses views on the bill, and the committee view and recommendations.

Footnotes

[1]Journals of the Senate, No. 142, 21 November 2024, p. 4321.

[2]Explanatory Memorandum, p. 7.

[3]Explanatory Memorandum, pp. 9–16.

[4]Explanatory Memorandum, p. 1.

[5]Explanatory Memorandum, p. 4.

[6]Explanatory Memorandum, p. 3.

[7]Explanatory Memorandum, p. 3.

[8]Joint Select Committee on Social Media and Australian Society, Chair’s Foreword, Final report, 18November 2024, pp. ix–x; The bill, EM, p. 3.

[9]Explanatory Memorandum, p. 1.

[10]Explanatory Memorandum, p. 2.

[11]Explanatory Memorandum, p. 1.

[12]Joint Select Committee on Social Media and Australian Society, Chair’s Foreword, Final report, 18November 2024, p. ix.

[13]Joint Select Committee on Social Media and Australian Society, Chair’s Foreword, Final report, 18November 2024, p. ix.

[14]Explanatory Memorandum, pp. 1–2.

[15]Explanatory Memorandum, p. 1.

[16]Explanatory Memorandum, p. 2.

[17]Explanatory Memorandum, p. 2.

[18]Explanatory Memorandum, p. 4.

[19]Explanatory Memorandum, p. 3.

[20]Explanatory Memorandum, p. 3.

[21]Explanatory Memorandum, p. 4.

[22]Explanatory Memorandum, p. 3.

[23]Explanatory Memorandum, p. 3.

[24]Explanatory Memorandum, p. 3.

[25]Explanatory Memorandum, p. 3.

[26]Explanatory Memorandum, p. 3.

[27]Explanatory Memorandum, p. 3.

[28]Explanatory Memorandum, p. 4.

[29]Explanatory Memorandum, p. 4.

[30]Explanatory Memorandum, p. 4.

[31]Explanatory Memorandum, p. 6.

[32]Explanatory Memorandum, p. 6.

[33]Explanatory Memorandum, p. 6.

[34]Explanatory Memorandum, p. 6.

[35]Explanatory Memorandum, p. 6.

[36]Explanatory Memorandum, p. 6.

[37]Explanatory Memorandum, p. 6.

[38]Explanatory Memorandum, p. 6.

[39]Explanatory Memorandum, p. 6.

[40]Explanatory Memorandum, p. 7.

[41]Explanatory Memorandum, p. 7.

[42]Explanatory Memorandum, p. 7.

[43]Explanatory Memorandum, p. 4.

[44]Explanatory Memorandum, p. 4.

[45]Explanatory Memorandum, p. 7.

[46]Explanatory Memorandum, p. 7.

[47]Explanatory Memorandum, p. 5.

[48]Explanatory Memorandum, p. 5.

[49]Explanatory Memorandum, p. 5.

[50]Explanatory Memorandum, p. 2.