Bills Digest No. 39, 2024-25

Online Safety Amendment (Social Media Minimum Age) Bill 2024

Infrastructure, Transport, Regional Development, Communications and the Arts

Author

Nell Fraser and Owen Griffiths

Go to a section

Key points

  • The Online Safety Amendment (Social Media Minimum Age) Bill 2024 (the Bill) introduces an obligation on certain social media platforms to take reasonable steps to prevent children under 16 years of age from having an account.
  • It is the responsibility of the eSafety Commissioner to write guidelines on the ‘reasonable steps’ to be taken by age-restricted social media platforms.
  • Details on what may be included in these guidelines, such as what age estimation or age verification technology may be used, is not included in the Bill.
  • The obligation for certain social media platforms to restrict under-age account holders will not commence for at least 12 months, with the date to be set by the Minister.
  • Research, including by the eSafety Commissioner, suggests that there are both benefits and risks to social media use by children, and that these benefits and risks are individualised.
  • The Bill has bipartisan support. However, some experts and researchers in relevant fields have encouraged the government to pursue alternative action.
  • The Bill has been referred to the Senate Environment and Communications Legislation Committee for inquiry and report by 26 November 2024.
  • At the time of writing, the Bill has not been considered by any parliamentary scrutiny committees.

Introductory InfoDate of introduction: 2024-11-21

House introduced in: House of Representatives

Portfolio: Communications

Commencement: The day after Royal Assent

 

Purpose of the Bill

The purpose of the Online Safety Amendment (Social Media Minimum Age) Bill 2024 (the Bill) is to amend the Online Safety Act 2021 to introduce an obligation on certain social media platforms to take reasonable steps to prevent children under 16 years of age from having an account. The Bill will also make an amendment to the Age Discrimination Act 2004 to facilitate the reform.

 

Background

The regulation of online safety in Australia

The safety of Australians online is primarily supported by the Online Safety Act 2021 (the Act) and the associated regulator, the eSafety Commissioner. Children’s online safety is a key focus of eSafety’s work, and of the Act – as indicated by the office’s initial name as that of the ‘Children's eSafety Commissioner’.

Australia was considered a world leader in online safety when eSafety was established in 2015. Since then, numerous international jurisdictions have legislated their own approaches to safeguarding people online – notably, the recent Online Safety Act in the United Kingdom (received Royal Assent in 2023) and the Digital Services Act in the EU (enacted in 2022).[1]

The roles and capabilities of digital platforms are quickly evolving, and the threats of harm caused by social media have been a key recent focus both locally and internationally. Discussion in this area has been stoked by the release of books such as Jonathan Haidt’s The Anxious Generation, and the prominent role of social media in crisis events such as the Wakeley Stabbing.

Seeking to keep abreast of these developments, the government has recently, or is currently, undertaking a number of reforms to online safety including:

A Statutory Review of the Online Safety Act, commenced in November 2023 and provided a final report to Government on 1 November 2024. This review has not yet been made public.

The Joint Select Committee on Social Media and Australian Society also published its final report on 18 November 2024.

Social media minimum age

There is no current legislated minimum age for social media use in Australia. Children are generally unable to sign up for social media accounts under the age of 13. However, this restriction is imposed by global platforms operating in line with the United States’ Children's Online Privacy Protection Rule (COPPA), rather than enforced by Australian regulation. It can also be easily circumvented.

In November 2023, Shadow Minister David Coleman, introduced a private member’s bill to compel the government to conduct a trial of age-verification technology, in line with a recommendation of the Roadmap for age verification – released by the eSafety Commission in the context of children accessing online pornography. Mr Coleman’s Bill specified that the trial include application to social media platforms. While the Bill did not proceed, the suggested trial was funded in the May 2024 budget (p. 150).

Introducing a minimum age for social media gained popular support over the ensuing months. The Federal Government expressed support for a ban on under-16s accessing social media in June, the Opposition pledged to implement a ban within 100 days of taking office, state premiers made similar commitments, and campaigns were launched by a range of media outlets. The Government of South Australia appointed former Chief Justice of the High Court Robert French AC to undertake an Independent Legal examination into banning children’s access to social media. This report was published in September 2024.

On 8 November 2024, the Federal Government announced its intention to ‘legislate 16 as the minimum age for access to social media, following endorsement by National Cabinet’. 

 

Policy position of non-government parties/independents

As noted above, the Opposition has also committed to introducing legislation to restrict access to social media by those under 16. On 8 November 2024, Shadow Minister for Communications, David Coleman, welcomed the government’s proposed legislation and stated that he did not support any major social media platforms being exempt from the obligations.

However, Nationals Senator Matt Canavan and Liberal Senator Alex Antic have voiced concerns over the Bill.

The Australian Greens have criticised the Bill, stating its introduction to Parliament is ‘rushed, reckless and goes against the evidence’. Senator Sarah Hanson-Young – Greens Spokesperson for Communications and Deputy Chair of the recent Joint Select Committee into Social Media – noted that the Committee did not find an age ban to be an appropriate solution to online harms and has proposed alternative approaches. Senator Hanson-Young also called on the Government to release the report on the Online Safety Act review.

The positions of other members of the cross-bench have been mixed:

  • Kylea Tink MP issued a statement saying that ‘social media bans are not the answer to youth mental health challenges’ and calling for evidence-based policy.
  • Zoe Daniel MP similarly stated that a ban is not the right ‘pathway to go down’.
  • In September 2024, Senator Jacqui Lambie supported greater protection against online harms but suggested a ban may not be the most appropriate solution.
  • The ABC reports that Senator David Pocock ‘is broadly supportive of the bill but still has concerns around the detail and process’.
  • Senator Tammy Tyrrell has questioned whether enforcement of a ban will work.
  • One Nation Senator Malcolm Roberts suggested the Bill was ‘government overreach’ and supported the Bill being referred to a Senate inquiry.
  • United Australia Party Senator Ralph Babet similarly stated that the Bill was ‘overreach’.
 

Key issues and provisions

Social media minimum age

The key amendment of the Bill is item 7 which inserts a new Part 4A—Social media minimum age into the Online Safety Act. The object of Part 4A is ‘to reduce the risk of harm to age-restricted users from certain kinds of social media platforms’ (proposed section 63B). The new term age-restricted user will be inserted into section 5 of the Act meaning ‘an Australian child who has not reached 16 years’ (item 2).

What platforms are affected?

Age-restricted social media platform (ARSMP) is defined at proposed section 63C. This section outlines the scope of the proposed obligation to be introduced, while also providing the Minister with flexibility to specify which platforms are or are not covered by the provisions through making legislative rules.

The definition of ARSMP draws on the existing definitions in the Act, in particular electronic service (section 5). This broad definition, with some exclusions, covers:

  • a service that allows end‑users to access material using a carriage service
  • a service that delivers material to persons having equipment appropriate for receiving that material, where the delivery of the service is by means of a carriage service.

Under proposed section 63C, an ARSMP would be an electronic service which satisfies the following conditions:

  • the sole purpose, or a significant purpose, of the service is to enable online social interaction between 2 or more end-users[2]
  • the service allows end-users to link to, or interact with, some or all of the other end-users
  • the service allows end-users to post material on the service
  • such other conditions (if any) as are set out in the legislative rules.

Further, only services that include material that is accessible to, or delivered to, one or more end-users in Australia are included within the definition of ARSMP.[3]

In addition to this definition, the Bill provides that Minister may, through legislative rules, specify services to be ARSMPs, or not be ARSMPs.[4] In her second reading speech, Minister for Communications Michelle Rowland indicated that these provisions will allow ‘flexibility to reduce the scope or further target’ the definition of ARSMP:

Achieving this through rules, rather than primary legislation, enables the government to be responsive to changes and evolutions in the dynamic social media ecosystem. Rules can be made to allow for additional conditions that must be met, in order to fall within the definition of 'age-restricted social media platform'.

To be clear, the government expects that this broader definition will capture services that are commonly accepted to be social media, and the services that are causing many parents the most concern. This will, at a minimum, include TikTok, Facebook, Snapchat, Reddit, Instagram, X (formerly Twitter), among others. These services will be required to take reasonable steps to prevent persons under 16 years of age from creating or holding an account.

Further, the Explanatory Memorandum (EM) states that, ‘in the first instance’, the Government proposes to make legislative rules to exclude messaging apps, online gaming services and ‘[s]ervices with the primary purpose of supporting the health and education of end-users’ from the definition of ARSMPs (p. 4). These legislative rules are intended to be ‘consulted on, settled and made’ before the commencement of the civil penalty provision.

There are some constraints on which services the Minister may specify as being or not being an ARSMP. Proposed subsection 63C(4) provides that the Minister may only make legislative rules specifying that an electronic service is an ARSMP if ‘satisfied that it is reasonably necessary to do so in order to minimise harm to age-restricted users’. Further, before the Minister specifies in the legislative rules that an electronic service is or is not an ARSMP, the Minister must seek the advice of the eSafety Commissioner and may seek advice from ‘any other authorities or agencies of the Commonwealth that the Minister considers relevant …’. The legislative rules will be disallowable instruments.

There also appears to be an attempt to exclude electronic services used for business purposes or interactions from the definition of ARSMP.[5]

Key issue: is social media harmful to children?

There are well acknowledged risks of social media; risks which may be heightened for children who do not have the skills, experience, and cognitive capacity to navigate complex environments. There is also emerging research that social media may impact children’s mental health. For example, analysis of the Longitudinal Study of Australian Children suggests that:

increased frequency of SNS [social networking sites] use over time is associated with greater levels of depressive symptoms… Although no causal effects can be drawn from the present study, the findings align with emerging evidence showing that increased social media use may lead to poorer mental health. [emphasis added]

However, there are also benefits to be gained from social media use.

This duality of benefit and risk of harm is well summarised in Chapters 3 and 4 of the final report of the Joint Select Committee on Social Media and Australian Society. The report acknowledges the undeniable harms and deep pain inflicted on some young people and their families by social media – including the effects of disordered eating, bullying and self-harm – yet it also concludes that social media amplifies, rather than causes, many harms (p. 19).

Further, the report considers the benefits and positive outcomes that social media may bring (pp. 34–38). The report quotes, for example, recent research by UNICEF Australia which found that ‘81% of [young] social media users says it has a positive influence on their lives’ (p. 6).[6]

As noted by the American Psychological Association in May 2023 discussion of social media use in adolescence, risks and benefits are individualised, and risk does not necessarily amount to harm:

using social media is not inherently beneficial or harmful to young people...the effects of social media likely depend on what teens can do and see online, teens’ preexisting strengths or vulnerabilities, and the contexts in which they grow up.

Adolescents’ experiences online are affected by both 1) how they shape their own social media experiences (e.g., they choose whom to like and follow); and 2) both visible and unknown features built into social media platforms. [emphasis added]

Research by the eSafety Commissioner supports this view, highlighting that the opportunities and risks for marginalised groups often differ to those from young cohorts generally.[7]

Is the measure proportionate to the risk?

There is limited evidence to suggest that a blanket ban to prohibit children from using social media is the most advantageous solution to addressing online harms.

Evidence to the inquiry referred to above does suggest support for a ban (p. 91), however, many submissions also proposed this approach as a stopgap solution in the context of an otherwise unregulated social media sphere, or recommended against the policy.[8]

The Committee itself refrained from recommending a minimum age for social media, instead recommending several other measures relating to the further regulation of social media networks (pp. 119–121), including Recommendation 7:

 the Australian Government support research and data gathering regarding the impact of social media on health and wellbeing to build upon the evidence base for policy development.

Further, many researchers and academics of media and communications in Australia and internationally have criticised a blanket ban. Of note, a recent letter signed by over 140 researchers and academics expresses concern ‘that a “ban” is too blunt an instrument to address risks effectively’.[9]

The eSafety Commissioner has also expressed concerns. In its submission to the inquiry, it noted, ‘a particular concern for eSafety is that restriction-based approaches may limit young people’s access to critical support.’ (p. 11) In June, the Commissioner also avoided endorsing a ban, suggesting that – like learning to swim – children benefited from education and guardrails, rather than from keeping them out.

Implementation and Enforcement

How will the ban be implemented?

Details on how platforms are expected to enforce age restrictions on social media platforms are not specified in the Bill.

Proposed section 63D outlines that providers of ARSMPs ‘must take reasonable steps to prevent age-restricted users having accounts with the age-restricted social media platform’. Failing to meet this requirement may result in a maximum civil penalty of $49.5 million.[10]

However, what is meant by ‘reasonable steps’ is not defined within the Bill. Item 5 of the Bill proposes to amend the functions of the eSafety Commissioner in subsection 27(1) of the Act to include:

(qa) to formulate, in writing, guidelines for the taking of reasonable steps to prevent age-restricted users having accounts with age-restricted social media platforms; and
(qb) to promote guidelines formulated under paragraph (qa)

The guidelines formulated by the eSafety Commissioner will not be legislative instruments (and therefore not subject to parliamentary disallowance).[11]

It is expected that the guidelines will be informed by the Age Assurance Technology Trial, which is currently underway. What technological solutions or expectations the guidelines may recommend is as yet unknown. The Age Verification Roadmap, released by eSafety in March 2023, found that ‘the age assurance market is immature but developing [and that] Each technology has benefits and trade-offs’ (p. 8). The Guardian has also reported that ‘no countries have implemented an age verification mandate without issue’.[12]

When will the ban be implemented?

Restricting under-age users from social media will not take immediate effect. Proposed section 63E provides that the Minister may, by notifiable instrument (again not subject to parliamentary disallowance), specify a day the civil penalty provision takes effect. This must not be earlier than 12 months after the day proposed section 63E commences.[13]

The Explanatory Memorandum states:

In deciding the date on which the minimum age obligation will commence, the Minister will consider affected stakeholders, as well as the Government’s age assurance trial, funded to take place throughout 2024-25, which will inform guidance to industry on what age assurance technologies would be considered ‘reasonable’ and consistent with minimum age obligation. (p. 23)

While this statement suggests the Minister will take specific matters into account in determining when the civil penalty provision commences, there is no obligation for the Minister to do so.

Will everyone have to be age-verified?

In a Senate Estimates hearing on 5 November 2024, a representative from Department of Infrastructure, Transport, Regional Development, Communications and the Arts confirmed that all account holders on ARSMPs will have to verify their age, not only those under the age of 16 (p.27).

Will children circumvent the age restriction?

The civil penalty provision at the centre of the Bill states that providers of ARSMPs must take reasonable step to prevent age restricted users from having accounts. It does not otherwise place any obligations on ARSMPs to prohibit people under the age of 16 from accessing content on their platforms. There is no civil penalty for parents or other people who provide access to ARSMPs for children under 16.

Further, some commentary has suggested that children may bypass age verification restrictions.

The eSafety Commissioner has also expressed concern for those that do bypass age restrictions, and who may use social media in secrecy:

This may mean that they access social media without adequate protections in place and are more likely to use less regulated non-mainstream services that increase their likelihood of exposure to serious risks. Restriction-based approaches can also reduce young people’s confidence or inclination to reach out to a trusted adult for help if they do experience harm, which is a key protective factor for safer internet use.

… Banning children of a certain age also doesn’t work to build the capacity of young people to engage online safely. Bans also place the onus on children to keep themselves safe, rather than putting the onus on online platforms and services to keep young people safe. (p. 12)

Further:

Even if social media could be demarcated and separated from other media, a primary concern is that children would migrate to other services and platforms with fewer safeguards. (p. 11)

Privacy concerns

The eSafety Commissioner’s July 2024 Tech Trends Issues Paper: Age assurance identifies that ‘age assurance technologies can pose privacy risks due to the type and amount of data they collect, store, use, and share’ (p. 11).

With regard to privacy concerns, proposed section 63F will establish privacy obligations where an ‘entity’ holds personal information about an individual that was collected for the purpose of, or for purposes including the purpose of, taking reasonable steps to prevent age-restricted users having accounts with an age-restricted social media platform.

If the entity uses or discloses information, without falling within one of the exceptions, this will be ‘taken to be’:

  • an interference with the privacy of the individual for the purposes of the Privacy Act 1988
  • covered by section 13 of that Act (which deals with interferences with privacy).

The Explanatory Memorandum notes that serious and repeated interferences with privacy ‘could result in maximum penalties of $50 million or above (per section 13G of the Privacy Act)’ (p. 24).

There will be exceptions if the entity, uses or discloses the information:

  • for the purpose of determining whether or not the individual is an age-restricted user
  • in circumstances where certain Australian Privacy Principles exceptions apply (including where required by law, in a ‘permitted general situation’ listed in section 16A, or a ‘permitted health situation’ listed in section 16B)
  • with the consent of the individual (specified in new subsection 63F(2)).

There will also be an obligation on entities to destroy the collected information ‘after using or disclosing it for the purposes for which it was collected’. A failure to destroy the information will also be ‘taken to be’ an interference with privacy and covered by section 13 of the Privacy Act.

The Bill expressly proposes to use the definitions of ‘entity’ and ‘personal information’ drawn from the Privacy Act. These are definitions which have broad coverage. In particular, the term ‘entity’ means an agency, an organisation or a small business operator (section 6). The term ‘organisation’ is also a defined term in the Privacy Act. Section 6C provides that ‘organisation’ means (with some exceptions) an individual (a natural person), a body corporate, a partnership, any other unincorporated association or a trust.

Age discrimination and the rights of the child

Age discrimination exemption

The Age Discrimination Act 2004 contains a range of protections against discrimination on the basis of age. For example, section 31 provides that it is unlawful for a person who is responsible for the administration of Commonwealth laws and programs to ‘discriminate against another person on the ground of the other person’s age in the performance of that function, the exercise of that power or the fulfilment of that responsibility’.

Item 17 of the Bill will amend Schedule 2 of the Age Discrimination Act to exclude the eSafety Commissioner’s new guideline-making functions and persons acting in compliance with proposed Part 4A from the coverage of the legislation. The Explanatory Memorandum states this is intended to ‘make clear that compliance with the provisions of the Online Safety Act is lawful and should not give rise to age discrimination complaints’ (p. 28).

Co-design with children

Several witnesses who provided evidence to the Joint Select Committee on Social Media and Australian Society cited the Convention on the Rights of the Child (CRC; p. 102). The Australian Human Rights Commission highlighted several pertinent articles of the CRC including Article 12:

1. States Parties shall assure to the child who is capable of forming his or her own views the right to express those views freely in all matters affecting the child, the views of the child being given due weight in accordance with the age and maturity of the child.

2. For this purpose, the child shall in particular be provided the opportunity to be heard in any judicial and administrative proceedings affecting the child, either directly, or through a representative or an appropriate body, in a manner consistent with the procedural rules of national law.

The Bill’s statement of compatibility with human rights notes that the Bill:

only restricts and penalises the behaviour of platforms, not that of children…

To the extent that the Bill engages with the right to freedom of expression, the restrictions are reasonable, necessary and proportionate to promote the best interests of children, and protect children from the harms outlined in this statement (p. 12)

The Committee recommended that:

any features of the Australian Government's regulatory framework that will affect young people be co-designed with young people.

Safety by design

As noted above, there is broad consensus that there are risks associated with young people’s use of social media. On 14 November 2024, the Government announced its intention to legislate a ‘Digital Duty of Care’ to ‘place the onus on digital platforms to proactively keep Australians safe and better prevent online harms.’

This proactive approach is in line with the concept of Safety by Design promoted by the eSafety Commissioner.

The Digital Duty of Care is notably absent from this Bill. However, in her second reading speech, the Minister noted that:

Legislating a digital duty of care is a separate body of work […]

Legislating a duty of care will mean services can't 'set and forget'. Instead, their obligations will mean they need to continually identify and mitigate potential risks, as technology and service offerings change and evolve.

Exemptions clause

As discussed above, proposed subsections 63C(6) and (7) provide that the Minister may specify that a digital service is out of scope of the definition of ARSMP. The Explanatory Memorandum notes that:

For example, the Minister could use this power to disapply the definition to messaging services, where users do not generally face the same harmful features as other mainstream social media services, including algorithmic content recommendation, endless scroll and other psychological manipulation techniques which encourage near-endless engagement. (p. 20)

It remains unclear whether social media platforms designed for age-restricted users – such as Instagram Teen Accounts – will be provided an exemption. However, the ABC has reported that the Minister:

has urged platforms to think about setting up dedicated, age-appropriate channels for young people.

Speaking to Nine radio, the minister pointed to YouTube Kids as a specific example.

"Part of what we want to do is encourage platforms to develop low-risk services," she said.

Other provisions

Information gathering powers and platform provider notifications

Under proposed section 63G, the eSafety Commissioner will be able to issue written notices to persons to require them to give the eSafety Commissioner information. Notices may be issued where the eSafety Commissioner ‘believes on reasonable grounds’:

  • a person is a provider of an ARSMP who has information relevant to the person’s compliance with section 63D (the ‘reasonable steps’ obligation) or
  • a person is a provider of an electronic service and has information relevant to whether or not the service is specified in the legislative rules.

Failure to comply with a notice may lead to a maximum civil penalty of 500 penalty units (currently $165,000) (proposed section 63H).

Under proposed section 63J, the eSafety Commissioner will also be able to prepare statements, copy them to providers of ARSMPs and publish them (where the eSafety Commissioner considers it appropriate) when a ARSMP has:

  • has contravened section 63D (the ‘reasonable steps’ obligation) or
  • has used, disclosed or failed to destroy information in a way that is taken to be an interference with privacy (the privacy obligations in proposed section 63F).

Increase in penalties for non-compliance with industry codes and standards

The Online Safety Act sets out a regime for registration and compliance with industry codes and standards. Items 8 and 11 significantly increase the maximum penalties when a person fails to comply with a direction by the eSafety Commissioner to comply with an industry code under section 143 or fails to comply with an industry standard under section 146. These maximum penalties will increase from 500 penalty units (currently $165,000) to 30,000 penalty units ($9,900,000). Further, the Explanatory Memorandum notes ‘[f]or a body corporate, the maximum penalty increases to 150,000 penalty units (currently equivalent to $49.5 million), per section 82(5) of the Regulatory Powers Act’ (pp. 26–27).

Infringement notices, enforceable undertakings and injunctions

The ‘reasonable steps’ obligation in proposed section 63D and compliance with the eSafety Commissioner’s information-gathering notices under proposed section 63H will be added to parts of the Online Safety Act which will allow the use of infringement notices, enforceable undertakings and injunctions by regulators under Parts 5, 6 and 7 of the Regulatory Powers Act.

Review of Part 4A

Item 16 inserts proposed section 239B which requires the Minister to cause an ‘independent review’ to be conducted of the operation of Part 4A. This must occur within 2 years after the day the civil penalty provision in proposed section 63D takes effect. The written report of the review must be tabled in both Houses of the Parliament within 15 sitting days of being received by the Minister.