Chapter 3 - Complementary and alternate approaches to online safety

Chapter 3Complementary and alternate approaches to online safety

3.1This chapter considers inquiry participants’ evidence on the complementary and alternate approaches to regulation aimed at improving children’s and young peoples’ online safety. In particular, the chapter considers evidence on:

digital duty of care;

education, digital literacy and social change;

device level controls; and

enhanced privacy laws.

3.2The chapter concludes with the committee’s view and recommendations.

Overview

3.3Although the majority of inquiry participants indicated significant concerns about the risks for children and young people in the digital environment and on social media, many did not believe the industry codes, legislated SMMA obligation, or the use of age assurance to give effect to those restrictions, were effective solutions.

3.4Instead, many participants advocated for a more systemic approach to children’s online safety, calling for social media and online spaces to be made safer for young people, rather than restricting their access.[1] The Australian Human Rights Commission (AHRC), for example, argued:

Regulatory efforts like this must focus on embedding safety by design features that respect, protect and promote human rights. Age assurance should not be treated as the default or preferred method of online safety, particularly where it risks undermining privacy, equality and freedom of expression.[2]

3.5Inquiry participants supported a range of alternative and complementary regulatory approaches to protect children and young people online which they argued are more effective, systematic and future-proof. The AHRC summarised this sentiment:

Other safety by design approaches (such as content filtering, crisis response tools, education and Digital Duty of Care) could offer a more proportionate and rights-respecting pathway to protecting children online. These approaches have potential to shift responsibility from users to service providers and can be implemented with less risk of compromising privacy, autonomy or inclusion.[3]

3.6Dr Rys Farthing similarly expressed support for an approach that targets the risks platforms have engineered, not who the user is.[4] He argued for an upstream, systemic approach that would see platforms conducting risk assessments on content, explaining:

Every single piece of how they work, from what the like button looks like to how the content recommender system prioritises content in your feed, is code that's written by humans. That can be changed. If we take that systemic focus that looks at what the systems and processes that digital platforms and services build are and actually place requirements on them to drive up safety, to drive up security and to drive up privacy, then I think we start to see the risk profile of the digital world decrease.[5]

3.7The Tech Policy Design Institute raised additional concerns that the current legislative measures risk being interpreted as a ‘job done’ solution, rather than part of an ongoing process of improvement. Whilst noting Australia is making positive changes in this space, Ms Johanna Weaver, Co-Founder and Executive Director of the Tech Policy Design Institute, advised that urgent reforms need to continue and should extend to duty of care, privacy and competition reforms if systemic change is to be realised.[6]

Digital duty of care

3.8Inquiry participants indicated strong support for digital duty of care obligations to be placed on digital platforms, as a wide-reaching mechanism to strengthen online safety.[7]

3.9In 2024, the Government announced the introduction of new Duty of Care obligations, as recommended in the statutory review of the Online Safety Act 2021.[8] A digital duty of care would place a legal responsibility on all digital platforms to proactively keep Australians safe and prevent harms more effectively. The Independent Review of the Online Safety Act Report explained:

A duty of care shifts the emphasis of regulation from reactively tackling specific pieces of material to remediate the harm, to taking a preventative and systems-based approach.[9]

3.10Ms Lucy Thomas OAM of Project Rockit explained how a duty of care obligation is ‘a much more future-proofed, future-focused and effective method’ than potentially excluding young people in their digital participation. She explained the duty-of-care approach:

… embeds a safety-by-design ethos by ensuring that platforms that design these digital environments—and profit from our use of them—are compelled to create environments that are worthy of young people.[10]

3.11Inquiry participants highlighted that the SMMA obligation does not capture all platforms, nor all mechanisms of harm.[11] A similar point was made by the report of the independent statutory review of the Online Safety Act 2021,[12] which noted ‘[t]here are other vectors of harm not properly captured by a focus on content, including contact and conduct.’[13]

3.12The committee was advised that duty of care obligations should be prioritised to address these shortfalls, and address ‘deeper issues’ such as algorithms, recommender systems, infinite scrolling, addictive designs, misinformation, AI, and more.[14] Inquiry participants also highlighted that duty of care obligations can address all age groups, address harmful content and matters around personal data extraction. Mr Pane of Electronic Frontiers Australia, for example, explained:

A digital duty of care requires platform design to not only inherently minimise content or function related harm but also severely restrict personal data extraction and prohibit algorithmic manipulation.[15]

3.13Some participants also noted that a digital duty of care also provides a flexible and open-ended approach that incentivises and encourages better behaviour for minimising inappropriate content, and gives rise to proactive ongoing regulatory engagement.[16] The abovementioned 2024 review of the Online Safety Act echoed this sentiment, noting:

It is also an approach that can deal with technologies and harms not yet dreamed of.It can help future proof regulation. Algorithms, recommender systems, addictive design, artificial intelligence, and generative artificial intelligence, business decisions and more are all factors that shape an individual’s online experience and have the potential to create significant harm.[17]

3.14Significantly, the committee was advised that duty of care obligations can be implemented in parallel with the current legislative approach. Dr Rys Farthing, for example, highlighted:

At the moment, in Australia, the focus appears to be more on finding out which piece of content is what and which user is what. We're looking at really specific instances of use and users, rather than looking at how platforms engineer risk. Those approaches can go hand in hand. You could do both at the same time.[18]

Education, digital literacy and social change (parental empowerment)

3.15Many inquiry participants championed the need for enhanced education and digital literacy as essential to achieving a comprehensive approach to children’s online safety.[19] For example, Ms Lauren Frost of the Youth Affairs Council Victoria, argued:

We believe we should be investing in co-designed education programs and resources for young people, parents, carers, youth workers and educators to ensure that young people are supported and confident to engage with online spaces safely and, importantly, to ensure they know where to go for support if they experience online harm.[20]

3.16The Centre for Multicultural youth echoed this sentiment, noting that education training and support for digital literacy are critical, particularly if children find a way around the proposed controls or are driven underground. Ms Harini Kasthuriarachchi explained:

It's important to remember that this ban isn't intended to punish young people; its intention is to try [and] manage the risks that social media pose. Alongside the ban and implementation, it's really important that we communicate really clearly with young people that, if they do encounter issues if and when they do access social media, there are supports for them and they should reach out to adults or the eSafety Commission—whoever might be relevant. We must ensure that they're not being punished for platforms continuing to be unsafe. There are risks we know are continuing to be perpetuated.[21]

3.17QUT Media Centre praised the emphasis on digital literacy and civic education in Finland, Sweden, and Indonesia, where digital literacy resources and training are targeted at a broad cross-section of the community.QUT Media Centre explained that a whole-of-society approach to digital literacy builds resilience against misinformation and harmful online content. Recognising ‘that critical thinking and digital literacy skills are important sites of development for young people as well as a lifelong skill for all current and future users of digital systems.’ It further noted:

These strategies work not by seeking to eliminate risk entirely, which is impossible, but by equipping young people with the skills and confidence to navigate complex digital environments safely.[22]

3.18Some participants also emphasised the need to empower parents as well as initiating social change.[23] Ms O’Shea from Digital Rights Watch, for example, noted there is a lot of work to be done in digital literacy that can’t be ignored and is going to be ‘generations long’.[24] She encouraged having meaningful conversations with parents and ‘encouraging greater conversations between parents and children about responsible use’, whilst also acknowledging the difficulties parents can face.[25]

3.19Mr Pane, Electronic Frontiers Australia, highlighted the need for social change to support regulatory reforms. He contended that introducing digital safety and digital health as part of the curricula at primary and secondary level schooling must be accompanied by parental responsibility in relation to the types of devices young children are given.[26]

3.20Online safety technology company Qoria reinforced the importance of parental empowerment in achieving better outcomes in a child’s digital life, including via technology loaded on a device. Mr Tim Levy, Qoria Founder and Managing Director, emphasised ‘safety starts with protecting the device’ because ‘that's the gateway to the internet.’[27] Mr Levy also raised concerns that the ‘the rights and roles of parents in online safety seem to be completely absent from all these discussions’, describing this as a fatal flaw.[28]

Device level controls

3.21The committee heard from inquiry participants that although duty of care obligations are important, ‘outsourcing safety to global platforms’[29] will not be adequate to achieve the wide-reaching results necessary for children’s online safety and must be backed by device-level controls.[30]

3.22Mr Levy called for a focus on policy settings ‘that facilitate the simple, reliable, and interoperable use of these tools, ensuring parents' choices are respected across all devices and platforms.’[31] He explained to the committee that parental use of downloadable device controls won’t prevent the use of VPNs ‘to get around the kind of geofenced protections that we're trying to put in place in this country’.[32] Instead, Mr Levy emphasised that greater interoperability between the parental control systems of tech giants and greater access to enterprise safe technologies for families were key to achieving effective device-level controls.[33]

3.23Ms Weaver of the Tech Policy Design Centre echoed this advice, stating that interoperability and addressing barriers to on-device protection need to be part of a systemic change.[34]

3.24Qoria submitted that enterprise technology controls are ‘the method (so-called ”end point protection”) which is prioritised by businesses to protect their information, services and devices.’[35] Mr Levy explained that these technologies can control whether VPNs are used and how, can stop access to the dark web and direct children to safer versions of search engines.[36] When installed at an ‘enterprise’ level, such as by schools, Qoria’s experience in the United States demonstrates that safety tools ‘ensure age-appropriate access to all online platforms … and the entirety of the web.’[37] He noted:

Enterprise safety technology provides almost all the security, privacy and safety measures that parents are begging for. But, that technology is being withheld by Google, Apple and Microsoft who only provide that access to enterprise app developers.[38]

3.25The committee was advised that the buying power and demand from enterprises has driven enterprise access to safety technology. However, Google, Apple and Microsoft are ‘not allowing true competition in safety technology in the consumer world because parents don't have buying power.’[39]

3.26Mr Levy noted that licensing issues can be solved with enhanced competition, including through pressure such as utilising levers in government procurement practices.[40]He stated ‘[i]f you free us up with interoperable access to this technology, the market will solve the problem of parents.’[41]

Prohibiting monetisation of children’s data

3.27In addition to the above concerns relating to the risk of potential misuse of data collected for the purposes of age verification, inquiry participants submitted that a broad prohibition on the monetisation of children’s data is a necessary feature in a systemic approach to online safety.

3.28 Submitting that ‘data is the currency of the online world’, UNICEF outlined some of the emerging risks for children relating to data and privacy, including ‘through data monetisation, microtargeted advertising, profiling and automated decision-making.’[42]

3.29Digital Rights Watch Chair, Ms Elizabeth O’Shea, advised that prohibiting the monetisation of children’s data is one mechanism to minimise profiling and ‘the tendency for social media platforms to send young people down rabbit holes’.[43]Further, Ms O’Shea emphasised that ‘there is a whole advertising ecosystem that we might wish to interrogate’, calling for greater transparency around where advertising revenue for digital platforms comes from. Ms O’Shea explained:

… we know that gambling companies, alcohol companies and junk-food companies are big advertisers on social media. A prohibition on gambling advertising for young people is a straightforward policy reform that could reduce harm and that, as I understand it, is supported by the vast majority of Australians.[44]

3.30Additionally, one inquiry participant questioned advertising claims made by social media platforms, stating:

It may well be true that they're not serving ads of organisations that pay for advertising on their platform, but those platforms are full of influencers, and a lot of them are doing so to promote a product. That's advertising, so I would question that.[45]

Enhanced privacy laws

3.31Enhanced privacy protections were also noted as essential for a comprehensive online safety regulatory framework.

3.32The Privacy Commissioner, Ms Carly Kind, noted that there are new privacy protections in the Social Media Minimum Age scheme, generally relating to purpose limitation of information collected, and associated requirements to destroy information collected for age-assurance purposes when its purpose is fulfilled.[46]

3.33Additionally, the Children’s Online Privacy Code is being drafted by the Office of the Australin Information Commissioner (OAIC). The Alannah and Madeline Foundation expressed its support for the ‘development and appropriate resourcing of a meaningful, comprehensive Children's Online Privacy Code which treats the best interests of the child as its central priority.’[47]

3.34However, some participants were concerned that the framework within which the children’s privacy code was being drafted is fundamentally flawed, particularly the Australian Privacy Principles.[48] Mr Pane of Electronic Frontiers Australia explained to the committee:

The existing APPs are already fundamentally flawed and have not kept pace with technology, despite their claims of being technologically neutral. This means, again, the government is poised to pass privacy regulation based on a redundant and antiquated regulatory precedent.

A genuinely safe internet for children begins with a mandated digital duty of care backed by privacy laws strong enough to prevent and prosecute against the systemic harms caused by surveillance based capitalism…[49]

3.35Some inquiry participants argued there is a need for Privacy Act amendments, and urged the adoption of the remaining recommendations of the Privacy Act Review Report which were accepted or accepted in principle by the government.[50] Digital Rights Watch explained:

We would advocate that we need to move to a world of data minimisation and that that should be enshrined in the Privacy Act through various amendments that the government has accepted need to be made to the Privacy Act but have not yet progressed.[51]

3.36In particular, inquiry participants indicated support for privacy reforms introducing a fair and reasonable test,[52] requiring ‘that the collection, use and disclosure of personal information must be fair and reasonable in the circumstances.’[53]

Committee view

3.37Like many inquiry participants, the committee welcomes efforts to improve online safety for Australian children and young people. The committee recognises that online safety is a difficult space to regulate given the complex, decentralised and evolving nature of the digital world.

3.38Nonetheless, it is vital that Australia pursues highly effective safeguards that enable children and young people to participate in the online world without being exposed to age-inappropriate material and without experiencing anti-social behaviour or unlawful conduct, such as bullying and sexual harassment. Such harms demonstrably undermine children’s and young people’s wellbeing.

3.39At the same time, children and young people are growing up in world where online technologies are ubiquitous and, indeed, instrumental to social participation. Search engines and digital media platforms serve as vast repositories of information from which young people can learn. Similarly, social media platforms are often spaces where young people engage with their peers and the world.

3.40Achieving the right balance between online protections for children and young people and supporting their online participation is clearly a challenging task. The age assurance measures under the Search Engine Services Code and the SMMA obligation are substantial measures that will have a major effect on the online experience of millions of Australians. Indeed, the SMMA obligation is a world first regulation.

3.41However, a central concern for the committee is whether the current Australian regulations reflect world’s best practice for children’s and young people’s online safety and, if not, what needs to be improved.

3.42Unfortunately, much of the evidence to this inquiry highlights serious concerns regarding the implementation of age assurance measures under the Search Engine Services Code and the SMMA obligation. In the committee’s view, many of these concerns remain unresolved and risk undermining the intent of improved online safety for children and young people. Indeed, the committee views many of the issues raised during the inquiry as significant problems that could meaningfully undermine the efficacy of Australia’s soon-to-be implemented age assurance measures.

3.43At the time of the committee’s report, age assurance under the SMMA is two weeks away from commencing and age assurance under the Search Engine Services Code will commence in a month. Given the significance of these changes, the committee is concerned that so many of the details around the implementation of age assurance remain uncertain. Crucially, this includes a lack of detail on the specific age verification mechanisms that search engine operators and social media companies intend to apply to their respective platforms.

3.44The committee is concerned that simply banning young people from a certain number of platforms will drive them to other, less safe and less controlled platforms that are not covered by the SMMA obligation. Platforms like Telegram, Roblox and other gaming platforms also contain serious risks for young people but are not currently captured by the SMMA obligation.

3.45The SMMA obligation could mean that platforms no longer provide age-appropriate accounts for young people, for example Instagram accounts that have direct messages automatically turned off. In addition, young people accessing platforms like YouTube in a logged out state will mean that they have the age-appropriate safeguards that the platforms currently provide removed.

3.46The committee is concerned that there has not been an adequate education program rolled out to young people who will be cut off from their online communities and connections when the SMMA obligation comes into place, especially young people who are already socially isolated.

Recommendation 1

3.47The committee recommends that the implementation of the Social Media Minimum Age obligation be delayed until 10 June 2026 to allow time for the issues in implementation and compliance to be properly considered and an education campaign for young people affected to be rolled out.

Recommendation 2

3.48The committee recommends that the Australian Government legislate a digital duty of care to make online platforms safer for all users.

Recommendation 3

3.49The committee recommends that the Australian Government legislate to prohibit platforms from harvesting and exploiting the data of minors and protect young people from targeted, unsolicited advertisements and algorithms as a matter of priority, with a view for this to apply to all users in the long-term to protect all Australians’ safety and privacy.

Recommendation 4

3.50The committee recommends that the eSafety Commissioner roll out an education program, including through schools, that delivers clear information to young people about the platforms that are covered by the Social Media Minimum Age obligation and what the impacts on young people will be, as well as information about digital literacy and online safety.

Senator Sarah Hanson-Young

Chair

Greens Senator for South Australia

Footnotes

[1]See, for example, Australian Human Rights Commission, Submission 53, p. 13; Dr Rys Farthing, Committee Hansard, 13 October 2025, p. 67; Ms Lauren Frost, Advocacy Manager, Policy and Communications, Youth Affairs Council Victoria, Committee Hansard, 13 October 2025, p. 34; MsJohanna Weaver, Tech Policy Design Institute, Committee Hansard, 13 October 2025, p. 66.

[2]Australian Human Rights Commission, Submission 53, p. 13.

[3]Australian Human Rights Commissions, Submission 53, p. 4.

[4]Dr Rys Farthing, Committee Hansard, 13 October 2025, p. 66.

[5]Dr Rys Farthing, Committee Hansard, 13 October 2025, p. 68.

[6]Ms Johanna Weaver, Co-Founder and Executive Director, Tech Policy Design Institute, Committee Hansard, 13 October 2025, p. 66.

[7]See, for example, Australian Human Rights Commissions, Submission 53, p. 5; Ms Lauren Frost, Advocacy Manager, Policy and Communications, Youth Affairs Council Victoria, Committee Hansard, 13 October 2025, p. 34;

[8]The Hon Michelle Rowland, Media Release, New Duty of Care obligations on platforms will keep Australians safer onlinehttps://minister.infrastructure.gov.au/rowland/media-release/new-duty-care-obligations-platforms-will-keep-australians-safer-online

[10]Ms Lucy Thomas OAM, Chief Executive Officer, Project Rockit, Committee Hansard, 13 October 2025, p. 32

[11]Ms Nicola Palfrey, Head of Clinical Practice, Headspace National Youth Mental Health Foundation, Committee Hansard, 13 October 2025, p. 32.

[12]Under section 239A of the Online Safety Act 2021, the Minister for Communications is required to initiate an independent review of the Act within three years of the Act’s commencement. The initial review of the Act was brought forward by one year.

[13]Delia Rickard PSM, Report of the statutory review of the Online Safety Act 2021, October 2024, p. 50.

[14]See, for example, Ms Lauren Frost, Advocacy Manager, Policy and Communications, Youth Affairs Council Victoria, Committee Hansard, 13 October 2025, p. 34.

[15]Mr John Pane, Chair, Electronic Frontiers Australia Inc., Committee Hansard, 13 October 2025, p. 43.

[16]Ms Elizabeth O’Shea, Chair, Digital Rights Watch, Committee Hansard, 13 October 2025, p. 46.

[17]Delia Rickard PSM, Report of the statutory review of the Online Safety Act 2021, October 2024, p. 51.

[18]Dr Rys Farthing, Committee Hansard, 13 October 2025, p. 66.

[19]See, for example, Ms Harini Kasthuriarachchi, Policy Officer, Centre for Multicultural Youth, Committee Hansard, 13October 2025, pp. 39-40; Ms Lauren Frost, Advocacy Manager, Policy and Communications, Youth Affairs Council Victoria, Committee Hansard, 13 October 2025, p. 34.

[20]Ms Lauren Frost, Advocacy Manager, Policy and Communications, Youth Affairs Council Victoria, Committee Hansard, 13 October 2025, p. 34.

[21]Ms Harini Kasthuriarachchi, Policy Officer, Centre for Multicultural Youth, Committee Hansard, 13October 2025, pp. 39-40.

[22]QUT Media Centre, Submission 14, [p. 12].

[23]See, for example, Ms Elizabeth O’Shea, Chair, Digital Rights Watch, Committee Hansard, 13 October 2025, p. 47; Mr John Pane, Chair, Electronic Frontiers Australia Inc., Committee Hansard, 13 October 2025, p. 47; Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 51.

[24]Ms Elizabeth O’Shea, Chair, Digital Rights Watch, Committee Hansard, 13 October 2025, p. 47.

[25]Ms Elizabeth O’Shea, Chair, Digital Rights Watch, Committee Hansard, 13 October 2025, p. 47.

[26]Mr John Pane, Chair, Electronic Frontiers Australia Inc., Committee Hansard, 13 October 2025, p. 47.

[27]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 55.

[28]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 51.

[29]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 52.

[30]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 54.

[31]Qoria, Submission 4, p. 4.

[32]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 56.

[33]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, pp. 51, 55.

[34]Ms Johanna Weaver, Tech Polic Design Centre, Committee Hansard, 13 October 2025, p. 66.

[35]Qoria, Submission 4, p. 3.

[36]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, pp. 51, 56.

[37]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 51.

[38]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 51.

[39]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 53.

[40]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 52.

[41]Mr Tim Levy, Founder, Managing Director, Qoria, Committee Hansard, 13 October 2025, p. 54.

[42]UNICEF, Submission 13, p. [2].

[43]Ms Elizabeth O’Shea, Chair, Digital Rights Watch, Committee Hansard, 13 October 2025, p. 47.

[44]Ms Elizabeth O’Shea, Chair, Digital Rights Watch, Committee Hansard, 13 October 2025, p. 47.

[45]Mr John Pane, Chair, Electronic Frontiers Australia Inc., Committee Hansard, 13 October 2025, p. 48.

[46]Ms Carly Kind, Privacy Commissioner, Committee Hansard, 13 October 2025, p. 79.

[47]Alannah & Madeline Foundation, Submission 3, p. 5.

[48]Electronic Frontiers Australia, Submission 32, p. 2.

[49]Mr John Pane, Chair, Electronic Frontiers Australia Inc., Committee Hansard, 13 October 2025, p. 44.

[50]Alannah & Madeline Foundation, Submission 3, p. 5.

[51]Ms Elizabeth O’Shea, Chair, Digital Rights Watch, Committee Hansard, 13 October 2025, p. 44.

[52]See, for example, Queensland Council for Civil Liberties, Submission 18, p. 4; Alannah & Madeline Foundation, Submission 3, p. 8; Ms Johanna Weaver, Co-founder and Executive Director, Tech Policy Design Centre, Committee Hansard, 13 October 2025, p. 66.

[53]Attorney-General’s Department, Privacy Act Review Report, December 2022, p. 14.