Chapter 2 - Views on the bill

Chapter 2Views on the bill

Introduction

2.1Due to the short timeframe that the committee had to consider the evidence, this chapter summarises the key views on the bill as received in submissions.

2.2A transcript of the hearing held on 25 November was not available at time of writing, and will be published when available.

2.3Almost all submitters raised concerns about the extremely short consultation period for the bill.

Support for the minimum age access to social media

2.4The Explanatory Memorandum (EM) cited a New South Wales Government survey of over 21000 people in 2024 regarding social media use and impacts:

87% of respondents expressed support for a minimum age for social media. For respondents aged 16 or older, 16 years was the most commonly suggested minimum age (40 per cent) and 18 years the second most common (25 per cent).[1]

2.5Safe on Social stated that the bill is an urgent, enforceable measure that will protect children now, but acknowledged that a ban would not be perfect. It argued that if the bill reduced harm by even 10 per cent, that would be progress, particularly if it was supported by education, enforcement, and transparent algorithms.[2]

2.6Similarly, the Hon Peter Malinauskus MP, Premier of South Australia said:

No legislative regime will be perfect. There will undoubtedly be workarounds by knowledgeable child users, just as some children still get around drug, alcohol or cigarette laws. That is not a reason not to legislate — the perfect should not be the enemy of the good.[3]

2.7Toby Walsh from the AI Institute, UNSW submitted:

An age ban for social media will incentivise social media companies to improve services that fall both within and outside a ban. For instance, under a ban, a website like YouTube Kids will likely become a better and more useful service for young people.[4]

2.8The Common Sense Party submitted that ‘the bill is beneficial and as a world first, could be phenomenal in setting up a precedence for the international community to follow’ and stated:

Legislation is needed, as opposed to some pathetic voluntary code of conduct or simply asking parents or young people to be aware of the risks, because these social media platforms are run for profit. As they do not charge users, their revenue is from advertisers. The more engagement they have with their platform, the more they can charge. So the business model is about driving engagement.[5]

2.9Similarly, Youth Law Australia expressed its support for ‘stronger regulation to prevent children and young people from experiencing online harm and welcome[d] the introduction of an obligation on providers’.[6]

2.10Finally, the committee also heard from some young people supportive of the bill via the submission made by Bravehearts. Representatives from the Bravehearts Youth Advisory Committee said:

I fully support the push to change social media access to age 16. Bullying, triggering content, and predators make social media a hostile and unsafe environment for young people to be in whilst their brains and emotions are still developing.[7]

2.11And also:

The status quo isn't really acceptable and social media needs some guardrails. I am glad YouTube isn't included and nor should it be because it fundamentally differs from other social media e.g. FB where two end users can interact.

It's an attempt to pursue a meaningful reform and I think the penalties for Meta are good.[8]

Mental health and behavioural harms from social media use

2.12The committee received evidence in both submissions and at the public hearing about the negative mental health and behavioural impacts of social media on young adolescents.

2.13At the public hearing, Dr Danielle Einstein questioned whether there are any mental health benefits from social media. She was of the view that any minor benefits, if they existed, were massively outweighed by the downsides.[9]

2.14Dr Einstein supported a minimum age for social media use because depression and anxiety invariably improve when social media use is restricted. Dr Einstein also pointed out that the bill does not limit access to the internet, or to one-to-one messaging or to phone calls, and that the educational benefits of the internet would remain available.[10]

2.15Dr Einstein also argued that the unhealthy, addictive element of social media are the notifications, and that is what leads to overuse. This element can cause children to ignore face-to-face interactions in the playground or after school and instead turn to online interactions. And yet, mental health at a young age is built by a multitude of face-to-face connections. The bill would, therefore, improve the environment for a majority of young people.[11]

2.16Similarly, Dr Simon Wilksch stated that he was not aware of any research evidence that social media was safe for children under 16. By contrast, there was ‘substantial evidence of harm’.[12]

2.17Professor Susan Sawyer from the Murdoch Children's Research Institute stated that their research on the child-to-adult transition using a large longitudinal cohort suggested that the greatest harms appear to be in younger female adolescents in the 10–13-year-old age group who were using social media for more than two hours a day:

In a study of 1200 young people from Melbourne that recruited a group of grade three, 8-year-olds and then studied them annually…the analyses demonstrate that in relationship to depression there is significantly greater evidence of depression in particular and also reduced wellbeing overall but very much it’s younger adolescent girls aged 10-13.[13]

2.18Professor Sawyer acknowledged that while social media provides benefits in terms of education, information seeking, and connection to peer networks, the benefits in terms of the mental health of younger people are far less than is commonly assumed.[14]

2.19Professor Sawyer also stated that one of the greatest benefits of a potential ban is the message it gives to parents around setting expectations for the younger cohort of 10- to 12-year-old children that they will not be participating in social media at that age.[15]

2.20Catholic School Parents WA raised concerns about the harmful behavioural effects of social media:

Parents are worried that children and young people are becoming desensitised to some of the content that they are seeing, and that it is leading to a distorted understanding of some serious topics. We see the instances of assault, domestic violence and misogynistic behaviour towards women increasing at an alarming rate and one cannot help but make a link back to what is being viewed online.[16]

2.21Catholic School Parents WA submitted that scamming and ‘sextortion’ are now becoming prevalent:

Scamming, and particularly in the case of young people, sextortion, is a growing concern for many parents. It is incredibly common these days to speak to parents who have had this issue raised by one of their children, either because they have been approached by a scammer or know someone who has.[17]

2.22The Australian Gaming and Screens Alliance (AGASA) submitted there was an extensive body of evidence around the negative mental health and developmental impacts on children and teenagers arising from social media. AGASA supported the bill and argued that it could be justified on numerous grounds including the harms associated with overuse (including of games that are accessed via social media platforms), inappropriate content and associated risks of harm, as well as the dangers to wellbeing associated with cyberbullying.[18]

2.23AGASA considered the bill ‘opens the path for working with tech companies around structural change of their platforms, persuasive design, algorithms, and more robust parental controls’.[19]

The bill would empower parents and reduce the burden on them

2.24Officials from the Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA) told the committee that it had received ‘very strong feedback from parents’ that ‘they do not want to bear the burden or responsibility of making decisions that should be better reflected in the law’:

So, it’s really important for parents to point to a standard law, an age limit, that will apply to everybody and it’s also feedback we’ve received from a lot of children as well, that they would rather that there was a universal law that applied to all children under the age of 16, rather than have a situation where some children have it and some don’t, and then all of the harms that we’re aware of through exposure to social media continue to magnify.[20]

2.25The ‘sense of overwhelming fatigue for parents’ was noted by the DITRDCA during its consultation, which was found through research as well’.[21]

2.26Parental fatigue was a point reiterated by Mr Chisholm who stated:

Parents have told us that their fatigue would be reduced by having a very clear law in place that imposes, for example, a 16-year age limit, that that will be welcome by parents rather than have to fiddle around with apps to see if they work.[22]

2.27A representative of Headspace National Youth Mental Health Foundation stated that ‘families are completely overwhelmed by having to monitor their children’s social media use, and parents ‘are panicked and fearful and frankly they are looking for solutions’.[23]

2.28Ms Emma Mason stated that a legislated minimum access age to social media would empower parents:

Even in my own household with my two other daughters, despite everything that has happened I fight the kids to get off their phones – to try and find peace against the never-ending 'need' to connect. It's completely out of control. A change to the law to not permit children access to social media accounts before the age of 16 is crucial to give the power back to parents.[24]

2.29Similarly, Catholic School Parents WA stated that legislation would help parents:

Legislation supporting an increase to the age at which young people can access social media would provide a safety net for parents and would assist them in supporting each other in creating boundaries to keep their children and young people safe.[25]

2.30Vanessa Treloar stated that: ‘As a parent, I am deeply fearful of the world my two-year-old child will grow up in if we do not take proactive steps to mitigate these risks.…This policy change would support parents in their efforts to protect their children.’[26]

Opposition to the bill

2.31The committee received a body of submissions that expressed opposition to the bill. These submissions raised various concerns, including that the bill:

does not appropriately reflect expert evidence in the field;

will have unfortunate and unintended consequences, including that persons under 16 will circumvent the restrictions and risk being driven into hidden spaces where they may be less safe;

would have a disproportionate and negative effect on vulnerable and marginalised young people;

would undercut the important role of education in the development of young people;

provides a broad delegation of power to the minister; and

raises privacy and security risks.

2.32Pride in Swan ‘firmly believe that the proposed blanket ban on social media access for individuals under 16 is neither the most effective nor equitable solution.’[27]

2.33The Internet Association of Australia recognised ‘the risks and real harms associated with children and young people accessing social media’. However, they did not consider the bill an appropriate solution.[28]

Bill does not appropriately reflect expert evidence

2.34Several submitters argued that there is a lack of scientific evidence to support the claims of harm from social media.

2.35The Shooters, Fishers and Farmers Party of Tasmania submitted that while studies such as the 2022 UK study cited by the bill suggest links between social media use and decreased life satisfaction for certain adolescent groups, these findings are highly contextual. The submission argued that broader research, including a 2023 Oxford Internet Institute study, found no consistent evidence that social media use has significant negative effects on adolescent mental health overall.[29]

2.36Professor Marcus Carter, Dr Taylor Hardwick, and Dr Ben Egliston quoted Lucy Thomas OAM, cofounder and CEO of Project Rockit (Australia's youth-driven movement against bullying, hate and prejudice), who considered that there was to the bias in the government’s social media summit, describing it as:

…carefully curated to amplify the (heavily contested) views of a select group of international speakers whose findings conveniently aligned with the government’s pre-determined stance. Meanwhile, globally respected Australian research was sidelined. By focusing exclusively on extreme harms—handpicked to suit a political agenda—the Summit created a public climate where balanced evidence and alternative perspectives have been erased and discredited.[30]

Safety risks: children’s social media use will become more hidden and secret

2.37UNICEF Australia submitted that children under 16 will inevitably bypass restrictions, landing in unregulated, darker spaces. They argued that this not only undermines the effectiveness of the legislation but also exposes children to greater risks online.[31]

2.38The Shooters, Fishers and Farmers Party of Tasmania noted that when access to social media is restricted, young people often resort to creating fake accounts or using VPNs to bypass restrictions. The submission cited a 2021 Tech Transparency Project report that found nearly 70 per cent of teens under 16 circumvented age restrictions on popular platforms.[32]

2.39Professor Marcus Carter, Dr Taylor Hardwick, and Dr Ben Egliston submitted that some children will certainly circumvent the methods for age verification and create adult accounts on social media platforms, effectively exacerbating the risk of online harm:

Thus, instead of having children’s accounts on these platforms—which afford parents the ability to choose the appropriate safety settings for their children and monitor their children’s online activity—platforms will no longer be able to improve the design of social media for young people, and parents will not be able to support children to learn how to use social media safely. Children’s social media use will become more private, and more hidden from those able to support children to navigate the online world.[33]

2.40Young Labor Left NSW argued that by banning access to goods and services, users face a harder time reporting, and seeking help for harm caused. Legally restricting social media for people under 16 years old will not significantly reduce the rate at which young people access social media and will not decrease instances of cyberbullying, abuse, or other forms of digital harm.[34]

Implications for children’s rights

2.41UNICEF Australia submitted that the proposed bill has significant implications for children's rights:

Article 12 of the United Nations Convention on the Rights of the Child (UNCRC) states that every child and young person under 18 has the right to participate and have their opinions included in decision-making processes that relate to their lives.[35]

Disproportionate impact on vulnerable and marginalised young people

2.426 News stated that the organisation is a streaming news channel founded in 2019 and run entirely by young people—the majority of whom are teenagers in high school with ‘tens of thousands of followers and millions of views across our social media platforms, including Facebook, Twitter/X, Instagram and YouTube’.[36]

2.436 News opposed the bill on the basis that it will only cause further problems for young people online:

In particular, we are deeply concerned about the impact it will have on vulnerable teenagers who may use social media as a necessary escape from their day-to-day lives. Marginalised young people who may feel safe in an online community (including those who may not feel safe at home), teens in remote parts of our nation who have found friends online, and those who have built up important relationships and friendships online all benefit.

Speaking to many young people in recent weeks, we have heard how they have been able to get through extremely difficult times in their own lives by using various social media platforms. Although we understand that students may experience bullying online, restricting all teenagers from using social media is not the solution.[37]

2.44Pride in Swan submitted that for many vulnerable youth particularly those from the LGBTIQA+SB community, social media is far more than a digital pastime, it is a critical lifeline:

It is often the only way they can connect with others who share their experiences, access affirming spaces, and find support networks that validate their identities. These connections are essential to their mental health and emotional well-being, especially in rural or remote areas where physical support systems may be unavailable.[38]

2.45That being said, Pride in Swan advocated ‘for a balanced approach that prioritises safety, accountability, inclusion, and education, without isolating young people who rely on these platforms for connection and support’.[39]

2.46Minus 18 submitted that for LGBTQIA+ youth, social media is a lifeline to safety, connection, and affirmation, and pointed to their recent survey of almost 1000 LGBTQIA+ young people across Australia:

95.7% of respondents rely on social media to access friends and emotional support.

91.5% report that platforms, particularly Instagram, have been crucial for forming meaningful friendships with other queer people.

83% fear that a ban on social media would sever them from their community and leave them feeling disconnected and isolated.[40]

2.47These findings underscore the critical role social media plays in supporting a demographic that already faces heightened risks of discrimination, rejection, and isolation.[41]

2.48Minus 18 submitted that the bill ‘inadvertently poses significant risks for LGBTQIA+ youth’ who:

Are more likely to lack supportive family environments, leaving social media as their primary source of connection.

Face unique barriers to accessing offline community support, such as fear of discrimination or lack of local services.

Are more likely to experience social isolation, which can compound mental health risks when disconnected from affirming online spaces.[42]

2.49Young Labor Left NSW noted the ban will have an adverse effect on young Australians who will be cut off from online support networks of friends and family, as well as people with similar identities, interests and passions. In a globally acknowledged loneliness crisis, restricting social media will worsen an already extreme epidemic.[43]

2.50The First Nations Peoples Aboriginal Corporation (FNPAC) raised concerns that many Aboriginal and Torres Strait Islander children and families, particularly in remote areas, lack access to such documentation. This could effectively block them from accessing platforms and services essential for education, healthcare, and community support, increasing their digital isolation.[44]

2.51The FNPAC also raised concerns that in remote communities, reliable internet and digital infrastructure are often unavailable or insufficient. Age verification systems requiring stable online access may further restrict Aboriginal and Torres Strait Islander people's ability to engage in digital activities like distance learning or online commerce.[45]

2.52In contrast, the Premier of South Australia made the point that a minimum age for social media access would in fact help protect vulnerable cohorts:

I recognise this is even more important for vulnerable cohorts, including First Nations young people and young people who are living with disability, LGBTIQ+, culturally and linguistically diverse, or from rural and remote areas. I have been presented with evidence that suggests that while online spaces can offer connection for vulnerable cohorts, social media is disproportionately harming these cohorts and protections would likely benefit them the most.[46]

Importance of social media in education and social causes

2.53Andrew Hamilton, solicitor at JPB Liberty, submitted that social media forms an important part of the lives of adults and is the main source of news, information and educational material for younger people.[47]

A blanket ban on social media for U16s will cause severe damage to the education of children and teenagers. It will put them in an artificial bubble and they will suffer significant trauma when they are suddenly exposed to the world of social media at age 16 when is may be too late for parents and teachers to provide suitable guidance.[48]

2.54Young Labor Left NSW pointed to the importance of social media in pushing for progressive changes in society, such as the #MeToo movement, Black Lives Matter, and School Strike 4 Climate. Young Labor Left NSW argued that social media ‘is indispensable for movements like these, and restricting young people's access will cut them off from the movements that inspire and represent them’:[49]

Finally, with more and more news consumed on social media, digital literacy is an increasingly important skill. Allowing young people to navigate the internet freely and with unimpeded support from family and educators is vital to ensuring misinformation and disinformation are combatted on the internet. By legally restricting access, young Australians will have less time to develop an understanding of digital environments upon reaching adulthood, leaving them vulnerable to dangerous online narratives. This must be combatted by education and guidance, not bans.[50]

2.55The Shooters, Fishers and Farmers Party of Tasmania pointed to a 2023 ReachOut survey which found that 84 per cent of Australian teens believe social media is an important tool for maintaining friendships and accessing support networks and that restrictive policies risk isolating these teens from valuable online communities.[51]

Delegation of power to the minister

2.56The Digital Industry Group Inc. (DIGI) submitted that the bill does not have sufficient detail on the ‘technical implementation regarding privacy, security, human rights or regulatory cost implications’ leading to concerns that there is too much weight on the discretionary powers of the Minister and the eSafety Commissioner.[52]

2.57Valid Agenda argued that the bill's reliance on future legislative rules to refine what constitutes an ‘age-restricted social media platform’ allows future regulators to decide what falls under this category, without current safeguards or specifics, potentially leading to overreach or misinterpretation in application.[53]

2.58Tech Council of Australia submitted:

We also strongly recommend that the government introduce a requirement in legislation for the Minister to consult with industry and affected stakeholders on the legislative rules, and provide that any such rules be disallowable instruments to ensure adequate parliamentary oversight.[54]

2.59ReachOut submitted that research shows that 73 per cent of young people access mental health support on social media and 49 per cent of ReachOut youth service users find the service via social media. Therefore, ReachOut welcomed the exemptions for digital mental health services.[55]

2.60That being said, ReachOut also urged ‘the Government to consult with the experts on how young people use digital technology to communicate, learn and connect as it develops the exemption framework—young people themselves’.[56]

Digital Duty of Care

2.61The committee received evidence on the importance of a Digital Duty of Care to ensure that digital platform providers take appropriate steps to monitor content and be held responsible for users’ safety. These organisations also expressed the view that a Digital Duty of Care would be a more productive approach than the bill’s minimum age restriction.

2.62For example, Bravehearts Foundation submitted:

We need the technology sector to do more to monitor what is happening online, supported by government rules and regulation. The recently announced Digital Duty of Care, is a positive step towards companies being held responsible for users’ safety.[57]

2.63Movember also submitted:

We also acknowledge the proposed introduction of Digital Duty of Care, brining stronger penalties for both reacting to harms through content regulation and expanding their approach to include exploring systems-based prevention. This is a positive step, placing greater responsibility onto the tech companies to keep young Australians safe online.[58]

2.64Similarly, the ARC Centre of Excellence for the Digital Child stated:

In regulatory terms, the government’s plan to focus on a Digital Duty of Care, which requires platforms to evaluate the potential risks of their tools before they release them, is a much more productive legislative direction, placing the initial burden on platforms, not parents.[59]

Privacy and security risks

2.65Ms Carly Kind, Privacy Commissioner at the Office of the Australian Information Commissioner, expressed support for the additional privacy protections in the bill but noted that the introduction of a minimum age for access to social media will have privacy impacts for all Australian users of social media:

While the bill does not dictate what ‘reasonable steps’ social media platforms must take to ensure users below the minimum age are prevented from holding an account, the inevitable outcome is that age assurance checks will need to be conducted for all Australian social media users (not just children). This is a significant departure from the status quo, and is likely to incentivise the collection, use and storage of additional personal information about all users of the service, which increases privacy risks and impacts.[60]

2.66The Privacy Commissioner argued that wholesale reform of the Privacy Act 1988 (Privacy Act) is the most effective way of tackling the most harmful aspects of the digital ecosystem:

In particular, the introduction of a fair and reasonable test for the collection, use and disclosure of personal information would dramatically increase the ability of the OAIC to address harmful and unfair data practices in the online environment.[61]

2.67At the public hearing, the Privacy Commissioner added:

This bill enshrines a much more robust definition of consent than exists already in the Privacy Act and it would be remiss of me not to remark that there are proposals on the table to strengthen the definition of consent in the Privacy Act writ large.

But importantly, in the context of this bill, it does impose that higher level of protections.[62]

2.68The FNPAC raised concerns that Indigenous communities often have heightened concerns about data privacy and the misuse of personal information:

Broad and poorly defined requirements for age and identity verification could deter families from using digital services, reducing engagement in education, commerce, and social support networks critical to their well-being.

Verification processes that collect personal data might not align with cultural sensitivities or privacy expectations within Indigenous communities. Fear of data misuse or lack of trust in external systems could discourage participation in online platforms, exacerbating digital exclusion.[63]

2.69Young Labor Left NSW had:

…deep reservations should the legislation require the provision of government ID documents to platforms to social media companies, given how prone these companies are to leaks and data misuse. Young Labor Left sees this as a blatant infringement on people’s right to privacy and on their civil liberties.[64]

2.70The Shooters, Fishers and Farmers Party of Tasmania Age submitted that ‘verification methods often require sensitive user data, such as government-issued IDs or biometric scans, creating privacy concerns’.

2.71UNICEF Australia submitted that population-wide age verification raises serious concerns about data security and privacy:

We know that data is the currency of the online world, and children's data - where it's collected, traded and sold on mass scales - is considered a big business. Implementing the measures under the proposed bill would require the collection and storage of sensitive personal information, increasing the risk of data breaches and misuse.[65]

2.72JPB Liberty submitted that privacy is a fundamental human right:

Requiring social media companies to conduct age verification will inevitably involve identifying all users of social media in a way that infringes privacy. People have the right to be anonymous online if they choose to be. There are many very good reasons including personal safety, the ability to express opinions privately that an employer or friends may not like.

Again, tyrannies have always tried to prevent privacy and anonymity. Free countries preserve these freedoms.[66]

2.73The FNPAC proposed a range of recommendations to mitigate the risks posed by the bill to Indigenous children and communities, including to:

Create provisions to exempt educational, cultural, and community-led platforms from broad regulations to guarantee access for Indigenous children to digital resources, learning tools, and cultural materials.[67]

Age assurance technology

2.74The Age Verification Providers Association (AVPA) is the global trade body representing 30 providers of privacy-preserving online age assurance technology. AVPA confirmed that ‘the age assurance technologies required to facilitate the implementation of this bill are sufficiently accurate, accessible, privacy-preserving, cost-effective, and aligned with international standards’.[68]

2.75The Australian Democrats raised concerns regarding privacy and identification systems:

There are many Australians who wish to live a private life, and ID verification requirements may disincentivise them from using social media, restricting their access to important information.[69]

2.76The Australian Democrats also expressed concern regarding the ease of circumventing ‘geo-protections’ by using VPNs and the ‘unprecedented risk for surveillance’ associated with ID based online verification.[70]

2.77The DITRDCA provided evidence about the current capabilities of social media companies, such as Instagram, to verify the age of their users on digital platforms:

I point out that Instagram, in their announcement about their teen accounts, they already verify age when you are changing your age which effectively amounts to an admission that you are lying – why would you change your age otherwise?

But they are already building technology – and I’ll quote from their announcement here – “to proactively find accounts belonging to teens, even if the account lists an adult birthday. This technology will allow us to proactively find these teens and place them in the same protections offered by teen account settings. We’ll start testing this change in the US early next year.[71]

2.78The committee also heard evidence about the ability of other social media platforms, such as TikTok, in removing underage accounts using both technology and human moderation to determine a user’s age.[72] TikTok Australia outlined that it had ‘removed more than 20 million suspected underage accounts globally’ between April and June 2024 alone.[73]

Digital ID

2.79Through the course of the inquiry there was some commentary relating to the Australian Government’s ‘Digital ID’ system and whether it would used by platforms in order to comply with the bill. On this, officials from the DITRDCA were clear:

I know there have been questions raised in a few areas about whether there’s any linkage to the government’s Digital ID system and to be clear, there is no linkage, that is not the intention.[74]

2.80And:

There’s no requirement for platforms to use the Digital ID system to comply with the obligation.[75]

2.81The DITRDCA provided evidence that they cannot see any way the eSafety Commissioner could designate Digital ID from the legislation as a means of age verification.[76] This would suggest the legislation could be strengthened to ensure this mechanism is not available.

The need for greater consultation with children and young people

2.82Various submitters to the inquiry emphasised the need for greater consultation with those that would be directly affected by the bill, especially children and young people. These submitters advocated that the government should prioritise genuine and meaningful consultation with young people in developing online safety policies, recognising young people as experts in their own online experiences.

2.83For example, Youth Law Australia (YLA) recommended that ‘further meaningful consultation is undertaken with children and young people to determine how to effectively monitor their experiences of online harms and to engage children and young people in earlier help seeking’.[77] The YLA also supported the deferred commencement of the bill to properly implement and educate children and young people on the changes:

In the implementation, children should be provided with child-sensitive and age-appropriate information on how the Bill impacts them, including reporting and complaint mechanisms if they do experience online harm, and services and remedies available to them. Similar information should be provided to parents and caregivers.[78]

2.84Youth Affairs Council Victoria, expressed concern about the short timeframe for the inquiry. It emphasised the importance of co-designing reforms with young people to ensure that online safety reforms are ‘effective and fit-for-purpose’ as well as co-designing support and education programs.[79]

Committee view

2.85The committee acknowledges the immense interest in the bill and the broad commitment to ensuring the best outcomes for young people and children.

2.86At the outset, therefore, the committee acknowledges that the inquiry timeframe was extremely short and that almost all submitters and witnesses expressed grave concerns that a bill of such import was not afforded sufficient time for thorough inquiry and report. The committee thanks the many submitters and witnesses who engaged with the inquiry in the tight inquiry timeframe.

2.87The committee recognises that persuasive arguments were put forward both in support for, and opposition to, the bill. The committee also recognises that while many inquiry participants supported the aims of the bill, concerns were raised about its implementation.

2.88Many of these concerns are reflected in the recommendations below. While it may be unusual for a legislation committee to make so many recommendations, the committee is firmly of the view that the recommendations set out below would improve the implementation of the bill and would address many of the concerns put forward by inquiry participants.

2.89The committee recognises that the rule-making powers invested in the Minister for Communications (the minister) under the bill are extensive and important in ensuring the optimum implementation of the bill and avoiding unintended consequences.

2.90The committee also notes that the bill seeks to capture services that are of the greatest community concern. The minister will have the power to determine that a particular service is in scope if its features evolve, and harms emerge to the point that action is considered necessary to reflect community standards.

2.91Importantly, the committee notes that these powers are in the form of disallowable legislative instruments, and as such, are subject to parliamentary oversight and potential amendment or veto.

2.92The committee also notes that in exercising the rule-making power, the minister will be required to seek and have regard to advice from the eSafety Commissioner and may also seek advice from other relevant Commonwealth agencies. The committee is therefore reassured that this should ensure that users under the minimum age retain access to platforms that predominately provide beneficial experiences, such as those that are grounded in connection, education, health and support.

2.93The committee considers that the requirement for the minister to have regard to advice from the eSafety Commissioner is very important given that many inquiry participants advocated for a balanced approach that prioritised safety, accountability, inclusion, and education, without isolating young people who rely on social media platforms for connection and support.

2.94To that end, the committee considers it vital that the government consults appropriately on all rule making associated with implementation of the bill, including with young people and their representative organisations.

2.95The committee notes that the explanatory memorandum to the bill currently provides that an independent review of the part that will be introduced into the Online Safety Act by the bill (Part 4A) will be conducted within two years of the minimum age obligation taking effect. The committee considers that the independent review of proposed new Part 4A of the Online Safety Act should occur sooner and recommends that it be conducted within 18 months of the minimum age obligation taking effect.

2.96The committee also received evidence on the importance of a Digital Duty of Care to ensure that digital platform providers take appropriate steps to monitor content and be held responsible for users’ safety.

2.97The committee also acknowledges that several organisations calling for a legislated Digital Duty of Care were also of the view that legislating a minimum age restriction for social media may simply absolve technology companies from designing platforms that are safe and have children’s rights and best interests at the fore.

2.98To that end, the committee recommends that the Australian Government legislate a Digital Duty of Care to place a legal obligation on digital platforms to take proactive steps to protect their users.

2.99The committee also notes the concerns raised with respect to privacy. Nevertheless, the committee notes that the bill contains robust privacy provisions, over and above what is set out in existing privacy laws. The provisions in the bill will require platforms to destroy data collected for age assurance purposes when the age assurance process is complete. Not destroying data would be a breach of the Privacy Act, with penalties of up to $50 million.

2.100Some submitters were concerned that the age assurance trial should be completed prior to the passage of the legislation to enable it to inform the detail and implementation of the bill. However, the committee notes that there is a twelve-month implementation period for this legislation, and the results of the age assurance trial will provide the scope and detail rather than technology being prescribed in the bill.

2.101Further, the committee is reassured by the evidence from the Age Verification Providers Association that the age assurance technologies required to facilitate the implementation of the bill are sufficiently accurate, accessible, privacy-preserving, cost-effective, and aligned with international standards.

2.102Finally, the committee also recognises that many parents have expressed deep concern about the harmful impacts of social media, including screen addiction, hazards of excessive use, and some of the deeply addictive features on their children.

2.103The committee notes that the bill places the onus on platforms to introduce systems and processes that can be demonstrated to ensure that people under the minimum age cannot create and hold a social media account.

2.104The committee therefore considers that the bill will have a normative effect that will enable parents to say ‘no’. On that basis, the committee recommends that the bill be passed.

Recommendation 1

2.105The committee recommends that the Australian Government legislate a Digital Duty of Care to place a legal obligation on digital platforms to take proactive steps to protect their users.

Recommendation 2

2.106The committee recommends that the Australian Government meaningfully engage young people in the implementation of the legislation.

Recommendation 3

2.107The committee recommends that the Minister for Communications provide a progress report to the Parliament on the Age Assurance trial by no later than 30 September 2025.

Recommendation 4

2.108The committee recommends that the bill be amended to prohibit providers of age restricted platforms from compelling a person to use an accredited service within the meaning of section 9 of the Digital ID Act 2024, or other government ID such as passports, and must set out alternative methods for assuring age as reasonable steps with consideration given to the age assurance trial.

Recommendation 5

2.109The committee recommends that the Minister for Communications provide a commitment to setting the implementation date within 12 months, not later.

Recommendation 6

2.110The committee recommends that there be appropriate consultation on all rule making associated with the bill.

Recommendation 7

2.111The committee recommends that the Australian Government amend the review period for the legislation such that an independent review of the proposed new Part 4A of the Online Safety Act 2021 is conducted within 18months of the minimum age obligation taking effect.

Recommendation 8

2.112The committee recommends that the bill be amended to enable the Minister for Communications to have review power over any ‘reasonable step’ rules determined by the eSafety Commissioner.

Recommendation 9

2.113Subject to consideration of the above recommendations, the committee recommends that the bill be passed.

Senator Karen Grogan

Chair

Senator Ross Cadell

Member

Footnotes

[1]Explanatory Memorandum, p. 2.

[2]Safe on Social, Submission 15, p. 2.

[3]The Hon Peter Malinauskus MP, Premier of South Australia, Submission 41, p. 6.

[4]AI Institute, Submission 72, p. 1.

[5]Common Sense Party, Submission 17, p. 1.

[6]Youth Law Australia, Submission 59, p. 2.

[7]Bravehearts, Submission 34, p. 6.

[8]Bravehearts, Submission 34, p. 5.

[9]Dr Danielle Einstein, Public hearing, 25 November 2024.

[10]Dr Danielle Einstein, Public hearing, 25 November 2024.

[11]Dr Danielle Einstein, Public hearing, 25 November 2024.

[12]Dr Simon Wilksch, Additional information, ‘Psychologist explains why raising minimum age can reduce social media harm’, tabled by Senator Karen Grogan on 25 November 2024.

[13]Professor Susan Sawyer, Public hearing, 25 November 2024.

[14]Professor Susan Sawyer, Public hearing, 25 November 2024.

[15]Professor Susan Sawyer, Public hearing, 25 November 2024.

[16]The Catholic School Parents WA, Additional documents, Submission 121 to the Joint Committee on Social Media and Australian Society, p. 2, tabled by Senator Karen Grogan on 25 November 2024.

[17]The Catholic School Parents WA, Additional documents, Submission 121 to the Joint Committee on Social Media and Australian Society, p. 3, tabled by Senator Karen Grogan on 25 November 2024.

[18]Australian Gaming and Screens Alliance (AGASA), Submission 11, p. 1.

[19]AGASA, Submission 11, p. 1.

[20]Mr James Chisholm, Deputy Secretary, Communications and Media Group, Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA), Public hearing, 25 November 2024.

[21]Mr Andrew Irwin, Assistant Secretary, Online Safety Branch, DITRDCA, Public hearing, 25 November 2024.

[22]Mr Chisholm, DITRDCA, Public hearing, 25 November 2024.

[23]Ms Nicola Palfrey, Head of Clinical Leadership, Headspace National Youth Mental Health Foundation, Public hearing, 25 November 2024.

[24]Ms Emma Mason, Additional documents, Submission 207 to the Joint Committee on Social Media and Australian Society, p. 7, tabled by Senator Karen Grogan on 25 November 2024.

[25]Catholic School Parents WA, Additional documents, Submission 121 to the Joint Committee on Social Media and Australian Society, p. 3, tabled by Senator Karen Grogan on 25 November 2024.

[26]Vanessa Treloar, Additional documents, Submission 70 to the Joint Committee on Social Media and Australian Society, pp. 1–2, tabled by Senator Karen Grogan on 25 November 2024.

[27]Pride in Swan Inc., Submission 5, p. 1.

[28]Internet Association of Australia, Submission 13, p. 1.

[29]The Shooters, Fishers & Farmers Party of Tasmania, Submission 8, p. 1; see also Young Labor Left NSW, Submission 10, p. 1.

[30]Professor Marcus Carter, Dr Taylor Hardwick, and Dr Ben Egliston, Submission 90, p. 3.

[31]UNICEF Australia, Submission 9, p. 2.

[32]The Shooters, Fishers & Farmers Party of Tasmania, Submission 80, p. 1.

[33]Professor Marcus Carter, Dr Taylor Hardwick, and Dr Ben Egliston, Submission 90, p. 1.

[34]Young Labor Left NSW, Submission 10, p. 1.

[35]UNICEF Australia, Submission 9, p. 1.

[36]6 News, Submission 6, p. 1.

[37]6 News, Submission, p. 1.

[38]Pride in Swan Inc., Submission 5, p. 1.

[39]Pride in Swan Inc., Submission 5, p. 1.

[40]Minus 18, Submission 1, p. 1.

[41]Minus 18, Submission 1, p. 1.

[42]Minus 18, Submission 1, p. 2.

[43]Young Labor Left NSW, Submission 10, p. 1.

[44]First Nations Peoples Aboriginal Corporation (FNPAC), Submission 8, p. 2.

[45]FNPAC, Submission 8, p. 2.

[46]The Hon Peter Malinauskus MP, Premier of South Australia, Submission 41, p. 5.

[47]JPB Liberty Pty Ltd, Submission 87, p. 1.

[48]JPB Liberty Pty Ltd, Submission 87, p. 1.

[49]Young Labor Left NSW, Submission 10, p. 1.

[50]Young Labor Left NSW, Submission 10, p. 2.

[51]The Shooters, Fishers & Farmers Party of Tasmania, Submission 80, p. 1.

[52]Digital Industry Group Inc, Submission 43, p. 2.

[53]Valid Agenda, Submission 3, p. 1; see also Professor Marcus Carter, Dr Taylor Hardwick, and Dr Ben Egliston, Submission 90, p. 1.

[54]Tech Council of Australia, Submission 46, p. 3.

[55]ReachOut, Submission 30, p. 2.

[56]ReachOut, Submission 30, p. 2.

[57]Bravehearts Foundation, Submission 34, p. 3.

[58]Movember, Submission 44, p. 1.

[59]ARC Centre of Excellence for the Digital Child, Submission 23, p. 2.

[60]Ms Carly Kind, Privacy Commissioner, Office of the Australian Information Commissioner Submission 12, p. 1; see also Professor Marcus Foth, Submission 88, p. 1.

[61]Ms Carly Kind, Privacy Commissioner, Submission 12, p. 3.

[62]Ms Carly Kind, Privacy Commissioner, Public hearing, 25 November 2024.

[63]FNPAC, Submission 8, p. 2.

[64]Young Labor Left NSW, Submission 10, p. 1.

[65]UNICEF Australia, Submission 9, p. 2.

[66]JPB Liberty Pty Ltd, Submission 87, p. 2.

[67]FNPAC, Submission 8, p. 2.

[68]Age Verification Providers Association (AVPA), Submission 7, p. 1.

[69]Australian Democrats, Submission 20, p. 1.

[70]Australian Democrats, Submission 20, p. 2.

[71]Mr Andrew Irwin, Assistance Secretary, Online Safety Branch, DITRDCA, Public hearing, 25November 2024.

[72]Article by Anglra Bharadwaj, Tech Giants’ fib on kids, tabled by Senator the Hon Sarah Henderson, 25 November 2024.

[73]TikTok Australia, Submission 102, p. 1.

[74]Ms Sarah Vandenbroek, First Assistant Secretary - Digital Platforms, Safety and Classification Division, DITRDCA, Public hearing, 25 November 2024.

[75]Mr Chisholm, DITRDCA, Public hearing, 25 November 2024.

[76]Mr Chisholm, DITRDCA, Public hearing, 25 November 2024.

[77]Youth Law Australia, Submission 59, p. 2.

[78]Youth Law Australia, Submission 59, 3.

[79]Youth Affairs Council Victoria, Submission 60, p. 3.