Dissenting Report from Senator the Hon Matt Canavan

Dissenting Report from Senator the Hon Matt Canavan

1.1The passage of the Online Safety Amendment (Social Media Minimum Age) Bill 2024 (the bill) has not been the Parliament's finest moment. The Government has called its proposal to ban children under 16 from social media as ‘world leading’. Yet such a change has been forced through the Parliament with a haste not befitting its radical and unprecedented nature.

1.2The public first saw the legislation to ban children under 16 from social media last Thursday, submissions closed on Friday, a public hearing was held on Monday and the committee's report is due on Tuesday. The bill is likely to be voted on by the end of this sitting week. That will mean that a bill that introduces a complex regulation of modern and evolving technologies will be subject to less than one week of public scrutiny.

1.3Such a condensed process removes the right of Australians to be involved in the creation of laws that impact them. Despite the less than 24 hours request for submissions, 15,000 submissions have been received on this bill. While the determination of people to make such an effort to submit their views should be applauded, Senators on this committee have had no ability to examine even a fraction of these submissions. At the time of writing, only 91 submissions (or just 0.6 per cent of submissions) have been published on the committee's website.

1.4The abuse of process also risks producing perverse outcomes and failing to reduce the real harms of young people accessing social media. These risks are evident by the Chair's draft recommending substantial changes to the bill with just days before its passing. While I welcome these recommendations, they are being made ‘on the run’.

1.5Given that even Government Senators have found significant defects in this bill in just a few days examination, the prudent approach would be for this inquiry to be extended into next year so a more fulsome examination can occur. For example, many supporters of the bill became frustrated in the hearing that the social media industry body (DIGI) did not have answers to their specific questions.

1.6Senator Henderson asked DIGI specifically about why some social media platforms like Instagram and snapchat do not use the same methods as TikTok around age verification and removing under-age accounts? Ms Sunita Bose, Managing Director, DIGI replied that they were an ‘an industry association …we cannot speak about specific companies.’[1]

1.7Dr Jennifer Duxbury, Director of Policy Regulatory Affairs and Research, DIGI added that ‘social media companies are currently under an obligation under the phase one codes to take steps to enforce the terms of use, including age restrictions.’[2]

1.8Under further questioning, Senator Henderson pointed out that Meta is claiming the technology is not there yet to ban underage users, stating ‘the big social media companies know how to get underage kids off their sites but they don't want to because this is going to cost them money. And you are simply making excuses for this failure.’[3]

1.9Such questions could be put directly to the social media companies if the committee is given more time to hold additional hearings. Effectively, the Senate is being asked to pass this bill ‘blind’ without being able to talk directly to the companies that will have to enforce an age verification process.

1.10There is no great urgency to pass this bill before Christmas. Even if it becomes law, the bill will not take effect until 12 months later. It would be much wiser to use this time to get the law right. If the Government still wanted to keep to a 2026 start date, we could shorten the ‘grace’ period before enforcement occurs.

1.11I share the concerns expressed in the Chair's report about the harms of social media used by young people. While bullying and harassment can never be eradicated, social media intensifies anxiety because it becomes almost impossible to escape from. Because of these harms, this is a highly emotional issue and there is an understandable demand for politicians to be seen to be ‘doing something’ about it. Yet it is also a highly complex area that should be examined carefully not in the hasty fashion that has beset this process.

1.12Australians have lost trust in the political process at an unprecedented rate in recent years. Trust will not be restored by the Senate ignoring good process and effectively blocking the Australian people from having their say on significant laws before their Parliament.

Recommendation 1

1.13That the Senate extend the time for the Environment and Communications Legislation Committee to review the Online Safety Amendment (Social Media Minimum Age) Bill 2024.

Proposed amendments

1.14While I think it would be prudent to delay consideration of this bill, I realise there is a high chance that the Senate will move to vote on this bill within days. With that in mind, and with the limited time to review the legislation that I have, I propose the following changes to the bill.

Preventing the use of Digital ID

1.15I support the Chair's proposal to amend the bill to rule out the use of Digital ID, however, I think further clarity is needed on whether the proposed amendment would seek to rule out Digital ID completely or simply ensure that alternatives to Digital ID are permissible.

1.16Given the nascent stage of Digital ID's development, all use of Digital ID should be ruled out. The Parliament could always seek to allow Digital ID by amending the law in the future if it becomes a widely accepted form of identification.

Recommendation 2

1.17The bill should be amended to prevent any use of Digital ID for age verification purposes.

Make ‘reasonable steps’ guidelines subject to Parliamentary accountability

1.18Under Item 5 of the bill, the eSafety Commissioner would have the role of writing guidelines for the taking of reasonable steps to prevent age-restricted users having accounts. At the hearing, Departmental officials provided little guidance on what these steps would be except that they would be guided by the Government's age verification trial which is not due to finish in mid-2025.

1.19The committee did not even have the time to take evidence from the eSafety Commissioner.

1.20The reasonable steps provisions are important because it is these details that will have the most impact on adult Australians. While Australians over 16 are not subject directly to constraints, to verify who is under 16, all Australians will have to be subject to some kind of age verification.

1.21Mandating age verification for all raises privacy and practicality issues. I am concerned that if the reasonable steps are made too onerous then they may inadvertently prevent older Australians (or Australians uncomfortable with technology) out of their accounts. Many older Australians rely on social media to stay in touch with their family and friends.

1.22Ironically, younger Australians are likely to have the savviness to circumnavigate the restrictions. If not designed properly, the social media ban may kick more 80-year-olds off social media than 8-year-olds.

1.23While the Explanatory Memorandum states that the eSafety Commissioner's guidelines ‘will not be binding’, in practice, and given the hefty fines for non-compliance, social media platforms are unlikely to depart far from the safe harbour of the written guidelines.

1.24The eSafety Commissioner's guidelines will not be a regulatory instrument so they would not be subject to Parliamentary scrutiny such as through a disallowance motion. This is not appropriate given the ramifications of getting the guidelines wrong.

1.25A better way would be for the Minister to make the guidelines as a regulatory instrument and hence be subject to Parliamentary disallowance. The Minister may consult the eSafety Commissioner on the guidelines, but a decision of this magnitude should be made by an elected official subject to Parliamentary accountability.

Recommendation 3

1.26The bill be amended such that the ‘reasonable steps’ guidance on how to conduct age verification be made by the Minister as a regulatory instrument disallowable by the Parliament.

Destruction of age verification material

1.27The bill seeks to protect the privacy of personal information collected by social media platforms by requiring its destruction (see s 63F(3)). However, the requirement is not triggered until the entity has used it ‘for the purposes for which it was collected.’ If the bill only allowed the collection of this material for the purpose of age verification then these provisions would protect privacy.

1.28However, the bill allows social media companies to use age-verification information for any other purposes with the ‘consent of the individual’. If the other purpose is for an ongoing need, then the entity would in effect never have to destroy the information.

1.29The Department stressed that the consent provisions of this bill are strengthened from those normally contained in similar legislation. In summary, the bill requires that the consent be voluntary, informed, current, specific and unambiguous. The individual must also be able easily withdraw consent.

1.30While this is welcome it is not clear precisely how such consent will be provided.

1.31There was not a convincing reason provided about why social media companies needed an open-ended requirement to use age verification information even with consent. The only specific example provided was in the event of a parent company owning multiple social media platforms (such as Meta owning Facebook and Instagram). Notionally, these provisions would allow the subsidiaries to disclose the age-verification information to its partner companies preventing the need for multiple age verification requests.

1.32Note that this example itself undermines the destruction provisions because an entity may be able to keep the information just in case someone is to open another account in the future.

1.33Ultimately, this is not a big enough reason to provide social media companies with broad powers to retain personal information. Given the recent, multiple instances of sensitive personal information being hacked, age verification information should be immediately destroyed once it is used for age verification. The only exceptions to this rule should be to comply with another law or a court ruling, and these exemptions are already covered in the bill through the reference to the Australian Privacy Principles.

Recommendation 4

1.34The bill be amended to remove the ability of social media platforms to use or disclose information collected for age verification purposes for any other purpose, apart from the need to comply with the law or a court ruling.

Inserting a role for parents

1.35Regardless of whether this bill passes, parents will remain at the frontline of monitoring and restricting children's use of social media. It is reasonable for the Government to help parents in this battle. It is unreasonable to expect any Government to completely replace the role of parents.

1.36It is strange then that this bill makes no mention of parents at all. This is despite the Government admitting that no law can prevent all harmful use of social media by children. A better approach would be for Government and parents to work together to limit harms.

1.37The original proposal for a social media ban by the South Australian Government included parental involvement. South Australia's proposal was for a ban to cover children under 14. Under this approach, 14- and 15-year-olds, could open social media accounts with parental consent. This approach has also been adopted in Florida.

1.38This would be a sensible alternative to the current Government heavy -handed approach. Parents are in the best position to understand the needs of their children. A partnership between Government and parents offers the best approach to reduce social media harm.

1.39To achieve this, the bill could be amended so that the Minister has the power to exempt classes of under-16 year olds from the provisions of the bill providing parental consent is provided in all such cases. One class could be children between 14 and 16 years old. Another class could be children with certain learning disabilities. Another could be children engaged in educational or vocational activities such as Leo Puglisi's @6NewsAu current affairs channel. Note that while the Minister can, at the moment, exempt entire social media platforms, there is no such ability to exempt particular children who may have a legitimate need for social media access.

Recommendation 5

1.40The bill be amended to allow the Minister to exempt classes of children from the age-verification restrictions provided that parental consent is given in these circumstances.

Tightening the definition of social media platform

1.41In evidence to the committee, the Department accepted that the definition of social media platform has been made intentionally broad and further that the Minister can restrict the definition through particular exemptions. The current definition includes a service that enables ‘online social interaction between 2 or more end-users’. This would cover almost all interaction on the internet from Facebook to Strava and WhatsApp to newspaper apps (most newspapers allow for comments sections). There is no evidence that many of these services have played any role in harming young children.

1.42It is understandable that the Government has rested on a definition close to one that is already in the Online Safety Act. But that Act covers a much wider range of issues. For the social media ban to be successful, it is essential that it remains focused on those apps that do the most damage to young children.

1.43A better approach would be to focus the definition to only cover platforms with features most associated with harms to young children. This has been the approach in a Florida law passed earlier this year.

1.44The Florida law defines a social media platform as one that has all of the following features:

allows for upload of content or to view content of other users,

ten per cent of daily active users who are under 16 use the service for an average of 2 hours per day or longer, or

employs algorithms to select content for users,

1.45and the platform must have at least one of the following features:

infinite scrolling,

push notifications,

displays individual metric (likes, shares, etc),

auto-playing of videos, or

live-streaming.

1.46Ideally the committee would have more time to evaluate the suitably of a more targeted definition appropriate for the Australian context. Yet, given the condensed time frame it would be better to err on the side of a narrowly focused law that can concentrate on the main harms rather than a broad law that would distract the Minister by requiring him or her to make endless exemptions.

Recommendation 6

1.47That the bill be amended to more narrowly define a social media platform, similar to the definition of a social media law under the Florida Online Protection for Minors Act.

Senator the Hon Matthew Canavan

Participating Member

Footnotes

[1]Ms Sunita Bose, Public hearing, Canberra, 25 November 2024.

[2]Dr Jennifer Duxbury, Public hearing, Canberra, 25 November 2024.

[3]Senator the Hon Sarah Henderson, Public hearing, Canberra, 25 November 2024.