Chapter 5 - Proposed powers for the Australian Communications and Media Authority

Chapter 5Proposed powers for the Australian Communications and Media Authority

5.1This chapter outlines the key issues raised during the inquiry about the Australian Communications and Media Authority's (ACMA) proposed regulatory powers under the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 (the bill).

5.2This includes views about the ACMA's proposed information-gathering powers, industry code and standard making powers, enforcement powers and the penalty provisions in the bill as well as the transparency of the ACMA's decision making.

5.3The following chapter concludes with the committee's view and recommendations.

The ACMA's proposed regulatory powers under the bill

5.4As discussed in Chapter 2, the bill would provide the ACMA with a range of regulatory powers to address misinformation and disinformation on digital communications platforms, including the ability to make digital platform rules.

5.5The ACMA would be given certain information-gathering and record keeping powers, and the ability to approve an industry code of practice or determine a misinformation standard.

5.6As noted in a previous chapter, the ACMA stated that it would not have a role in assessing the seriousness of harm under the bill. The ACMA also stated it would not determine what constitutes misinformation and disinformation.[1] Submitter views about the scope of the ACMA’s role in determining misinformation and disinformation is set out in Chapter 3.

5.7The ACMA outlined that it has developed expertise in relation to digital platform activities including their efforts to address misinformation and disinformation in Australia. This includes its role in overseeing industry efforts under the Australian Code of Practice on Disinformation and Misinformation (the Code) since 2021.[2]

5.8The ACMA further outlined that it would 'leverage its experience overseeing compliance with transparency requirements and co-regulation of codes of practice for the broadcasting and telecommunications industries when undertaking compliance and enforcement activities under this bill'.[3]

5.9Diverse views were expressed about the scope and effectiveness of the ACMA's proposed regulatory powers as provided for in the bill, and are set out below.

Proposed information-gathering powers

5.10Under the bill, the ACMA would have the power to obtain information from digital platforms, and make rules requiring them to create and retain records relating to misinformation and disinformation. Digital platform providers could be required to provide reports to the ACMA periodically.[4]

5.11The bill provides for protections for end-users of digital platforms in Australia, by limiting the type of individuals required to produce information or documents to: 'a platform employee, content moderator, fact checker or a person providing services to the provider of the digital platform'.[5]

5.12The Department of Infrastructure, Transport, Regional Development, Communication and the Arts (DITRDCA) stated that these powers would enhance transparency, and allow the regulator to track the progress of digital platforms in addressing misinformation and disinformation on their services.[6]

5.13This would allow the ACMA to gain insights into the extent of misinformation and disinformation on digital platforms and the effectiveness of measures to combat its spread, including the effectiveness of voluntary codes, approved misinformation or misinformation standards.[7] DITRDCA also outlined that these provisions would 'set a clear expectation that digital platforms must be transparent with the Australian public'.[8]

5.14The bill would also allow the ACMA to obtain information and documents from a wider range of sources than digital communications platform providers.

5.15Officials from the ACMA stated that it would use the proposed informationgathering powers to assess whether a code made under the bill is operating effectively, and would use data collected under the new informationgathering powers or record keeping rules, or research conducted by the ACMA:

By gathering that information, we are able to ascertain whether their processes and systems are actually working. We would seek to know, for example, how they're handing their complaints? How many complaints are they receiving and resolving? We're not particularly interested in what those complaints may be about, but the system and the process that they've put in place. Have they employed fact checkers? Have they done what they said they would do under the voluntary code as it exists at the moment. We'd probably make an assessment against that voluntary code to some extent when we're deciding whether there is a need to take the next step forward.[9]

5.16ACMA officials noted that the proposed information-gathering powers could be used to increase transparency around the use of algorithms and recommender systems on services in future.[10]

Concern about the breadth of the information-gathering powers

5.17Several submitters expressed concern that the bill would provide the ACMA with too broad information-gathering powers.[11] For example, the Institute of Public Affairs characterised the ACMA's proposed powers as 'coercive' and were of the view that these proposed powers could 'violate fundamental legal rights' including the right to silence and the privilege against selfincrimination.[12]

5.18Australians for Science and Freedom contended that the proposed informationgathering powers would enable the 'ACMA to act as a spy agency, keeping dossiers on people who have been "wrong" on the internet'.[13]

5.19The Public Interest Journalism Initiative and Digital Rights Watch expressed concern about the breadth of these proposed powers, highlighting 'the absence of parliamentary oversight or review in relation to the exercise of these powers' and the absence of any requirement that personal or sensitive information be exempted from disclosure or anonymised.[14]

5.20Similarly the Digital Industry Group (DIGI), while supportive of the ACMA being granted the power to require platforms to keep and provide records and information concerning misinformation and disinformation on their services, argued that these powers 'remain broadly drafted and do not sufficiently restrict the ACMA's discretion to request information to the policy intent of the bill'.[15]

5.21The Australian Broadcasting Corporation recommended that the informationgathering powers in the bill should be subject to similar limitations as those contained in the Broadcasting Services Act 1992, including protection of journalists' sources.[16]

The ability to approve industry codes and register a misinformation standard

5.22As set out in Chapter 2, the ACMA would have the power to approve and register enforceable misinformation codes developed by sections of the digital platforms industry, and in certain circumstances, determine misinformation standards for sections of the digital platforms industry.[17]

5.23A code or standard could include obligations on platforms to have robust systems and processes in place such as reporting tools, links to authoritative information, support for fact checkers and 'demonetisation' of disinformation.[18]

Support expressed for the ACMA's code approval and misinformation standard powers

5.24Reset Tech Australia, a research organisation focused on digital risks, expressed support for enhanced regulatory powers for misinformation and disinformation, citing failures in the existing transparency and accountability mechanisms under the existing voluntary code (the Australian Code of Practice for Misinformation and Disinformation):

… the threshold for ACMA to develop an industry standard has long passed … ACMA should be immediately empowered to bypass industry codes and set a standard …[19]

5.25The Australian Communications Consumer Action Network, the peak body for consumers on communications issues, argued that the bill should go further and 'require the ACMA to publish information on measures taken by digital communications platforms in response to misinformation codes'.[20]

5.26Similar views were expressed by the Tasmanian Climate Collective recommending that the bill be amended 'by removing provisions for industry self-regulation and empower independent regulatory bodies with the authority to establish new standards'.[21]

Concern about the scope of ACMA's code and misinformation standard powers

5.27In contrast to these views, other submitters argued for greater constraints on the ACMA's proposed approval powers for industry codes and standards.

5.28For example, DIGI recommended that 'additional minimum guardrails' should be 'set around the exercise of the power to request codes'. According to DIGI, the ACMA should only be able to request industry codes for 'those sections which are at a high risk of dissemination of mis and disinformation and which are not voluntarily implementing adequate measures to protect the community'.[22]

5.29Similarly, Snap Inc. emphasised the need for the ACMA to consider the measures taken by digital platform service providers to comply with existing regulatory codes 'before requesting the development of new codes for misinformation and disinformation'.[23]

5.30Google contended that there should be better guardrails and transparency around the ACMA's proposed powers to request and register codes and implement standards. These include requiring the ACMA to publish:

the basis for any decision to request a code, accept or reject a variation of a code;

the grounds for refusing to approve a code;

the grounds on which the ACMA has determined that a code is deficient or totally deficient; and

the grounds for determining a standard and varying a standard.[24]

5.31Calls for greater transparency about the ACMA's decision-making is set out further in this chapter.

5.32A range of other views were expressed about these proposed powers. These included:

limiting the scope of the ACMA's proposed codes and standard making powers to disinformation as it is more objectively defined, than misinformation;[25]

better defining the 'exceptional circumstances' in which a standard is made and including a sunset clause;[26]

lengthening the timeframe for industry to develop a code, and for the ACMA to assess whether a misinformation code has failed;[27] and

expanding the examples of matters that may be dealt with in a code or standard to include consumer harms or the undermining of consumer trust in the digital economy.[28]

5.33The ACMA outlined certain practical considerations that may lead it to request a misinformation code. These include:

that the voluntary code does not include participation from platforms that have large Australian user bases or does not include effective reporting or governance arrangements;

data that platforms publish under the bill, including the outcomes of risk assessments, may point to deficiencies that may require measures to be put in place;

complaints data published under the digital platform rules indicates that concerns about misinformation or disinformation are not being addressed;

independent and the ACMA commissioned research, that points to ongoing community concern about misinformation or disinformation on digital platforms; and

records kept and reported on including increasing volume of content violating platforms' policies.[29]

Enforcement mechanisms including penalty provisions

5.34As set out in Chapter 2, the bill proposes that the ACMA would take a graduated and proportionate approach to enforcement with a range of tools to address non-compliance by digital platforms. These include the use of formal warnings, remedial directions, infringement notices, injunctions and civil penalties.

5.35This approach is consistent with the ACMA's broader compliance and enforcement approach across its broadcasting, radiocommunications, telecommunications and certain online content remit.[30]

5.36The UTS Centre for Media Transition expressed broad support for the bill's proposed enforcement regime stating:

Overall, we think the enforcement regime under the bill is well-designed and improves on other parts of the Broadcasting Services Act in offering ACMA a more extensive range of enforcement options.[31]

5.37While there was support for mechanisms to hold digital platforms accountable, various submitters highlighted the need to ensure proportionality, transparency and appropriate safeguards.

Concern that the penalty provisions are excessive

5.38Several submitters expressed concerns about the severity of potential penalties, with some arguing that these penalties would encourage digital platforms to 'over-enforce' and over-censor and disproportionately impact smaller platforms.

5.39The need for a proportionate approach to penalties was raised by some submitters. For example, Google advocated that penalties should be available only where there has been a serious breach of a key requirement of the bill, and 'should be proportionate to the nature of the relevant conduct'. Google expressed concern that some requirements, such as obligations to publish certain information like a media literacy plan, are captured by the civil penalty provisions.[32]

5.40DIGI recommended that the proposed penalties should be comparable to those in the Broadcasting Act 1992, including for breaches of broadcast codes and standards.[33]

Whether criminal penalties would apply

5.41Concerns were also expressed during the inquiry about the possibility of criminal penalties applying to contraventions of the bill. However, DIGI noted the removal of criminal penalties from the 2023 Exposure Draft of the bill.[34]

5.42DITRDCA also confirmed that there are no criminal penalties in the bill.[35]

It further clarified, in relation to criminal proceedings against individuals:

There are no circumstances in this bill where an individual could be the subject of [criminal] proceedings in relation to this. Again, the bill is focused on digital platforms and the commitments they make under the codes they submit to ACMA. It is not about the people that post information to those platforms; it's focused on the platforms … There are circumstances in the bill where individuals with knowledge of the platforms' compliance with the codes, such as employees of the platforms, might be asked to provide information to assist ACMA, but there are no circumstances in this bill at all where individuals would be the subject of such [criminal] proceedings.[36]

Transparency of the ACMA's decision making

5.43The transparency of the ACMA's decision making under the bill's provisions was also highlighted during the inquiry. This included various calls for greater clarity about the ACMA's decision-making.

5.44For example, the Australian Muslim Advocacy Network Foundation emphasised the importance of transparency of both digital communications platform providers and the Government recommending that the ACMA 'publish information about actions they take that that impact human rights'.[37]

5.45Digital Rights Watch suggested that greater transparency for reporting by ACMA would be welcome, in particular, reporting on platforms' decision making around content amplification.[38]

5.46DITRDCA highlighted the range of transparency and accountability measures which would apply to the ACMA under the bill, including:

certain regulatory actions by the ACMA, such as the making of digital platform rules, approval of codes or the determination of standards, are all disallowable legislative instruments subject to Parliamentary scrutiny;

reporting on the ACMA's regulatory actions, including its decision-making, as part of the review of the operation of the bill; and

a requirement for the ACMA to publish on its website an annual report on the operation of its powers under the bill.[39]

5.47The ACMA welcomed transparency and accountability measures for its decision making powers as proposed in the bill, including:

The ACMA's powers to develop digital platform rules, approve codes or make standards under the bill are all subject to parliamentary scrutiny and disallowance.

Clause 69 of the bill requires the ACMA to give an annual report on the exercise of its powers under the bill to the Minister. This report would provide details on the use of the ACMA's powers during the financial year, and comment on whether misinformation codes or standards are necessary. The Minister must present the report to the Parliament and the ACMA must also cause a copy of this report to be published on its website.

Clause 70 of the bill requires that triennial reviews will be conducted on the operation of the bill.[40]

5.48These are in addition to existing transparency and accountability mechanisms in place for the ACMA, including annual reports, a published compliance and enforcement policy, publishing a set of annual compliance priorities, publishing investigations, and ongoing parliamentary scrutiny.[41]

The ACMA's intended activities

5.49The ACMA outlined that it anticipated taking a number of immediate steps, if the bill were passed, to inform digital platforms and the public about its proposed powers and responsibilities. This would include:

educating the public and platforms about its new powers, and the obligations placed on the ACMA in exercising them;

publishing industry guidance about the providers and services captured by the legislation;

communicating to platforms the impact of the legislation and its expectations for initial compliance with transparency obligations including on data access for researchers and related obligations under the Code;

continue to work with industry to improve voluntary transparency reporting, including the adoption of pilot metrics to improve reporting under the Code; and

use its information-gathering powers should voluntary efforts to improve transparency fail or prove inadequate.[42]

Footnotes

[1]Australian Communications and Media Authority (ACMA), written questions on notice from Senator Grogan, 22 October 2024 (received 1 November 2024).

[2]ACMA, written questions on notice from Senator Grogan, 12 November (received 13November2024).

[3]ACMA, written questions on notice from Senator Grogan, 12 November (received 13November2024).

[4]Department of Infrastructure, Transport, Regional Development, Communication and the Arts (DITRDCA), Submission 74, p. 3.

[5]DITRDCA, Submission 74, pp. 3–4.

[6]DITRDCA, Submission 74, p. 3.

[7]Explanatory Memorandum, p. 90.

[8]DITRDCA, Submission 74, p. 3.

[9]Ms Creina Chapman, Deputy Chair, ACMA, CommitteeHansard, 11 November 2024, p. 55.

[10]Ms Kelly Mudford, Manager, Disinformation and Platforms, ACMA, Committee Hansard, 11November 2024, p. 73.

[11]See, for example, Institute of Public Affairs, Submission 44, [p. 21]; Digital Rights Watch, Submission 35, p. 6.

[12]Institute of Public Affairs, Submission 44, p. 14; Digital Rights Watch, Submission 35, p. 6.

[13]Australians for Science and Freedom, Submission 17, p. 7.

[14]Public Interest Journalism Initiative, Submission 82, p. 4.

[15]Digital Industry Group (DIGI), Submission 79, p. 3.

[16]Australian Broadcasting Corporation, Submission 37, p. 5.

[17]Explanatory Memorandum, p. 24.

[18]DITRDCA, Submission 74, p. 7.

[19]Reset Tech Australia, Submission 25, p. 9.

[20]Australian Communications Consumer Action Network, Submission 77, p. 1.

[21]Tasmanian Climate Collective, Submission 92, p. 3.

[22]DIGI, Submission 79, pp. 27−28.

[23]Snap Inc., Submission 63, p. 2.

[24]Google, Submission 86, p. 16.

[25]See, for example, DIGI, Submission 79, pp. 8−9; Google, Submission 86, p. 9.

[26]DIGI, Submission 79, p. 3.

[27]DIGI, Submission 79, p 35. DIGI notes that the timeframe of 120 days for the making of a code is insufficient to 'allow for meaningful engagement with industry, the regulator, and other stakeholders'.

[28]Australian Communications Consumer Action Network, Submission 77, p. 4.

[29]ACMA, written questions on notice from Senator Grogan, written questions on notice from SenatorGrogan, 12 November (received 14 November 2024).

[30]ACMA,Compliance and enforcement policy(accessed 12 November 2024).

[31]UTS Centre for Media Transition, Submission 98, p. 6.

[32]Google, Submission 86, p. 20.

[33]DIGI, Submission 79, p. 38.

[34]DIGI, Submission 79, p. 38.

[35]DITRDCA, Submission 74, p. 11.

[36]Mr James Chisholm, Deputy Secretary, Communications and Media Group, DITRDCA, CommitteeHansard, 11 October 2024, p. 60.

[37]Australian Muslim Advocacy Network Foundation, Submission 29, p. 3.

[38]Ms Elizabeth (Lizzie) O'Shea, Founder and Chair, Digital Rights Watch, Committee Hansard, 11October 2024, p. 34.

[39]DITRDCA, written questions on notice from Senator Grogan, 22 October 2024 (received 5November 2024).

[40]ACMA, written questions on notice from Senator Grogan, 22 October 2024 (received 1November2024).

[41]ACMA, written questions on notice from Senator Grogan, 22 October 2024 (received 1November2024).

[42]ACMA, written questions on notice from Senator Grogan, 12 November 2024 (received 13November2024).