Chapter 4Transparency, integrity and compliance measures for digital platform providers
4.1This chapter discusses the transparency, integrity and compliance measures contained in the provisions of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 (the bill), and inquiry participants' views on these measures.
4.2This chapter outlines the bill's provisions and inquiry participant views relating to:
current requirements for transparency and compliance in the voluntary Australian Code of Practice on Disinformation and Misinformation (the Code), and concerns identified by the Australian Communication and Media Authority (ACMA) about the effectiveness and operation of the Code;
the need for transparency and compliance strengthening measures in the bill;
the proposed requirement for digital platform providers to publish current media literacy plans; and
proposed measures for complaints handling and compliance.
4.3The next chapter discusses views about the ACMA's proposed regulatory powers, and related transparency measures, under the bill.
Current transparency requirements on digital platform providers
4.4One of the bill's objectives is 'to increase transparency regarding the way in which digital communications platform providers manage misinformation and disinformation'.
4.5The bill's Explanatory Memorandum stated that this objective stemmed from a 'sharp increase in Australians' concerns about misinformation', and that Australians' concerns are 'well above the global average'.
4.6The bill would 'impose core transparency obligations on digital platforms requiring them to be upfront about what they are doing on their services to combat misinformation and disinformation'.
4.7Under the bill, digital platforms would be required to publish media literacy plans setting out how users of their platform would be better able to identify misinformation and disinformation. Further, digital platform providers would be required to publish their current policies for misinformation and disinformation, and the results of risk assessments for misinformation and disinformation on their services.
4.8The ACMA would be able to make digital platform rules which relate to media literacy plans, risk management plans, and complaints and dispute handling processes.
4.9The Office of the eSafety Commissioner (eSafety Commissioner) highlighted the mandatory industry codes and standards for online safety under the Online Safety Act 2021. These codes and standards 'now create an enforceable obligation on companies to take reasonable steps' towards mitigating the spread of harmful online content such as child sexual exploitation material.
4.10The eSafety Commissioner stated that, from its perspective, 'asking tech companies to explain how they are enforcing their own declared house rules does not seem unreasonable or out of step with existing regulatory schemes'. Further, the eSafety Commissioner stated that 'without meaningful transparency, there can be no accountability'.
The operation of the voluntary industry code
4.11The bill's proposed transparency obligations are intended to broadly align with the objectives of the existing voluntary Code developed for digital communications platforms by the Digital Industry Group Inc (DIGI).
4.12The Code requires signatories to have systems and processes in place to minimise the risk of harm from misinformation and disinformation, and places responsibility on platform providers to determine whether content contravenes their policies.
4.13TikTok, one of the signatories to the Code, described the voluntary Code as 'a blueprint for addressing the complex and multifaceted challenge of harmful online misinformation and disinformation at scale'.
4.14The Code places a mandatory commitment on digital platform providers to implement measures to publish an annual transparency report. Transparency reports provided to DIGI under the Code have shown that significant amounts of content were identified as violating misinformation policies in 2023. For example, TikTok removed 28 511 pieces of content for violating misinformation policies in 2023, and warnings were displayed on more than 9.2million distinct pieces of content on Facebook, and more than 510 000 on Instagram in Australia.
4.15Snap Inc. noted that it is the 'only major platform to provide country-specific information about reports received, and our responses to these, in all the countries in which we operate around the world'. Snap Inc. stated that there is no regulatory mandate to provide this additional information, and that the ACMA should recognise 'responsible efforts to provide transparency information when considering application of its powers'.
ACMA's recommendations to strengthen the Code's transparency measures
4.16As discussed in Chapter 1, the voluntary Code's operation and effectiveness has been reviewed by the ACMA three times since its development. The ACMA has consistently raised the need for industry to improve the quality of its reporting against the Code.
4.17The first of the ACMA's three reports to the Australian Government, in 2021, put forward recommendations intended to strength transparency. These included new powers to make record-keeping rules, and new powers to register industry codes, enforce industry code compliance and make standards for the activities of digital platforms' corporations.
4.18The ACMA's reports outline observations on the effectiveness of transparency arrangements in the Code, including that:
there remains an urgent need to improve the level of transparency about the measures platforms are taking to address misinformation and their effectiveness
better reporting by signatories is needed to enable an assessment of progress and impact
industry needs to take further steps to review the scope of the code and its ability to adapt quickly to technology and service changes
transparency reports lack sufficient levels of consistent Australian trended data either by individual signatories or across signatories
data integrity issues limit insights into the effectiveness of some signatories' measures and undermine the ability of Australians to be confident that platforms are delivering on their commitments
there is no discernible progress towards developing key performance indicators (KPls) by both individual signatories and across the code.
4.19The ACMA stated that the recommendations set out in their 2021 report 'remain relevant and their implementation is more urgent than ever, given community concern'. The ACMA stated that the bill would enable it to collect information to improve transparency of digital platforms, including those platforms which are not signatories to the Code.
4.20The ACMA considered that the bill's provisions would 'significantly improve current arrangements and provide greater confidence to Australians that mis/disinformation was being actively addressed by digital platforms'.
Views that the code should be strengthened
4.21Reset Tech Australia highlighted its previous research, which had documented that the existing accountability mechanisms within the Code are deficient. Similarly the Australian Communications Action Network noted that voluntary industry codes can lead to weak compliance and falling consumer trust, citing the Telecommunications Consumer Protection Code.
4.22The Australian Forest Products Association was supportive of the bill's intention to 'address the identified shortcomings of the voluntary code of practice on disinformation and misinformation through greater regulation'.
4.23According to the ACMA, as the Code is voluntary, 'there is no regulatory backstop to either compel digital platforms to become signatories or hold them accountable if they breach their obligations'. The ACMA gave the following example of an original signatory to the Code failing to meet the requirements:
X Corp (an original code signatory) lost its signatory status in November2023 after the code's Independent Complaints Subcommittee found that it had committed a serious breach of the code for failing to provide a mechanism to the public to make reports of breaches of its misinformation policies for an extended period. This was the strongest enforcement option available to the Subcommittee. While this decision demonstrates that the code's governance arrangements are functional, it also highlights the limitations of voluntary arrangements.
4.24DITRDCA explained the need for the bill's transparency and integrity measures:
The ACMA has continuously highlighted the need for industry to improve the quality of its monitoring and reporting against the voluntary code's outcomes, noting that a robust performance measurement framework is critical to its success. The ACMA has found that the transparency reports made under the voluntary code lack consistent, trended, Australia-specific data on the effectiveness of digital platforms' efforts to address misinformation and disinformation on their services.
4.25Further, DITRDCA stated that the bill's provisions were required to address the need for stronger transparency:
It is increasingly clear that existing regulation is insufficient in dealing with seriously harmful misinformation and disinformation. Digital communications platform providers are not incentivised to proactively promote an online information environment that minimises the spread of misinformation and disinformation.
Support for greater transparency of digital platforms
4.26There was broad support for increased transparency and accountability of digital platforms, including their approach to content moderation.
4.27The eSafety Commissioner stated that while the main technology companies have made public commitments to ensure that platforms are doing the right thing, it is very difficult to evaluate effectiveness:
All the main technology companies have made public commitments to ensuring their platforms do not become havens of lies, deception, deceit and polarisation.
They also have rules to combat harmful content and conduct because online toxicity does not attract advertisers or help retain users.
4.28The eSafety Commissioner noted that digital platform policies promise strong measures to tackle misinformation and disinformation, including: human and automated fact-checking and moderation; warnings and labels for suspected misinformation and disinformation; promoting authoritative sources; user reporting; and, penalties. However, the eSafety Commissioner was not assured of their success, and stated that 'the answer is we don't really know' whether these policies work.
4.29The Department of Home Affairs (Home Affairs) welcomed the bill and 'its flow on implications for efforts to strengthen social cohesion and democratic resilience'. Home Affairs explained that the proposed transparency obligations would enhance its ability to understand the nature and extent of current risks which are posed by misinformation and disinformation.
4.30Officials from Home Affairs expressed support for measures which 'empower users to form their own views and advance their own critical thinking about those issues'. Home Affairs stated that:
We would also point to, again, the transparency obligations of this bill as a contribution to increasing the supply of trustworthy and credible information within the public debate, including the ability of government but also end users to understand the risks and the information flows, the content and nature of that information, balanced with appropriate protections for freedoms within our democracy.
4.31The Mudgee District Environment Group supported action to prevent misinformation and disinformation, stating that increased transparency for Australians to assess information would work to 'assure the Australian people that the information they have access to is true and correct'.
4.32While the Fighting Harmful Online Communication Hallmark Research Initiative at the University of Melbourne supported measures to inform digital platform users, it suggested that there is a lack of detail as to 'how plans should be written to satisfy the changes being proposed in this bill'.
4.33The ACMA explained that its role is firstly to oversee and enforce transparency provisions, including measures related to the publication of policies, risk assessment reports and media literacy policies. The ACMA's proposed powers related to setting rules in relation to these issues, and complaints handling, would be part of this role. The ACMA stated that 'the parameters around which we can make these decisions are quite specifically set out in the bill'.
4.34ACMA's proposed powers under the bill are addressed in the next chapter.
Concerns raised in relation to the proposed transparency and integrity measures
4.35Inquiry participants put forward a range of views relating to provisions in the bill relating to transparency and integrity.
Concerns over government accountability
4.36Strong concerns were raised in relation to accountability for government. The Australian Christians Party, for example, considered that the bill had a lack of accountability for government which was 'the most troubling aspect'. The Australian Christians Party stated:
Exemptions for government content, combined with the broad powers granted to ACMA, create an environment where the government can silence dissent without consequence. Transparency and accountability are critical to maintaining public trust, and this Bill offers neither. Freedom of speech belongs to all Australians, not just to those who align with the government's views.
4.37A key change from the 2023 Exposure Draft of the bill was to refine the categories of content that would be excluded from the bill to strengthen accountability while reinforcing the protections for freedom of speech. This includes removing the exclusion for government-authorised content and authorised electoral matter from this bill.
4.38The Australian Christian Lobby stated that 'Government transparency and a willingness to be held accountable is an essential prerequisite for the trust that enables democracy in Australia'. It considered that the bill proposes to limit transparency.
Protections for independent fact-checking services
4.39The Australian Associated Press (AAP) considered that the bill's provisions 'positions third-party fact-checkers as stakeholders, and would allow employees of organisations like AAP FactCheck to be called upon to provide information and documents to the ACMA. AAP FactCheck stated that it reports to the platforms in line with their record-keeping requirements, and considered that the bill's provisions could discourage platforms from engaging fact-checking organisations 'for fear they may be compelled by [the] ACMA to reveal otherwise confidential information'.
Proposed media literacy plans provisions
4.40As set out in Chapter 2, the bill would impose certain requirements on digital communication platform providers including to publish a current media literacy plan for their platform setting out the measures the platform will take to enable end-users to better identify misinformation and disinformation and identify the source of content disseminated on the platform.
4.41Inquiry participants provided a range of views on the bill's provisions relating to the publication of current media literacy plans by digital platform providers.
Support for the bill's media literacy plan provisions
4.42The Institute for Civil Society supported the bill's provisions for transparency in media literacy plans, arguing that this will 'empower' users of digital platforms 'to better identify misinformation and disinformation'.
4.43Reset Tech suggested that 'a single, coordinated repository of information, administered by ACMA' would make it easier for the public to access the media literacy plans and policies of digital platforms:
We note that the Bill (s 17) provides for digital platforms to publish their risk assessment report, misinformation and disinformation policies, a media literacy plan, and other information specified in digital platform rules. We commend the spirit of this section, but note that it would be more straightforward for public users to access a single, coordinated repository of information, administered by ACMA.
4.44Home Affairs particularly welcomed the bill's proposed requirement for digital platforms to publish a media literacy plan:
Greater transparency around these plans will mean the Department can also better target its media literacy efforts in response to specific risks (e.g., foreign interference) in a manner that complements and is consistent with various digital platform efforts.
Opposing views regarding media literacy plans
4.45Digital Industry Group Inc (DIGI), the digital industry peak body, did not support the proposed requirement to publish current media literacy plans, and argued that media literacy should 'be part of a coordinated, systematic and evidenced informed literacy policy, developed and implemented by government'.
4.46Further, DIGI stated that this should use 'input from stakeholders including educators at the primary and secondary school level, vocational educators, and educators at higher education institutions'. According to DIGI, 'delegating this responsibility will lead to fragmented, inconsistent approaches and are unlikely to be effective'.
4.47DIGI representatives elaborated on its views, and stated that there are a number of low-risk businesses which would be subject to mandatory media literacy plans including large and small-scale content aggregators. DIGI stated that product review sites which aggregate information relating to things such as road conditions, travel and hobbies would be required to have a media literacy plan, which would be an 'onerous requirement'.
Industry reporting and data
4.48The bill proposes to introduce greater transparency and accountability requirements for digital communication platforms, including the publishing of the results of their risk assessments relating to misinformation and disinformation on their platforms. Inquiry participants put forward a range of views on these provisions.
Calls for the bill's industry reporting provisions to be strengthened
4.49Reset Tech argued that the bill should be strengthened to improve public accountability and transparency by requiring that the ACMA release all industry reports and data submitted to it within a reasonable period:
For completeness, it would be prudent to prevent potential content carve-outs and provide expressly that any further materials submitted to ACMA by platforms are also provided to the public via this repository after a reasonable period of time. The effect of this is there is a central location for comprehensive information on the outputs generated by the Bill, including information made available to the regulator. This extra step of public transparency is, in our view, key for building public trust and understanding of the Bill's functionality.
4.50Similarly, the Australian Democracy Network called for industry reports and data submitted to the ACMA to be publicly released. The network stated that while the bill would provide for the ACMA to be able to publish certain information on its website, 'it creates no obligation to do so'. To enhance transparency, and therefore improve public trust, 'it's vital that ACMA doesn't become a black box for information about platform measures to curb misinformation and disinformation'.
4.51The Human Rights Law Centre also recommended that reports submitted to the ACMA under digital platform rules should be publicly released. It suggested that the ACMA create a 'single, coordinated repository for public users' to access the digital platforms' risk assessment reports, misinformation and disinformation policies, and media literacy plans 'in the bill's current provisions as well as the recommended additional information.'
Concern about the regulatory burden
4.52Other submitters, including digital platforms, highlighted the additional regulatory burden imposed by the proposed reporting provisions.
4.53Snap Inc., the technology company responsible for Snapchat, was concerned at the potential impact of the increased regulation on the digital platforms market in Australia:
Ultimately, the companies who are best served by overly burdensome and complex regulation are the largest firms, with the largest compliance teams, who can easily deal with the bureaucracy involved, while smaller companies struggle to handle the workload.
4.54It stated that any new powers conferred on the ACMA should be 'exercised in a risk-based and proportionate way that minimises unnecessary bureaucratic and administrative burden on industry participants'. Snap Inc. also stated that responding to regulatory requests for information, complying with regulatory codes or practice or standards, and providing transparency information 'requires significant time and resources'.
4.55Snap Inc. recommended that the bill be amended to require the ACMA to 'consider existing efforts made by digital platform service providers to publish transparency information before exercising its [proposed] new powers related to information-gathering or code development'.
Complaints and dispute handling
4.56As part of its administration role, DIGI can accept and investigate complaints about platforms' alleged breaches of the Code, and as noted below, can investigate and withdraw a company from the Code if necessary.
4.57The bill would empower the ACMA to make rules requiring digital communications platform providers to implement and maintain a process for handling complaints and resolving disputes about misinformation and disinformation.
4.58As set out in the Explanatory Memorandum, these rules, could for example, require digital platform providers to implement and maintain complaints and dispute handling processes regarding misinformation, or to set minimum standards for these processes.
Calls for the provisions in the bill to be strengthened
4.59The Australian Communications Consumer Action Network, the peak body for consumers on communications issues, suggested that the bill should empower the ACMA to set minimum complaints and dispute handling processes for all types of complaints on digital platforms, rather than only those relating to misinformation and disinformation. It also suggested allowing complaints to be elevated to an independent third-party body such as the Telecommunications Industry Ombudsman.
Calls for an appeals process
4.60The Institute of Public Affairs (IPA) was strongly opposed to the bill, including its proposed measures for complaints handling. The IPA stated that Australians whose content is found to violate the bill's provisions for misinformation and disinformation would not have a right of appeal or review against the decision made by the ACMA, and that:
An aggrieved Australian may lodge a complaint with the digital platform, but the platforms are private entities and as such are not bound by the rule of evidence and the common law protections for procedural fairness.
4.61The Institute for Civil Society also stated that the bill 'does not guarantee a right of appeal against any decision made by [the] ACMA', and that the bill would delegate the decision as to whether something may be subject to appeal through regulations. The Institute called for this to 'require direct Parliamentary endorsement'.
4.62The Combined Faith Leaders was also concerned at the 'conspicuous lack of appeal rights and transparency with regard to removed content'. The Combined Faith Leaders called for an appeals process for individuals whose content is considered to be misinformation or disinformation, with the individual to be told immediately of the occurrence with the reasons provided. Independent review bodies should be available to review decisions upon appeal, with the information published.
4.63This concern was echoed by a representative of the Victorian Bar, who stated that the bill does not have 'any obvious redress for people whose content is wrongly labelled as being misinformation'.
4.64DITRDCA's submission made it clear that the bill would allow the ACMA to create digital platform rules dealing with complaints and dispute handling processes, and requiring platforms to keep and report on specific information such as the number of complaints and reports about misinformation and disinformation made by Australian users. DITRDCA also stated that a number of decisions by the ACMA could be reviewed in the Administrative Review Tribunal.
Compliance measures
Concerns that the bill will lead to 'over censorship'
4.65The IPA put forward concerns that the bill would incentivise 'over-compliance with obligations to censor, whereas digital platforms are under no obligation to protect freedom of speech'. The IPA highlighted provisions relating to financial penalties on digital communications platform providers for non-compliance with standards for misinformation the IPA characterised as 'vague'. The IPA stated that this 'will incentivise platforms to excessively censor to mitigate risk'.
4.66The IPA stated that while the bill proposes obligations and penalties for the failure to manage content, 'similar obligations and penalties do not apply in situations where a platform censors content that is not misinformation'.
4.67The IPA further stated that there is a conflict between 'censoring content' to address the risks of misinformation and the principle of freedom of speech:
For the digital platforms, the material risk is only on one side of the conflict. If contentious information is being circulated on a digital platform, censoring that information comes at no cost to the platform. But there is a significant potential cost for the same information being left to circulate. If in doubt, the platforms are incentivised to censor so as to mitigate risk.
4.68Similarly, the Institute for Civil Society raised concerns that private sector organisations (the digital communications platform providers) would be required to implement the bill's provisions. The Institute argued that platform providers have a 'tendency … to over-censor', and called for 'maximum transparency as to what information is being censored', and for 'the right to rapid review and redress when material is incorrectly labelled 'misinformation' or 'disinformation'.
4.69The Combined Faith Leaders also raised concerns that digital platform providers might be 'highly motivated to over-censor content'. The Combined Faith Leaders highlighted existing concerns over the platforms' ability to remove or affect access to content and considered that the bill would provide the ability for platforms to 'claim Government approval—and even mandate—for removing posts, shadow banning, and other forms of censorship'. They stated that:
Digital content platform providers are privately owned or publicly traded companies with a primary responsibility to maximise profit and share value. It would be foolish to rely on these organisations to prioritise freedom of speech, the good of society, or anything other than profit. By and large, these organisations do not have a history of behaving in the public interest.
4.70DITRDCA's submission states:
Nothing in the Bill enables the ACMA to take down individual pieces of content or user accounts. The Bill takes a system level approach and digital platforms would remain responsible for managing content on their services.
…
It is expected that the ACMA will use a graduated, proportionate and risk-based approach to non-compliance and enforcement, which could include issuing formal warnings, remedial directions and infringement notices, and applying for injunctions and civil penalties, depending on the particular provision.
…
[T]he amount of civil penalties payable by digital communications platforms for breaches of approved misinformation codes and misinformation standards would be determined by the courts (up to the maximum amounts in the Bill).
There are no criminal penalties in the Bill.
Expanding data access for researchers and fact checkers
4.71Various submitters and witnesses to the inquiry advocated for expanding access to digital platform data for researchers and fact checkers in Australia, with some citing the European Union's Digital Services Act 2022 as an example of legislation that mandates data access for researchers.
4.72Reset Tech Australia stated that it had been 'concerned since the exposure draft stage that there won't be enough public accountability and transparency', calling for researchers to be able to examine misinformation in Australia. It drew attention to challenges faced by independent researchers accessing data as platforms may limit access to their data.
4.73Similarly, the Human Rights Law Centre outlined that '[a]t present, platforms are not voluntarily providing key data to researchers in Australia that would allow independent review'.
4.74The Australian Muslim Advocacy Network Foundation recommended the immediate access of digital platforms data relating to misinformation and disinformation for accredited independent researchers. This should also be extended to capture data from the ACMA about its requests to digital platforms under the bill.
Government amendments made in the House of Representatives
4.75As outlined previously, on 7 November 2024 the House of Representatives passed several amendments to the bill. These included proposed data-related measures to require digital communications platform providers to publish information regarding their policy or policy approach for supporting access by researchers, and providing the ACMA with the power to make digital platform rules to establish one or more data access schemes.
4.76As set out in the Supplementary Explanatory Memorandum, the data access related amendments reflect 'increasing recognition worldwide of the value of independent research about misinformation and disinformation on digital communications platforms'. This includes developments in other jurisdictions, including the European Union to implement a data access scheme supporting independent research.
4.77The Human Rights Law Centre indicated it supported these amendments.