Chapter 2 - The changing media landscape in Australia

Chapter 2The changing media landscape in Australia

Overview

2.1Evidence presented to the inquiry pointed to a complex relationship between digital platforms and the traditional news media.

2.2With audiences moving increasingly online for news and information, digital platforms have become 'unavoidable trading partners' for traditional media organisations. The ability of digital platforms to harness user data to sell highly targeted advertising, has seen advertising revenue diverted from traditional media organisations to the platforms. This shift has altered the media landscape in Australia, with serious consequences for the sustainability of traditional news outlets.

2.3At the same time, the committee heard that digital platforms have evolved into online gatekeepers[1]—wielding significant control over the news content seen by users—but without any of the transparency, accountability and oversight applying to traditional publishers.

2.4The lack of transparency and accountability of digital platforms was a common theme emerging from the inquiry. There was also a sense that existing regulation has not kept pace with the evolving role of digital platforms, or the challenges posed by both social and technological developments, including the manipulation of algorithms as part of online disinformation campaigns, as well as the widespread use of generative Artificial Intelligence (AI).

2.5In addition, while there was near universal agreement about the importance of public interest journalism as a bulwark against mis- and disinformation, questions were raised about its ability to fulfil this function in the face of falling advertising revenues, declining trust in journalism, increasing news avoidance, and the sheer volume of misleading and false material circulating online.

2.6The remainder of this chapter focuses on the impact of digital platforms on the Australian media landscape, the proliferation of mis- and disinformation on digital platforms, the importance of public interest journalism in countering 'fake news', and the effectiveness of current approaches to the regulation of digital platforms.

The impact of digital platforms on the Australian media landscape

2.7The Australian media landscape has changed rapidly since the advent of the internet and the emergence of digital platforms, particularly social media platforms.

2.8Prior to the digital era, print, television and radio news media funded their journalism through advertising revenue, with print outlets reaching audiences via the direct sale of newspapers and magazines.[2]

2.9However, as noted by the Media, Entertainment and Arts Alliance (MEAA), the rise of digital platforms 'fundamentally upended the way the news media interacts with the community', 'opened up greater choice for users to seek different sources of information' and 'given consumers power over what news is covered and how'.[3]

2.10While these changes were initially seen in a positive light for both news producers and consumers, the MEAA went on to argue that:

This "democratisation" of the media was initially hailed as a boon for news producers and consumers. But consolidation of platforms, search engines, e-commerce sites, and other online services has given those companies that are still standing incredible power. There are now a small number of giant multi-national companies controlling how people access information and interact with each other online.[4]

The use of digital platforms to access news and information

2.11Overall, television is still the most frequently accessed news source in Australia. However, Australians are turning increasingly to digital platforms for news and information. This is particularly true for younger Australians.

2.12Ms Nerida O'Loughlin of the Australian Communications and Media Authority (ACMA) told the committee that 53 per cent of Australian adults access news via free-to-air television, with 48 per cent accessing news via social media. For 20 per cent of Australian adults, social media communications and websites are their main source of news.[5]

2.13ACMA's findings are borne out by the University of Canberra's Digital News Report: Australia, which found social media is closing in on television as the primary source of news for Australians and has become the main source of news for Gen Z Australians:

Although television remains the most popular news source, its popularity declined to 56% (-2pp). Almost half of Australians (49%) use social media to access news, marking a 4-percentage point increase since last year. Nearly two thirds of Gen Z (60%) rely on social media as their main news source, which is a significant increase of 17 percentage points from last year. This year Gen Z's use of Instagram for news increased by 8pp and the platform is now the top social media platform for news among this generation (34%).[6]

2.14Of the Australian adults who had accessed social media in the preceding seven days, ACMA's survey found that 68 per cent accessed professionally produced media, 43 per cent accessed official/reputable sources of information, and 38 percent accessed information via community or special interest groups.[7]

2.15Free TV Australia (Free TV) drew particular attention to ACMA's findings that younger Australians rely more on celebrities and influencers (31 per cent) and unknown sources (33 per cent) for news content.[8] SBS also pointed out social media usage is higher among multilingual Australians and First Nations communities.[9]

Digital platforms as 'unavoidable trading partners' for news media outlets

2.16The committee heard repeatedly that digital platforms have become 'unavoidable trading partners' for news media outlets.[10] For example, Mr Tim Duggan of the Digital Publishers Alliance (DPA) highlighted publishers' reliance on digital platforms to reach their audiences:

Meta has been, and can be, considered an unavoidable trading partner. The truth is that, if you want to build a modern media business, you need to go where your audiences are. Many of them are on social media, particularly Facebook and Instagram.[11]

2.17This point was reinforced by Ms Catriona Lowe of the Australian Competition and Consumer Commission (ACCC), who spoke about the market share built by digital platforms as they evolved into the 'unavoidable gatekeepers between Australian consumers and businesses'.[12]

2.18The Local & Independent News Association (LINA) highlighted the positive aspects of this relationship and noted that digital platforms provide new ways to engage with target audiences, reach particular communities more quickly, and gather story leads and community feedback:

Facebook and Instagram therefore play a valuable role in supporting the visibility of publishers' content and helping to establish direct relationships with their audience, providing a platform for audience feedback and discussion, as well as promoting community engagement and the representation of diverse voices within media coverage.[13]

2.19While recognising the symbiotic relationship between publishers and digital platforms, some participants suggested that platforms have exploited the content and credibility of news media organisations to build their market share and create dependency. For example, Mr Anthony Kendall of Australian Community Media (ACM) explained:

As a business we have participated in social media to help grow our audience and because we believe users of those platforms deserve access to credible local news and information. But the platforms have used our content and our brand equity to build their profits to the point where they are now monopolies.[14]

2.20This sentiment was reflected in evidence from Mr Mike Sneesby of Nine Entertainment, who argued that:

… over a number of years, Meta has used the content from media organisations to build a scale audience reaching most of the Australian population, creating that unavoidable business partner and a dependency on that distribution channel by media businesses.[15]

2.21Participants also told the committee about being incentivised to build audiences on digital platforms, to the benefit of those platforms. For example, as the founder of digital media company, Junkee Media, Mr Duggan described being 'heavily incentivised and courted' to create content for Meta and Instagram:

… we were encouraged, incentivised—we were given training; we were given funding—to create products specifically for the digital platforms, specifically for Meta and Instagram. What that means is that we created content and we built audiences. We spent a lot of time building up these very large audiences on these platforms, to the benefit of the digital platforms. We would put our content onto Facebook and Instagram, and ads would be surrounding that content. Audiences would go towards there.[16]

2.22However, after being encouraged to build their audience on digital platforms, publishers began to find their access to those audiences restricted as a result of algorithmic changes.[17] According to Mr Duggan, 'Meta are able to dial up and dial down what people see on their platforms, and what they have done progressively over time is to dial down the audience'.[18]

2.23Ms Karen Percy of the MEAA labelled this tactic as a 'bait and switch':

You have this situation where the platforms have made news organisations dependent on the platforms. You have to go where the audience is, because they're not going to traditional platforms or traditional ways of getting it anymore. They have drawn us in and then done a bait-and-switch. You used to be able to promote your content at a low cost. It's much harder to find news content these days, though we know that people get their news that way. I think it's important to understand what the back end of these organisations [are] doing.[19]

2.24Similarly, Mrs Natalie Harvey described Mamamia's experience of this for the committee:

Facebook encouraged publishers to go on the platform and create audiences that would help drive platform usage as well. So that benefits Facebook. It does also benefit the publishers, because you're creating these communities and you're having a very deeply engaged conversation with them. But, over time, it gets harder and harder to reach those audiences, because they change the algorithm so that your content is not appearing in an organic fashion and, therefore, you have to invest money to be able to reach those audiences. … And, over time, it has got harder and harder to reach them.[20]

2.25Likewise, Mr Nicholas Shelton of Broadsheet Media (Broadsheet) indicated that after initially courting Broadsheet, Meta slowly degraded its organic referral traffic—with a decline of 50 per cent in the last 12 months. Broadsheet now spends approximately $500000 annually to reach its audience.[21]

2.26We Are Explorers likened this situation to having its audience 'effectively held hostage until payments are made',[22] while Mr Duggan described it as Meta 'basically overnight, breaking up with the news industry' after heavily courting them to build content and audiences for digital platforms.[23]

The impact of digital platforms on Australian news media sustainability

2.27While the reach of digital platforms—and the speed at which information can be disseminated on those platforms—expanded audiences for traditional media organisations, this has come at a heavy cost. According to the Australians for a Murdoch Royal Commission (AMRC), the rise of digital platforms has seen traditional media outlets lose control of content distribution, instead becoming reliant on web traffic and social media algorithms. At the same time, businesses began to move their advertising online.[24]

2.28This shift was noted by the Human Technology Institute (HTI), which explained that the ability of digital platforms to offer highly personalised content (based on intensive data collection) resulted in advertising revenue being channelled away from traditional media and toward the platforms.[25]

2.29According to Mr Shelton, this allowed platforms such as Meta to take control of the economics of internet content:

They do that very effectively, and when you control the distribution you control the commerce and you control the economics. That's what they've done so effectively. Now they've decided they don't need us anymore, so they're happy to shut the door.[26]

2.30The decline in advertising revenue for traditional media outlets was highlighted by numerous inquiry participants.[27] For example, the MEAA stated that print, television and radio outlets have lost a significant portion of the advertising revenue that previously sustained their operations.[28] Similarly, Mr Kendall reflected that the rise of digital platforms had 'decimated the ad revenues that funded quality journalism'.[29]

2.31The scale of this loss was captured in a 2018 University of Technology Sydney report (UTS report) commissioned by the ACCC. The UTS report explained that 'between 2011 and 2015, Australian newspaper and magazine publishers lost $1.5 billion and $349million respectively in print advertising revenue'. Over the same period, these publishers gained only $54million and $44 million in digital advertising revenue. Further, by 2016, Google and Facebook were in receipt of three quarters of Australia's total online advertising expenditure.[30]

2.32More recent research by the University of Sydney's Department of Media and Communications (USYD MECO) found that by 2022, online advertising was a $14.7 billion industry in Australia—larger than the newspaper, broadcast television, radio, and online video service industries combined—with Google and Facebook accounting for 55.4 per cent and 16.6 per cent of the market share respectively.[31]

2.33Further, Free TV submitted that while media organisations are struggling to sustain investments in public interest journalism, 'Meta's first quarter profits for 2024 more than doubled to USD 12.37 billion, suggested to be largely as a result of higher advertising revenue and increasing the cost of ad placement'.[32]

2.34The MEAA contended that the downturn in revenue has led to significant redundancies across news organisations, 'damaged the media workforce' and 'left the Australian media greatly weakened'.[33] Likewise, the Australian Broadcasting Corporation (ABC) pointed to its 'corrosive effect' on 'the quality, quantity, reach and impact of public interest journalism'.[34]

2.35AMRC argued that the loss of advertising revenue was 'the final blow' for many independent and community news outlets:

Across regional and metropolitan Australia, local newspapers and radio programs either closed down, reduced their frequency, merged with neighbours, or were absorbed into the corporates like Murdoch, Fairfax, or Packer's media empires. Between 2014-2019 alone, 16,000 media jobs were lost. In 2020, 164 media businesses closed down. By 2024, over thirty Australian local government areas now exist without any access to a local news outlet in either print or digital form. Sub 188, p. 3.

2.36More broadly, the Public Interest Journalism Initiative (PIJI) warned that 'news market failure is becoming a more likely prospect across the country' as the producers of public interest journalism face the overlapping challenges of declining advertising revenue and an increasing reliance on digital platforms for referral traffic.[35]

2.37PIJI's Australian News Data Report for March 2024 showed the stark impact of these challenges—particularly in non-metropolitan areas (see Box 2.1).

Box 2.1 Australian News Data Report – March 2024[36]

161 news outlets closed in the five years between January 2019 and March 2024:

this is a sharp acceleration from previous ACCC data showing 106 news outlet closures in the ten years from 2008 to 2018.

Even where news outlets remained operational, there has been significant overall contraction in news production and availability:

between 2019 and 2024 there were 337 newsroom contractions and only 175 newsroom expansions.

Regional areas, already underserved relative to metropolitan areas, have been the hardest hit by these closures and contractions:

while 90 new regional outlets opened between 2019 and 2024, that benefit was outweighed by the 109 regional outlets that closed, representing two-thirds of total closures nationally; and.

ninety-one per cent of the outlets across the country that experienced a contraction in services were also in regional areas.

Eighty-eight per cent of the news outlets in Australia are local in scale, servicing a single Local Government Area (LGA) or cluster of LGAs.

2.38However, LINA pointed to the expansion of digital newsrooms in the wake of these contractions and closures and suggested that its member businesses represented the 'green shoots' of the industry. LINA emphasised the role these publishers play 'in representing diverse and regional voices, providing public interest news services in areas where access to information has been significantly impacted by newsroom closures and the syndication of services'.[37]

2.39Despite this, some participants cautioned that emerging technologies could worsen current challenges for publishers. For example, Mr James Chisholm of the Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA) suggested that artificial intelligence (AI) has the potential to further concentrate user reliance on digital platforms at the expense of traditional media organisations:

We will see a situation whereby consumers will more and more just rely on platforms for their information and, as AI brings all that information together in such a way that you don't really have the incentives to click away from the ecosystem that you're in, that is going to make life incredibly difficult, even more difficult, for publishers than it is now.[38]

The rise of mis- and disinformation on digital platforms

2.40While online harms, including the spread of mis- and disinformation, are not a new phenomenon, they have been turbocharged by the rise of digital platforms. For example, Mr Peter Lewis of the Centre of the Public Square, Per Capita (CPS Per Capita) contended that:

We know there have always been lies in politics and in the media. What's different here is that lies drive more virality and more engagement.[39]

2.41Mr Bill Calcutt PSM suggested that this has been enabled by social media's transformation of social participation, which allows people—many anonymously—'to have almost instant and largely unmoderated access to a global audience'.[40]

2.42The 'hyper-connectivity' of digital platforms was also noted by DITRCDA, which observed the potential of social media to amplify online harms.[41] In addition, participants such as HTI and PIJI observed that the emergence of generative AI and the ability to rapidly create and disseminate content, has supercharged the challenge of separating fact from fiction on digital platforms.[42]

2.43According to the Australian Human Rights Commission (AHRC), 'the Global Risks Report 2024 declared that misinformation and disinformation would be the "most severe global risk anticipated over the next two years"'. The AHRC noted that mis- and disinformation can have 'devastating effects on human rights, social cohesion and democratic processes. Indeed, this can be the very purpose intended by the release of disinformation'.[43]

2.44This was reflected in evidence from the Department of Home Affairs, which noted that 'foreign actors are increasingly using disinformation campaigns in a bid to increase polarisation, reduce trust in government, and foment extremism in democratic societies'.[44]

2.45Indeed, there was near universal concern among inquiry participants about the proliferation of mis- and disinformation on digital platforms.[45] For example, the ABC stated that the risk of malicious actors using digital platforms to 'share false or misleading information, now increasingly generated by AI, in order to manipulate public opinion is widely acknowledged'.[46]

2.46Likewise, Reset.Tech Australia listed a 'deteriorating information environment, with upticks in "fringe" and palpably false content, including a rise in AI-generated content with unclear provenance' as a key digital challenge facing Australia.[47]

2.47In a similar vein, the HTI noted that 'the rise of social media has been accompanied by a rise in mis- and disinformation' fuelled by uncertainty and heightened social anxiety. HTI further contended that generative AI applications have 'provided new avenues to create and spread mis- and disinformation'.[48]

2.48Ms Elizabeth O'Shea of DRW told the committee that digital platforms have incentivised the creation of mis- and disinformation, resulting in the pollution of the information ecosystem by 'content that is at best low quality and at worst deceptive and extremist'.[49]

2.49Further, the MEAA asserted that digital platforms are threatening news media sustainability 'and distorting public discourse by failing to control misinformation and disinformation on their platforms'. It argued that the reduction in news on Meta's platforms had already resulted in an increase in mis- and disinformation:

One researcher, for example, found that when news was removed from the platform, it was replaced by 'viral content producers' who produced 'misleading or false' information.[50]

Wider community concerns

2.50In addition to concerns raised by publishers, submitters such as SBS pointed to research showing that concerns about mis- and disinformation are shared by the wider Australian community.[51] For example, both LINA and ACMA referred to the University of Canberra's Digital News Report: Australia, which found that concern about mis- and disinformation in Australia is among the highest in the world,[52] with 75 per cent of Australians concerned about misinformation in 2024. This is an increase of 11 per cent since 2022 and is far higher than the global average of 58 per cent.[53]

2.51While the Digital News Report: Australia found that this increase had occurred across all demographics, it was particularly noticeable among medium and highly educated participants, those living in cities, and among Generations Z and X (since 2022).[54]

2.52Likewise, a 2022 survey by ACMA found that 78 per cent of respondents 'agreed or strongly agreed that misinformation is prevalent in Australia', with 'over half of all Australians wanting the government to play a more active role in addressing harmful content'.[55]

2.53In addition, a June 2024 survey by Squiz Kids found that 98 per cent of parents identified 'exposure to misinformation, deep fakes and biased media as their number one concern for their kids when online'. This was higher than exposure to pornography (76 per cent), and meeting strangers online (55 per cent).[56]

The role of algorithmic recommender systems in promoting mis- and disinformation

2.54According to Reset.Tech EU, content on digital platforms is 'distributed and presented through algorithmic recommender systems, which usually rely on engagement-based ranking'.[57] As noted by the Department of Home Affairs, these systems 'significantly shape our information environment' by 'determining the content that is served to users'.[58]

2.55Accordingly, numerous participants pointed to algorithmic recommender systems as key drivers of mis- and disinformation.[59] For example, SBS argued that the exponential growth in online mis- and disinformation was being 'fuelled by algorithms which favour polarisation' and the impact of AI, which 'can generate and publish misleading information and/or content from extreme viewpoints'.[60]

2.56The Department of Home Affairs also noted how malicious actors can 'use a network of fictitious personas' to manipulate recommender algorithms, with disinformation often featuring in such campaigns.

2.57AMRC emphasised the 'systematic bias' of digital platforms 'towards content that prompts an intensely negative emotional experience'—noting that the strong affective impact of misinformation means that it spreads six times faster than factual content.[61]

2.58Indeed, HTI concurred and explained how algorithms function to increase user engagement with content, meaning they often prioritise highly emotional, divisive or controversial content at the expense of factual information:

The algorithms that sort content on social media platforms tend to prioritise this kind of content and are capable of sharing it rapidly and at great scale with audiences, often at the expense of truthful content. As a result, factual, balanced reporting can be deprioritised or drowned out, while false or misleading news likely to illicit emotion is frequently bumped up and reshared.[62]

2.59This de-prioritisation of reputable news sources was also highlighted by Man of Many, which noted its role in creating echo chambers, ‘further entrenching misinformation and reducing critical engagement with diverse viewpoints’.[63]

2.60As a result, the HTI argued that 'Australians today are less likely to benefit from a shared understanding of relevant public information which was more characteristic of the age of legacy media'.[64]

2.61CPS Per Capita highlighted the commercial imperatives driving platforms' amplification of provocative and divisive content:

Digital platforms like YouTube, Facebook, X and TikTok are also underpinned by commercial models which, above all else, seek to monetise our attentions and interactions, trapping Australians in algorithmically-induced content 'holes' ...[65]

2.62Further, Digital Rights Watch (DRW) highlighted the perversity of revenue sharing schemes that incentivise the creation and sharing of viral content, citing the example of users with Premium X accounts who made money sharing both Islamophobic and anti-Semitic misinformation in the wake of the Bondi Junction attack in April 2024.[66]

2.63In addition, some submitters argued that the convergence of platforms' recommender systems and the growing use of AI could supercharge existing disinformation challenges. For example, DRW argued that:

The impact of personalised disinformation campaigns is likely to be exacerbated in the future by developments in generative AI, which promises to deliver increasingly granular forms of content customisation at scale.[67]

2.64At the same time, the ABC contended that the integration of generative AI with internet search tools is 'likely to make it even more difficult for Australians to both identify the source of information and determine whether or not it is trustworthy'.[68]

2.65In addition, Free TV stated that mis- and disinformation disseminated via social media platforms is also difficult for third parties—such as Australian regulators—to track, with it being near impossible for third parties to determine the identity of those behind misleading content.[69]

2.66To this end, The Conversation noted that NewsGuard has to date identified 966 AI-generated news and information websites that rely on digital platforms to reach audiences but operate with little to no human oversight.[70]

Lack of transparency around algorithm-based recommender systems

2.67Currently, the Australian Code of Practice on Disinformation and Misinformation (ACPDM) requires signatories to provide annual transparency reports about their efforts to address mis- and disinformation on their platforms. Seven of the nine signatories to the ACPDM have also opted in to Outcome 1e, which requires signatories to:

… provide transparency about the use of recommender systems and efforts to provide users with options that relate to content suggested by recommender systems.[71]

2.68Despite this, the ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S) contended that 'there is very little systematic knowledge about how platforms' algorithms, recommender systems and business tactics influence what Australians see (and hear)'.[72]

2.69Indeed, the committee heard significant concerns about the lack of transparency around algorithm-based recommender systems and the impact this has on our understanding of the role they play in promoting mis- and disinformation.[73]

2.70For this reason, a number of participants, such as Children and Media Australia, argued for direct regulation of recommender systems, as well as greater transparency around the operation of platforms' algorithms.[74]

2.71However, according to the eSafety Commissioner, Mrs Julie Inman Grant, the industry is becoming more opaque, rather than more transparent:

[The industry is] pushing back against transparency reports and the ability for third-party researchers, academics and regulators to be able to measure toxicity, for example, on their platforms through scraping and access to APIs and data hoses and the like.[75]

2.72Similarly, USYD MECO described how research to understand the influence of recommender systems is currently 'hampered by limited access to the inner workings of these systems, relying on indirect methods such as observation, data donations and external experimentation, which can only provide a partial view'.[76]

2.73To this end, Mrs Inman Grant suggested that Australia would likely need regulatory powers like those in the European Union’s (EU) Digital Services Act in order to facilitate third-party research:

They've set up something called the ECAT in Spain, which is basically a group of data scientists and technology specialists who are looking under the hood. They're looking at how the algorithms and the recommender systems are working so that they can validate what the companies are telling them through their transparency reports. I think we're going to need to move in that direction, but we need to have the right people, processes and systems built to be able to do that effectively.[77]

2.74However, Dr Rys Farthing from Reset.Tech Australia told the committee that the issue was less about the transparency of the algorithms being used but rather the complexity code and understanding its outcomes. Dr Farthing pointed to the importance of platforms being accountable and having a duty of care for their users, which should include detailed risk assessment requirements that:

… require platforms to identify their risks, whether it's the risk of persuasive design elements producing addictive-like behaviours, the risk in the way they create their settings that create grooming risks for young people or the risks of misinformation for adults for electoral threats.[78]

2.75According to Dr Farthing, risk assessment and mitigation frameworks in Europe have had some success in mitigating the risks of platforms' recommender systems. However, Dr Farthing also highlighted the need to look beyond algorithms and 'place safety expectations and … requirements' around the broader suite of systems and processes used by platforms:

… I think it misses a trick to try and improve algorithms without also requiring similar improvements to all of the systems and processes that platforms deploy, whether it's content moderation systems, ad approval systems or the privacy systems underpinning the platform. I think the metaphor is like if you've got a busted road—fixing algorithms is like fixing the broken traffic lights but not looking at the potholes, the lack of lighting or the road markings. You need all of them to give you a safe drive.[79]

The impact of online mis- and disinformation

2.76According to DRW, research has shown the effectiveness of mis- and disinformation in 'entrenching and enhancing pre-existing beliefs'. This effect is amplified on digital platforms, where the design of algorithms—which promote content in which users have already shown interest—can exacerbate confirmation bias of online mis- and disinformation.[80]

2.77Unsurprisingly, the committee heard widespread concern about the impact of online mis- and disinformation.[81] For example, Ms O'Loughlin told the committee that ACMA's 2023 survey had found 88 per cent of respondents 'either agreed or strongly agreed that misinformation is generally harmful to individuals, groups or society'.[82]

2.78This was also reflected in evidence from the Cancer Council Australia (Cancer Council), which argued that online mis- and disinformation compounds health inequities 'by making it even harder for people, especially those with lower levels of health and media literacy, to make informed decisions'. The Cancer Council pointed to the 'devastating' consequences of misinformation on people with cancer who 'delay or refuse evidence-based treatment' or opt for alternative, unproven interventions. Worryingly, the Cancer Council reported:

Nearly one-third of popular social media cancer content contained misinformation and the majority (76.9%) of these contained harmful and incorrect information. Most concerning, among the most popular cancer articles on Facebook, articles containing misinformation and harmful information received statistically significantly more online engagement.[83]

2.79More broadly, the PIJI cited research in the United States that concluded that the risks posed by mis- and disinformation are 'not primarily technological, but institutional and cultural, that is to say, political'.[84]

2.80According to the PIJI, these risks are being exacerbated by market failure that is leaving a growing vacuum to be filled by 'destructive narratives' that impact on trust in institutions and governments:

As the number of producers of public interest journalism contract, the threat posed by online misinformation and disinformation increases; not simply because 'fake news' goes unchallenged by 'real news', but because as public interest journalism retreats, the narrative of strong institutions and accountable government that it supports retreats with it. Destructive narratives can fill that vacuum, motivated by vested interests and partisan political considerations. Those narratives provide the fertile ground in which misinformation and disinformation can take root.[85]

2.81This was highlighted in evidence from ReachOut, Beyond Blue and the Black Dog Institute, which pointed to the role of algorithms, disinformation and harmful content in undermining social cohesion, and creating 'echo chambers that seed and reinforce attitudes that can deepen divisions in society'.[86]

2.82At the most extreme level, this has the potential to result in 'deepening national security threats of ideologically motivated extremism, with intensifying links to content recommender systems'—which was identified by Reset.Tech Australia as one of the key digital challenges facing Australia.[87]

2.83Further, several participants linked the spread of mis- and disinformation on digital platforms to an erosion of public trust and confidence in authoritative statements of fact, events and institutions.[88] According to DITRDCA, sowing distrust is the aim of much mis- and disinformation:

The aim of spreading mis- and disinformation is to pollute the information environment and sow distrust in government and civil institutions, such as science, journalism and education, and, at its worst, discredit democratic processes. Globally, there are examples of how mis- and disinformation have caused extreme distress among sectors of a society and led to the breakdown of trust within communities, leading to social and public disorder.[89]

2.84The HTI noted that this decline in trust has occurred even when these statements are 'supported by clear evidence, expertise or traditionally reliable sources'.[90]

2.85According to Mr Lewis, the success of mis- and disinformation in sowing distrust was enhanced in situations where 'calculated attempts to undermine a fact base' interact with real community concerns.[91] For example, Mr Lewis pointed to the disinformation driving scepticism about the transition to renewable energy, which is attached to very real concerns about 'large-scale developments that are poorly planned, poorly managed and poorly communicated to communities'.[92]

2.86Other submitters, such as Schwartz Media, also pointed to the role of mis- and disinformation in driving mistrust of legitimate news and fuelling news avoidance.[93]

2.87A similar view was shared by AMRC, which noted that almost 70 per cent of Australians deliberately avoid engaging with news, with half of Australians believing that 'journalists are dividing Australian society'. AMRC also expressed concern that the loss of trust in news is greatest among those who 'need to believe in the press the most for democracy to function':

News avoidance is most pervasive in those communities in Australia most disconnected and affected. For example, 80% of Australians who report 'doing it tough' in the cost-of-living crisis report avoiding the news, compared to just 59% of those who are well-off. Without faith in traditional information structures, they are already turning to 'alternative' explanations for their social and material difficulties. [94]

2.88Further, AMRC stressed that the rise of misinformation represents 'an expectation of manipulation—that is that there are such high levels of mistrust that people expect the news is going to manipulate them'. It went on to describe the impact of this as 'poisonous for democracy':

In a democracy, deliberation over public affairs occurs under the assumption that your counterparts are good-faith, rational actors with broadly accurate beliefs about the nature of reality. Where there is vast mistrust and expectation of manipulation, those who believe mis- and disinformation believe that they are not susceptible to false information. In turn, this shuts off political communication and civil debate. Polarisation and siloing become the norm ...[95]

2.89The impact on democratic systems was also underlined by DRW, which noted that the business models of digital platforms—which incentivise and amplify mis- and disinformation—'undermine our capacity to empathise and compromise across social, political and cultural divides, which are essential features of a functioning democracy'.[96]

2.90The HTI went on to warn that a failure to reverse this trend would likely lead to 'further erosion of the quality of our public discourse, and a loss of legitimacy for our democratic institutions'.[97]

This means that digital platforms are 'not the right hosts of our civic lives'

2.91While some inquiry participants highlighted the role social media can play in fostering engagement and collaboration, and driving positive social change,[98] DrKeith Heggart and others also highlighted its failure to deliver on its early 'democratic promise':

… while it is true that anyone with an internet connection and a device can now lay claim to being a journalist, rather than this democratising knowledge and improving the state of civil debate, it has instead proven to be fertile ground for the spread of misinformation and disinformation.[99]

2.92Further, in response to claims that digital platforms operate as a virtual town square—allowing people to share news and engage in political debate and advocacy—the HTI argued that the commercial nature of the platforms and their lack of neutrality undermines access to verifiable information:

… unlike the physical public square, digital platforms, in their current form, are not 'neutral' platforms. Their AI-based business model shapes our access to information, and the debates we have in new and sometimes insidious ways. Fundamentally, digital platforms have undermined public access to verifiable information about civil and political issues.[100]

2.93A similar point was also made by Mr Lewis who stated that 'our public square has become a private mall', with digital platforms connecting users via a business model that centres on 'extracting and exploiting our attention'. MrLewis went on to describe this as 'a fundamental chink in the armour when building a society that is based around … collective decisions and finding points of connection'.[101]

2.94Mr Shelton of Broadsheet Media agreed and argued that unlike the locally governed town squares of old, digital platforms are instead 'unaccountable fiefdoms run by global corporations with obvious little regard for the health of Australian society and people'.[102] A similar view was expressed by Ms Lisa Watts of The Conversation, who described Facebook as 'an American company that is about making profit. They're not interested in social cohesion or public good'.[103]

2.95Likewise, DRW emphasised that 'collective concerns such as the public interest, human rights, community responsibility and upholding democracy struggle to compete with the profit motive, and in practice are not prioritised by commercial social media platforms'.[104] A similar view was expressed by former Australian Senator, Mr Rex Patrick, who reflected that 'in the conflict between public good and money in [Facebook's] pocket, the conflict is always resolved in favour of money in their pocket'.[105]

2.96For this reason, the CPS Per Capita argued that digital platforms are 'not the right hosts of our civic lives' and described their operational models as:

… the antithesis of what healthy civic participation requires – shared spaces to facilitate considered debate and deliberation, consensus and bridge building, respect and representation for all citizens, a healthy public square powered by facts, reason and truth.[106]

2.97In response to the above, some participants called for a renewed focus on the responsibilities of digital platforms and their social licence to operate. For example, Mr Shelton expressed the view that Australia 'must insist on a social licence for digital platforms with a commitment to contribute positively to Australian society'.[107]

2.98Likewise, Mr Miller floated the idea of a social licence tied to stronger regulation, including:

… a set of laws with both new, partly accelerated laws and existing laws. It'd cover things like algorithms, making the platforms liable for the content that is amplified, curated and controlled by their recommender engines; customer complaints so there is a recourse for the things that Australian telcos and banks have to meet minimum standards on; and a license fee for access to Australian people.[108]

2.99Mr Schreyer went further in describing the responsibility of platforms like Meta to 'do what is right':

It is one thing to rip billions from our economy but the social, mental and physical damage caused by what is allowed to prosper on Meta's platforms takes things to another level. Meta long ago ceased to be just a provider of social media platforms. Once upon a time Facebook was a place for pet videos, holiday snaps, family updates and social banter but now the beast has evolved to a point where the US Surgeon General this week called for social media to have warnings attached to it over mental health concerns for users. It has evolved into an antisocial entity that has provided a haven for toxicity, fake news scams, blackmail, cyberbullying, doxing, revenge porn, trolling, deepfakes, political interference, surveillance capitalism, and the spread of mis- and disinformation. This has caused so much damage within our communities. The live streaming of massacres, images of unrealistic so-called beautiful people and conspiracies are also part and parcel of social media today. It is one of the great paradoxes of our time that, rather than tackle these unacceptable elements and accept responsibility for the damage they have caused by providing habitat for such a scornful matter, Meta has instead opted to diminish the presence of real news and of truth. As a corporate citizen Meta has a responsibility to do what is right, just as we publishers do.[109]

The importance of public interest journalism in countering mis- and disinformation

2.100According to the PIJI, one of the main barriers to be overcome is the Australian media's declining ability to 'produce sustainable, reliable public interest journalism that blunts the impact of misinformation and disinformation':

As the number of producers of public interest journalism contract, the threat posed by online misinformation and disinformation increases; not simply because ‘fake news’ goes unchallenged by ‘real news’, but because as public interest journalism retreats, the narrative of strong institutions and accountable government that it supports retreats with it. Destructive narratives can fill that vacuum, motivated by vested interests and partisan political considerations. Those narratives provide the fertile ground in which misinformation and disinformation can take root.[110]

2.101PIJI has defined public interest journalism as 'original content that records, reports or investigates' issues of public significance for Australians, issues relevant to engaging Australians in public debate and informing democratic decision making, and content which relates to community and local events.[111]

2.102The benefits of public interest journalism were captured in the News Media Assistance Program Consultation Paper published by the DITRDCA (see Box 2.2).

2.103Numerous participants highlighted the importance of public interest journalism to building an informed population and a functioning democracy.[112] For example, in highlighting the importance of a sustainable news media sector, Free TV argued that:

Robust, accountable and independent public interest journalism educates and informs citizens, holds power to account and is an essential component of a well-functioning democracy.[113]

2.104Similarly, PIJI underscored the reliance of liberal democracies on public interest journalism to provide 'a shared baseline of agreed facts about our society' and demand 'transparency and accountability from those in power'.[114]

2.105The USYD MECO elaborated:

News media are critical to shaping the ideas, values and beliefs that shape a culture, and particularly the ideologies or 'mental maps' through which social reality is collectively perceived and acted upon. As communications scholar Michael Schudson has argued, 'The world will survive without a lot of the journalism we have today, but the absence of some kinds of journalism would be devastating to the prospects for building a good society, notably a good democratic political system'.[115]

2.106Unsurprisingly, a range of participants, including the Australian Associated Press (AAP), described access to public interest journalism as a 'counterpoint to the mis- and disinformation narratives that seek to undermine social cohesion and democratic norms in Australia by targeting vulnerable demographics, institutions like science or medicine, or elements of democracy'.[116]

2.107However, submitters such as the ABC were careful to differentiate between preventing the spread of mis- information and disinformation—which falls largely to platforms and government regulation—and reducing the effects of mis- and disinformation, which journalism is able to do.[117]

Box 2.2 Benefits of public interest journalism[118]

Informed democratic participation – informed citizens contribute to proper functioning of democracy by improving the likelihood that outcomes will reflect community expectations and preferences, helping to dispel and discourage misleading practices, and better aligning expectations with outcomes.

Informed public administration and policy – informed decision-makers are better able to respond to and reflect community expectations. News and journalism also provide citizens with a platform for advocacy to government and public activism.

Trusted and accountable institutions – scrutiny and accountability deliver greater compliance and accountability for all sectors of society, and government that is fairer, more transparent and more responsive. In turn, this 'watchdog' function may also increase trust in democracy and public institutions.

Tested and shared ideas – the sharing and testing of ideas can contribute to better informed decision-making and the selection of more robust ideas. In turn, respectful and informed debate may also contribute to greater trust in democracy, public institutions and the media.

Greater inclusion and social welfare – news and journalism can contribute to social welfare and cohesion through the portrayal or representation of diverse social groups, provision of information about social issues, advocacy for social or political causes, or the provision of a platform for advocacy by others.

Informed decision making by individuals – news and journalism inform the private decisions made by individuals, including in relation to social, financial and health decisions.

2.108While the importance of public interest journalism was not disputed, its ability to counter online mis- and disinformation was seen to depend to a large degree on its accessibility and quality, as well as the media literacy of audiences. Views were also mixed in relation to the role fact checking plays in countering mis- and disinformation.

Accessibility of public interest journalism

2.109As stated in the DITRDCA's News Media Assistance Program Consultation Paper, news content 'provides no benefit if Australians cannot access it, or otherwise glean its informational content'.[119]

2.110As detailed earlier in this chapter, numerous inquiry participants commented on the high degree of control digital platforms exercise over the content seen on their platforms.[120]

2.111In relation to news accessibility, Mr Shelton described how platforms amplify harmful content to maximise engagement, while 'legitimate journalism is deemed "complicated" and deliberately suppressed by the platforms'.[121]

2.112This point was reinforced by the ABC, which noted that Meta 'has made no secret' about adjusting its algorithms to favour content that attracts interactions (such as comments) and to deprioritise politics-related content. According to the ABC, this reduced aggregate Facebook traffic to news and media properties by 48 per cent.[122]

2.113Similarly, The Conversation described a 'significant' decline in traffic of 40 per cent following an algorithmic change by Meta in May 2024,[123] meaning that fewer Australians now see 'evidenced based, trusted news and analysis' on the digital platforms where they are spending considerable time.[124]

2.114A number of submitters also highlighted the growing problem of 'news deserts', where there are no local media outlets.[125] For example, Dr Carson spoke about PIJI's mapping of news deserts in Australia, which found they are getting larger, with regional areas 'particularly hard hit'.[126]

2.115The importance of local news in countering mis- and disinformation was noted by LINA, which referred to research showing that the loss of local news on Facebook—following its news ban—'had "profound consequences for Canadians" during wildfires that spread through Northern Canada', such as confusion and misinformation about evacuation centres, financial compensation and the progression of fires, all exacerbated by the absence of local news.[127]

Quality of public interest journalism

2.116The quality of public interest journalism was also considered important to its effectiveness in countering online misinformation. For example, the Department of Home Affairs pointed to the importance of 'quality and trustworthy media reporting' to 'building resilience to mis- and disinformation that can exacerbate community tensions and threaten social cohesion, especially during crises'.[128]

2.117Various participants made a link between journalistic standards, media diversity and the quality of the news media in Australia.[129]

Journalistic standards

2.118Many submitters noted the complex interplay between social media and traditional outlets in relation to media standards and the spread of and mis- and disinformation. For example, DRW argued that the spread of misinformation was not purely a social media problem but was 'inextricably connected to problems in traditional broadcast (mainstream) media':

We note that while Australian journalism can play an important role in countering mis- and disinformation on digital platforms, there are also many news outlets that actively contribute to the creation and dissemination of sensationalist, misleading and divisive material.[130]

2.119Similarly, submitters such as Mr Calcutt PSM pointed to the role social media has played in increasing sensationalist and divisive content in the mainstream media:

Conflict, shock, and fear have always been staple elements of the mainstream media diet. But in the face of a ubiquitous social media, it has been forced to increasingly turn to sensationalism and populism to retain market share.[131]

2.120Further, the AMRC submitted that financial considerations have impacted standards and constrained the ability of media outlets to produce local interest stories, long-form content and other 'commercially unreliable news products':

The freedom of both editors and journalists to cover stories which serve a public interest function without being commercially viable has narrowed. Investigative, complex, and long form reporting have all decreased. Reporting on local government, health issues, science, community events and community issues has decreased significantly – not just in the absolute number of articles, but also in the proportion of space in publications devoted to these matters.[132]

2.121This was reinforced by the MEAA, which stated that falling revenues and workforce cutbacks 'have severely affected the news media's abilities to fully scrutinise and report upon matters of considerable public interest'. According to the MEAA, 'this can be seen in diminished news reporting in crucial areas—from local council decisions to major state and federal government projects and corporate business dealings'.[133]

2.122In addition, Free TV noted the role that media coverage can play in amplifying 'highly emotive and engaging posts within small online conspiracy groups'.[134]As an example of this, the Digital Industry Group Inc. (DIGI) pointed to well-known instances of news outlets amplifying false claims made on social media—most recently in relation to the 'repetition of defamatory claims on traditional media in the aftermath of the Bondi Junction attacks'.[135]

2.123Likewise, Dr Carson referred to research that showed the role of mainstream media in spreading misinformation about a 'death tax' during the 2019 election:

This is something that we saw firsthand in a study on the so-called 'death tax' during the 2019 election. The amount of disinformation around the death tax moved from the online environment back into the offline environment with news stories about it. In 53 per cent of cases, we found that mainstream news media were not disabusing the public of the fact that that was not ALP policy at the time.[136]

2.124This point was reinforced by the ADM+S, which observed that 'the practice of sourcing stories from social media, including by reporting on viral TikTok videos or Twitter controversies, is a key source of this amplification'.[137]

2.125To this end, Ms O'Loughlin told the committee that there were actions traditional media could take, including applying existing codes of practice to their broadcast video on demand service—as SBS has done—and considering whether the terms of those codes adequately cover mis- and disinformation:

We've been discussing with the commercial—and, indeed, national—broadcasters why they don't extend their current codes, which include accuracy and impartiality, onto their digital services. That's a continuing conversation. We think that the broadcasters should also turn their minds to whether or not accuracy and impartiality—those sorts of traditional terms—capture what we see in misinformation and disinformation, and that's an ongoing conversation we're having with them.[138]

2.126Further, Ms Anna Draffin of PIJI highlighted her organisation's work in relation to integrity measures and argued that expectations around accountability and a social licence to operate should apply equally to digital platforms and news media organisations:

It's for all players that are involved in the ecosystem and that have a social licence to operate. Certainly, as part of PIJI's next-stage work, we're looking at areas around integrity measures et cetera as part of bringing in greater alignment between news consumption and news production, particularly in an era of genAI, where finding trustworthy is going to become more difficult. We'll all be deluged with synthetic content at speed and scale, so how do we actually (a) deal with the media literacy part but (b) decide what the indicators are that we need to have in place to demonstrate a source of public interest journalism?[139]

Media diversity

2.127Multiple participants highlighted the importance of media diversity to reducing the spread of mis- and disinformation.[140] LINA defined diversity both in terms of 'the range and relevance of information available to audiences; and the range of voices and perspectives represented in the media landscape'.[141]

2.128According to the ACMA, Australia's media diversity policies aim to encourage a diversity of news and information across the media market and prevent 'any single media voice from exerting unacceptable levels of influence over public discourse'. Australia's media diversity rules are set out in the Broadcasting Services Act 1992 (Broadcasting Services Act), which regulates the number of media 'voices' in a market.[142]

2.129However, the concentration of media ownership in Australia was viewed by some participants as impacting negatively on the quality and sustainability of journalism in Australia and the level of trust in the media.[143] For example, Private Media contended that 'by one measure, Australia is the 'third worst market globally for media diversity' with only China and Egypt having 'more monopolistic media markets'.[144]

2.130According to Capital Brief, this means that the dominance of News Corp, Nine Entertainment and Seven West Media has resulted in 'a small group of editors can dictate which stories are told and how the narrative takes shape'. Capital Brief argued that 'this concentration of editorial decision-making inhibits the ability of Australian journalism to effectively counter the mis- and disinformation present on social media platforms'.[145]

2.131The role of digital platforms was also highlighted by AMRC, which argued that both deregulation and the emergence of digital platforms had impacted on the concentration of media ownership in Australia, with flow on effects in terms of 'trust in the media, polarisation, misinformation, and widespread news avoidance'.[146]

2.132Similarly, submitters such as LINA asserted that small publishers are being disproportionately impacted by digital platforms, whose algorithms are 'throttling' news content and 'constraining traffic and revenue in places where revenue is already scarce, further compounding pressures on media diversity in Australia'.[147]

2.133To this end, the ACMA noted weaknesses in the current measures to address media diversity in Australia and indicated that it is implementing a new Media Diversity Measurement Framework (Diversity Framework), which will 'present a more complete picture of the contemporary news landscape'.[148] Once fully implemented (over four years), the framework will allow:

… levels of diversity across Australia's print, radio, TV and online media to be monitored. … this work will inform and support the government's broader News Media Assistance Program (News MAP) initiative, providing a robust evidence base to help inform advice to government and policymakers …[149]

2.134According to LINA, policies that support media diversity—particularly local news publishers—can help sustain quality reporting and reduce the impact of mis- and disinformation:

Media diversity provides audiences with reporting that meets editorial standards and draws from a range of sources, from which individuals can make their own decisions on any given topic. In relation to local and independent news, media diversity provides an opportunity to draw on local expertise, deep subject matter knowledge and community connections to include perspectives in news reporting that would not otherwise be shared.[150]

2.135In her evidence to the committee, Ms Claire Stuchbery explained how local journalists, who are embedded in their communities, are 'uniquely positioned to identify mis- and disinformation circulating within their community when it starts to gather momentum'.[151]

Media literacy

2.136According to Dr Tanya Notley, media literacy refers to 'the ability to apply critical thinking to digital and non-digital media through analysis, evaluation and reflection', a skill which she described as 'essential for full participation in society' and important to avoiding misinformation:

Multiple studies have demonstrated that increasing people's capacity for critical thinking in relation to their media use increases their ability to detect and avoid misinformation and avoid scams and other media harms. Media literacy initiatives can also empower citizens to become competent and responsible media producers and consumers.[152]

2.137A range of participants highlighted the importance of media literacy in allowing Australians to critically engage with news and assess the reliability of online content.[153] For example, Ms Stuchbery stressed the importance of audiences understanding 'what a verifiable source of information is and how to recognise trustworthy news'.[154] Similarly, SBS described media literacy as the key to 'identifying "fake news" and mis- and disinformation'.[155]

2.138Concerningly, AMRC reported that 'less than half of Australian adults can confidently identify misinformation online'.[156] In addition, the Australian Library and Information Association (ALIA) pointed to research by the Australian Media Literacy Alliance (AMLA), which found that only 56 per cent of Australians feel confident they can find information online and only 39 per cent of adult Australians say they can check if online information is true.[157]

2.139AMLA highlighted low levels of media literacy among young Australians, with only 41 per cent believing they know how to identify fake news.[158]

2.140According to the University of Canberra's 2024 Digital News Report: Australia, participants found it hardest to identify untrustworthy information on TikTok (34 per cent) and X (32 per cent), followed by Facebook and Instagram (both 26per cent).[159]

2.141While there was a significant focus on media literacy for young Australians, DrCarson underscored its importance for older Australians, with studies showing they are often the most taken in by mis- and disinformation.[160]

2.142This was borne out by evidence from ALIA, which cited lower media literacy levels for particular cohorts of Australians, including those aged 75 and older and those in regional Australia.[161]

The role of fact checking in countering mis- and disinformation

2.143The important role of fact checking in countering mis- and disinformation was emphasised by a range of participants. For example, ADM+S described fact checking as a 'critical function' and suggested that fact checking organisations have 'more of a direct impact on limiting the circulation of mis/disinformation than the general provision of news and public interest media'.[162]

2.144Similarly, AAP told the committee that its fact checking work has 'a direct and tangible impact on the circulation of disinformation'. As a third-party fact checker for Meta and TikTok, AAP indicated that it publishes dozens of debunk articles to Meta and provides recommendations to TikTok on content containing potential mis- and disinformation. AAP also noted that the impact of its work is 'exponentially increased by Meta's use of AI technology to automatically label mis/disinformation similar to that already debunked by fact-checkers'.[163]

2.145According to AAP, this work can result in the downranking or removal of content by these platforms. However, AAP also stressed the limitations of its influence, with any interventions being 'determined and applied solely by the platforms'.[164]

2.146Some submitters, such as Reset.Tech Australia, questioned the effectiveness of digital platforms' responses to third-party fact checking:

Like other platforms, Meta only label or remove the exact post that fact checkers address and near-identical copies. Posts that repeat the same falsehoods, attach images or swap the order of words remain unaffected even when reported.[165]

2.147Schwartz Media also argued that 'when advertising money is involved, [Meta] will freely publish and promote factually incorrect claims that other media businesses refuse to run'. As an example, Schwartz Media pointed to Facebook's refusal to take down an ad by former President Donald Trump's 2020 re-election campaign—even though CNN refused to run the ad because it had been proven to be demonstrably false by various fact-checkers.[166]

2.148Despite the value of fact checking, many participants cautioned against an overreliance on it to counter online mis- and disinformation. For example, DrCarson agreed that fact checking has a role in identifying—and potentially moderating the impact of—disinformation but she also noted that it did not stop its spread.[167]

2.149Further, DRW described fact checking and content removal as 'well-intentioned and useful' surface solutions, but argued that they lack the scalability required to deal with the 'industrial levels of low quality and polarising content that pollute our information ecosystem'.[168] A similar point was made by SBS, which noted that the volume of mis- and disinformation means 'fact-checking and verification of information takes longer and requires more resources'.[169] To this end, ADM+S pointed out that fact checking organisations are 'mostly dependent on funding from digital platforms and donations for revenue'.[170]

2.150Likewise, Reset.Tech Australia acknowledged the 'crucial role' that fact checkers play but asserted that Australia's 'uniquely small and brittle' fact checking system faces a number of challenges, including its small size, the risk posed by perceptions of bias and a lack of trust, and the increased workload arising from the use of generative AI.[171]

2.151Reset.Tech Australia also highlighted a new challenge to the effectiveness of fact-checking—where bad actors explicitly attempt to overload Australian fact checkers as part of bad-faith attacks designed 'to disrupt and undermine trust in institutions'.[172]

2.152This was reflected in evidence from the Department of Home Affairs, which described how 'Australian outlets were among more than 800 organisations around the world recently targeted by a pro-Russian campaign aimed at clogging fact-checking systems in newsrooms'.[173]

2.153Given the impossibility of fact checking all claims, the ABC underscored the importance of improving media literacy. It argued that 'by developing the awareness and skills of audiences, the ABC can try to inoculate them against mis- and disinformation'.[174]

Regulation of mis- and disinformation on digital platforms

2.154Participants provided extensive evidence in relation to the regulation of mis- and disinformation on digital platforms. This included views on:

the effectiveness of current regulatory approaches to mis- and disinformation—including the operation of the ACPDM;

the different standards that apply to media organisations and digital platforms; and

jurisdictional challenges involved in regulating digital platforms.

Effectiveness of current regulatory approaches to mis- and disinformation

2.155Since 2021, Australia has employed an industry-led self-regulatory approach to address the risks of misinformation and disinformation on digital platforms. This is enacted primarily via the industry-led ACPDM.[175]

2.156The committee heard that other legislation affecting digital platforms also has an emphasis on self-regulation. For example, according to USD MECO, under the Online Safety Act 2021 (which includes mechanisms to remove abusive and harmful online content), the powers of the eSafety Commissioner are intended only as 'a backstop and legislator of last resort'.[176]

2.157Overall, there was a view that these self-regulatory approaches are insufficient to address the digital threats and challenges facing Australia, including the spread and impact of online mis- and disinformation.[177] For example, the HRLC submitted that 'reliance on rules written by the tech industry itself has led to demonstrably weaker protections against harm and is clearly at odds with the expectations of the community'.[178]

2.158Indeed, the HRLC suggested that self-regulation and co-regulation are inappropriate for a powerful and high-risk sector where platforms' commercial imperatives and the public interest may be in conflict. The HRLC also argued that effective regulation requires the ability to 'impose and enforce measures to curb risks'—noting that digital platforms are 'flouting existing online safety rules and dismissing the imposition of even quite modest fines or penalties'.[179]

2.159In a similar vein, USYD MECO drew the committee's attention to the example of X (formerly Twitter), which removed video of the Wakeley church stabbing from its Australian site but 'rejected the call to apply a global takedown' and 'successfully contested the takedown notice in the courts'. According to USYD MECO, this demonstrates the limitations of the light-touch approach:

X's preparedness to contest eSafety Commissioner directives in the courts, as well as its previous withdrawal from the [ACPDM] … this 'soft law' based approach, reliant as it is upon industry goodwill as an alternative to sanctions and coercion.[180]

2.160Further, USYD MECO suggested that previous arguments made by digital platforms about the need for self-regulatory approaches, have been weakened by repeated scandals and heighted scrutiny. For example, the Cambridge Analytica scandal and the widespread circulation of misinformation during political campaigns raised concerns about the ability of platforms to 'adequately safeguard users and administer their platforms in the public interest'.[181]

2.161More recently, some platforms also acknowledged the limits of self-regulation:

Facebook (now Meta) CEO Mark Zuckerberg famously conceded before the U.S. Congress that his company ultimately bore responsibility for the content on its platform, and that the question was not whether social media companies should be regulated, but how such regulation should occur.[182]

2.162The limits of self-regulation were also recognised by Ms O'Loughlin, who noted ACMA's advice to government about the need for a mechanism to 'bring parties to the table if they didn't sign up and effectively operate under a self-regulatory code or if they bunked out of that self-regulatory code completely' (as seen with the removal of X from the ACPDM):

You can't just have major players who say, 'We're not going to,' or an industry association who finds that there is a problem with one of the signatories and removes them from the code, and then they're not covered by anything. … You can't leave that empty space. There has to be something, some way of applying rules to those types of players.[183]

2.163While Reset.Tech Australia noted Australia's 'proud history as a "first mover" and innovator in digital platform regulation', it argued that Australia has now 'slipped behind' other jurisdictions:

New risks, driven by increasingly powerful algorithms and an explosion of data harvesting, have now surpassed the ability of existing digital regulatory frameworks to effectively manage them. Australia is not alone in facing these risks, but other countries are now making substantial progress, in particular the UK and the EU, with emerging progress in Canada. These jurisdictions have drawn upon the innovations and exemplars of Australian policy but introduced more comprehensive, preventative and muscular regulatory models. These models encourage platform conduct that ensures user safety and is more commensurate with public expectations for digital regulation more broadly. By contrast, Australia is still largely reliant on a hopeful but outdated desire for industry-led and largely self-regulated processes.[184]

2.164Reset.Tech Australia also told the committee that over the last decade, it has become clear to governments worldwide that:

voluntary or co-regulatory schemes do not provide sufficient protections as they can be ignored by platforms as a 'reputational risk approach' is not sufficient incentive to safeguard the public interest; and

legislation and fine regimes are vulnerable to being ignored by platforms if they are not seen as significant.[185]

2.165USYD MECO also reflected on the worldwide movement toward greater regulation, following initial light-touch approaches:

The early history of platforms such as Facebook, Twitter/X, YouTube and TikTok was marked by a strong orientation towards light-touch moderation that dealt primarily with issues such as copyright infringement and clearly illegal, threatening, or defamatory content. In recent years, conversations have increasingly turned towards the question of online safety and harms.[186]

2.166This was reflected in evidence from the ABC, which noted that 'governments around the world are moving to regulate technology platforms and social media spaces more effectively'. As an example, it cited the introduction of the Digital Service Act, the Digital Market Act, and the Artificial Intelligence Act in the European Union. According to the ABC, these measures 'oblige online and social media platforms to actively fight disinformation, ensure transparency in relation to the algorithms used to prioritise content, and warrant online safety'.

2.167A similar view was expressed by the HRLC, which argued that 'regulator-drafted industry standards should be the norm' and noted that introduction of the Digital Services Act in the EU was 'driven by growing recognition that self- and co-regulatory models are inadequate and ineffective'.[187]

2.168To this end, Dr Rys Farthing of Reset.Tech Australia told the committee that in markets where regulation has driven change in the way platforms operate, there appeared to be five building blocks underpinning that change. These were:

measures to drive platform accountability (such as a duty of care);

detailed risk assessment requirements for platforms;

strong, proactive risk mitigation measures by platforms;

strong transparency requirements; and

strong enforcement measures.[188]

2.169However, the 'legitimate difficulty' of regulating to limit mis- and disinformation was recognised by AMRC, which pointed to the primacy of free speech as a democratic principle.[189] However, AMRC also noted how the changes wrought by digital platforms are necessarily challenging the way we think about free speech:

… the original principle that free speech should be defended on the basis that truth will prevail in the marketplace of ideas is outdated, when, in the modern media landscape, we are drowning in misinformation.[190]

2.170Similarly, USYD MECO described how changes in the media landscape are challenging the effectiveness of 'counter speech' as a remedy against mis- and disinformation. These changes include:

distribution systems which can incentivise fake news over legitimate news;

diminished gatekeeping barriers due to the proliferation of news and information outlets;

data-driven interactivity and personalisation targeting consumers with tailored news;

the rise of partisan news outlets and selective news exposure;

reduced ability (both content providers and consumers) to distinguish between fake and legitimate news; and

the speed of fake news production, dissemination, and consumption.[191]

2.171Accordingly, USYD MECO contended that regulation of digital platforms would need to consider 'how to balance free speech on social media and the freedom to innovate with the broader freedoms of a liberal society'.[192]

2.172To this end, ACMA highlighted development of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 (misinformation bill). The misinformation bill would provide the ACMA with powers to act where voluntary arrangements are ineffective, as well as powers to enhance transparency about digital platforms' responses to mis- and disinformation on their services.[193]

2.173On 12 September 2024, the misinformation bill was introduced in the House of Representatives.[194] The bill aims to:

empower ACMA to require that digital platform providers take steps to manage the risk posed by mis- and disinformation on their platforms;

increase transparency about the way digital platform providers manage mis- and disinformation; and

empower digital platform users to identify and respond to mis- and disinformation on digital communications platforms.[195]

2.174The bill would impose core obligations on digital platform providers to:

assess risks relating to mis- and disinformation on their platforms, and publicly report on the outcomes of that assessment;

publish their policy or approach to managing mis- and disinformation

publish a media literacy plan detailing the measures that will be taken to help users better identify mis- and disinformation.[196]

2.175The bill also contains a range of graduated compliance and enforcement measures, including formal warnings, remedial directions, infringement notices, injunctions, and civil penalties.[197]

2.176While concerns have been raised about the bill's impact on freedom of speech and the potential for censorship, the bill contains a range of strong protections for both privacy and freedom of speech. These include:

a focus on encouraging digital platforms to have robust systems and measures in place to address mis- and disinformation on their services, rather than the ACMA directly regulating individual pieces of content;

ACMA not having the power to request specific content or posts be removed from digital platform services;

the code and standard-making powers under the bill not applying to authorised electoral and referendum content and other types of content such as professional news and satire; and

private messages sent on instant messaging services not being within scope of the powers.[198]

2.177Claims about censorship were also refuted by Mr Sam Kursar of DITRDCA. In particular, Mr Kursar noted the addition of clause 67, which states 'there is nothing in the bill, whether it's the digital platform rules, the codes or the standards, that will require the take-down of content, unless it's disinformation involving inauthentic behaviour'. He continued:

That is a very strong protection and safeguard that has been added to the bill. It includes content such as bots and troll farms from foreign actors, in research institutes abroad, who want to undermine our democratic way of life and cause vilification of groups of people across Australia based on certain attributes, whether it's imminent damage to critical infrastructure. This particular clause will provide that safeguard. What it will mean is that there are a range of other options, when it comes to a code or standard under the bill, which include things like providing authoritative sources of information or links to factual information, the de-monetisation of disinformation and so on.[199]

2.178Mr Kursar also highlighted the strength the safeguards in the bill, including the high threshold for serious harm, demonstrated by the narrow definitions of harm set out in clause 14:

As I said, we've provided strong safeguards in that you've got to have a very high threshold of serious harm. We have types of harm which are very narrow; they align with our international human rights obligations. When it relates to vilifying a group in Australia or undermining the operation of an election, the government has taken a view that platforms should be held to account in dealing with that kind of content on their servers. Again, we have provided protections in the bill that nothing in the bill will require takedowns unless it is disinformation involving inauthentic behaviour.[200]

2.179The criteria for serious harm, as set out in the bill, were then clarified by Mr Kursar:

…it's got to meet the definitions in clause 13—so it's got to be verifiably false. If it is verifiably false and reasonably likely to cause serious harm…if it is at scale, and if you've got a large number of users. It looks at the context of the environment. It looks at if you've made a claim using an authoritative source that's not correct—so if your content claims it's from the AC and it's not, it's more reasonably likely to cause serious harm. If your serious harm relates to the vilification of a group of Australians, if it's going to cause imminent damage to critical infrastructure, imminent disruption of emergency services and public harm, it's more than likely to be in scope.[201]

2.180Further, Mr Kursar stressed that any potential fines that may be levied in relation to mis- and disinformation would 'apply to the digital platforms and not to everyday Australians'—this bill is 'not about fining an Australian about their social media posts. That's not what this bill does'.[202]

2.181In addition, Mr Kursar indicated that the bill would require a triennial review of the framework, including any potential impact of the bill on freedom of speech or whether there has been over censorship or overreach.[203]

The ACPDM

2.182The ACPDM requires signatories to 'take action to identify, assess and address misinformation and disinformation on their services'.[204] The ACPDM was developed by DIGI— the industry body for Australia's digital industry—and has currently has nine signatories (Adobe, Apple, Google, Meta, Microsoft, Redbubble, TikTok, Twitch and Legitimate).[205]

2.183In relation to combatting mis- and disinformation, DIGI stated that its members:

… share a strong commitment to ensuring the transparency and integrity of Australian democratic political processes, and institutions recognising that as important actors in the information ecosystem, they have a critical role and responsibility in reducing the spread of disinformation and misinformation online.[206]

2.184While recognising that further progress can be made, DIGI highlighted a range of measures it has implemented to strengthen the ACPDM since 2021, including:

appointment of an independent complaints committee to resolve complaints about possible breaches by signatories of their code commitments;

creation of a portal on DIGI's website for the public to raise such complaints;

appointment of an independent reviewer to fact check and attest all signatories' transparency reports;

development of best practice reporting guidelines to drive improvements and consistency in transparency reports;

encouraging greater participation in the code by smaller digital platforms;

updating the definition of 'harm' in relation to mis- and disinformation; and

additional commitments reflecting updates to the strengthened EU Code of Practice in relation to recommender systems, and deterring advertisers from repeatedly placing digital advertisements that propagate mis- and disinformation.[207]

2.185Despite this, some participants raised concerns about the operation of the ACPDM. For example, Free TV highlighted concerns raised by ACMA about the implementation of the ACPDM in a June 2023 report, including:

a lack of public transparency about the measures signatories are taking under the ACPDM and the effectiveness of those measures;

a lack of key performance indicators by which signatories track progress against the aims and objectives of the ACPDM; and

a lack of awareness of signatories' commitments under the ACPDM hindering the complaints function.[208]

2.186This was also reflected in direct evidence from ACMA, which noted improvement in the operation of the ACPDM, but suggested that further work was needed, including the need for key performance indicators and an urgent need to improve transparency. Ms O'Loughlin also told the committee that ACMA had recommended greater investigative and enforcement powers for itself in relation to the ACPDM:

As I said, we think significant improvements to the [ACPDM] are needed. That's partly why we made recommendations to government around some powers for ourselves, for example greater powers around gathering information from platforms, formal investigative powers, formal information-gathering powers and the ability to step in if the [ACPDM] fails, using our own regulatory tools to create a standard or code which we would oversee.[209]

2.187The importance of enforcement was also highlighted by AMRC, which argued that without enforcement, the ACPDM is 'effectively redundant'.[210] Similarly, Reset.Tech Australia highlighted instances where voluntary requirements have simply been ignored by signatories to the ACPDM:

For example, despite being a signatory to the voluntary [ACPDM], which places clear obligations on platforms to enable end-users to report misinformation, X turned off the ability for Australians to report misinformation two weeks ahead of the Voice referendum.[211]

2.188Enhanced powers for ACMA, as a 'regulatory backstop' to the ACPDM, were also supported by DIGI.[212]

Different standards applying to media organisations and digital platforms

2.189In addition to concerns about the effectiveness of the ACPDM, various submitters highlighted the higher standards that apply to content generated by traditional media organisations versus content appearing on digital platforms.[213] For example, Schwartz media described how lower standards for social media platforms fuels online mis- and disinformation:

Misinformation is provided a clear path across social platforms as they are exempt from the majority of rules and legislation that govern what domestic media companies can publish, and become a firehose of locally and internationally produced content that is designed only to confuse and mislead.[214]

2.190Likewise, Free TV noted that the user generated content posted on digital platforms 'lacks the robust regulation that applies to our members'.[215]

2.191A similar point was made by We Are Explorers, which stated that digital platforms are not held to the same standards as media organisations 'and in fact encourage sensationalised, out-of-context, or downright false information through their algorithms'.[216]

2.192Mr Andrew Schreyer of Country Press Australia (CPA) argued that while Meta could be considered a publisher, it is not subject to the oversight or controls that apply to CPA members:

… its platforms are not subject to laws, including defamation and contempt of court, editorial standards and regulatory frameworks. Meta's Facebook has little if any checking mechanism prior to content and comments being published. Its algorithms serve up content based on popularity regardless of what may be appropriate for the audience. It has no transparent complaints process, is not held to account in any way and is not required to respond to complaints.[217]

2.193In a similar vein, Mr Michael Miller of News Corp Australasia commented that digital platforms lack accountability and respect for Australian laws:

News Corp journalists and commentators do get criticised for their reporting and exchange of views, but what they do is in a totally accountable, open environment governed by rule of law, and I stand by them. It's entirely different to the world of social media, who prey on women with fake porn, peddle scams and advertisements to rob the elderly, and push violent conspiracy theories with no respect for our laws and no accountability whatsoever.[218]

Jurisdictional challenges involved in regulating digital platforms

2.194The committee heard evidence in relation to the challenge of effectively regulating entities that may not operate from Australia.[219] For example, MrMcDonald of the Department of the Treasury told the committee that while digital platforms may have infrastructure and personnel in Australia, it may not be straightforward from a legal perspective:

… one of the challenges of the digital space is where [platforms] actually operate. Where do social media organisations operate? From a practical point of view, we see them on our phones and think they must be here. … but legally the questions are not quite as straightforward.

Essentially, parliament can pass laws but, if we're not able to enforce them and validly impose sanctions, the question I'm grappling with is how we're able to effectively enforce the laws the parliament passes.[220]

2.195While the approach to regulation in the EU was held up as a potential model for Australian legislators,[221] some participants, including Mr McDonald, noted that platforms had voluntarily agreed to be subject to EU jurisdiction:

If we look through how Europe has been able to bring digital platforms into its regulatory net, you see that they have voluntarily accepted being within their jurisdiction. If we look at some of the jurisprudence that's evolving, we see that it's more contested in other areas.[222]

2.196Similarly, USYD MECO explained that while the 'internet has long been governed to some degree through national laws, … this is made possible by platform companies agreeing to be subject to national jurisdiction, and contesting laws they don't agree with through the political process rather than outright defiance'. However, according to USYD MECO, recent court actions by X in Australia had shown 'a willingness to test the boundaries of nation-state authority over social media content'.[223]

2.197Some participants saw testing of enforceability as inevitable, rather than an insurmountable barrier to regulation, instead focusing on the importance of the way legislation is crafted. Ms O'Loughlin pointed to regulation of the interactive gambling sector as an example:

We have a similar issue: we regulate interactive gambling, and most of those businesses are external to Australia, but we do have regulatory powers that we can use. So I think it's crafting the regulation to give the right regulatory powers to a regulator to be able to do it. Of course, that will be tested along the way, but, as I think our ACCC colleagues mentioned earlier today, I think we are all learning from what works internationally. There's a huge amount of cooperation going on between regulators internationally, and I think we can advise the government and the parliament of what we're seeing internationally that we think will work.[224]

2.198Ms O'Loughlin also refuted arguments put forward by some platforms about the desirability of a global approach to regulation:

I think what gets put to regulators internationally by some of these parties is that we should have global regulation, because we can't have different regulation in different countries. No company operates like that globally. Every country has its own things that are important to it, either culturally, socially or economically, and it's well within the wit of any government to decide what should be in place for a particular country.[225]

2.199Overall, participants took a strong view that digital platforms needed to comply with Australian laws to continue operating here. For example, MrJeffrey Howard of Seven West Media contended that:

Australia should not acquiesce to the demands of the digital platforms. They should be made to play by our rules. Other multinational industries are compelled to comply with all manner of laws and regulations when they want to trade here. It's time for the social media exemption to be addressed.[226]

Footnotes

[1]The University of Technology Sydney (UTS) report, The Impact of Digital Platforms on News and Journalistic Content, published in October 2018, described how (as opposed to their early incarnation as content distribution platforms), digital platforms now perform a hybrid role that includes shaping the news agenda by distributing news and deciding what is and isn't acceptable content. They are, therefore, 'less than content producers, but more than mere intermediaries'.

[2]Media, Entertainment and Arts Alliance (MEAA), Submission 53, [p. 4].

[3]MEAA, Submission 53, [p. 3].

[4]MEAA, Submission 53, [p. 3].

[5]Ms Nerida O'Loughlin, Chair and Agency Head, Australian Communications and Media Authority (ACMA), Proof Committee Hansard, 21 June 2024, p. 47.

[6]Park, S., Fisher, C., McGuinness, K., Lee, J., McCallum, K., Cai, X., Chatskin, M., Mardjianto, L. & Yao, P. (2024). Digital News Report: Australia 2024. Canberra: News and Media Research Centre, University of Canberra, p. 10. This report was cited by multiple inquiry participants, including SBS, Submission 45, pp. 2 and 3; Schwartz Media, Submission 36, p. 3; The Daily Aus, Submission 43, [p.7]; Australian Broadcasting Corporation (ABC), Submission 65, p. 3; Nine Entertainment, Submission 159, [p. 2]; Country Press Australia (CPA), Submission 196, p. 3; ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S), Submission 120, p. 7.

[7]ACMA, Submission 52, p. 8.

[8]Free TV Australia (Free TV), Submission 54, p. 18.

[9]SBS, Submission 45, p. 5. Multilingual Australians are more likely to use Facebook than the total population or English-only speakers. First Nations communities have higher rates of social media usage compared to non-Indigenous Australians.

[10]See, for example, Free TV, Submission 54, pp. 3 and 27; Public Interest Journalism Initiative (PIJI), Submission 158, p. 23; Schwartz Media, Submission 36, p. 3; Mr Jeffrey Howard, Managing Director and Chief Executive Officer, Seven West Media; Proof Committee Hansard, 21 June 2024, p. 2; Mr Michael Miller, Executive Chairman, News Corp Australasia, Proof Committee Hansard, 21 June 2024, p.11; Mr Tony McDonald, Assistant Secretary, Competition and Consumer Branch, Department of the Treasury, Proof Committee Hansard, 25 June 2024, p. 4.

[11]Mr Tim Duggan, Chair, Digital Publishers' Alliance (DPA), Proof Committee Hansard, 21 June 2024, p.36.

[12]Ms Catriona Lowe, Deputy Chair, Australian Competition and Consumer Commission (ACCC), Proof Committee Hansard, 21 June 2024, p. 17.

[13]Local & Independent News Association (LINA), Submission 44, p. 4.

[14]Mr Anthony Kendall, Managing Director, Australian Community Media (ACM), Proof Committee Hansard, 21 June 2024, p. 26.

[15]Mr Mike Sneesby, Chief Executive Officer (CEO), Nine Entertainment, Proof Committee Hansard, 21 June 2024, p.10.

[16]Mr Tim Duggan, Chair, DPA, Proof Committee Hansard, 21 June 2024, p.37.

[17]See, for example, MEAA, Submission 53, [p. 9]; LINA, Submission 44, p. 5; Man of Many, Submission 33, p. 4.

[18]Mr Tim Duggan, Chair, DPA, Proof Committee Hansard, 21 June 2024, p.38.

[19]Ms Karen Percy, Media Federal President, MEAA, Proof Committee Hansard, 30 September 2024, p.13.

[20]Mrs Natalie Harvey, CEO, Mamamia, Proof Committee Hansard, 10 July 2024, p.15.

[21]Mr Nicholas Shelton, Founder and Publisher, Broadsheet Media, Proof Committee Hansard, 10 July 2024, p. 17.

[22]We Are Explorers, Submission 116, [pp. 1–2].

[23]Mr Tim Duggan, Chair, DPA, Proof Committee Hansard, 21 June 2024, p.38.

[24]Australians for a Murdoch Royal Commission (AMRC), Submission 188, p. 3.

[25]Human Technology Institute (HTI), Submission 146, p. 13.

[26]Mr Nicholas Shelton, Founder and Publisher, Broadsheet Media, Proof Committee Hansard, 10 July 2024, p. 18.

[27]See, for example, ABC, Submission 65, pp. 3–4; AMRC, Submission 188, p 3; Mr James Chisholm, Deputy Secretary, Communications and Media Group, Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA), Proof Committee Hansard, 2 July 2024, p. 3.

[28]MEAA, Submission 53, [p. 4].

[29]Mr Anthony Kendall, Managing Director, ACM, Proof Committee Hansard, 21 June 2024, p. 26.

[31]University of Sydney Department of Media and Communications (USYD MECO), Submission 154, p. 16.

[32]Free TV, Submission 54, p. 21.

[33]MEAA, Submission 53, [p. 4]. According to the MEAA, there are now 4000 to 5000 fewer editorial roles in Australia than there were in 2010.

[34]ABC, Submission 65, p. 3.

[35]PIJI, Submission 158, p. 2.

[36]PIJI, Submission 158, p. 3.

[37]LINA, Submission 44, p. 3.

[38]Mr James Chisholm, Deputy Secretary, Communications and Media Group, DITRDCA, Proof Committee Hansard, 2 July 2024, p. 9.

[39]Mr Peter Lewis, Founder, CPS Per Capita, Proof Committee Hansard, 10 July 2024, p. 6.

[40]Mr Bill Calcutt PSM, Submission 22, p. 3.

[41]DITRDCA, Submission 12, p. 4.

[42]HTI, Submission 146, p. 14 and PIJI, Submission 158, p. 2.

[43]Australian Human Rights Commission, Submission 79, p. 7.

[44]Department of Home Affairs, Submission 41, p. 5.

[45]See, for example, Media.com, Submission 148, pp. 1 and 2; Free TV, Submission 54, p. 14; Australian Experts to the International Holocaust Remembrance Alliance; Submission 135, p. 1; Mr Adrian McMahon, Submission 75, p. 1; Mrs Christy Hutt, Ms Dani Phelan, Ms Anne Hibbert and Ms Jayne Aguiar, Submission 74, [p. 2]; Dr Keith Heggart, Dr Damian Maher, and Associate Professor Bhuva Narayan, Submission 83, p. 4; LINA, Submission 44, p. 6; Mr Luke Arnold, Submission 34, p. 1.

[46]ABC, Submission 65, p. 5.

[47]Reset.Tech Australia, Submission 16, p. 2.

[48]HTI, Submission 36, p. 13.

[49]Ms Elizabeth O'Shea, Chair, DRW, Proof Committee Hansard, 10 July 2024, p. 2.

[50]MEAA, Submission 53, [pp. 2 and 8].

[51]SBS, Submission 45, p. 3.

[52]LINA, Submission 44, p. 8. See also, ACMA, Submission 52, p. 5.

[53]University of Canberra News and Media Research Centre, Submission 47, p. 2.

[54]Park, S., Fisher, C., McGuinness, K., Lee, J., McCallum, K., Cai, X., Chatskin, M., Mardjianto, L. & Yao, P. (2024). Digital News Report: Australia 2024. Canberra: News and Media Research Centre, University of Canberra, p. 126.

[55]ACMA, Submission 52, p. 5.

[56]Squiz Kids, Submission 114, [p. 2]. Squiz Kids surveyed more than 1000 people, with the majority being teachers and parents of school-aged children.

[57]Reset.Tech EU, Submission 137, p. 3.

[58]Department of Home Affairs, Submission 41, p. 5.

[59]See, for example, ACMA, Submission 52, p. 11; Broadsheet Media, Submission 198, p. 1; Public Interest Publishers Alliance, Submission 133, [p.1]; Digital Rights Watch (DRW), Submission 17, p. 13; We Are Explorers, Submission 116, [p. 3]; Mr Adrian McMahon, Submission 75, p. 1.

[60]SBS, Submission 45, p. 1.

[61]AMRC, Submission 188, p. 5.

[62]HTI, Submission 146, p. 14.

[63]Man of Many, Submission 33, p. 4.

[64]HTI, Submission 146, p. 14.

[65]CPS Per Capita, Submission 50, p. 2.

[66]DRW, Submission 17, p. 12.

[67]DRW, Submission 17, p. 13.

[68]ABC, Submission 65, p. 5.

[69]Free TV, Submission 54, p. 14.

[70]The Conversation, Submission 6, p. 2.

[71]ACMA, Submission 52, p. 11.

[72]ADM+S, Submission 120, p. 2.

[73]See, for example, USYD MECO, Submission 154, p. 22; Dr Rob Nicholls, Senior Research Associate, The University of Sydney, Proof Committee Hansard, 28 June 2024, p. 52.

[74]Children and Media Australia, Submission 140, p. 4. See also, Business News Australia, Submission5, p.3; Mr Les Daniel, Submission 10, p. 1; Fremantle Herald, Submission 204, p. 7.

[75]Mrs Julie Inman Grant, Commissioner, eSafety Commissioner, Proof Committee Hansard, 21 June 2024, pp. 57 and 62. As examples, Mrs Inman Grant referred to Meta's deprecation of its public insights tool, CrowdTangle, and the decision by X to charge USD$40 000 per month for access to its API.

[76]USYD MECO, Submission 154, p. 22.

[77]Mrs Julie Inman Grant, Commissioner, eSafety Commissioner, Proof Committee Hansard, 21 June 2024, p. 57.

[78]Dr Rys Farthing, Director of Research, Reset.Tech Australia, Proof Committee Hansard, 10 July 2024, p.4. According to Dr Farthing, the algorithms used by X and TikTok are available to view.

[79]Dr Rys Farthing, Director of Research, Reset.Tech Australia, Proof Committee Hansard, 10 July 2024, p.14.

[80]DRW, Submission 17, pp. 12–13.

[81]See, for example, Australian Human Rights Commission, Submission 79, pp. 8–9; LINA, Submission 44, p. 10; AMRC, Submission 188, p. 6; eSafeKids, Submission 112, p. 7; Man of Many, Submission 33, p. 4; Playhouse, Submission 110, [p. 4].

[82]Ms Nerida O'Loughlin, Chair and Agency Head, ACMA, Proof Committee Hansard, 21 June 2024, p.46.

[83]Cancer Council Australia (Cancer Council), Submission 189, p. 3 (citation omitted).

[84]PIJI, Submission 158, [p. 3].

[85]PIJI, Submission 158, p. 2.

[86]ReachOut, Beyond Blue and the Black Dog Institute, Submission 168, p. 7.

[87]Reset.Tech Australia, Submission 16, p. 2.

[88]See, for example, Department of Home Affairs, Submission 41, p. 5; Dr Andrea Carson, Professor of Political Communication, La Trobe University, Proof Committee Hansard, 28 June 2024, p. 51; HTI, Submission 146, p. 13; ABC, Submission 65, p. 4; Man of Many, Submission 33, p. 5; Mr Luke Arnold, Submission 34, p.1; Schwartz Media, Submission 36, p. 6; The Daily Aus, Submission 43, [p. 7].

[89]DITRDCA, Submission 12, p. 10.

[90]HTI, Submission 146, p. 13.

[91]Mr Peter Lewis, Founder, CPS Per Capita, Proof Committee Hansard, 10 July 2024, p. 7.

[92]Mr Peter Lewis, Founder, CPS Per Capita, Proof Committee Hansard, 10 July 2024, p. 8.

[93]Schwartz Media, Submission 36, p. 6.

[94]AMRC, Submission 188, p. 8.

[95]AMRC, Submission 188, p. 8.

[96]DRW, Submission 17, p. 13.

[97]HTI, Submission 146, p. 13.

[98]See, for example, Dr Keith Heggart, Dr Damian Maher, and Associate Professor Bhuva Narayan, Submission 83, p. 1; Ms Kyla Raby, Submission 76, [p. 1]; Australian Publishers, Submission 63, [p.2].

[99]Dr Keith Heggart, Dr Damian Maher, and Associate Professor Bhuva Narayan, Submission 83, p.4 (citation omitted).

[100]HTI, Submission 146, p. 13.

[101]Mr Peter Lewis, Founder, CPS Per Capita, Proof Committee Hansard, 10 July 2024, p. 6.

[102]Mr Nicholas Shelton, Founder and Publisher, Broadsheet Media, Proof Committee Hansard, 10 July 2024, p. 16.

[103]Ms Lisa Watts, CEO, The Conversation, Proof Committee Hansard, 10 July 2024, p.17.

[104]DRW, Submission 17, p. 13.

[105]Mr Rex Patrick, Submission 182, [p. 4].

[106]CPS Per Capita, Submission 57, p. 2.

[107]Mr Nicholas Shelton, Founder and Publisher, Broadsheet Media, Proof Committee Hansard, 10 July 2024, p. 16.

[108]Mr Michael Miller, Executive Chairman, News Corp Australasia, Proof Committee Hansard, 21 June 2024, p. 13.

[109]Mr Andrew Schreyer, President, CPA, Proof Committee Hansard, 21 June 2024, p. 27.

[110]PIJI, Submission 158, [p. 7].

[111]PIJI, Submission 158, [p. 4]. PIJI groups relevant issues into four pillars of public interest journalism: government, courts and crime, community, and public services (e.g. health, education and emergency).

[112]See, for example, ACMA, Submission 52, p. 9; PlayHouse, Submission 110, [p. 3]; HTI, Submission 146, p. 3; NSW Council for Civil Liberties, Submission 147, p. 5; AMRC, Submission 188, pp. 2–3; Business News Australia, Submission 5, p.3

[113]Free TV, Submission 54, p. 8.

[114]PIJI, Submission 158, [p. 3].

[115]USYD MECO, Submission 154, pp. 16–17.

[116]Australian Associated Press, Submission 55, p.2 See also, LINA, Submission 44, pp. 6–7; Department of Home Affairs, Submission 41, p. 4; PIJI, Submission 158, p. 2.

[117]ABC, Submission 65, p. 4.

[118]DITRDCA, News Media Assistance Program Consultation Paper, December 2023, pp. 9–10.

[119]DITRDCA, News Media Assistance Program Consultation Paper, December 2023, p. 10.

[120]See, for example, Schwartz Media, Submission 36, p. 11; MEAA, Submission 53, [p. 4]; Mr Rex Patrick, Submission 182, [p. 4]; USYD MECO, Submission 154, p. 22.

[121]Mr Nicholas Shelton, Founder and Publisher, Broadsheet Media, Proof Committee Hansard, 10 July 2024, p. 16.

[122]ABC, Submission 65, p. 6.

[123]Ms Lisa Watts, CEO, The Conversation, Proof Committee Hansard, 10 July 2024, p.17.

[124]The Conversation, Submission 6, p. 2.

[125]See, for example, MEAA, Submission 53, [p. 8]; Mr Anthony Kendall, Managing Director, Australian Community Media, Proof Committee Hansard, 21 June 2024, p. 26; Dr Andrea Carson, Professor of Political Communication, La Trobe University, Proof Committee Hansard, 28 June 2024, p. 55; ABC, Submission 65, p. 3.

[126]Dr Andrea Carson, Professor of Political Communication, La Trobe University, Proof Committee Hansard, 28 June 2024, p. 55.

[127]LINA, Submission 44, p. 7.

[128]Department of Home Affairs, Submission 41, p. 4.

[129]See, for example, DPA, Submission 39, p. 1; Capital Brief, Submission 64, pp. 2 and 3; LINA, Submission 44, p. 8; Free TV, Submission 54, pp. 6–7.

[130]DRW, Submission 17, pp 11–12.

[131]Mr Bill Calcutt PSM, Submission 22, p. 3. See also, Dr Keith Heggart, Associate Professor Simon Knight, Dr Damian Maher, and Associate Professor Bhuva Narayan, Submission 83, p. 4.

[132]AMRC, Submission 188, p. 4.

[133]MEAA, Submission 53, [p. 4].

[134]Free TV, Submission 54, p. 14.

[135]Digital Industry Group Inc. (DIGI), Submission 57, p. 16.

[136]Dr Andrea Carson, Professor of Political Communication, La Trobe University, Proof Committee Hansard, 28 June 2024, p. 53.

[137]ADM+S, Submission 120, p.2.

[138]Ms Nerida O'Loughlin, Chair and Agency Head, ACMA, Proof Committee Hansard, 21 June 2024, p.51.

[139]Ms Anna Draffin, CEO, PIJI, Proof Committee Hansard, 28 June 2024, p. 44.

[140]See, for example, Man of Many, Submission 33, p. 4; LINA, Submission 44, p. 8; DPA, Submission 39, p.1; Ms Anna Draffin, CEO, PIJI, Proof Committee Hansard, 28 June 2024, p. 41.

[141]LINA, Submission 44, p. 8.

[142]ACMA, Submission 52, pp. 8 and 9.

[143]See, for example, LINA, Submission 44, p. 8; AMRC, Submission 188, p. 2; Fremantle Herald, Submission 204, pp. 20 and 21.

[144]Private Media, Submission 56, [p. 1].

[145]Capital Brief, Submission 64, p. 2. According to Capital Brief, News Corp, Nine Entertainment and Seven West Media own the leading news mastheads in every state, the national broadsheet, the national business publication, the most-read free news websites, the top news/talk radio stations and the leading free and subscription television networks.

[146]AMRC, Submission 188, p. 2. According to AMRC, Australia has the third most concentrated media market in the world, with the top four media companies in Australia (NewsCorp, Seven, Nine Entertainment/Fairfax, and ACM) controlling 95 per cent of total revenue from daily newspapers, 75 per cent of total revenue from free to air TV, and 70 per cent of total revenue from radio broadcasting.

[147]LINA, Submission 44, p. 4. See also, We Are Explorers, Submission 116, [p. 4].

[148]ACMA, Submission 52, p. 9.

[149]ACMA, Submission 52, p. 9. The first report under the Media Diversity Measurement Framework will be published by the end of 2024.

[150]LINA, Submission 44, p. 8.

[151]Ms Claire Stuchbery, Executive Director, LINA, Proof Committee Hansard, 28 June 2024, p. 42.

[152]Dr Tanya Notley, Submission 176, p. 2.

[153]See, for example, Australian Media Literacy Alliance (AMLA), Submission 134, p. 1; LINA, Submission 44, p. 9; Cancer Council, Submission 189, p. 3; The Daily Aus, Submission 43, [p. 15]; SBS, Submission 45, p. 8; Playhouse, Submission 110, [p. 3].

[154]Ms Claire Stuchbery, Executive Director, LINA, Proof Committee Hansard, 28 June 2024, p. 42.

[155]SBS, Submission 45, p. 8.

[156]AMRC, Submission 188, p. 8.

[157]Australian Library and Information Association (ALIA), Submission 62, p. 1.

[158]AMLA, Submission 134, p. 3.

[159]Park, S., Fisher, C., McGuinness, K., Lee, J., McCallum, K., Cai, X., Chatskin, M., Mardjianto, L. & Yao, P. (2024). Digital News Report: Australia 2024. Canberra: News and Media Research Centre, University of Canberra, p. 121.

[160]Dr Andrea Carson, Professor of Political Communication, La Trobe University, Proof Committee Hansard, 28 June 2024, p. 57.

[161]ALIA, Submission 62, p. 1. The percentage of Australians with low media literacy levels varied by cohort: Australian aged 75 years and older (75 per cent), those aged 56–74 (57 per cent), those with low education levels (56 per cent), those living with a disability (48 per cent), low-income Australians (43 per cent), and those living in regional Australia (41 per cent). See also, AMLA, Submission 134, p. 2.

[162]ADM+S, Submission 120, p. 9.

[163]AAP, Submission 55, pp. 3 and 4.

[164]AAP, Submission 55, p. 4.

[165]Reset.Tech Australia, Submission 16, p. 4.

[166]Schwartz Media, Submission 36, p. 10.

[167]Dr Andrea Carson, Professor of Political Communication, La Trobe University, Proof Committee Hansard, 28 June 2024, p. 53.

[168]DRW, Submission 17, p. 12.

[169]SBS, Submission 45, p. 3.

[170]ADM+S, Submission 120, p. 10.

[171]Reset.Tech. Australia, Submission 16, p. 4.

[172]Reset.Tech. Australia, Submission 16, p. 4.

[173]Department of Home Affairs, Submission 41, p. 4.

[174]ABC, Submission 65, p. 5.

[175]ACMA, Submission 52, p. 5. DIGI administers the Australian Code of Practice on Disinformation and Misinformation (ACPDM) and governance arrangements to encourage signatory compliance. ACMA reports to the Australian Government on the effectiveness of the ACPDM.

[176]USYD MECO, Submission 154, p. 18.

[177]See, for example, Mrs Julie Inman-Grant, Commissioner, eSafety Commissioner, Proof Committee Hansard, 21 Jun 2024; p. 59; Schwartz Media, Submission 36. p. 8; eSafeKids, Submission 112, p. 7; Professor Lorna Woods, Submission 191, [p. 2]; Dr Susie Alegre, Submission 18, p. 2.

[178]Human Rights Law Commission, Submission 184, Attachment 3 (Submission to the Senate Economics References Committee inquiry into the influence of international digital platforms), p.9.

[179]Human Rights Law Commission, Submission 184, Attachment 1 (Submission to the Statutory Review of the Online Safety Act 2021), p. 17.

[180]USYD MECO, Submission 154, p. 19 (citation omitted).

[181]USYD MECO, Submission 154, p. 18.

[182]USYD MECO, Submission 154, p. 18. The Cambridge Analytica scandal involved third parties directing Facebook user data to political campaigns under misleading pretences.

[183]Ms Nerida O'Loughlin, Chair and Agency Head, ACMA, Proof Committee Hansard, 21 June 2024, p.54. Ms O'Loughlin stated this advice was provided in relation to the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023.

[184]Reset.Tech Australia, Submission 16, p. 1 (citations omitted).

[185]Reset.Tech Australia, Submission 16, p. 2.

[186]USYD MECO, Submission 154, p. 18.

[187]Human Rights Law Commission, Submission 184, Attachment 3 (Submission to the Senate Economics References Committee inquiry into the influence of international digital platforms), p.9.

[188]Dr Farthing, Director of Research, Reset.Tech Australia, Proof Committee Hansard, 21 June 2024, p.4.

[189]AMRC, Submission 188, p. 8.

[190]AMRC, Submission 188, p. 9.

[191]USYD MECO, Submission 154, p. 19 (citation omitted).

[192]USYD MECO, Submission 154, p. 19 (citation omitted).

[193]ACMA, Submission 52, p. 7.

[194]The Hon. Michelle Rowland MP, Minister for Communications, Proof House of Representatives Hansard, 12 September 2024, p. 7.

[195]Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, Explanatory Memorandum, p. 1.

[196]Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, Explanatory Memorandum, p. 1.

[197]Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, Explanatory Memorandum, p. 4.

[198]DITRDCA, Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023—Fact sheet, June 2023, p. 2.

[199]Mr Sam Kursar, Acting Assistant Secretary, Digital Platforms, International and Policy Branch, DITRDCA, Proof Committee Hansard, 1 October 2024, p. 51.

[200]Mr Sam Kursar, Acting Assistant Secretary, Digital Platforms, International and Policy Branch, DITRDCA, Proof Committee Hansard, 1 October 2024, p. 55.

[201]Mr Sam Kursar, Acting Assistant Secretary, Digital Platforms, International and Policy Branch, DITRDCA, Proof Committee Hansard, 1 October 2024, p. 54.

[202]Mr Sam Kursar, Acting Assistant Secretary, Digital Platforms, International and Policy Branch, DITRDCA, Proof Committee Hansard, 1 October 2024, p. 53.

[203]Mr Sam Kursar, Acting Assistant Secretary, Digital Platforms, International and Policy Branch, DITRDCA, Proof Committee Hansard, 1 October 2024, p. 54.

[204]ACMA, Submission 52, p. 5.

[205]ACMA, Submission 52, p. 5. DIGI administers the Australian Code of Practice on Disinformation and Misinformation (ACPDM) and governance arrangements to encourage signatory compliance. ACMA reports to the Australian Government on the effectiveness of the ACPDM.

[206]DIGI, Submission 57, p. 13.

[207]DIGI, Submission 57, pp. 14, 15 and 16.

[208]Free TV, Submission 54, p. 15.

[209]Ms Nerida O'Loughlin, Chair and Agency Head, ACMA, Proof Committee Hansard, 21 June 2024, p.46.

[210]AMRC, Submission 188, p. 13.

[211]Reset.Tech Australia, Submission 16, p. 3 (citation omitted).

[212]DIGI, Submission 57, p. 14.

[213]See, for example, HTI, Submission 146, p. 14; Mr Michael Miller, Executive Chairman, News Corp Australasia, Proof Committee Hansard, 21 June 2024, p. 12; The Conversation, Submission 6, p. 2.

[214]Schwartz Media, Submission 36, p. 7.

[215]Free TV, Submission 54, p. 14.

[216]We Are Explorers, Submission 116, [p. 3].

[217]Mr Andrew, President, Country Press Australia, Proof Committee Hansard, 21 June 2024, p. 27.

[218]Mr Michael Miller, Executive Chairman, News Corp Australasia, Proof Committee Hansard, 21June 2024, pp. 1–2.

[219]See, for example, Mr Bruce Meagher, Head, Public Affairs, Proof Committee Hansard, 28 June 2024, p.40; Tattarang, Ms Rosie Thomas, Director of Campaigns, CHOICE, Proof Committee Hansard, 10 July 2024, p. 49; Ms Kate Reader, General Manager, Digital Platforms Branch, ACCC, Proof Committee Hansard, 21 June 2024, pp. 20 and 23.

[220]Mr Tony McDonald, Assistant Secretary, Competition and Consumer Branch, Department of the Treasury, Proof Committee Hansard, 25 June 2024, p. 6.

[221]See, for example, Dr Rys Farthing, Director of Research, Reset.Tech Australia, Proof Committee Hansard, 10 July 2024, p. 5; Ms Chandni Gupta, Deputy Chief Executive Officer and Digital Policy Director, Consumer Policy Research Centre, Proof Committee Hansard, 10 July 2024, p. 48; Ms Rosie Thomas, Director of Campaigns, CHOICE, Proof Committee Hansard, 10 July 2024, p. 52; Dr Rob Nicholls, Senior Research Associate, The University of Sydney, Proof Committee Hansard, 28 June 2024, p. 52.

[222]Mr Tony McDonald, Assistant Secretary, Competition and Consumer Branch, Department of the Treasury, Proof Committee Hansard, 25 June 2024, p. 6.

[223]USYD MECO, Submission 154, p. 19. USYD MECO referred to X's successful prosecution of the case of whether Australian restrictions on violent content can be applied outside of Australia.

[224]Ms Nerida O'Loughlin, Chair and Agency Head, ACMA, Proof Committee Hansard, 21 June 2024, p.48.

[225]Ms Nerida O'Loughlin, Chair and Agency Head, ACMA, Proof Committee Hansard, 21 June 2024, p.49.

[226]Mr Jeffrey Howard, Managing Director and CEO, Seven West Media, Proof Committee Hansard, 21 June 2024, p.2.