Chapter 1 - Introduction

Chapter 1Introduction

1.1The Joint Select Committee on Social Media and Australian Society (committee) was appointed by resolution of the Senate on 15 May 2024 and resolution of the House of Representatives on 16 May 2024, to inquire into and report on the influence and impacts of social media on Australian society, with reference to:

(a)the use of age verification to protect Australian children from social media;

(b)the decision of Meta to abandon deals under the News Media Bargaining Code;

(c)the important role of Australian journalism, news and public interest media in countering mis- and disinformation on digital platforms;

(d)the algorithms, recommender systems and corporate decision making of digital platforms in influencing what Australians see, and the impacts of this on mental health;

(e)other issues in relation to harmful or illegal content disseminated over social media, including scams, age-restricted content, child sexual abuse and violent extremist material; and

(f)any related matters.[1]

Conduct of the committee's inquiry

1.2The committee advertised the inquiry on its website and wrote to relevant stakeholders and other interested parties inviting them to make a written submission by 28 June 2024.

1.3The committee received 217 submissions, as well as additional information and answers to questions on notice.

1.4At the time of reporting, the committee had held nine public hearings in Canberra:

Friday, 21 June 2024;

Tuesday, 25 June 2024;

Friday, 28 June 2024;

Tuesday, 2 July 2024;

Wednesday, 10 July 2024;

Wednesday, 4 September 2024;

Monday, 30 September 2024;

Tuesday, 1 October 2024; and

Wednesday, 2 October 2024.

1.5Links to public submissions, hearing programs (including witnesses), Hansard transcripts of evidence, answers to questions on notice, and other additional documents published by the committee are available on the committee website.

Acknowledgements and references

1.6The committee thanks those individuals and organisations who contributed to the inquiry by providing written submissions and giving evidence at public hearings. In particular, the committee would like to thank those individuals who shared their personal experiences with the committee. These important testimonies have helped to inform the findings of this report.

1.7References in this report to the Hansard transcripts of public hearings are to the proof Hansard transcripts. Page numbers may vary between the proof and official Hansard transcripts.

Scope of this report

1.8This report is an interim report that will focus on the decision of Meta to abandon deals under the News Media and Digital Platforms Mandatory Bargaining Code (Code) and the important role of Australian journalism, news and public interest media on a healthy democracy in countering mis- and disinformation on digital platforms.

1.9The report comprises three chapters, including this introductory and background chapter, with the remaining chapters as follows:

Chapter 2 looks at the impact of social media on the Australian media landscape, the rise of mis- and disinformation on digital platforms, the importance of public interest journalism, and the effectiveness of current approaches to the regulation of digital platforms.

Chapter 3 examines the operation of the Code and the impact of Meta's decision to abandon deals under the Code, and outlines the committee view and recommendations.

1.10While the focus of this interim report is on the Code, the committee did hear evidence in its initial tranche of hearings from many witnesses on the broader aspects of the committee's terms of reference.

1.11These issues included age verification; online safety; algorithms and recommender systems; the impacts of the mental health of users; lack of accountability of social media platforms; the privacy and data issues inherent in using social media; and the complexity of addressing the issues.

1.12While this report provides some consideration of algorithms and recommender systems, these will be explored further in the committee's final report looking at how social media operates, the impacts that social media has on various cohorts across Australian society, and ways in which the potential harms can be mitigated or prevented.

Operation of digital platforms in Australia

1.13This section provides background on social media services in Australia, including the key Australian regulatory schemes, as well as recent examples of international jurisdictions that have taken steps to strengthen the effectiveness of their regulatory frameworks.

What are digital platforms?

1.14Digital platforms refer to online systems that facilitate the creation, exchange and consumption of digital content and services. This includes, but is not limited to, internet search engines, digital content aggregators, social media services, private messaging services, media referral services and electronic marketplaces.[2]

1.15These platforms can take various forms and serve diverse purposes—such as enabling social connections, providing access to information and entertainment, the ability to conduct business, or for people to buy goods and services.[3]

1.16Australians rely on digital platforms for many of these services, which are provided by a small group of global companies. The five largest digital platforms are Apple, Microsoft, Google, Amazon and Meta.[4] These companies own and operate multiple digital platforms—for example, Meta owns Facebook, Instagram and WhatsApp, and Google owns YouTube.[5]

1.17Social media services are a component of digital platforms that enable user-to-user type interactions and/or the sharing of user-generated content.[6] Theseservices can include e-commerce as well as 'purely social networking services, as well as professional networking, dating, chat, messaging, social gaming, and user generated content-sharing services, among others'.[7]

1.18Digital platforms generate their revenue primarily from advertising, generally through collecting and harnessing user data and capturing user attention.[8] TheAustralian Competition and Consumer Commission (ACCC) has noted that 'the monetisation of user attention and data through selling digital advertising is essential to the business model of social media platforms'.[9] The business model used by some of the larger platforms is discussed in more detail below.

Australian regulation of digital platforms

1.19There are a range of laws and schemes in Australia covering online content, with responsibility for regulating digital platforms primarily shared between the Australian Communications and Media Authority (ACMA), the ACCC, the Office of the Australian Information Commissioner (OAIC), and the Office of the eSafety Commissioner (eSafety Commissioner). Together, these agencies comprise the Digital Platforms Regulators Forum (DP-REG).[10]

1.20While not a decision-making body, DP-REG members share information about, and collaborate on, overlapping issues and activities involving the regulation of digital platforms. This includes consideration of how competition, consumer protection, privacy, online safety and data issues intersect.[11]

1.21Several regulatory reforms have been announced and are in the process of being implemented or are currently before Australian parliaments.[12]

Australian Communications and Media Authority

1.22ACMA oversees content regulation in Australia, including matters related to some online content, broadcasting standards and classification, radiocommunications and telecommunications. Digital platforms may be subject to obligations regarding harmful or illegal content, such as child exploitation material, violence, or hate speech.[13] The ACMA also oversees the Australian Code of Practice on Disinformation and Misinformation (CPDM) relating to targeting of a person with dis- and misinformation.[14]

1.23The Commonwealth Government (government) is currently considering amendments to an exposure draft of its Communications Legislation Amendment. (Combatting Misinformation and Disinformation) Bill 2023. Theproposed bill would enable the ACMA to better monitor and assess the effectiveness of platform moderation activities, while incentivising greater participation and performance by industry under the existing CPDM.[15]

Australian Competition and Consumer Commission

1.24The ACCC conducts inquiries and law enforcement cases in relation to Australian Consumer Law and anti-competitive behaviour by digital platforms. The ACCC has an ongoing 5-year inquiry into the supply of digital platform services, including digital advertising and data collection, storage, supply, processing and analysis services by digital platforms.[16] The ACCC also operates the National Anti-Scam Centre and is engaging with the government to support the development of mandatory industry codes on scams.[17]

1.25In response to the recommendations of the ACCC's fifth interim report for its Digital Platform Services Inquiry, the government has written to digital platforms requesting that they develop a voluntary international dispute resolution code by July 2024.[18]

Office of the Australian Information Commissioner

1.26The OAIC has three main functions, including protecting the privacy of individuals under the Privacy Act 1988 (the Act) and other legislation, access to information held by the government in accordance with the Freedom of Information Act 1982, and information management as set out in the Australian Information Commissioner Act 2010.[19]

1.27In its interim response to the Privacy Act Review, the government concluded that it was necessary to overhaul Australia's privacy laws to ensure they remain fit-for-purpose in the digital age.[20] The government agreed with several of the report's proposals including:

updating and clarifying the definition of 'personal information' to ensure that it includes technical and inferred information where this can be used to identify individuals;

requiring that the collection, use and disclosure of personal information must be fair and reasonable in the circumstances; and

introducing a Children's Online Privacy Code that applies to online services that are likely to be accessed by children.[21]

eSafety Commissioner

1.28The eSafety Commissioner is the primary agency tasked with regulating online safety and its legislative functions are set out in the Online Safety Act 2021 (OSA). This includes administering complaints and investigation schemes for four types of online harms—cyberbullying of children, cyber abuse of adults, the non-consensual sharing of intimate images, and illegal or restricted online content.[22]

1.29The OSA also provides the eSafety Commissioner with a mechanism to request or require the blocking of material that promotes, incites, instructs in or depicts abhorrent violent conduct if the material is likely to cause significant harm in the Australian community.[23]

1.30On 31 May 2024, the Online Safety (Basic Online Safety Expectations) Amendment Determination 2024 came into effect, which includes new core expectations that service providers:

take reasonable steps to proactively minimise material or activity that is unlawful or harmful, and ensuring users can use a service in a safe manner

protecting children from content that is not age appropriate like pornography

taking reasonable steps to prevent harmful use of anonymous and encrypted services

putting in place user-reporting mechanisms, and clearly outlining their terms of service and enforcing penalties for people who breach these terms

cooperating with other service providers

responding to requests for information from the eSafety Commissioner.[24]

1.31As part of the 2024–25 Budget, the government announced funding for the Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA) to conduct a trial of age assurance technology to protect children from age-restricted online content. The trial will complement the eSafety Commissioner's development of Phase 2 industry codes.[25]

1.32The government is also conducting an independent statutory review of the OSA, which will consider whether additional protections are needed for harmful online material such as hate speech and image-based abuse.[26] A report is due to be provided to the government by 31 October 2024.

1.33A summary of each agency's responsibilities in relation to digital platforms is shown in Figure 1.1 below.

Figure 1.1Digital platform regulators' roles and responsibilities

Source: The Digital Platform Regulators Forum, Submission 35, p. 9.

International regulatory frameworks

1.34Several international jurisdictions have taken steps to strengthen the effectiveness of their regulatory frameworks, with the European Union, United Kingdom and Canada seeking to address the harms associated with online platforms and the need to increase their accountability and responsibility to consumers.

European Union (EU)

1.35The EU have recently introduced a new regulatory regime, including the Digital Services Act (DSA) and the Digital Markets Act (DMA).[27] The DSA regulates online intermediaries and platforms, including online marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms.[28] It also provides for additional, stricter obligations for Very Large Online Platforms (VLOPs) and search engines.[29]

1.36Under the DSA, VLOPs are required to undertake annual risk assessments to identify any significant systemic risks arising from the functioning and use of their services, including their algorithms, recommender systems, content moderation systems, user terms and conditions, advertising systems and data-related practices.[30]

1.37Similarly, the DMA establishes specific designated 'gatekeeper' platforms.[31] Gatekeeper platforms are digital platforms with a systemic role in the internal market that function as bottlenecks between businesses and consumers for important online services, such as for example online search engines, app stores, messenger services.[32]

1.38On 6 September 2023, the European Commission designated six digital platform providers as gatekeepers under the DMA. Alphabet (Google's parent company), Amazon, Byte Dance, Meta and Microsoft were required to make their combined 22 core platform services compliant with the DMA by March 2024.[33]

United Kingdom (UK)

1.39The UK Online Safety Act (UK OSA) addresses illegal content and content harmful to children online.[34] The UK OSA became law in October 2023 and established the Office of Communications (Ofcom) as the regulator for online safety, with appropriate enforcement, oversight and designation responsibilities.[35]

1.40The UK OSA also established requirements to scan for child sexual exploitation and abuse material, and to report such material to the UK National Crime Agency, and duty of care obligations, requiring platform providers to act against a range of illegal and harmful online material.[36]

Canada

1.41The Online News Act in Canada establishes a framework for digital news intermediary operators and news businesses to enter into agreements to make available news content on digital platform services.[37]

1.42On 26 February 2024, the Canadian government also tabled the landmark Bill C-63, which seeks to enact the Online Harms Act.[38] If passed, it will set in place a legislative scheme to hold a wide range of services accountable for reducing the risk of harmful content online. This would include:

requirements for online platforms to monitor and remove seven categories of harmful content;

the establishment of three new regulatory bodies, such as a Digital Safety Commission, to monitor and order removal of content;

new offences under the Criminal Code of Canada, to address hate speech content and messaging; and

new powers to impose cost recovery charges on social media service providers, to fund the operation of the regulations.[39]

Influence of social media platforms in Australia

1.43Digital platforms have become a part of daily life for many Australians. TheACCC has noted that as of January 2023, there were 21.3 million active users of social media platforms—representing 81 per cent of the total Australian population.[40] The average monthly active users by social media platforms between 2019 and 2022 is shown in Figure 1.2.

Figure 1.2Average monthly active users by social media platforms 2019–22

Source: ACCC analysis of Sensor Tower data. This chart reflects the yearly average of Australian monthly active users of social media platforms in 2019 and 2022. Extracted from the ACCC, Digital Platform Services Inquiry Interim Report No. 6 – Social media services in Australia, March 2024, p. 32.

How Australians use social media

1.44Australians use social media services for significant amounts of time each month. For example, the time spent on Facebook and Instagram by Australian monthly users increased from 16.9 to 17.2 hours per month on Facebook, and 9.8 to 9.9 hours per month on Instagram.[41]

1.45Similarly, between 2020 and 2022 the average Australian adult's time spent on YouTube's mobile apps increased from 21.5 to 22.6 hours per month. TikTok had the greatest increase in total time spent on its mobile app, with the average adult's time spent on TikTok increasing from 6.5 hours per month in 2020 to 13.1hours per month in 2022[42]

1.46According to the University of Canberra's latest Digital News Report: Australia, Australians are also increasingly relying on social media to get their news, with one in two Australians using social media as a general source of news, and 25percent using it as their main source (See Figure 1.3).[43] The study also noted that 60percent of Gen Z audiences rely on social media as their main source of news and Instagram was their top news source among social media platforms.[44]

Figure 1.3General sources of news 2016–24 (%)

Source: Digital News Report: Australia 2024. Canberra: News and Media Research Centre, University of Canberra, p. 81.

Operational/business models

1.47The business models used by some of the larger consumer facing digital platforms, has been to charge a zero monetary price to consumers in return for their attention, the collection of their data, and the subsequent ability to sell targeted advertising opportunities.[45]

1.48Digital platforms use this data to understand their user's interests, activities, interests, location, and demographics, and in turn monetise this data by allowing advertisers to target certain types of users based on this data.[46] Therelationships between digital platforms, consumers, and businesses are illustrated in Figure 1.4 below.

Figure 1.4Relationships between digital platforms, consumers, businesses and media content creators

Source: Extracted from the ACCC's Digital Platforms Inquiry – Final Report, June 2019, p. 61.

Algorithms and recommender systems

1.49The way digital platforms collect and present content to their users has changed over time and the processes they have used for doing so have in part driven their growth. Many platforms initially presented content in chronological order, but as technology advanced, they began to use algorithms to distribute content to their users in a more personalised manner.[47] Algorithms are used to target advertisements to specific groups of users, making display advertising opportunities highly effective at reaching their desired audience.[48]

1.50Algorithms facilitate a level of personalisation, and help users navigate online material to discover content of relevance and interest. They are also used by some digital platforms to assist with content moderation, identification of harmful material as well as for targeted advertising.[49]

Box 1.1 What are algorithms?

An algorithm is a coded sequence of instructions that is often used by online service providers to prioritise content a user will see.

These instructions are determined by platforms based on many factors, such as user attributes and patterns, and can involve personalised suggestions to achieve a particular goal, such as discovering new artists, friends, products, activities, and ideas, as well as helping business and creators efficiently reach a target audience.

For these reasons, algorithms are used by almost all digital platforms to amplify, prioritise and recommend content and accounts to their users. Their use and sophistication continue to grow, with multiple algorithms typically being active within a platform at any given time, all completing different tasks with different outcomes.[50]

1.51While the specific process by which content is prioritised can be called a 'recommender algorithm', large social media platforms do not use a single algorithm to sort content. Most platforms have complex and interlocking algorithmic systems that use varying kinds of algorithms for a range of purposes—including but not limited to content recommendation.

1.52A set of algorithms that prioritise content or make personalised content suggestions on online services are referred to as 'recommender systems' or 'content curation systems'.[51] Not all social media content is driven by recommender algorithms, but their use has significantly increased over time because of how effective the algorithms are in driving user engagement and, therefore, profit.[52]

1.53Where used, recommendation algorithms seek to increase engagement with platform content (such that users stay on the platform longer).[53]

Concerns about algorithm use

1.54There are significant concerns that algorithms used by digital platforms do not operate in a way that adequately supports community values, such as fairness, accuracy, privacy and user safety.[54] These concerns are discussed in more detail in Chapter 2 of this report.

The Bargaining Code

1.55The Code commenced in March 2021 following the passage through the Parliament of the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Act 2021.[55]

1.56The Code aims to govern commercial relationships between Australian news businesses and 'designated' digital platforms who benefit from a significant bargaining power imbalance.[56] It allows eligible registered news media businesses to bargain individually or collectively with digital platforms over payment for the inclusion of news on these platforms and services.[57]

1.57News media businesses can be registered by ACMA if they satisfy tests relating to annual revenue, the type of news content published, professional and editorial standards, and having a predominantly Australian audience.[58] Currently, the revenue test for corporations is set at $150 000 per annum (either in the most recent year for which there are accounts, or in three of the five most recent years for which there are accounts).[59]

1.58The Code was developed by the ACCC at the direction of the former Australian Government in response to the ACCC's Digital Platforms Inquiry Final Report. The report identified a significant power imbalance between Australian news organisations and two digital platforms, namely Google and Facebook:[60]

While the digital platforms clearly value the news media content that they are able to display to their users, Google and Facebook each appear to be more important to the major news media businesses than any one news media business is to Google or Facebook. As set out above, this provides each of Google and Facebook with substantial bargaining power in relation to many news media businesses.[61]

1.59This imbalance, coupled with the reliance of news organisations on traffic from Google and Facebook, had led to news organisations 'accepting commercial deals with these platforms that are less favourable than they would otherwise agree to'. It was also seen to be 'undermining the ability and incentives for Australian news businesses to produce news content'.[62]

1.60The ACCC inquiry heard particular concerns about:

the lack of warning provided by digital platforms to news media businesses of changes to key algorithms relating to the display of news content or news referral links;

the implementation of policies and formats that may have a significant and adverse impact on the ability of news media businesses to monetise their content and/or to build or sustain a brand and therefore an audience;

the impact of such policies on the incentives for news and journalistic content creation, particularly where significant effort is expended to research and produce original content.[63]

1.61Government intervention in the relationship between news organisations and digital platforms was considered necessary due to the 'public benefit provided by the production and dissemination of news and the importance of a strong independent media in a well-functioning democracy'.[64]

Operation of the Code

1.62Following the introduction of the Code, Google and Meta reached voluntary commercial agreements with many news media organisations.[65] TheACCC has separately authorised both Country Press Australia and Commercial Radio Australia to collectively bargain with Google and Facebook for remuneration for news content featured on those platforms without breaching Australian competition laws.[66]

1.63The ACCC has similarly published two collective bargaining class exemption notices lodged by the Minderoo Foundation on behalf of 23 small publishers which allows them to collectively bargain with Google and Meta for remuneration for news content featured on those platforms without breaching Australian competition laws.[67]

1.64According to the Treasury, as at the end of 2021, Google had entered into 20 agreements with news organisations and Meta had entered into 14 agreements.[68] While the details of these agreements remain confidential, it is estimated that the total funding secured by news organisations following introduction of the Code exceeds $200 million.[69]

1.65Under the Code, the Minister may designate a digital platform corporation and its digital platform services to participate in the Code.[70] At the time of writing, no digital platforms have yet been designated.[71]

1.66In deciding whether to designate a digital platform, the Minister must consider whether there is a significant bargaining power imbalance between the platform and Australian news businesses, and whether the platform has made a significant contribution to the sustainability of the Australian news industry, including through agreements to remunerate those businesses for their news content.[72]

1.67On 29 February 2024, Meta announced on its website that it would be discontinuing Facebook News—a dedicated tab for news content—in the United States and Australia.[73] A similar announcement to discontinue Facebook News in the UK, France, Germany and Canada was announced in 2023.[74]

1.68As part of the announcement, Meta stated that it had decided not to enter or renew commercial deals for news content in Australia and that no new Facebook products would be offered for news publishers in the future.[75]

1.69Further discussion on the impact of Meta's decision to abandon deals under the Code in Australia is provided in Chapter 3.

Review of the Code

1.70In November 2022, the Department of the Treasury published the report of its review of the Code.[76] The review considered it 'reasonable to conclude that the Bargaining Code has been a success to date'.[77] It also noted that over '30commercial agreements had been struck, agreements that were highly unlikely to have been made without the Bargaining Code'.[78]

1.71The review made several recommendations for improving the future operation of the Code, including that the government consider:

directing the ACCC to prepare periodic reports on extending the Code to other platforms;

the need for additional ACCC information gathering powers in relation to agreements between platforms and news businesses, in the context of its response to the ACCC's proposed major digital platforms reforms; and

commissioning another review of the Code after it has operated for four years.[79]

1.72The government response to the Treasury review indicated support for all five of its recommendations.

Footnotes

[1]Journals of the Senate, No. 110, 15 May 2024, pp. 3357–3359; House of Representatives, Votes and Proceedings, No. 119, 16 May 2024, pp. 1530–1531.

[2]Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA), Submission 12, p. 3.

[3]Australian Competition and Consumer Commission (ACCC), Digital Platform Services Inquiry Interim Report No. 6 – Social media services in Australia, March2023.

[6]DITRDCA, Submission 12, p. 3.

[7]DITRDCA, Submission 12, p. 3. Section 13 of the Online Safety Act 2021 (Cth) defines the term ‘social media service’ in relatively broad terms. The powers in the Act apply in relation to social media services and relevant electronic services, covering a wide range of digital interaction methods.

[10]The Digital Platform Regulators Forum (OP-REG), Submission 35, pp. 1–2. OP-REG is an initiative of Australian independent regulators to share information about and collaborate on. cross-cutting issues and activities on the regulation of digital platforms.

[11]OP-REG, Submission 35, pp. 1–2.

[12]DITRDCA, Submission 12, pp. 8–13.

[13]Australian Communications and Media Authority, Submission 52, pp. 5–10.

[14]Australian Communications and Media Authority, Submission 52, pp. 5–6.

[15]Australian Communications and Media Authority, Submission 52, p. 7. See also, DITRDCA, Submission 12, pp. 10–11.

[16]The ACCC must provide the Minister an interim report every six months until the inquiry concludes, with a final report to be provided to the Minister by 31 March 2025.

[17]OP-REG, Submission 35, p. 3.

[18]Government Response to ACCC Digital Platform Services Inquiry, Government Response to ACCC Digital Platform Services Inquiry,2023. See also, DITRDCA, Submission 12, pp. 11–12.

[19]Office of the Australian Information Commissioner, Annual report 2022–23, p. 7.

[20]Australian Government, Attorney-General's Department, Privacy Act Review Report 2022, 16February 2023. The Privacy Act Review originated out of the recommendations of the ACCC's 2019 Digital Platforms Inquiry final report.

[22]Office of the eSafety Commissioner, Submission 1, pp. 2–3.

[23]Office of the eSafety Commissioner, Submission 1, pp. 2–3.

[24]Office of the eSafety Commissioner, Submission 1, pp. 4–5.

[25]Office of the eSafety Commissioner, Submission 1, p. 14. DITRDCA, Submission 12, p. 10.

[26]The Hon Michelle Rowland MP, Minister for Communications, 'Ensuring our online safety laws keep Australians safe', Media Release, 13 February 2024.

[27]EU 2022 Digital Services Act, https://eur-lex.europa.eu/eli/reg/2022/2065/oj; EU 2022 Digital Markets Act,https://digital-markets-act.ec.europa.eu/index_en (accessed 23 July 2024). See also, DITRDCA, Submission 12, pp. 13–14; Australian Gaming and Screens Alliance, Submission 59, p. 12; CHOICE, Submission 51, pp. 14–15.

[31]DITRDCA, Submission 12, p. 14.

[32]DITRDCA, Submission 12, p. 14.

[33]DITRDCA, Submission 12, p. 14. On 29 April 2024, the European Commission designated Apple with respect to its iPadOS, its operating system for tablets, as a gatekeeper under the DMA. On13May 2024, the Commission also designated under the DMA, Booking as a gatekeeper for its online intermediation service Booking.com. In total, 24 core platform services provided by those gatekeepers have been designated.

[34]UK 2023 Online Safety Act 2023https://www.legislation.gov.uk/ukpga/2023/50/enacted. Under the UK OSA, with the agreement of the courts, Ofcom can require payment providers, advertisers and internet service providers to stop working with a site, preventing it from generating revenue or being accessed from the UK. See also, Australian Human Rights Commission, Submission 79, p. 6; DITRDCA, Submission 12, p. 14.

[35]Government of the United Kingdom, 'UK children and adults to be safer online as world-leading bill becomes law', Media Release, 26 October 2023. See also, DITRDCA, Submission 12, p. 14.

[36]Government of the United Kingdom, 'UK children and adults to be safer online as world-leading bill becomes law', Media Release, 26 October 2023. See also, Reset.Tech Australia, Submission 16, p. 1; DITRDCA, Submission 12, p. 14.

[37]DITRDCA, Submission 12, p. 15.

[38]Canada 2024 Online Harms Bill 2024 https://www.parl.ca/LegisInfo/en/bill/44-1/c-63. See also, DITRDCA, Submission 12, p. 15.

[39]DITRDCA, Submission 12, p. 15.

[43]News and Media Research Centre, University of Canberra, Submission 47, p. 2.

[44]Park, S., Fisher, C., McGuinness, K., Lee, J., McCallum, K., Cai, X., Chatskin, M., Mardjianto, L. & Yao, P. (2024). Digital News Report: Australia 2024. Canberra: News and Media Research Centre, University of Canberra, p. 81.

[47]See, for example, Meta, Submission 46, p. 45; Office of the eSafety Commissioner, Recommender systems and algorithms position statement, December 2022, p 1.

[49]Office of the eSafety Commissioner, Recommender systems and algorithms position statement, December 2022, p 1.

[50]Office of the eSafety Commissioner, Submission 1, p. 15.

[51]Office of the eSafety Commissioner, Recommender systems and algorithms position statement, December 2022, p 1.

[52]Office of the eSafety Commissioner, Recommender systems and algorithms position statement, December 2022, p. 1.

[53]Office of the eSafety Commissioner, Recommender systems and algorithms position statement, December 2022, p 1.

[54]ACCC, Digital Platform Services Inquiry Interim Report No. 6 – Social media services in Australia, March2023, pp. 156–157.

[55]The Hon Josh Frydenberg MP, Treasurer, and the Hon Paul Fletcher MP, Minister for Communications, Urban Infrastructure, Cities and the Arts, 'Parliament passes News Media and Digital Platforms Mandatory Bargaining Code', Media Release, 25 February 2021.

[56]Australian Government, the Treasury, Review of the News Media and Digital Platforms Mandatory Bargaining Code (Consultation Paper, 1 April 2022).

[57]Australian Government, the Treasury, Review of the News Media and Digital Platforms Mandatory Bargaining Code (Consultation Paper, 1 April 2022).

[58]Australian Government, the Treasury, Review of the News Media and Digital Platforms Mandatory Bargaining Code (Consultation Paper, 1 April 2022).

[59]Australian Communications and Media Authority, News media bargaining code: Eligibility guidelines, July 2022, p. 5.

[61]ACCC, Digital Platforms Inquiry Final Report, June 2019, p. 16.

[63]ACCC, Digital Platforms Inquiry Final Report, June 2019, p. 16.

[65]Australian Government, the Treasury, Review of the News Media and Digital Platforms Mandatory Bargaining Code (Consultation Paper, 1 April 2022).

[66]Australian Government, the Treasury, Review of the News Media and Digital Platforms Mandatory Bargaining Code (Consultation Paper, 1 April 2022).

[67]Australian Government, the Treasury, Review of the News Media and Digital Platforms Mandatory Bargaining Code (Consultation Paper, 1 April 2022).

[68]Australian Government, the Treasury, Review of the News Media and Digital Platforms Mandatory Bargaining Code (Consultation Paper, 1 April 2022).

[69]Media, Entertainment and Arts Alliance, Submission 53, [p. 5]. See also, Mr Terry Flew, Professor of Digital Communication and Culture, The University of Sydney, Proof Committee Hansard, 28 June 2024, p. 50.

[70]Australian Government, the Treasury, Review of the News Media and Digital Platforms Mandatory Bargaining Code (Consultation Paper, 1 April 2022).

[71]Australian Communications and Media Authority, Submission 52, p. 2.

[72]Australian Government, the Treasury, Review of the News Media and Digital Platforms Mandatory Bargaining Code (Consultation Paper, 1 April 2022).

[73]Meta, 'An update on Facebook news', Newsroom, 29 February 2024.

[74]Meta, 'An Update on Facebook News in Europe', Newsroom, 5 September 2024.

[75]Meta, 'An update on Facebook news', Newsroom, 29 February 2024.