Dissenting Report from Senator David Pocock

Dissenting Report from Senator David Pocock

Introduction

1.1Social media companies have extraordinary power and influence, and have become a central part of our society. While there is broad support for action on mis- and disinformation that circulates on social media, few agree that this bill takes the right approach. Based on the information the committee heard, I believe a more effective approach to addressing the harms caused by social media companies would be to focus on transparency, through data access for accredited researchers, and to eliminate inauthentic human actors, otherwise known as bots.

1.2The bill focuses on minimising the harmful effects of mis- and disinformation on digital platforms. Mis- and disinformation are issues on digital platforms, but they stem from the platforms’ algorithms which are driven by outrage and polarisation. This leads to the amplification of mis- and disinformation, but also other types of harmful content such as information promoting eating disorders and suicide, as well as other graphic and extreme content. These digital platforms understand what content demands our attention the most and use algorithms to keep feeding it to users until in some cases, there is no turning back. The algorithms, and the incentive structures behind them, are causing serious harm.

1.3For this reason, I think the bill takes the wrong approach and in doing so, creates an unnecessary risk to freedom of speech. As constitutional law expert Professor Anne Twomey AO said in her opening statement to the committee:

Misinformation and disinformation are insidious and widespread through social media platforms. I would like very much to see it all disappear. But I worry that in the process of ineffectually trying to do this, we create worse problems through large scale censorship of contested views and the undermining of democracy in the name of cleansing it from misinformation. What is particularly worrying about this legislation is that responsibility is being outsourced to large overseas companies over which Australia has little control or influence. It could all go very wrong.[1]

1.4While mis- and disinformation do cause harm, it seems strange to specifically target that while neglecting broader online safety measures, such as implementing all the recommendations from the review of the Privacy Act 1988 (Privacy Act). These Privacy Act changes should be implemented to create a stronger base for all other legislative changes we make to our digital ecosystem. From there, we can take a systems approach to regulating platforms rather than trying to address individual issues one by one. As the Human Rights Commissioner said:

Dealing with these things in isolation raises concerns for us about the consistency of the framework that we'll end up with, and that raises concerns for the ability of individual Australians to actually be able to engage and understand the laws that they're meant to comply with.[2]

Data Access

1.5The Australian Government initially said it was unconstitutional to provide access to researchers, then moved an amendment to allow access in limited circumstances. I believe this data access should be expanded beyond data related only to mis- and disinformation. Researchers should be able to analyse all types of data so we can better understand how social media companies and platforms are impacting our communities through algorithms and targeting of content. As Miss Alice Dawkins, Executive Director, Reset Tech Australia said:

With access to the same data available to EU researchers under the Digital Services Act scheme, ‘‘We would be able to much more comprehensively understand what's going on with bot accounts. Of course it wouldn't be perfect data—I want to make that very obvious. We are working with compromises constantly…we would have the ability to get much bigger sample sizes and much more creditable estimations of inauthentic activity on Australian social media. It is squarely in the public interest for that work to be able to be done.[3]

1.6This is further highlighted by researchers who were trying to research the impact social media platforms have on polarisation, saying:

Many critics raise concerns about the prevalence of ‘echo chambers’ on social media and their potential role in increasing political polarisation. However, the lack of available data and the challenges of conducting large-scale field experiments have made it difficult to assess the scope of the problem.[4]

Bot accounts

1.7Bots do not have the human right to free speech. The committee heard convincing evidence that bots, as artificial entities, do not possess right to freedom of speech. As the AHRC pointed out, ‘human rights by their very definition reside in humans.’

1.8Numerous other witnesses agreed that bots do not have the human right to free speech. A representative of the Human Rights Law Centre (HRLC) stated clearly that non-human actors do not have freedom of expression or any human rights.[5] When I asked if non-human actors, essentially bots, have freedom of expression and some sort of human right, the HRLC representative definitively responded ‘No.’[6]

1.9And from the Institute of Public Affairs (IPA): ‘I don't think bots do have freedom of speech, so I don't think that they should be entitled to the same rights and freedoms that people have.’[7]

1.10These quotes reflect a consensus that the right to freedom of speech is reserved for humans and should not extend to artificial systems or entities like bots.

1.11The committee also heard powerful evidence about how bots can be used to influence public perception on certain topics and weaponised by foreign adversaries to further their own agendas.

1.12CyberCX spoke about how they uncovered the ‘Green Cicada Network’ which was a china-linked disinformation campaign on X while Ms Julie Inman Grant, e-Safety Commissioner, Office of the e-Safety Commissioner, said:

…we have botnets or bot armies and bots that are proliferating dis- and misinformation. There was a study done in the US last week which said that some of the Russian inspired bots that were active during the 2020 election are still all over X.[8]

1.13Miss Alice Dawkins, Executive Director, Reset Tech Australia, explained how they have comprehensively and very effectively mapped Kremlin disinformation and pro-Russian bot networks. They've also looked at potential Chinese state interference on bot networks as well through the use of highly technical but advanced methodologies. These are methodologies that are available to European researchers under the Digital Services Act scheme.[9]

1.14While bots may be good for business bottom lines by bolstering account numbers and perceived engagement, they are a risk to good faith online discourse and social cohesion. They do not have a right to freedom of speech and are being weaponised to influence views on different topics, including in elections.

Concluding comments

1.15The bill in the form that passed the House of Representatives should not be passed.

1.16The Australian Government should bring forward a bill that legislates access for researchers to be able to look at what is happening inside social media platforms - from algorithms to mis and disinformation, moderating harmful content, advertising and non-human accounts.

Recommendations

Recommendation 1

1.17The bill should not be passed.

Recommendation 2

1.18The Australian Government should introduce legislation to provide researchers with the access to social media companies necessary to study, understand and communicate the impact of social media on our society.

Recommendation 3

1.19The Australian Government should introduce legislation that places an obligation on social media companies to remove bot accounts that impersonate humans, and clearly label all legitimate bot accounts.

Recommendation 4

1.20The Australian Government should introduce legislation for media literacy training throughout Australian schools as well as broader society, with the objective of helping people understand how to think critically about the information they see and how to verify claims.

Senator David Pocock

Participating Member

Footnotes

[1]Professor Anne Twomey, AO, Opening Statement, 11 November 2024.

[2]Mrs Lorraine Finlay, Human Rights Commissioner, Australian Human Rights Commission, Proof Committee Hansard, 17 October 2024, p. 31.

[3]Miss Alice Dawkins, Executive Director, Reset Tech Australia, Proof Committee Hansard, 11 October 2024, p. 21.

[4]Brendan Nyhan et al, ‘Like-minded sources on Facebook are prevalent but not polarizing’, Nature, 27 July 2023.

[5]Mr David Mejia-Canales, Senior Lawyer, Human Rights Law Centre (HRLC), Proof Committee Hansard, 11 November 2024, p. 4.

[6]Mr Mejia-Canales, HRLC, Proof Committee Hansard, 11 November 2024, p. 4.

[7]Mr Daniel Wild, Deputy Executive Director, Institute of Public Affairs (IPA), Proof Committee Hansard, 11October 2024, p. 27.

[8]Ms Julie Inman Grant, e-Safety Commissioner, Office of the e-Safety Commissioner, Proof Committee Hansard, 17 October 2024, p. 31

[9]Miss Dawkins, Reset Tech Australia, Proof Committee Hansard, 11 October 2024, p. 20