Senator Andrew Bragg's Additional Comments

The Social Media (Anti-Trolling) Bill 2022 (the Bill) places responsibility in the hands of social media platforms. This is a welcome development which will see scrutiny finally being applied to how content is generated on these platforms, including through elusive algorithms.
The Coalition Government has led the way on bringing Big Tech to task through major reforms on media bargaining and online safety. This additional bill is an attempt to alter the defamation framework in Australia.
I have long argued that the Big Tech platforms are publishers and they proved this beyond all doubt when they removed former US President Donald Trump from their platforms. They also showed they can switch off users and pages when they desire. This was proven in the debate on media bargaining when Facebook switched off community pages.
Social media platforms have a distinct business model. Individual users generate data about themselves, which is then monetised by the platform through algorithmic targeted advertising. These platforms are, for better or worse, an integral part of life. Nearly two-thirds of Australians are active Facebook users, including 75 per cent of persons aged 13 to 17.1 A staggering amount of data points have been collected and monetised by these corporations. In turn, these platforms have become the primary forum for Australians to receive news, impart opinions, view entertainment, and communicate with friends and family.
The ethics of a business model which depends on appropriating personal information requires urgent attention from policymakers all over the world. These platforms have immense power and resources. This cannot come without responsibilities. These responsibilities must be defined and enforced in law.
The application of defamation law to social media platforms is one such area. It is this issue which the Bill attempts to address. In announcing this legislation, the Government stated that these laws are intended to 'better protect Australians online' with 'some of the strongest powers in the world when it comes to tackling damaging comments from anonymous online trolls and holding global social media giants to account'.2
The Bill attempts to do this by providing a suite of mechanisms aimed at ensuring that social media platforms and users are held to account for defamatory content. First, the Bill establishes an immunity from defamation claims for page owners. Second, the Bill designates social media platforms as 'publishers'. Third, the Bill establishes a defence for social media platforms when a complaints scheme is established in accordance with the Bill.
A number of issues have been flagged by submitters. These additional comments will focus on two aspects of the framework set out in the Bill: the immunity for page owners provided at section 14 and the defence for social media platforms provided at section 16.

Immunity for Page Owners

Section 14 of the Bill provides that for the purposes of the general law of the tort of defamation, page owners are taken not to be the publishers. This overrides the High Court's decision in Fairfax Media Publications Pty Ltd v Voller [2021] (Voller) which determined that the concept of publication applies for the owners and administrators of social media pages.3
The protection from liability for page owners seems extensive. In effect, it was submitted that the effect would be to remove any form of liability for page hosts or administrators for defamatory content on their pages. It was contended that this would grant a 'complete immunity'4 for page owners. Professor David Rolph argued that this enabled page owners to 'act as negligently, irresponsibly or maliciously as they please without fear of any legal consequences.'5
It was argued that this would lead to anomalous outcomes, with the effect of granting greater protections from liability in the online world than in the physical world. This is illustrated in submission 14:
A person who has a physical noticeboard may become liable as a publisher of third party comments posted on it if they become aware of the presence of those comments and, having the power to remove them, fail to do so within a reasonable period of time. If the proposed reform is to be enacted as it currently stands, the explanatory materials would need to provide a compelling policy justification for the disconformity between the legal protection afforded in the online and physical worlds.6
It was submitted that these anomalous outcomes were partly a function of the Bill overturning High Court precedent. As Sue Crysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC submitted:
A proper reading of that case (the Voller decision) is that there has been no change to the law of publication. … It is simply a restatement of the law. No legislative change is needed to deal with the decision given it effectively has no impact on the law.7
As the Law Council notes, the current drafting of the Bill provides that page owners would not be liable even if they fail to remove content which they have been explicitly warned as defamatory.8 As a result, page owners would not be incentivised to moderate their social media pages.9 Page owners can moderate content on their pages quickly and effectively.10 By contrast, moderation by making an application to the platform can often be a cumbersome and lengthy process.
Further, the Law Council noted that the exemption for page owners under section 14 would also apply if page owners 'seek or conduce defamatory material' while falling short of posting the material themselves.11 For example, a page owner would not be liable in defamation even if they are 'actively encouraging conversations which the users ultimately end up defaming each other'.12
As a result, a number of submitters took the view that the effect of this liability protection would have the unintended effect of increasing, rather than decreasing, the proliferation of defamatory content on social media.
It is for this reason that I support the Chair's recommendation that the bill be amended in cases where a page owner has notice or awareness of defamatory content, or has knowingly encouraged defamatory content.

The Complaints Scheme Defence

Having granted an immunity from liability for page owners, the Bill instead deems social media providers to be the publishers of material hosted by their services. Social media providers have access to a defence where a complaints scheme is put in place in accordance with the requirements established in the Bill.
Among other details, the complaints scheme would require social media companies to have a nominated entity in Australia, and to, if requested, provide the relevant contact details of the defamatory poster to the applicant.
The Attorney-General's Department submitted that the complaints scheme is a 'quick and low-cost way for victims to raise concerns about defamatory material and obtain the poster's contact details.'13 However, it is not clear that this mechanism would be more effective than the status quo, where page owners have the primary responsibility for moderating defamatory content.
The availability of the defence may lead to unintended consequences which are likely to further increase the significant power imbalance between individual users and social media platforms.

Scope of the defence

This sentiment was supported in submissions. At submission 35, Sue Crysanthou SC and others contended that this defence could leave claimants without effective legal recourse.14 It was also contended that identifying and enforcing court orders in respect of defamatory commentators would remain a protracted and expensive process.15
The Law Council submitted that there were practical difficulties associated with collecting the contact details of users. The requirement would incentivise end-users to provide false information or use software to obscure or alter their location.16
This was disputed by the Attorney-General's department, which stated in response to questions on notice that:
To obtain access to the conditional defence the social media company must provide the poster's relevant contact details to the complainant. If the social media company is unable or unwilling to disclose the poster's contact details to the complainant, it will not be able to access the defence.17
Noting this significant difference of opinion, clarification was sought from Sue Crysanthou SC regarding the effect of establishing this defence for social media companies. Ms Crysanthou stated that:
The defence gives social media companies a complete defence even where they keep such material online and do not pass on the troll's contact details to the victim of the post, that effectively removes the best chance members of the public have to get such material taken down. … The broad defence given by s16(2) makes it harder, not easier, for members of the public to take action to protect themselves against publications. … Put shortly, enacting s16(2) would be worse than doing nothing.18

Effectiveness and unintended consequences

Numerous submitters expressed concern about cost, complexity and delay arising from the existence of the defence. It should be noted that this is an area of law which has been consistently cited for these issues. It was submitted that these issues would cause unsatisfactory outcomes for complainants.
It was submitted that this defence would remove this incentive, allowing social media platforms to 'neutralise their risk completely by simply asking the author to agree to have his/her contact details passed on'.19 In conjunction with the protection from liability for page owners, this would remove the major incentives for defamatory contact to be taken down swiftly and effectively. The ultimate effect of this could be to increase the proliferation of defamatory content online.
It is possible that large, established, social media companies would be more likely to have the resources and the knowledge to benefit from, and comply with, the section 16 defence. The same amount of resources and knowledge may not be available to nascent challengers. The ultimate effect of this could be to benefit established players while leaving disruptors exposed to liability.
Recommendations 2 and 3 of the Chair's report seek to address these issues. However, I am open-minded about these and other suggestions to improve this important initiative.
Senator Andrew Bragg
Liberal Senator for New South Wales

  • 1
    eSafety Commissioner, 'Young people and social media usage', https://www.esafety.gov.au/research/youth-digital-dangers/social-media-usage (accessed 24 March 2022).
  • 2
    Prime Minister and Attorney General, 'Combatting Online Trolls and Strengthening Defamation Laws', Media Release, 28 November 2021.
  • 3
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14.
  • 4
    Ibid.
  • 5
    Professor David Rolph, Submission 14, p. 8.
  • 6
    Ibid.
  • 7
    Sue Crysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC, Submission 35, p. 4.
  • 8
    Law Council of Australia, Submission 1, p. 14.
  • 9
    Ibid.
  • 10
    Ibid.
  • 11
    Ibid.
  • 12
    Ibid.
  • 13
    Attorney-General's Department, answers to written questions on notice, 11 March 2022 (received 16 March 2022.
  • 14
    Sue Chrysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC, answers to written questions on notice, 16 March 2022 (received 17 March 2022).
  • 15
    Ibid.
  • 16
    Ibid.
  • 17
    Attorney-General's Department, answers to written questions on notice, 11 March 2022 (received 16 March 2022.
  • 18
    Sue Chrysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC, answers to written questions on notice, 16 March 2022 (received 17 March 2022).
  • 19
    Sue Chrysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC, answers to written questions on notice, 16 March 2022 (received 17 March 2022).

 |  Contents  |