Chapter 1

Introduction

Referral and conduct of the inquiry

1.1
On 10 February 2022, the Senate referred the Social Media (Anti-Trolling) Bill 2022 (the bill) to the Legal and Constitutional Affairs Legislation Committee (the committee) for inquiry and report by 24 March 2022.1
1.2
Details of the inquiry were advertised on the committee's webpage, and the committee wrote to a number of organisations and individuals inviting them to make a submission. The committee also issued a media release on 4 March 2022 inviting expressions of interest from people who had experienced online harm and who may wish to appear before the committee. The committee called for submissions to be received by 28 February 2022. The committee received 37 submissions, which are listed at Appendix 1.
1.3
The committee held public hearings in Canberra on 10 and 15 March 2022. A list of the witnesses who appeared is at Appendix 2.

Structure of this report

1.4
This report is comprised of two chapters:
Chapter 1 outlines the key provisions of the bill and provides administrative details relating to the inquiry; and
Chapter 2 examines the key issues raised in evidence and provides the committee view and recommendations.

Background

Development of the bill

1.5
The bill was introduced on 10 February 2022 by the Minister for Communications, Urban Infrastructure, Cities and the Arts, the Hon. Paul Fletcher MP who stated:
Social media services2 have provided Australians significant social, educational and economic benefits for many years. However, these benefits have also come with greater exposure to online harms—including exposure to emotional, mental health and financial harms that can result where malicious online trolls post defamatory material to attack a person's reputation.
Today we introduce the Social Media (Anti-Trolling) Bill, to ensure defamation law is fit-for-purpose in the digital age and to empower Australians to respond appropriately to defamatory attacks by anonymous social media trolls.3
1.6
The Attorney-General's Department (the department's) Social Media (Anti-Trolling Bill) 2021 explanatory paper for the bill further explained that:
The challenges presented by defamation over social media are particularly acute, given the speed at which such material can spread and the limited scope to contain the spread of a defamatory post once it has been published on a social media network. Where such material is posted anonymously, complainants may have limited ability to identify the poster, and the capacity to seek vindication by commencing defamation proceedings may be limited.4
1.7
In addition to these challenges, the bill seeks to respond to the High Court's decision in Fairfax Media Publications v Voller [2021] HCA 27 (Voller), which was handed down on 8 September 2021.5 In Voller, a majority of the High Court held that individuals and organisations with social media pages on which third party material can be posted may be publishers of that material for the purposes of defamation law.6
1.8
The Minister for Communications, Urban Infrastructure, Cities and the Arts, the Hon. Paul Fletcher MP, outlined the potential impact the Voller decision could have on individuals and business in Australia who maintain social media pages, stating:
A person who maintains a social media page may be a 'publisher' of defamatory comments posted by third parties. This means that they may be liable in defamation, even if they are not aware of the defamatory material, and do not intend its publication. Liability concerns may have a chilling effect on free speech, as people who administer social media accounts may censor comments or disable functionalities due to a fear of being held liable in defamation for content that they did not post.7
1.9
Similarly, in its explanatory paper the department outlined the potential negative impacts that the Voller decision could have on freedom of speech, including effectively requiring individuals and businesses to monitor social media pages to avoid being considered a publisher for the purposes of a defamation action. The explanatory paper also highlighted how anonymity provided by social media services enables users to post harmful comments with impunity. The explanatory paper also highlighted that, while anonymity is a legitimate feature of the digital ecosystem, the anonymity provided by social media services can be abused by users to post harmful comments and it can be very difficult for victims to respond appropriately.8
1.10
In developing the bill, the department undertook consultation with a number of stakeholders, including the Department of Infrastructure, Transport, Regional Development and Communication, the Defamation Working Party (which is comprised of representatives from the states and territories operating under the direction of the Meeting of Attorneys-General), the Office of the Australian Information Commissioner, the eSafety Commissioner, the legal community, academics, social, digital, and traditional media stakeholders, as well as advocacy and specific interest groups.9 The department also received 44 written submissions and 25 survey responses via an online consultation hub.10

Interaction with other legislation and reforms

1.11
In developing the bill, the department considered the broader policy landscape. Consequently, the bill is intended to complement existing regulations aimed at preventing other forms of online harm, many of which occurs on social media; namely, cyber abuse and cyber bullying under the Online Safety Act 2021 (Online Safety Act), and use of a carriage service to menace, harass or cause offence under the Criminal Code 1995 (Cth), set out below.11 The bill addresses a legislative gap by introducing protections for harm specifically caused by defamatory material published on social media.12

Online Safety Act 2021

1.12
The eSafety Commissioner is empowered under the Online Safety Act to investigate adult cyber abuse material13 that is published on social media. While in some cases, defamatory material published on social media can meet the threshold for cyber abuse,14 comprising “a measurable proportion of reports under [the] serious adult cyber abuse scheme”,15 the commissioner's existing powers are not intended to capture defamatory material published online that causes reputational harm.16

Criminal Code Act 1995 (Cth) (Criminal Code)

1.13
Alongside the Online Safety Act, the Criminal Code provides protection from online harm by containing an offence for use of a carriage service17 to menace, harass or offend.18

Review of Model Defamation Provisions

1.14
Defamation is a tort at common law, which can be modified by state, territory and Commonwealth legislation. In 2004, an agreement between Attorneys General of States and Territories, aimed at creating uniformity in defamation law, resulted in Model Defamation Provisions (MDP) being enacted at the state and territory level.19 At this time, the then federal Attorney-General Mr Philip Ruddock stated that failure of the states and territories to enact uniform defamation laws would result in the Commonwealth developing legislation to override state and territories' defamation laws.20
1.15
On 8 June 2018, the Council of Attorneys-General (now the Meeting of Attorneys-General (MAG)) reconvened a working party to review the MDPs. The working party is currently progressing reform in two stages. Stage 1, completed in July 2020, focused on reviewing the MDPs to ensure their policy objectives remain valid and up to date. Stage 2, currently in progress, is focusing on the responsibilities and liability of digital platforms for defamatory material published online.21
1.16
While the review is ongoing, as noted in the department's submission, the consequences of the Voller decision and the harmful effects of defamatory material on social media require urgent reforms to provide clarity to Australian social media users and internet intermediaries. Accordingly, the bill is narrowly focussed to address a specific issue around liability for defamatory material published on social media.22

Privacy Act 1988 (Cth)

1.17
The Privacy Act 1998 (Privacy Act) is the primary piece of Australian legislation that protects the privacy of individuals, and restricts how government and industry can collect, use, and disclose an individuals' personal information.23
1.18
The bill protects social media service providers from civil liability under the Privacy Act where they comply with obligations to collect and disclose country location data24 and relevant contact details. The explanatory memorandum emphasises that, 'disclosure mechanisms are only enlivened where there is reason to believe that there may be a right for the complainant to obtain relief against the poster in a defamation proceeding.' Further, it notes that 'the definition of 'relevant contact details' is narrowly drafted to include a person's name (or the name by which they are known), e-mail address, telephone number and such other information as is specified in legislative rules'.25
1.19
In December 2019, the Australian Government made a commitment to conduct a review of the Privacy Act. The review is currently being undertaken by the department and will consider issues relating to the collection and security of personal information by social media services. 26 As this review is still underway, this report considers the compatibility of the bill with the existing provisions of the Privacy Act.

Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021 (Online Privacy Bill)

1.20
In March 2018, the Australian Government committed to strengthening privacy protections by introducing a binding code of practice for social media and other online platforms that trade in personal information, and by enhancing enforcement mechanisms and penalties provisions under the Privacy Act.27
1.21
Consultation is currently being undertaken by the department on an exposure draft of the Online Privacy Bill. The exposure draft seeks to empower the Australian information commissioner to create a privacy code that would be binding on social media service providers (OP code). The OP code would address how certain existing and new privacy obligations would apply to organisations such as social media service providers, and how obligations will apply in relation to children and vulnerable groups. For example, the OP code would require social media services to obtain the consent of a parent or guardian before collecting, using or disclosing the personal information of a child under the age of 16.28

Purpose of the bill

1.22
The bill creates a framework to allow Australians to respond to defamatory material posted on social media. Broadly speaking, the bill seeks to achieve the following outcomes:
First, the bill clarifies who is a publisher of material posted on social media for the purposes of defamation law.29
Second, the bill creates new mechanisms for Australians to ascertain whether potentially defamatory material on social media was posted in Australia and, if so, to obtain the poster's relevant contact details for the purposes of instituting defamation proceedings in an Australian court.30
Third, in circumstances where user information is often held outside of Australia, the bill requires social media providers to have a nominated entity in Australia with access to country location data and relevant contact details of posters in relation to material posted in Australia to discharge key obligations of the bill.31
Fourth, the bill allows the Attorney-General to intervene in certain cases and provide legal assistance where necessary.32

Outline of the bill

1.23
The bill contains five parts:
Part 1—Introductory matters
Part 2—Liability
Part 3—End-user information disclosure orders
Part 4—Nominated entities
Part 5—Miscellaneous.33
1.24
Part 1 of the bill deals with introductory matters including commencement, definitions, when material is posted in Australia and the bill's interaction with the general law of the tort of defamation, and state and territory defamation laws. Part 1 clarifies that the bill is intended to operate in conjunction with the current (and future) corpus of defamation law from all of its sources, except to the extent modified by the bill.34
1.25
Part 2 of the bill provides that an Australian person who maintains or administers a page of a social media service is taken to not be a publisher of third-party material posted on the page for the purposes of the general law of the tort of defamation, and thereby cannot be liable in defamation for the posting of that material. Part 2 of the bill also clarifies that, if material is posted in Australia, social media service providers are deemed publishers of the material posted on their service for the purposes of the general law of the tort of defamation. This applies both to material posted by account holders on a page which they maintain or administer (page owners), and to material posted on a page which another individual administers or maintains. Part 2 of the bill also provides a conditional defence for social media service providers in defamation proceedings that relate to material posted on their platform, provided the material is posted in Australia. Social media service providers will benefit from the defence where they meet conditions specified in the bill. Part 2 also sets out the prescribed requirements for the complaints scheme.35
1.26
Part 3 of the bill establishes a framework for end-user information disclosure orders (EIDO). EIDOs are a new form of court order which may be obtained from an Australian court that has jurisdiction to hear a substantive defamation proceeding, and from Division 2 of the Federal Circuit and Family Court of Australia. EIDOs will allow an applicant to obtain country location data in relation to material posted on a social media service and, if the material is posted in Australia, obtain the poster's relevant contact details. Part 3 clarifies that the jurisdiction to grant EIDOs does not otherwise affect the jurisdiction of the Federal Circuit and Family Court of Australia to hear and determine substantive defamation cases.36
1.27
Part 4 of the bill requires a social media service provider that is a body corporate incorporated in a foreign country and meets specified thresholds to establish an Australian incorporated entity, known as the 'nominated entity', that is an agent of the provider and has an office in Australia. The nominated entity is required to have access to country location data and relevant contact details of posters in relation to material posted in Australia, and has authority to receive complaints and requests under the complaints scheme. The obligation to establish a nominated entity is enforceable by a civil penalty provision, and establishing a nominated entity is also a prerequisite for accessing the conditional defence under Part 2 of the bill.37
1.28
Part 5 allows the Attorney-General to intervene in matters arising under the bill, or in defamation proceedings to which a social media service provider is party, where the proceedings are before a court exercising Federal jurisdiction. Where the Attorney-General chooses to intervene, the Commonwealth is taken to be a party to the proceeding, and the court may award costs orders against the Commonwealth as it sees fit. Under this Part, the Attorney-General may also authorise payment to the applicant by the Commonwealth for costs reasonably incurred in relation to the proceeding.38
1.29
The bill provides for different commencement periods for different provisions. The provisions which deem page owners to not be publishers would commence on the earlier of proclamation or 6 months after the bill receives Royal Assent. The remaining substantive provisions of the bill would commence on the earlier of proclamation or 12 months after the bill receives Royal Assent. 39

Key provisions

Protecting social media page owners from liability

1.30
Clause 14 applies to end-users of social media services (page owners) who maintain or administer a page on that service. This clause explicitly affects the general law of the tort of defamation, by clarifying that a page-owner is not a publisher of material posted on the relevant page by a third party for the purposes of defamation law, if the page owner is an Australian person, regardless of where the poster was located when the material was posted. As a result, the page owner cannot be liable under Australian defamation law in respect of the material posted.40
1.31
Clause 14 addresses the implication of the High Court's decision in Voller. Specifically, it overrides part of that decision, by providing that end-users who maintain or administer pages on social media (as page owners) are not taken to be publishers of third-party comments on their pages.41
1.32
The poster's status as a publisher of the material is not affected by the bill. This includes where a person posts material to their own social media page.42
1.33
The bill is not intended to affect the characterisation of any separate act of publication of the material by the page owner, such as re posting. It is also not intended to affect any liability for the page owner on a basis other than the law of defamation in respect of the material posted by the third party.43

Classifying social media providers as publishers for the purposes of defamation law

1.34
Clause 15 of the bill provides that, for the purposes of defamation law, a social media service provider is a publisher of material posted on a page of the social media service.44
1.35
Subclause 15(1) applies where one end-user of the social media service posts material on the page of another end-user. Subclause 15(2) applies where an end-user of the social media service posts material on their own page.45
1.36
Subclauses 15(1) and (2) address the implications of the Voller decision. They clarify a consequence of that decision by providing that the provider of the social media service is a publisher of such material if it is posted in Australia. The provider may, in any case, be a publisher as a consequence of the High Court's reasoning in Voller. However, subclauses 15(1) and (2) remove the need for a complainant to have to prove in legal proceedings that the social media service provider was a publisher of the material.46
1.37
Subclause 15(3) explicitly displaces section 235 of the Online Safety Act 2021, and the defence of innocent dissemination (whether under general law or the law of a State or Territory), in cases where a provider of a social media service is a publisher of material that is posted in Australia. This applies in the situation where a provider is a publisher as a result of subclauses 15(1) or (2) or otherwise.47

End-user information disclosure orders

1.38
Clause 19 establishes a framework for a prospective applicant to apply to a court for an EIDO. An EIDO can require the provider of a social media service to disclose the poster's country location data and (if the material was posted in Australia) the poster's relevant contact details to the prospective applicant.48
1.39
An application for an EIDO may be made to either an Australian court that would have jurisdiction to hear the defamation proceeding, or the Federal Circuit and Family Court of Australia (Division 2).49
1.40
There is no requirement for a prospective applicant to first go through the social media service provider's complaints scheme before they are entitled to apply for a court order. A person who has been subjected to defamatory material posted on social media can choose to pursue a complaint under the social media service provider's complaints scheme, to apply to the court for an EIDO, or both. This mechanism provides flexibility to a prospective applicant, to utilise the less formal complaints mechanism where they consider this will be helpful, or to go straight to applying for a court order where they consider the complaints mechanism is unlikely to be helpful.50

Defence available to social media providers

1.41
Clause 16 of the bill creates a conditional defence for the provider of a social media service in a defamation proceeding in which they are a defendant.51
1.42
Subclause 16(1) makes clear that the clause applies where:
(i)
a person has posted material on a page of a social media service;
(ii)
the provider of the social media service is a publisher of that material;52
(iii)
a person has instituted defamation proceedings in relation to the material; and
(iv)
the provider, a related body corporate or nominated entity is a defendant in the proceeding.53
1.43
Subclause 16(2) outlines when the defence will be available. It provides that, subject to subclause (3), it is a defence if the defendant referred to above proves the following elements:
(i)
the material was posted in Australia; and
(ii)
the social media service has a complaints scheme that meets the prescribed requirements in clause 17; and54
(iii)
where the applicant (or a person acting on behalf of the applicant) has made a complaint to the provider about the material—the provider has complied with the complaints scheme in relation to that material, including by disclosing the country location data of the poster in relation to the material; and
(iv)
any of the following conditions are satisfied:
(1)
the applicant (or a person acting on their behalf) has requested disclosure of the poster's relevant contact details under the complaints scheme, and the provider has disclosed the relevant contact details to the applicant;
(2)
where an EIDO has been made against the social media service provider—the provider has disclosed the required information and has otherwise complied with the order; or
(3)
the applicant (or a person acting on their behalf) has not applied for an EIDO; and
(v)
at the time the material was posted, the social media service provider had established an Australian nominated entity satisfying the requirements set out in Part 4 of the bill.55
1.44
The defence is only available where the material is posted in Australia. Where the material is posted from a location outside of Australia, the general law of the tort of defamation will apply.56
1.45
If the provider has a nominated entity (if required), and has established and complied with a complaints scheme, whenever a provider discloses the poster's relevant contact details to the applicant, the provider is entitled to access the defence. Further, whenever an applicant chooses to not apply to the court for an EIDO, the provider is entitled to access the defence. However, whenever a court has declined to grant an EIDO, or has granted an EIDO but the provider does not (or is not able to) disclose the poster's relevant contact details, then the provider is not entitled to access the defence.57
1.46
The explanatory memorandum notes that the objective of subclauses 16 (1) and (2) are to make clear that, in all cases, the provider of the social media service must have established a complaints scheme that meets prescribed requirements and established an Australian nominated entity satisfying the requirements set out in Part 4 of the bill in order to have access to the defence.58
1.47
Subclause 16(3) provides for an exception to the defence. If the poster is the social media service provider, a related body corporate of the provider or the nominated entity of the provider, then the defence will not apply so far as the defamation proceeding is against the poster. This ensures that the referenced entities remain accountable for their own material, even where the conditions in subclause 16(2) are satisfied.59
1.48
The references through clause 16 to a person acting on behalf of an applicant are intended to take account of circumstances in which another person acts on behalf of the applicant in the complaints scheme or court proceedings in respect of which the clause applies. This could occur in relevant proceedings where, for example, the applicant is a child or has capacity issues, and a parent or guardian has commenced proceedings on behalf of the applicant.60

Requirement for social media provider to have a nominated entity in Australia

1.49
Clause 22 creates a requirement, where the provider of a social media service is a body corporate incorporated in a foreign country, for the provider to ensure it has an Australian 'nominated entity' capable of meeting the various obligations that may arise under, or in connection with, the bill. This purpose of this clause is to ensure that any EIDO made by a court can be readily enforced in Australia against an Australian entity.61
1.50
Paragraph 22(1)(b) provides that this obligation only applies where, either: there are at least 250,000 Australian persons who hold accounts with the service; or the service is specified in the legislative rules. This threshold is intended to avoid imposing a disproportionate burden on any social media service provider that has only a small number of Australian users. The ability to specify services in the legislative rules will serve the purpose of avoiding the need to prove this element in civil penalty or defamation proceedings.62
1.51
Paragraphs 22(1)(c), (d) and (e) make clear that the provider of the social media service must ensure that its nominated entity is a body corporate that:
is an agent of the provider; and
is incorporated in Australia, and
has an office in Australia.63
1.52
Additionally, Paragraph 22(1)(f) requires the nominated entity to have access to the relevant contact details and country location data of end-users of the service who have posted material on pages of the service, where the material is posted in Australia. There is flexibility in the way that providers of social media services can meet this requirement. For instance, clause 22 does not require data localisation or transfer of data into Australia. If relevant contact details and country location data relating to users are held offshore, the nominated entity may establish arrangements that allow it to access that data on a case-by-case basis as required.64
1.53
Paragraph 22(1)(g) makes clear that, if a social media service provider has a complaints scheme that meets the prescribed requirements under clause 17, the nominated entity must have authority to receive complaints and requests made under that complaints scheme. This aligns with the principle that Australians seeking to make complaints about potentially defamatory material on a page of a social media service that is made in Australia should be able to approach an Australian entity in relation to that material.65
1.54
Paragraph 22(1)(h) makes clear that if the provider of the social media service has related bodies corporate in Australia, the nominated entity must be one of those related bodies corporate. This provision clarifies that, if a provider has at least one related body corporate incorporated in Australia, it may not nominate an unrelated body corporate. The provision also clarifies that, if a provider has multiple related bodies corporate incorporated in Australia, not every related body corporate will be required to act in the capacity of a nominated entity.66

Empowering the Attorney-General to intervene in proceedings brought under the bill

1.55
Clause 25 allows the Attorney-General to intervene in particular proceedings on behalf of the Commonwealth. Subclause 25(1) in particular, provides that the Attorney-General may intervene in a proceeding where:
a poster has posted material on a page of a social media service; and
an Australian person has instituted defamation proceedings in relation to the material; and
the provider of the social media service is a party to the defamation proceeding; and
the proceeding is before an Australian court that is exercising federal jurisdiction; and
the Attorney-General believes it is in the public interest to do so.67
1.56
The explanatory memorandum outlines the purpose of subclause 25(1), noting that:
Defamation cases can be complex, and can sometimes involve a significant power imbalance between a large publisher and an individual whose reputation has been harmed. In part, the power to intervene under subclause 25(1) would allow the Attorney-General to address this power imbalance where it is in the public interest to do so.68
1.57
Subclause 25(2) provides an alternative and additional basis of intervention. The provision makes clear that the Attorney-General may intervene in a proceeding before an Australian court in a matter arising under the bill. This includes proceedings that are not defamation proceedings, but still arise under the bill. For example, the Attorney-General may intervene in proceedings that concern an EIDO (where substantive defamation proceedings have not yet commenced). Subclause 25(2) also serves the additional purpose of allowing the Government to put its views to the court about how this novel legislation is intended to apply.69
1.58
Subclause 25(3)(4) and (5) concern legal costs in any proceeding where the Attorney-General has intervened under subclause 25(1) or (2). The court may award costs against the Commonwealth and the Attorney-General in these instances and may authorise the payment of reasonable costs incurred by the applicant. These subclauses recognise that intervention by the Attorney-General, particularly in a matter where an uncertain area of law is being settled, may result in additional costs being incurred by the applicant and defendant.70

Consideration by other parliamentary committees

House Select Committee on Social Media and Online Safety (Social Media Committee)

1.59
The Social Media Committee was established by a resolution of appointment that passed the House of Representatives on 1 December 2021, to inquire into the prevalence and impact of online harm in Australia. The Social Media Committee also considered existing legislative frameworks and solutions for improving online safety, particularly for vulnerable users. The inquiry report was tabled on 15 March 2022.71
1.60
Recognising that defamation is part of a broader suite of harms that can arise in the digital world, the Social Media Committee also sought to review the measures in the bill in their broader context.72 The Social Media Committee found that:
Online harm is experienced by 'any user of the internet' with vulnerable groups being recognised as most at risk.73
A 'child's best interest approach' could increase levels of safety on social media platforms.74
In relation to anonymity on social media platforms, submitters were divided on how to strike the correct balance between ensuring rights, such as privacy protections, were maintained whilst also limiting harm caused by individuals who operate anonymously online.75
Education on digital wellbeing could have a role in reducing online harm.76
Existing models for social media platforms which put the onus on the user to take responsibility for their own safety were 'fundamentally unsafe'.
These models encourage the proliferation of harmful 'toxic content' which is particularly profitable as it attracts user engagement.77
As such, social media platforms, in addition to other digital services, had not demonstrated a willingness to put the safety of their users before other considerations.78
1.61
The inquiry report made 26 recommendations with the summarised elements of the recommendations being:
enforce default pro-privacy settings for users under 18 years of age and requirements for manufacturers to ensure all relevant digital products have parental control functionalities.79
establish an educational campaign targeted at all Australians, focusing on digital citizenship, and respectful online interaction.80
appoint a House Standing Committee on Internet, Online Safety and Technological Matters to examine the role of social media in relation to democratic health and, potentially, other relevant issues.81
Review and increase funding to support victims of technology-facilitated abuse, through existing Australian Government-funded programs.82
conduct a Digital Safety Review on the existing regulatory framework in relation to the digital industry, with a view to simplify existing regulations, increase platform transparency, strengthen Basic Online Safety Expectations, and implement a duty of care framework.83
For the eSafety Commissioner to:
undertake research focusing on issues such as promoting positive cultural change in the digital space as well as preventing repeat offences of online harm and volumetric attacks.84
become the single point of entry service for victims of online abuse.85
collaborate with relevant Government agencies to review the use of algorithms on digital platforms and examine the need to regulate end-to-end encryption technology in the context of harm prevention.86
Consider specific issues outlined in recommendations 11 and 12 of the report when drafting new social media industry codes and implementing the Basic Online Safety Expectations.87

Senate Standing Committee for the Scrutiny of Bills

1.62
The Senate Standing Committee for the Scrutiny of Bills (the scrutiny committee) considered the bill in scrutiny digest 2 of 2022, published on 18 March 2022.
1.63
The scrutiny committee noted that significant matters such as the scope of terms central to a legislative scheme, should be set out in primary legislation unless a sound justification for the use of delegated legislation is provided. In particular, the lack of explanation provided for leaving the definition of 'exempt service' to the legislative rules was highlighted.
1.64
The scrutiny committee also raised concerns in relation to clause 22, stating that it is unclear in which circumstances it would be appropriate to prescribe social media services with less than 250,000 Australian account holders. Given the significance of these terms to the proposed new defamation framework, the committee considered that it may be appropriate to include at least high-level guidance in relation to this matter.
1.65
At the time of tabling this report, the scrutiny committee sought further information from the Minister regarding the following matters:
why it is considered necessary and appropriate to leave key elements of the definitions of 'social media service' and 'exempt service' to delegated legislation;
why it is considered necessary and appropriate to leave the scope of the application of the requirement to have a 'nominated entity' to delegated legislation; and
whether the bill can be amended to include at least high-level guidance regarding these matters on the face of the primary legislation.

Acknowledgement

1.66
The committee thanks all submitters and witnesses for the evidence they have provided to this inquiry.

Note on references

1.67
References to the Committee Hansard are to the proof Hansard. Page numbers may vary between the proof and official Hansard transcripts.

  • 1
    Journals of the Senate, No.136, 10 February 2022, pp. 4527-4529.
  • 2
    The bill does not seek to introduce a new definition of 'social media service'. Instead, the bill uses the existing definition under section 13 of the Online Safety Act 2021 which, in brief, states that a 'social media service' is a type of electronic service that enables online social interaction between end-users in Australia. This does not include broadcasting services. (Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 12.)
  • 3
    The Hon Paul Fletcher MP, Minister for Communications, Urban Infrastructure, Cities and the Arts, House of Representatives Hansard, 10 February 2022, p. 10; Trolling' is not a legal term, and can refer to a range of inappropriate or abusive behaviours online, up to and including serious criminal offences (Attorney-General's Department, Submission 10, p. 1).
  • 4
    Attorney-General's Department, Social Media (Anti-Trolling Bill) 2021: Explanatory Paper, (explanatory paper) 1 December 2021, p. 2. (Please note this explanatory paper related to the exposure draft version of the Bill.)
  • 5
    Fairfax Media Publications v Voller [2021] HCA 27.
  • 6
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 2.
  • 7
    The Hon Paul Fletcher MP, Minister for Communications, Urban Infrastructure, Cities and the Arts, House of Representatives Hansard, 10 February 2022, p. 10.
  • 8
    Attorney-General's Department, explanatory paper, p. 3. (Please note this explanatory paper related to the exposure draft version of the Bill.)
  • 9
    Attorney-General's Department, Submission 10, p. 2.
  • 10
    Attorney-General's Department, Submission 10, p. 2.
  • 11
    See Criminal Code 1995 section 474.17(1) and Online Safety Act 2021 sections 6 and 7.
  • 12
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 13.
  • 13
    Under the Online Safety Act, adult cyber abuse material is made out when there is material that is posted with the apparent intention of causing serious harm, and the material is offensive, harassing or menacing in all the circumstances (eSafety Commissioner, Submission 5, p. 3).
  • 14
    The threshold serious harm is serious physical harm, serious psychological harm or serious distress, but does not include 'mere' ordinary emotional reactions such as simple distress, grief, fear or anger (eSafety Commissioner, Submission 5, p. 3).
  • 15
    eSafety Commissioner, Submission 5, p. 1.
  • 16
    eSafety Commissioner, Submission 5, p. 3.
  • 17
    A carriage service is a service that enables users to access the internet (Telecommunications Act 1997 (Cth) section 7).
  • 18
    Attorney-General's Department, Submission 10, p. 2; Criminal Code 1995 (Cth) section 474.17(1).
  • 19
    Attorney-General's Department, Submission 10, pp. 2-3.
  • 20
    Parliamentary Library, 'The Commonwealth plan for reforming defamation law in Australia', research note, No. 4, 21 July 2004, pp. 1-2.
  • 21
    Attorney-General's Department, Submission 10, pp. 2-3.
  • 22
    Attorney-General's Department, Submission 10, p. 3.
  • 23
    Attorney-General's Department, Privacy, <https://www.ag.gov.au/rights-and-protections/privacy#:~:text=The%20Privacy%20Act%201988%20(Privacy,and%20in%20the%20private%20sector.> (accessed 12 March 2022).
  • 24
    The bill defines country location data as data determined according to the geolocation technology deployed by the social media service provider (Social Media (Anti-Trolling) Bill 2022, clause 6).
  • 25
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 19.
  • 26
    Attorney-General's Department, Review of the Privacy Act 1988, <https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988> (accessed 12 March 2022).
  • 27
    Attorney-General's Department, Online Privacy Bill Exposure Draft, <https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/> (accessed 15 March 2022).
  • 28
    Attorney-General's Department, Online Privacy Bill Exposure Draft, <https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/> (accessed 15 March 2022); Alannah & Madeline Foundation and Dolly's Dream, Submission 3, pp. 5–6.
  • 29
    Attorney-General's Department, Submission 10, p. 1; Social Media (Anti-Trolling) Bill 2022, explanatory paper, p. 4. (Please note this explanatory paper related to the exposure draft version of the Bill.)
  • 30
    Attorney-General's Department, Submission 10, p. 1; Social Media (Anti-Trolling) Bill 2022, explanatory paper, p. 4. (Please note this explanatory paper related to the exposure draft version of the Bill.)
  • 31
    Attorney-General's Department, Submission 10, p. 1; Social Media (Anti-Trolling) Bill 2022, explanatory paper, p. 4. (Please note this explanatory paper related to the exposure draft version of the Bill.)
  • 32
    Attorney General's Department, Submission 10, p. 1; Social Media (Anti-Trolling) Bill 2022, explanatory paper, p. 4. (Please note this explanatory paper related to the exposure draft version of the Bill.)
  • 33
    Social Media (Anti-Trolling) Bill 2022 clause 6.
  • 34
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 2.
  • 35
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 3.
  • 36
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 3.
  • 37
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 3.
  • 38
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 3.
  • 39
    Attorney-General's Department, Submission 10, p. 8.
  • 40
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 13.
  • 41
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14.
  • 42
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14.
  • 43
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14.
  • 44
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14.
  • 45
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14.
  • 46
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14.
  • 47
    The defence of innocent dissemination provides a defence to the publication of defamatory matter. This defence can apply in circumstances where the defendant was a subordinate distributor of the material, and neither knew nor ought reasonably to have known that the matter was defamatory (in circumstances where that lack of knowledge was not due to negligence). Section 235 of the Online Safety Act concerns internet service providers and is not relevant to the subject matter of the bill (Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14).
  • 48
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 19.
  • 49
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 20.
  • 50
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 20.
  • 51
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 16.
  • 52
    Whether as a result of subclauses 15(1) or (2), or otherwise.
  • 53
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 16.
  • 54
    Clause 17 outlines the prescribed requirements for a complaints scheme implemented by the provider of a social media service, so that the provider may receive the benefit of the defence in clause 16.
  • 55
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 16.
  • 56
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 16.
  • 57
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 16.
  • 58
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 16.
  • 59
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 16.
  • 60
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 16.
  • 61
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 22.
  • 62
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 22.
  • 63
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 22.
  • 64
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 23.
  • 65
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 23.
  • 66
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 23.
  • 67
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, pp. 24 - 25.
  • 68
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 25.
  • 69
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 25.
  • 70
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 25.
  • 71
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) p. 1.
  • 72
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) p. 135-137.
  • 73
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) p. 29.
  • 74
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) p. 169.
  • 75
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) pp. 100-104.
  • 76
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) p. 170.
  • 77
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) pp. 180 and 160 - 162.
  • 78
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) pp. 182.
  • 79
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) p.xxi (Recommendation 16 and17).
  • 80
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) (see recommendations 4, 21, 22 and23).
  • 81
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) (see recommendations 1, 2, and 7).
  • 82
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) (see recommendations 8 and 25).
  • 83
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) (see recommendations 15,18,19 and 20).
  • 84
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) (see recommendations 3 and 5).
  • 85
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) (see recommendation 6).
  • 86
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) (see recommendations 10 & 13); relatedly, the Committee also called for mandatory reporting obligations in relation to the use of algorithms in recommendation 14.
  • 87
    House Select Committee on Social Media and Online Safety, Inquiry into Social Media and Online Safety, (15 March 2022) p. xvii (see recommendations 11 and 12).

 |  Contents  |