Chapter 2

Key Issues

2.1
Social media platforms are among the most prominent digital actors in the lives of everyday Australians. Platforms can act as a means of connecting people around the world, enabling Australians to communicate, engage and share their experiences with others. However, it is evident that some aspects of this engagement can intensify and amplify social harm, such as through the proliferation of defamatory material.
2.2
While reforms have been introduced to moderate online spaces, and others are under currently under consideration, developments in the functionality of social media platforms have outpaced the existing legal and regulatory framework in Australia.
2.3
Accordingly, this chapter examines the Social Media (Anti-Trolling) Bill 2022 (the bill) as a proposed response for Australians subject to defamatory material on social media platforms. In considering the bill, this chapter will also examine Australia's existing regulatory regime and its interaction with the evolving nature of social media platforms. In addition, this chapter provides an overview of the key issues raised in evidence and concludes by providing the committee view and recommendations.

Need for the bill

Overturning the Voller decision1

2.4
As noted in Chapter 1, the primary intent of this bill is to address the issues raised by the High Court's decision in Fairfax Media Publications v Voller [2021] HCA 27 (Voller) which made clear that individuals and organisations with social media pages on which third party material can be posted may be 'publishers' of that material for the purposes of defamation law.
2.5
The bill seeks to overturn the Voller decision and clarify that social media page owners are not 'publishers' of defamatory material posted on their pages by third parties, thereby protecting them from defamation liability.2
2.6
The Hon. Paul Fletcher MP, provided in the bill's second reading speech that:
Following Voller, it is clear that Australians who maintain an ordinary social media page could be liable for defamatory material posted by someone else—even if they do not know about the material. Many individuals and organisations simply do not have the resources to continuously monitor and moderate their social media pages, merely to manage their defamation liability risk. This bill urgently responds to the risks created by the Voller decision to protect all Australians who maintain a normal social media page from inappropriate defamation liability.3
2.7
In discussing the implications of the Voller decision, Free TV Australia, included in its submission a number of case studies to illustrate the impact that the Voller decision is having on the industry (see Box 2.1).
2.8
The committee also heard that page-owners may unwittingly take on defamation liability if they permit comments and other material to be posted. This point was echoed by Ms Bridget Fair, Chief Executive Officer of Free TV Australia, who stated her view that the Voller decision may limit public debate and engagement with news as commercial broadcasters are often page owners on social media platforms. Ms Fair raised that:
the Voller decision has led to a significant increase in risk for defamation liability. This case found that page owners, including media companies, were liable for the material posted by users on their pages. Most often this relates to comments on posts. This resulted in an unmanageable risk for media companies. We're often posting multiple times each day and there's often a high volume of comments on those posts. Making us responsible for the content of those comments was not manageable for us. There are often situations where we've had to turn off commenting features on posts. . We have provided examples of that in our submission. That impacts on the reach of a lot of that material. It's not something that we want to do. It limits the engagement of people with news and information and it limits important discussions that happen around the content.4
2.9
The Hon. Bruce Billson, Australian Small Business and Family Enterprise Ombudsman (ASBFEO) also expressed concern at the prospect that businesses that administer social media pages could be liable for third party defamatory material. Mr Billson stated:
There are clearly far more learned people around the defamation law interplay and how that's activated. I hope to be bringing to the committee a very field-evidence perspective of just what small businesses are facing. If they need to defend their economic interests by going to the Federal Court, it is often very expensive, very delayed, very complex, with lots of risks to be evaluated, including cost orders against them.5
2.10
Further, Digital Industry Group Inc. (DIGI) supported the initiative to legislate following the Voller decision, observing that:
…this decision, along with others that have impacted internet intermediaries, illustrates that Australia's defamation laws require updating for the digital age. The implications of the Voller decision are wide-ranging and will most likely result in increased defamation litigation against administrators of social media accounts and those who may not necessarily be responsible for posting defamatory material, including individuals and businesses with social media pages.6

Box 2.1:   Case studies: impact of the Voller decision

In an article published in the New York Times in September 2021, the ABC's then-Director of News, Gaven Morris, said 'We do choose not to publish some stories on Facebook and social platforms because we can't ourselves control the comment streams that follow it…[t]he load on our end in having to be vigilant across all of the comments is increasing all of the time…[w]e've talked long and hard particularly to Facebook about this and say there's got to be a better way… to moderate comments as they're going up.' Dave Earley, audience editor at Guardian Australia said '…we won't post stories about politicians, Indigenous issues, court decisions, anything that we feel could get a problematic reaction from readers.'
Community legal service Justice Connect issued a warning to the non-profit sector that '…essentially, if your not-for-profit organisation operates a social media page, it is responsible under defamation law for everything that everyone posts on that page. It doesn't matter if your not-for-profit organisation disagrees with a defamatory comment or is even unaware that someone has posted a comment. Every not-for-profit organisation that operates a social media page should put measures in place to ensure that posts or comments made by others are not defamatory.'
In December 2019, media industry publication Mumbrella decided to stop sharing news stories on Facebook altogether, telling readers the risk outweighed the reward. Mumbrella founder Tim Burrowes said that for a small publisher, the impact of even a single defamation claim can be very time consuming and expensive to defend stating 'It can have a chilling effect on whether they share perfectly legitimate news.'7
2.11
The alternative view presented to the committee was that bill is premised on a misreading of the Voller case. Ms Sue Chrysanthou SC contended:
The issue of publication has been defined by the common law for centuries. Basically, if you participate in publication, you're liable. The High Court's decision in Voller was predictable and predicted by defamation lawyers because all it does is recite and restate hundreds of years of common law. Voller has not changed the law of defamation at all. Anyone who knows anything about defamation will tell you that.8
2.12
Professor Michael Douglas predicated that the bill conflates publication of defamation with liability for defamation. Professor Douglas explained 'publication does not entail liability. A person who publishes defamation may avoid liability via the innocent dissemination defence that has been a part of Australian law for more than 15 years; see, for example, Defamation Act 2005 (NSW) s 32(1)'.9
2.13
The Law Council of Australia (LCA) also expressed the view following the Voller decision, 'intervention at the federal level in the law of defamation should not occur until the completion of the Stage 2 Review process and should form part of any package of reforms to the liability of online intermediaries more broadly'.10 The LCA stated:
…it is appropriate to reconsider the circumstances in which internet intermediaries will be liable in defamation for third-party content. In the Law Council's view, the law should place a greater onus on the originator of potentially defamatory online material and not emphasise internet intermediaries as the primary entities to be liable in defamation.11
2.14
However, the Law Council recognised that:
The [Voller] decision has significant implications for those who operate online forums (forum administrators) which allow third parties to make comments, post material and the like, including the media, businesses, community groups and individuals.12

A rapidly changing digital environment

2.15
As outlined in chapter 1, Australia's current regulatory framework was not designed to respond effectively and efficiently to increasing volumes of defamatory material on social media platforms. The bill's explanatory memorandum observes that:
…the challenges presented by defamation over social media are particularly acute, given the speed at which such material can spread and the limited scope to contain the spread of a defamatory post once it has been published on a social media network. Where such material is posted anonymously, complainants may have limited ability to identify the poster, and the capacity to seek vindication by commencing defamation proceedings may be limited.13
2.16
A number of submitters also expressed concern about the speed and volume at which anonymous, defamatory material is proliferated on social media platforms and the lack of avenues for redress available to those subject to this material.14
2.17
Ms Julie Inman Grant, eSafety Commissioner, told the committee that the nature of social media ecosystems amplifies the velocity and volume of defamatory social media commentary:
There's no question that online defamation remains a serious and intractable challenge that we must tackle. But online defamation is particularly insidious because it can be created with great malice with the touch of a button, and it can be amplified online globally and instantaneously with relative impunity. Defamation laws were developed for the print age, not the digital age, and cannot keep pace with the velocity and volume of social media commentary.15
2.18
Similarly, Ms Nyadol Nyuon, explained how the challenges presented by defamation and online abuse on social media are particularly acute, given the speed at which such material can spread once it has been published on a social media network:
I receive a lot of online abuse from a lot of anonymous trolls. I receive racist abuse that refers to me in derogatory terms, that is dismissive, that is purely racist in its delivery, but the sheer volume of it makes it hard to be able to deal with each individual who sends it. If I was to dedicate my time to trying to expose each one, it would be unmanageable.16
2.19
Additionally, Ms Sonya Ryan, OAM, Founder and Chief Executive Officer of The Carly Ryan Foundation commented that the nature of the social media industry's business model relies on the amplification of sensationalist content to maximise engagement:
…the way in which social media companies construct their businesses means that someone's story or opinion or vendetta can be amplified across a physical location and into rumour mills of thousands or millions of other people. So too can the good things, but, tragically, the more horrifying, outrageous or unsavoury it is, the more quickly it spreads.17
2.20
Mr Anthony Seibold noted that these issues are compounded by the added element of anonymity. In discussing his experiences on social media platforms, Mr Seibold described an escalation in the use defamatory language in content from anonymous users. Mr Seibold remarked:
In the anonymous posts, whether on Twitter, Facebook or Instagram, that had fake names, profiles or handles in Twitter's case, I was certainly harassed significantly and in greater detail when the person's identity wasn't attached. To add to that: I found that anonymous profiles can be more defamatory than ones with identities. I did find that people who had their identity there were happy to forward on those messages. Even when I made complaints, or my solicitor or family members made complaints, to the social media platforms around people that had their profile on there, it still wasn't taken down.18

Existing regulatory framework

2.21
Throughout the inquiry, concerns were raised about Australia's existing regulatory framework and its ability to ensure Australians who are subjected to defamatory material on social media platforms have access to effective mechanisms of redress.19 The Attorney-General's Department (the department) told the committee that the bill seeks to address the challenges posed by reform in the digital era through a lens that is complementary to both the Online Safety Act 2021 (Cth) regime and ongoing defamation reform.20 A departmental representative, Mr Michael Johnson advised:
This bill, whilst it's certainly complementary to the range of government measures for online harm across the spectrum, seeks to attach to the existing defamation law regime, which itself is a private basis of claim and therefore utilises the individual who suffers the harm to initiate steps to rectify that. Because of that, the mechanisms in this bill are all aimed at empowering, enabling and assisting those persons to utilise the mechanisms that already exist but for which there might be barriers to access—and the primary barrier here is anonymity.21
2.22
Ms Erin Molan explained to the committee that she had been subject to a significant amount of online defamatory material and the subsequent difficulties in obtaining an avenue for redress:
Then you move into the other space of things that are incredibly defamatory that you really feel like you have no avenue to address whatsoever. Over so many years, things would appear about me on sites like Facebook, Instagram and Twitter suggesting how I got certain jobs. People were saying things about what I would do with players. They were not written as, 'I've heard this' but written as fact. They were completely untrue, and I had no means to defend myself from them.22
2.23
A number of witnesses also told the committee that the lack of available remedies and the speed at which defamatory material can spread on social media platforms has resulted in long-term reputational damage, exhaustion and emotional trauma.23 Mr Anthony Seibold outlined that defamatory remarks made about him on social media caused a long-term impact on his family. He told the committee:
…over the next 48 to 72 hours I was harassed significantly and defamed significantly on social media, including platforms such as Facebook. The comments, the harassment and the defamatory remarks were hard on not only myself but my daughter, who, as I said, was in year 12 at the time and finishing off her schooling in Sydney. There was a significant impact on her to the point where she had professional help. It had a significant impact on myself and I've had some professional help—which was instigated by my employer at the time, the Brisbane Broncos. My wife continues to have professional help. So it's had a significant impact. The best way to describe it for my family members and myself is that it's almost like PTSD [Post Traumatic Stress Disorder].24
2.24
Nonetheless, some witnesses raised concerns about the bill as an appropriate regulatory response, given the existing legal mechanisms25 that are available.26 Professor David Rolph argued that the bill is unnecessary, as there are powers that exist 'under the Federal Court Rules 2011 (Cth) as well as under the rules of court in the States and Territories. These allow prospective applicants to obtain orders from the court to compel persons to provide information about prospective respondents.27 However, Mr Toby Dagg, Executive Manager, Investigations, Office of the eSafety Commissioner, informed the committee that '[t]here may be some matters that reach the threshold of adult cyberabuse that could also be considered potentially defamatory'.28
2.25
Social media provider Meta contended that the Online Safety Act 2021 (Cth) was a more appropriate mechanism for tackling defamation on social media, noting that the Act grants the eSafety Commissioner powers to require the removal of serious and targeted online content 'within 24-hours and to require the disclosure of contact details for a user to support'.29 However, Ms Julie Inman Grant, eSafety Commissioner, noted the threshold that allows her to invoke her powers 'specifically excludes defamatory material'.30
2.26
Some submitters highlighted the ongoing Stage 2 review into Model Defamation Provisions31, with DIGI, for example, raising concerns about the potential risk of inconsistency between the mechanisms proposed by the bill and those under consideration in the Stage 2 Review.32
2.27
Professor David Rolph echoed these concerns as a member of the Model Defamation Provisions Expert Panel, submitting:
…proceeding with this proposed reform at this time may be both premature and liable to detract from the uniformity of Australian defamation law. This is because the proposed reforms pre-empt many of the issues being considered in Stage 2 of the review of the Model Defamation provision currently being undertaken by the States and Territories and led by New South Wales.33
2.28
Whereas other witnesses observed that in seeking to address the immediate implications of the Voller decision,34 the bill would assist in 'avoiding the spectre, the threat, of defamation liability impacting page owners utilising their social media accounts in ways that are appropriate and that bring great benefit to society'.35
2.29
The department also noted that while the Stage 2 Review of the Model Defamation Provisions remains a priority for MAG in 2022, 'it is unclear what reform will arise from that review and when it will be implemented', elaborating that:
The Government considers that the issues addressed in the Bill – the consequences of the Voller decision and the harmful effects of defamatory material on social media – require urgent reforms.36

Accountability for hosting defamatory material

2.30
As noted in chapter 1, clause 15 of the bill provides that for the purposes of defamation law, a social media provider is a 'publisher' of material posted on a page of the social media service.37
2.31
In discussing the potential impacts of clause 15, submitters and witnesses focussed on the need to have proportionate legislative safeguards in place to effectively mitigate the impacts of defamatory material on social media.38 A common view raised by submitters was that social media platforms should bear liability for content hosted on their platforms as a baseline or first step in protecting users of social media.39
2.32
Witnesses pointed to examples of inadequate, industry self-regulation. For example, Reset Australia asserted that:
The social media sector has demonstrated a track record of systemic compliance issues, including multiple breaches of existing legislation and a generally anaemic response to self-regulation This warrants a pivot towards primary and subordinate legislation and regulation for the sector.40
2.33
Additionally, Ms Sonya Ryan, from the Carly Ryan Foundation explained:
…social media companies have a responsibility to look after their users and do what they can to provide relevant information and, therefore, hold a person accountable online for their actions, just as we are held accountable offline for how we choose to treat others.41
2.34
Ms Erin Molan was of the view that social media providers need to be accountable for publishing harmful and defamatory material and its impacts on Australians:
look at the two kids who have taken their lives over the past month or so, and look at the adults that have taken their lives. This stuff is damaging. This stuff kills people. There has got to be a responsibility somewhere when it comes to the platforms that publish it and allow it to be written.42
2.35
Mr Anthony Seibold also noted that social media providers should have a duty of care in relation to the content that is published on their platforms, emphasising the need for a balance between hosting anonymous accounts that allow for freedom of expression and accountability to complainants:
I think they [social media providers] have a duty of care. I think that, if you provide platforms where there can be anonymous accounts set up, and you do not act when complaints are made, that is something that is really distressing. If you go to a shopping centre, there is a duty of care in some regard, so I think that's the message to the social media platforms: there is a duty of care.43
2.36
The ASBFEO expressed support for shifting liability to social media providers, reflecting that 'small business[es] might find it difficult to navigate those [social media] platforms, and they may in fact be hosting comments of that kind and not be aware of what their obligations and options are'.44 In reference to overturning the Voller decision, the Hon. Mr Billson stated:
In our view, perfect shouldn't be the enemy of good. There are clearly far more learned people around the defamation law interplay and how that's activated. I hope to be bringing to the committee a very field-evidence perspective of just what small businesses are facing. If they need to defend their economic interests by going to the Federal Court, it is often very expensive, very delayed, very complex, with lots of risks to be evaluated, including cost orders against them.45
2.37
Conversely, some submitters cautioned against a narrow approach to reform in such a complex policy environment.46 Mr Tass Liveris, President of the Law Council of Australia, cautioned against such an approach, stating:
It is absolutely fundamental, we say, that a reset and a rethink of the liabilities of the various players, as my colleague Ms Carnabuci referred to, in social media transactions takes place. We really do need to look at the entire digital world, and this does not do that.47
2.38
Additionally, Twitter submitted that some complainants may not wish to undergo either of the proposed mechanisms for redress to identify the commenter if liability ultimately lies with a social media provider as a publisher. It remarked:
This would have the opposite effect of the Government's publicly-stated intention, which is to ensure that post-Voller, individual commenters (i.e. trolls) are held accountable for their comments or material posted online, and would result in a bypass mechanism rather than bringing wrongdoers to justice, which is the intended purpose of the legislation.48

Anonymity and pseudonymity

2.39
Anonymity is used to hide the real identity of online users and pseudonymity refers to a name, term or descriptor that is different to an individual's actual name.49
2.40
The bill seeks to enable victims of defamation to identify anonymous users who post defamatory material by engaging with their social media provider's complaint's mechanism to obtain the originator's contact details, or through an 'end-user information disclosure order' (EIDO) from a court.50
2.41
In discussing the bill's intention to balance issues of anonymity and access to mechanisms for redress, the department emphasised that the bill aims to:
disincentivise or create consequences for making defamatory comments online. The big-picture goal for all of these mechanisms is to reduce the number of defamatory comments being made in the first place.51
2.42
Overall, submitters had contrasting views about the bill's intention to address anonymity/pseudonymity, drawing the committee's attention to a number of considerations.

Anonymity/pseudonymity as a safeguard for online participation

2.43
Some submitters noted that anonymity is critical to an open internet and enables individuals to exercise autonomy over their online identity, especially for those from vulnerable communities.52
2.44
Reset commented that 'anonymity is important for many different communities and user groups, including for [example] women, LGBTIQ+ and communities of colour'.53
2.45
Some submitters and witnesses also emphasised the importance of anonymity in enhancing an open internet, genuine public interest criticism and democratic debate54. Electronic Frontiers Australia posited that:
Some people, because of their jobs or other circumstances, cannot speak freely under their own name. This applies both to people who offend powerful Australians, and also to people who speak critically, from Australia, about powerful individuals in other countries. The very people anonymous users most fear discovering their identity are exactly the people most likely to try to use the powers under the draft to unmask their anonymous critics. Even if the complaint is subsequently found to be meritless, the consequences of being unmasked could be potentially career-ending or similarly catastrophic.55
2.46
Similarly, social media provider Meta argued that the bill's intention to identify anonymous users could:
stifle freedom of expression as users may no longer feel comfortable expressing themselves if there is a risk that their identity and contact details could be disclosed to another user. This could have a particularly chilling effect on whistle-blowers, survivors of sexual assault, victims of domestic violence and other users who could be put at risk if their identity was disclosed.56

Anonymity/pseudonymity as a barrier for redress

2.47
Conversely, other submitters considered the need to balance the right to anonymously engage on social media while preventing defamatory comments occurring with impunity57. To this end, the committee heard concerns about the lack of regulation, legislative safeguards and mechanisms available to those who are subject to defamatory material posted by anonymous users on social media platforms.58
2.48
Through her lived experience of being subject to defamatory remarks on social media, Ms Erin Molan observed that anonymity emboldened commenters in posting defamatory material:
It's [anonymity] like anything in the world if there are no consequences, or if you're not going to be held to account for something. I don't think most people are this way inclined, but, for those who might not be that pleasant, absolutely that's almost a green light for a lot of them. I'm not of the belief, either, that anyone who goes on social media should have to have their full name in their bio, but what needs to happen is they need to register somewhere so they can be found if they break the law or if they do something that crosses a line, whatever that line might be or however it's defined.59
2.49
Mr Anthony Seibold added that the core provisions in the bill would make a difference for Australians who are subject to defamation on social media by empowering them to respond to anonymous users. Mr Seibold explained:
Going back to my experience, I think the lack of legislation and the lack of power that the police told me they had in regard to identifying individuals through the social media platforms highlighted the fact that there are some things that need to change. I feel as though if people need to provide their identity in some way, shape or form to social media platforms, it becomes easier for legislation to garner the information that's required to make people accountable if they want to defame or harass.60

Compliance and enforcement of the bill's mechanisms

2.50
As noted in the bill's explanatory memorandum, the bill seeks to:
provide new mechanisms for Australians to ascertain whether potentially defamatory material on a page of a social media service was posted within Australia and, if so, to obtain the relevant contact details of the poster.'61
2.51
This section considers issues raised by submitters around the bill's proposed mechanisms for redress, which primarily relate to the proposed complaints scheme, the conditional defence for social media providers and EIDOs.
2.52
Under the proposed arrangements in the bill, some submitters outlined limitations around the practical difficulties for social media providers in collecting the information necessary to avail themselves of the conditional defence.62Additionally, submitters raised concerns about the enforceability of EIDOs due to jurisdictional reach.

Complaints scheme

2.53
Clause 15 provides social media providers access to a 'safe harbour' defence on the condition that they establish and comply with the proposed complaints scheme. This includes providing the complainant with the 'relevant contact details' of the commenter making defamatory remarks, which is defined in clause 6 of the bill to mean:
the name of the person or the name by which they are usually known;
an email address and phone number that can be used to contact the person; and
such other information as enabled by the legislative rules.63
2.54
Some submitters welcomed the proposed complaints handling requirements for social media providers64. For example, Digital Rights Watch observed:
It is critical that social media companies better resource their complaints mechanisms so that the harms generated by the platform are not socialised. None of this is inconsistent or incompatible with allowing users to remain anonymous or pseudonymous. While there may be challenges in navigating the resolution of complaints with the right to freedom of expression, these are not insurmountable. We encourage social media companies to be open and accountable when it comes to reporting on online harms and the functionality of their complaints processes.65
2.55
Similarly, the eSafety Commissioner submitted that:
it is unlikely that a commenter responsible for material that is potentially capable of leading to commencement of defamation proceedings will consent to being exposed as a defendant in litigation. However, we do see value in the proposed complaints scheme. We believe the conditional defence to defamation for services can be linked to this scheme in a way that balances the imperatives of user safety, privacy and security.66
2.56
Medical Group Insurance Australia (MGIA) noted that the conditional defence associated with the complaints scheme will incentivise social media providers to provide meaningful, workable and fair complaints processes:
MGIA considers it imperative to ensure there are clear incentives for social media services and review websites to provide meaningful, workable and fair complaints processes, which include scope for removal of a defamatory review. The best way to do this is to require such a process to exist and be followed appropriately as a condition of such services and websites avoiding liability for a defamation claim.67
2.57
However, the committee heard that there may be some limitations around social media providers verifying the identify of its users, given the large volume of social media accounts operating in Australia. Some submitters noted that social media users may also provide false information to obscure or alter their geographical location (such as through the use of a virtual private network to change an Internet Protocol Address) to avoid potential liability.68
2.58
Social media providers Twitter, YouTube and Meta noted that the disparate nature of Australia's regulatory framework can create inconsistencies in the implementation of laws, citing the lack of prescribed requirements for social media providers in the complaints scheme provision as an example. For instance, Meta described the 72-hour time frames proposed to collect relevant contact details as 'onerous, particularly where a complaint may be vague, lack critical information (such as a URL for the material) or refer to a large volume of material'.69
2.59
Other submitters observed that the proposed timeframe and difficulty in verifying the contact details of users could mean that social media providers consider alternatives (such as increased censorship) to limit the impact of the proposed reforms on their operations.70
2.60
Professor Michael Douglas discussed the practical difficulties of removing a defamatory comment with the commenter's consent,71 suggesting that the scheme has two key limitations that should be considered by the committee:
First, a commenter can simply withhold consent. It is easy to foresee that many people would do this; they may only be prompted to act, if at all, when the threat of litigation is imminent (e.g., once they receive a concerns notice). If the commenter is still effectively anonymous, there is even less incentive for the commenter to provide consent. Second, the language provides the provider with a discretion to not remove the comment even if consent is provided. US-based entities may defer to be laissez-faire, and keep harmful content online, even where consent is provided by the commenter, out of First Amendment concerns.72
2.61
Ms Kara Hinesley, Director of Public Policy, Australia and New Zealand for Twitter argued that further regulatory intervention such as the complaints scheme should align with the Stage 2 Provision Reform Process:
One of the key things with the stage 2 provision reform process is that we've been going through trying to work out what would be a complaints mechanism or a complaints notice process that would be able to work well for end users and potential plaintiffs and also be workable from the platform perspective. We also want to, of course, take into consideration the role of page owners and providers, making sure that they also have defences that are available in certain situations.73

End-user information disclosure orders

2.62
Clause 19 establishes a framework for a prospective applicant to apply to a court for an EIDO.74 An EIDO can require the provider of a social media service to disclose the poster's country location data and (if the material was posted in Australia) the poster's relevant contact details to the prospective applicant. This mechanism seeks to provide the complainant the option to apply for a court order where they consider the complaints mechanism is unlikely to be helpful for their circumstance.75 However, the committee received evidence pertaining to potential jurisdictional issues associated with the proposed court order.
2.63
In terms of extraterritorial application76, the department explained the perceived challenges in enforcing the outcome of a defamation proceeding against a foreign defendant:
…strictly speaking the majority of the bill applies either in Australia or to things that have a meaningful connection to Australia. One thing the bill does not do is seek to address broader aspects of defamation proceedings.
There are already measures in place to try to make that easier. Having assets locally is one element, but, unless there is a legal entity which a court order or a remedy can be executed against, there are challenges there. But the primary mechanism is through bilateral treaties with relevant countries to enable the reciprocal recognition of judgements. That's a large issue, and lots of work happens on that in other spaces, and this certainly would latch onto that. But this bill does not seek to fix that larger issue; this bill seeks to address the entry issue, which is to commence the legal proceedings.77
2.64
Professor David Rolph suggested the committee considers amending the bill to seek to overcome jurisdictional issues and prioritise enforcement. If amendments were not made, Professor Rolph contended that an individual accessing mechanisms for redress under the bill may:
have no claim in defamation against the social media page owner or administrator because they have a complete immunity under clause 14(1)(c);
have a claim in defamation against the provider of a social media service at common law and under clause 14(1)(d) but cannot pursue it because the social media service may not be amenable to jurisdiction in Australia or a defamation judgment of an Australian court may not be enforceable in a jurisdiction where the social media service is located;
the nominated entity of the social media service is not liable for the provider of the social media service's liabilities in defamation; and
the commenter cannot be identified or located.78
2.65
Although, Professor Michael Douglas submitted that the High Court has already affirmed the power of Australian courts to issue orders with extraterritorial effect in relation to assets based in foreign jurisdictions. Professor Douglas submitted:
As Australians' wellbeing online may depend on people and entities outside of Australia, it is illogical and unnecessary to frame legislation designed to protect Australians online with respect to harmful comments 'made' in Australia.79

Nominated entity requirement

2.66
Part 4 of the bill provides that foreign social media providers must have a nominated entity in Australia.80 The nominated entity is required to have access to country location data and relevant contact details of posters in relation to material posted in Australia and have authority to receive complaints under the complaints scheme.81 Evidence received by the committee in relation to the requirement concerned the potential collection of users' data offshore and the enforcement of court orders.

Balancing privacy and access to redress

2.67
Some submitters expressed concern over data being held offshore by social media providers.82
2.68
The department stated that the bill allows flexibility in the way that providers and their nominated entities meet the requirement to have access to relevant data, clarifying:
It does not require data localisation or transfer of data into Australia. For instance, if relevant contact details and country location data relating to users are held offshore, the nominated entity may establish arrangements that allow it to access that data on a case-by-case basis as required.83
2.69
In discussing balancing the collection of information with access to redress, Ms Erin Molan supported the collection of information as a proportionate response to criminal activity on social media platforms. Ms Molan told the committee:
I'm not of the belief, either, that anyone who goes on social media should have to have their full name in their bio, but what needs to happen is they need to register somewhere so they can be found if they break the law or if they do something that crosses a line, whatever that line might be or however it's defined. I think people can go on anonymously and make jokes or have an account where they have a bit of fun with something or whatever; that's fine. But it's when they break the law that they lose their right to be anonymous. That's how I feel about it. It empowers and it emboldens them.84
2.70
However, Twitter highlighted that the collection of even greater volumes of Australians' personal information may contravene 'other government initiatives and general best practice privacy principles, such as data minimisation.'85
2.71
Additionally, Professor Michael Douglas explained that obtaining location data may not be enough to identify anonymous users, limiting the impact of the complaints scheme in facilitating justice:
…country location data can be easily fabricated with little technological savvy. Many Australians are familiar with the use of proxies. If a person wants to cause harm online anonymously, they can do so without much difficulty.86
2.72
The eSafety Commissioner commented that the inconsistency between social media providers in relation to how they obtain, validate, store and log location data, provides varying investigative value for uncovering anonymous identities:
In our experience, obtaining BSI [basic subscriber information] is only a first step in opening further lines of inquiry to try to identify an end-user. It is not always the case that a person's identity or even contact details can be adequately established through information held by a digital service. For example, while an Australian mobile phone number may be useful information the identity of the subscriber can generally only be ascertained by querying restricted databases or seeking further end-user information from the service provider.87
2.73
Some submitters recommended that the committee consider amending the bill to include further safeguards, which are discussed sequentially below.88
2.74
Dolly's Dream and the Alannah & Madeline Foundation advised that committee should consider the bill in the context of the provisions of the Online Privacy Bill 2021 to protect vulnerable users:
…the exposure draft of the Online Privacy Bill 2021 proposed to enable the creation of an Online Privacy (OP) code for industry. This code would require social media services to obtain the consent of a parent or guardian before collecting, using or disclosing the personal information of a child under the age of 16. Social media services would be expected to take 'all reasonable steps' (as yet undefined) to verify that this parental / guardian consent is genuine. Thus, changes may be afoot to how social media services handle children's personal data, both in response to a future OP code under an Online Privacy Act 2022 and in response to the Social Media (Anti-Trolling) Bill 2022. Aligning these changes may present a challenge.89
2.75
The Office of the Australian Information Commissioner (OAIC) told the committee to consider clarifying whether the name of the person in the definition of 'relevant contact details' means a person's actual or legal name. It submitted that the use of a person's legal name to satisfy the compliance requirements may result in:
…social media services seeking to collect identity information, such as government issued credentials like a driver's licence or passport, in order to verify the authenticity of an individual's name. This is problematic given the privacy and security risks associated with the mishandling of this information. For instance, government issued credentials contain significantly more personal information than may already be collected and held by social media services (such as address, date of birth and other identifiers like Medicare number). Further, compromise of identity credentials and information can lead to identity theft, which has significant consequences for individuals.90
2.76
In response to these issues, the department clarified that the bill 'includes a number of limitations and safeguards to minimise its impact on the right to privacy and on legitimate anonymous use of social media'. The department noted that any impacts on privacy are reasonable, necessary, and proportionate to achieve the bill's policy objectives:
Facilitating the disclosure of the poster's contact details in relevant circumstances will impact on the privacy of individuals. However, this This legitimate policy end requires the disclosure of personal information to be effective.91

Jurisdictional reach

2.77
The committee also heard that the nominated entity requirement poses potential legal challenges regarding the enforcement of EIDO court orders. Mr Michael Johnson, Assistant Secretary, Defamation Taskforce from the Attorney-General's Department, explained:
Whilst the companies have asserted in their submissions that, by all means, they'll comply with an order of the court—because, of course, they will—if push were to come to shove, there would be no means of enforcing those orders against a company that's situated in a foreign jurisdiction. So, to deal with that challenge, the bill has a requirement on foreign social media companies to have a local nominated entity that is able to do a number of things prescribed in the bill. As a result of that, among other things, the information disclosure order that the bill provides for can be served against the nominated entity, and the nominated entity is required to fulfil that order, essentially on behalf of the foreign social media company.92
2.78
In response to discussion around potential remedies to strengthen the enforceability of EIDOs, Mr Johnson continued:
it [enforceability for foreign entities] is an issue that rises well above defamation. There are already measures in place to try to make that easier. Having assets locally is one element, but, unless there is a legal entity which a court order or a remedy can be executed against, there are challenges there. But the primary mechanism is through bilateral treaties with relevant countries to enable the reciprocal recognition of judgements. That's a large issue, and lots of work happens on that in other spaces, and this certainly would latch onto that. But this bill does not seek to fix that larger issue; this bill seeks to address the entry issue, which is to commence the legal proceedings.93
2.79
Meta told the committee that 'when Australian law enforcement regulators or courts seek to [inaudible], we work to comply with the relevant directions that are made. Certainly, we've set up systems particularly for law enforcement to make it easy for them to request that data regardless of jurisdiction or entity'.94
2.80
Ms Angelene Falk, Australian Information Commissioner and Privacy Commissioner, was in support of the intent of the bill, but suggested that the Government incorporate the issue of extraterritoriality of the Privacy Act 1988 and Online Privacy Bill once finalised.95
2.81
Additionally, the Business Council of Australia and YouTube submitted that the nominated entity requirement may be in tension with Article 10.5 of the Australia-United States Free Trade Agreement [2005] ATS 1 (AUSFTA), which reads:
Neither Party may require a service supplier of the other Party to establish or maintain a representative office or any form of enterprise, or to be resident, in its territory as a condition for the cross-border supply of a service.96
2.82
In response to this concern, the department informed the committee of an applicable exception that exists:
…in the General Agreement on Trade in Services, which is incorporated into the US free trade agreement by express reference, which allows measures that might otherwise be inconsistent with a primary obligation in the free trade agreement which are necessary to ensure compliance with other regulations or laws that are otherwise consistent with the free trade agreement.97

Other issues raised in evidence

2.83
A range of other issues were raised over the course of the committee's inquiry in relation to the bill's provisions, with some of these issues outlined below.

Title of the bill

2.84
The majority of witnesses and submitters expressed concern with the title of the bill, specifically the use of the term 'trolling'.98
2.85
In reference to the use of the term trolling, the department explained:
…'trolling' is not a legal term, and can refer to a range of inappropriate or abusive behaviours online, up to and including serious criminal offences. The Bill addresses one form of online abuse, namely posting defamatory material on social media.99
2.86
When providing evidence to the committee, the department clarified that the bill:
…is about trolling to the extent that making defamatory comments on social media is one form of trolling. But you are right: it is about defamatory comments made on social media, which is one form of trolling.100
2.87
The Minister for Communications, Urban Infrastructure, Cities and the Arts, the Hon. Paul Fletcher MP likewise stated:
Social media services have provided Australians significant social, educational and economic benefits for many years. However, these benefits have also come with greater exposure to online harms—including exposure to emotional, mental health and financial harms that can result where malicious online trolls post defamatory material to attack a person's reputation.101
2.88
Many submitters expressed concern about the framing of the bill as a legal remedy to trolling, suggesting that the scope of the title is too narrow and does not contain clear references to the behaviour it is intended to address.102
2.89
For example, the eSafety Commissioner submitted that the bill poses a 'risk of public confusion over what the bill seeks to achieve if its present name is preserved', further asserting that:
Trolling has become a common place term to describe a wide array of abuse. Over time this could serve to normalise or trivialise the very real suffering that those seeking to deliberately menace, or abuse others inflict on their targets.103
2.90
The Centre for Digital Wellbeing explained that while defamation is a type of online abuse that people experience, it is only a very small subset of the types of abuse committed online:
…the Bill focuses only on material posted online anonymously, which is an even smaller subset of online abuse. The material must also have been posted in Australia, further limiting the scope of the Bill. With its narrow focus on defamation, the Bill does not address the issues that the majority of victims of online abuse experience.104
2.91
Additionally, Twitter suggested that while trolling can be harmful to its intended target in certain contexts, trolling by itself may not rise to a level sufficient to give its victim a cause of action in defamation:
Australian courts have previously stated that 'mere vulgar abuse' is not defamatory if it fails to reach the standard of seriously harming a person's reputation or diminishing their standing in the community. With regards to "trolling," it is generally defined as inflammatory, insincere, digressive, extraneous, or off-topic messages posted in an online community.105
2.92
In response the department submitted that there is overlap between defamation and other forms of online abuse, referencing findings released by the eSafety Commissioner stating:
…around 40 per cent of adult cyber abuse complaints received in the first four weeks of operation of the Online Safety Act 2021 appear to be defamatory in nature or concern purely reputational damage. Between 1 July 2017 and 13 December 2021 (prior to the commencement of the Online Safety Act 2021), approximately 25.4% of reports to the eSafety Commissioner about adult cyber abuse related to defamatory material.106
2.93
Ms Julie Inman Grant, eSafety Commissioner, recommended that the committee consider changing the name of the bill:
We have reached a level of sophistication in this space, where there are gradations of harm that we are dealing with. Parliament has set a threshold in the set of definitions about what the serious cyberabuse is, what youth based cyberbullying is, so to lump that all together under the guise of trolling, I don't think is constructive. You could change the name of the bill and it would get rid of a lot of confusion and conflation.107

Cost and access to defamation proceedings

2.94
The committee heard that some Australians are unlikely to have the time or resources to pursue defamation proceedings, possibly limiting access to the mechanisms offered by the bill.
2.95
Victim-survivors of domestic and family violence are among a cohort of Australians least likely to be able to seek redress through a court process due to the high cost of defamation proceedings. Women's Services Network (Wesnet), explained that the bill:
…may in fact impact adversely on women and children. It is possible that a victim-survivor's anonymous comments on social media - which are often surveilled and monitored by abusers - could become the subject of defamation proceedings. These proceedings would not necessarily take into account a history of family violence perpetrated by the plaintiff.108
2.96
Similarly, Ms Nyadol Nyuon submitted that research from the Harmony Alliance found that:
migrant and refugee people face additional [inaudible] to accessing courts, including limited knowledge of their legal rights, limited understanding of court processes, and communication and cultural barriers.109
2.97
Professor David Rolph also observed that while one vulnerable group may be protected by the bill, another may be further disadvantaged if they are the alleged poster of the defamatory material. Professor Rolph explained:
A commenter may have a fear of a threat posed by the prospective applicant to a person other than the commenter themselves. For instance, in a situation of family or domestic violence, where a prospective applicant seeks an end-user information disclosure order, the commenter may want to oppose the making of such an order not because the commenter fears a risk to their own safety but rather a risk to the safety of their children.110
2.98
The bill does provide the Court with the discretion not to make an EIDO where an individual may fear for his or her safety. The explanatory memorandum makes this explicit, stating that 'in addition to its ordinary discretion to refuse to grant an order, the court may refuse to make an EIDO where doing so may present a risk to the poster's safety'.111
2.99
In response to the cost barriers outlined by submitters and witnesses, the department submitted that:
…in recognition that ministerial intervention can lead to an increase in legal costs, the Attorney-General may authorise payment to the applicant by the Commonwealth for costs reasonably incurred in relation to the proceeding. A request for legal financial assistance under this provision will be considered in accordance with guidelines made by the Attorney-General.112

Right to obtain relief

2.100
In relation to how the public would engage with the complaints scheme,113 submitters discussed how the proposed mechanism is predicated on a complainant having a right to obtain relief.
2.101
The committee heard that the complex nature of defamation law may limit the accessibility of the bill. DIGI explained:
Where a person has been the subject of trolling, it may be particularly unclear whether there is a right to relief in defamation. The Bill aims to provide fast and efficient access to recourse for harmful online content; however, it seems victims of trolling may either be unable to rely on the mechanism due to having no right to relief in defamation, or be driven to seek legal advice in order to determine this matter, and incur expenses of doing so.114
2.102
The eSafety Commissioner echoed these statements, noting that the bill may require the complainant to have at least a working knowledge of defamation law, and be able to distinguish between offensive, hurtful and insulting comments and material harmful to a person's reputation:
In our experience, the general degree of awareness of what constitutes potentially defamatory material posted online is likely low among most Australian end-users of social media services. The need to self-assess whether there is a right to obtain relief under the law of defamation is likely to lead to many complaints under the scheme falling short of what might be considered actionable under the law of defamation.115
2.103
The department advised, however, that the bill does not seek to address all forms of online harm, or address all avenues of response to reputational harm on social media:
The points the deputy chair mentioned are quite accurate, which is not everybody can access defamation proceedings. Not all objectionable material on social media meets the criteria of defamation let alone is even roughly about reputational harm. So this bill seeks to do a very specific and narrow thing, which is to assist people who are faced with reputational harm on social media, who would, in light of their circumstances and the nature of the reputational harm that is being effected, be minded to commence defamation proceedings to rectify that situation. But the thing that stands in their way from doing so is the anonymity of the originator. This bill seeks to open up that door and help those people commence defamation proceedings in those circumstances.116

Committee view

2.104
In 2004, the Australian Government engaged in a process to pass uniform defamation laws. In the face of proposed Commonwealth legislation being considered by the Parliament, Model Defamation Provisions (MDP) were enacted at the state and territory level.117 Since 2018, the Meeting of Attorneys-General (MAG)) has reconvened a working party to review the MDPs. Whilst the Stage 1 review is complete, the Stage 2 review which includes reviewing the laws relating to online defamation is still ongoing.118
2.105
Given the considerable uncertainty surrounding this Stage 2 review and the length of time it is taking, the committee strongly endorses the Australian Government's decision to promptly clarify the operation of the law with respect to the responsibilities and liability of digital platforms for defamatory material published online.
2.106
The committee also strongly supports efforts to improve the ability of Australians to pursue civil proceedings in relation to defamatory material posted on social media platforms, and commends the Australian Government for seeking to implement a framework to provide a safer online environment for Australians.
2.107
One of the key challenges posed by witnesses and submitters to this inquiry is the importance of balancing privacy, anonymity, freedom of expression and the need to protect Australians who are subject to defamatory material on social media platforms.
2.108
In considering these issues, the committee remains highly concerned about the apparent lack of accountability demonstrated by social media providers hosting defamatory material on their platforms. The committee notes the evidence received by the Social Media Committee of the social media providers use of algorithms and design features such as filters, shares, likes and infinite scrolling which are used to drive engagement, a core element of each social media provider's business model. The committee is also highly concerned about the profound failure of social media providers to take responsibility for, and adequately protect Australians from, harmful defamatory material posted on their platforms including by anonymous trolls.
2.109
To this end, the committee believes that both social media providers and the current regulatory framework have failed to respond effectively to increasing volumes of defamatory material on social media platforms.
2.110
More broadly, as proposed by some submitters, given the complexity of enacting state and territory uniform defamation laws which are fit for purpose in a rapidly evolving landscape of digital communications, the committee considers, in the longer term, that a federal defamation act may be preferable.
2.111
The committee is also concerned that an increasing number of online defamation disputes will give rise to major access to justice issues, given the cost and complexity of these types of proceedings. Whilst beyond the scope of this bill, the committee considers that alternate, low cost forms of online defamation dispute resolution may be necessary such as giving applicants the opportunity to obtain 'take down' orders only, at minimal cost.
2.112
That said, the committee believes the bill can be strengthened by making several amendments as explained below.

Access to the conditional defence

2.113
The committee refers to concerns raised by some submitters that the conditional defence in section 16(2) gives social media service providers a defence even if they (a) do not pass on the poster’s contact details to the complainant and (b) keep the material online.
2.114
The Committee considers this is not a fair characterisation of the way the bill operates. The bill is focused on social media service providers accessing the conditional defence when a poster’s relevant contact details have been handed over. For example, if requested under the complaints scheme, a social media service provider can access the conditional defence if they have disclosed the relevant contact details of the poster to the applicant: section 16(2)(d)(i). Similarly, if a court grants an EIDO directing the social media service provider to disclose the poster’s relevant contact details to the applicant, the provider must comply with the order to access the conditional defence: section 16(2)(d)(ii).
2.115
The concern outlined above is true only if the applicant (or a person acting on behalf of the applicant) has not applied to a court for an EIDO as a result of the (potentially defamatory) material: section 16(2)(d)(iii). The Committee recognises that subsection (iii) strikes a fair balance in circumstances where an applicant has decided not to pursue the matter further after completing a complaints process. As soon as the applicant is prepared to seek an EIDO from a court, and is successful, then the social media service provider will lose the conditional defence.
2.116
As for the suggestion that social media service providers should be able to access the conditional defence merely by removing the defamatory material, the Committee’s position is that this would give social media service providers the incentive to remove material at the slightest suggestion of it being defamatory, thereby encouraging inappropriate censorship on social media in Australia.

Liability of social media page owners

2.117
The bill currently ensures that social media page owners may still be sued for defamation for endorsing third party defamatory material, such as re-posting or sharing defamatory material. This reflects the existing law.
2.118
However, some companies and organisations benefit from a business model which seeks to encourage the publication of defamatory and abusive material. In these circumstances, the committee is concerned that a social media page owner's failure to remove a poster's defamatory material promptly, upon request, gives rise to circumstances where the social media page owner has contributed to the publication of the defamatory material and not just facilitated its publication.
2.119
Accordingly, there is little reason why social media page owners should not also be liable for defamation as publishers where they:
(a)
knowingly encourage the publication of a poster's defamatory material; and
(b)
have been notified of the poster's defamatory material and have failed to remove it promptly.

Complaints scheme and End-user Information Disclosure Orders

2.120
The committee notes that the proposed complaints scheme and end-user information disclosure orders would also assist in ensuring that digital platforms are incentivised to have in place complaints systems which are fit for purpose. The bill would also effectively ensure that social media providers are accountable to the Australian people by introducing penalties for non-compliance.
2.121
While the committee appreciates that the Australian Government has been very reluctant to diminish free speech outside of a judicial process, many submitters support a quick and easy 'take down' remedy.
2.122
Accordingly, the committee is of the view that the complaints scheme should be amended so that social media service providers can 'take down' a poster's allegedly defamatory material, within 72 hours, without the poster's consent, if such poster has been requested to remove the alleged defamatory material but has not responded.
2.123
This will address concerns about the consequences of a poster failing to comply with the bill's complaints process. It is important to reiterate that the committee is concerned that a mandatory 'take down' scheme may enable or encourage social media service providers to adopt a 'trigger happy' approach, resulting in material being removed in a manner which constitutes an unjustifiable censorship of free speech.
2.124
In addition, while acknowledging their importance, the committee believes that an EIDO should not be granted if the Court is satisfied that the disclosure of relevant contact details or country location data is likely to present a risk to the poster's safety or any other person, such as in domestic violence situations.

Recommendation 1

2.125
The committee recommends that the bill be amended so that a social media page owner may be liable for a poster's defamatory material where the social media page owner:
knowingly encourages the publication of a poster's defamatory material; and
has been notified or is aware of the poster's defamatory material and has failed to remove it promptly.

Recommendation 2

2.126
The committee recommends that the bill be amended so that the complaints scheme provides that a social media service provider can 'take down' the poster's alleged defamatory material, within 72 hours, without the poster's consent, if such poster has been requested to remove the alleged defamatory material but has not responded.

Recommendation 3

2.127
The committee recommends that the bill be amended so that an end-user information disclosure order (EIDO) should not be granted if the Court is satisfied that the disclosure of relevant contact details or country location data is likely to present a risk to the poster's safety or any other person, such as in domestic violence situations.

Recommendation 4

2.128
Following the implementation of recommendations 1, 2 and 3, the committee recommends that the Senate passes the bill.
Senator the Hon Sarah Henderson
Chair

  • 1
    Fairfax Media Publications Pty Ltd v Voller [2021] HCA 27.
  • 2
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 2.
  • 3
    The Hon Paul Fletcher MP, Minister for Communications, Urban Infrastructure, Cities and the Arts, House of Representatives Hansard, 10 February 2022, p. 10.
  • 4
    Ms Bridget Fair, Chief Executive Officer, Free TV Australia, Committee Hansard, 15 March 2022, p. 7.
  • 5
    The Hon. Bruce Billson, Australian Small Business and Family Enterprise Ombudsman, Committee Hansard, 10 March 2022, p. 29.
  • 6
    Digital Industry Group, Submission 26, p. 4.
  • 7
    Free TV Australia, Submission 18, pp. 8-9.
  • 8
    Ms Sue Chrysanthou SC, Committee Hansard, 15 March 2022, p. 21.
  • 9
    Professor Michael Douglas, Submission 11, p. 1.
  • 10
    Law Council of Australia, Submission 1, p. 5.
  • 11
    Law Council of Australia, Submission 1, p. 5.
  • 12
    Law Council of Australia, Submission 1, p. 11.
  • 13
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 2.
  • 14
    Ms Nyadol Nyuon, Committee Hansard, 10 March 2022, p. 9; Ms Erin Molan, Committee Hansard, 10 March 2022, pp. 2-3; Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 7; Reset, Submission 27, p.3; Digital Rights Watch, Submission 17, p. 6; Dolly's Dream and the Alannah and Madeline Foundation, Submission 3, p. 7.
  • 15
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 10 March 2022, p. 12.
  • 16
    Ms Nyadol Nyuon, Committee Hansard, 10 March 2022, p. 9.
  • 17
    Ms Sonya Ryan, OAM, Founder and Chief Executive Officer of The Carly Ryan Foundation, Committee Hansard, 15 March 2022, p. 2.
  • 18
    Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 7.
  • 19
    See Digital Rights Watch, Submission 17, p. 1; Dolly's Dream and the Alannah and Madeline Foundation, Submission 3, pp. 4 and 6; Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 7; Ms Erin Molan, Committee Hansard, 10 March 2022, p. 3; Reset Australia, Submission 27, p. 4; and Technology Council of Australia, Submission 36, pp. 1-2.
  • 20
    See NSW Department of Communities and Justice, Review of Model Defamation Provisions, available at: https://www.justice.nsw.gov.au/defamationreview (accessed 16 March 2022).
  • 21
    Mr Michael Johnson, Assistant Secretary, Defamation Taskforce, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 42.
  • 22
    Ms Erin Molan, Committee Hansard, 10 March 2022, pp. 2-3.
  • 23
    See Mr Anthony Seibold, Committee Hansard, 10 March 2022, pp. 1-2; Ms Erin Molan, Committee Hansard, 10 March 2022, pp. 1-2; and Ms Nyadol Nyuon, Committee Hansard, 10 March 2022, p. 8.
  • 24
    Mr Anthony Seibold, Committee Hansard, 10 March 2022, pp. 1-3.
  • 25
    See Communications Legislation Amendment (Online Content Services and Other Measures) Act 2018, available at: https://www.legislation.gov.au/Details/C2018A00028 (accessed 16 March 2022); Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019, available at: https://www.legislation.gov.au/Details/C2019A00038(accessed 16 March 2022); Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018, available at: https://www.legislation.gov.au/Details/C2018A00148 (accessed 16 March 2022); Telecommunications Legislation Amendment (International Production Orders) Act 2021, available at: https://www.legislation.gov.au/Details/C2021A00078 (accessed 16 March 2022) and Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019, available at: https://www.legislation.gov.au/Details/C2019A00038 (accessed 16 March 2022).
  • 26
    See Professor David Rolph, Submission 14, p. 11; See also Meta, Submission 7, p. 10; Digital Industry Group Inc. (DIGI), Submission 26, p. 6; Professor Michael Douglas, Submission 11, p. 8.
  • 27
    Professor David Rolph, Submission 14, p. 11.
  • 28
    Mr Toby Dagg, Executive Manager, Investigations, Office of the eSafety Commissioner, Committee Hansard, 10 March 2022, p. 13.
  • 29
    Meta, Submission 7, p. 10. See also the eSafety Commissioner's reporting schemes in relation to cyberbullying, adult cyber abuse, image-based abuse and illegal and restricted content at https://www.esafety.gov.au/report/forms.
  • 30
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 10 March 2022, p. 13.
  • 31
    See Law Council of Australia, Submission 1, p. 13; Meta, Submission 7, p. 28; DIGI, Submission 26, pp. 1 and 6; Sue Chrysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC, Submission 35, p. 5.
  • 32
    Digital Industry Group Inc. (DIGI), Submission 26, p. 6.
  • 33
    Professor David Rolph, Submission 14, p. 5.
  • 34
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 47; Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 3; Ms Erin Molan, Committee Hansard, 10 March 2022, pp. 2-4.
  • 35
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 47.
  • 36
    Attorney-General's Department, Submission 10, pp. 2-3.
  • 37
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14.
  • 38
    Twitter, Submission 8, p. 9; eSafety Commissioner, Submission 5, p. 13; OAIC, Submission 9, p. 3; Dr Evita March, Submission 13; Human Rights Law Alliance, Submission 16, pp. 4 and 5.
  • 39
    Attorney-General's Department, Submission 10, pp. 1-2; Ms Erin Molan, Committee Hansard, 10 March 2022, p. 3; The Hon. Bruce Billson, Ombudsman for ASBFEO, Committee Hansard, 10 March 2022, p. 29; Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 7; Reset Australia, Submission 28, p. 10; Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 47.
  • 40
    Reset Australia, Submission 28, p. 10.
  • 41
    Ms Sonya Ryan, OAM, Founder and Chief Executive Officer of The Carly Ryan Foundation, Committee Hansard, 15 March 2022, p. 3.
  • 42
    Ms Erin Molan, Committee Hansard, 10 March 2022, p. 3.
  • 43
    Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 7.
  • 44
    The Hon. Bruce Billson, Ombudsman for ASBFEO, Committee Hansard, 10 March 2022, p. 28.
  • 45
    The Hon. Bruce Billson, Ombudsman for ASBFEO, Committee Hansard, 10 March 2022, p. 29.
  • 46
    Sue Chrysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC, Submission 35, p. 3; Law Council of Australia, Submission 1, p. 7; Professor Michael Douglas, Submission 11, p. 3; YouTube, Submission 6, p. 2.
  • 47
    Mr Tass Liveris, Law Council of Australia, Committee Hansard, 10 March 2022, p. 23.
  • 48
    Twitter, Submission 8, p. 6.
  • 49
    Office of the Australian Information Commissioner, Chapter 2: APP 2- Anonymity and pseudonymity, 22 July 2019, https://www.oaic.gov.au/privacy/australian-privacy-principles-guidelines/chapter-2-app-2-anonymity-and-pseudonymity (accessed 16 March 2022).
  • 50
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 3.
  • 51
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 47.
  • 52
    DIGI, Submission 26, p. 11; Reset, Submission 27, p.3; Electronic Frontiers Australia, Submission 29, p. 11; Meta, Submission 7, p. 32; Ms Erin Molan, Committee Hansard, 10 March 2022, p. 5; Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 7; eSafety Commissioner, Submission 5, p. 17; Dolly's Dream and the Alannah and Madeline Foundation, Submission 3, p. 7.
  • 53
    Reset, Submission 27, p. 3.
  • 54
    Twitter, Submission 8, p. 12; Meta, Submission 7, p.4; Electronic Frontiers Australia, Submission 29, p.4; Law Council of Australia, Submission 1, p. 10; Reset Australia, Submission 27, p.2.
  • 55
    Electronic Frontiers Australia, Submission 29, p. 11.
  • 56
    Meta, Submission 7, p. 32.
  • 57
    Meta, Submission 7, p. 32; Ms Erin Molan, Committee Hansard, 10 March 2022, p. 5; Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 7; OAIC, Submission 9, p. 4; Medical Insurance Group, Submission 20, p. 2; Business Council of Australia, Submission 4, p. 1; yourtown, Submission 30, p. 5.
  • 58
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 45; Ms Sonya Ryan, OAM, Founder and Chief Executive Officer of The Carly Ryan Foundation, Committee Hansard, 15 March 2022, p. 2; Ms Erin Molan, Committee Hansard, 10 March 2022, p. 5; Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 7.
  • 59
    Ms Erin Molan, Committee Hansard, 10 March 2022, p. 5.
  • 60
    Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 7.
  • 61
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 2.
  • 62
    Free TV Australia, Submission 18, p. 6; MIGA, Submission 20, p. 3; OAIC, Submission 9, p. 4; Professor David Rolph, Submission 14, p. 4; Australian Small Business and Family Enterprise Ombudsman, Submission 2, p. 1.
  • 63
    Social Media (Anti-Trolling) Bill 2022, clause 6.
  • 64
    Digital Rights Watch, Submission 17, p. 6; Mr Anthony Seibold, Committee Hansard, 10 March 2022, p. 2; Ms Erin Molan, Committee Hansard, 10 March 2022, p. 2; Attorney-General's Department, Submission 10, p. 5.
  • 65
    Digital Rights Watch, Submission 17, p. 6.
  • 66
    eSafety Commissioner, Submission 5, p. 15.
  • 67
    Medical Group Insurance Australia, Submission 20, p. 4.
  • 68
    Law Council of Australia, Submission 1, p. 16; Electronic Frontiers Australia, Submission 29, p. 4; Meta, Submission 7, p. 32; Western Australian Government, Submission 28, p. 1; DIGI, Submission 26, p.12; eSafety Commissioner, Submission 5, p. 10.
  • 69
    Meta, Submission 7, p. 35
  • 70
    Law council, Submission 1, p.7.
  • 71
    Social Media (Anti-Trolling) Bill 2022, s16(1)(e).
  • 72
    Professor Michael Douglas, Submission 11, p. 8.
  • 73
    Ms Kara Hinesley, Director of Public Policy, Australia and New Zealand. Twitter, Committee Hansard, 15 March 2022, p. 17.
  • 74
    Social Media (Anti-Trolling) Bill 2022, clause 19, p. 16.
  • 75
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 20.
  • 76
    Defined as existing or taking place outside the territorial limits of a jurisdiction. Judicial College of Victoria, 1.3 - Territorial Limits (judicialcollege.vic.edu.au), (accessed 15 March 2022).
  • 77
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 48.
  • 78
    Professor David Rolph, Submission 14, p.11.
  • 79
    Professor Michael Douglas, Submission 11, p.13.
  • 80
    Attorney-General's Department, Submission 10, p. 8.
  • 81
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 16.
  • 82
    eSafety Commissioner, Submission 5; OAIC, Submission 9; Law Council of Australia, Submission 1; Centre for Digital Wellbeing, Submission 23; Digital Rights Watch, Submission 17.
  • 83
    Attorney-General's Department, Submission 10, p. 7.
  • 84
    Ms Erin Molan, Committee Hansard, 10 March 2022, p. 5.
  • 85
    Twitter, Submission 8, p. 4.
  • 86
    Professor Michael Douglas, Submission 11, p. 8.
  • 87
    eSafety Commissioner, Submission 5, p. 7.
  • 88
    Dolly's Dream and the Alannah & Madeline Foundation, Submission 3, p. 9; OAIC, Submission 9, p. 6; Attorney-General's Department, Submission 10, p. 3.
  • 89
    Dolly's Dream and the Alannah & Madeline Foundation, Submission 3, p. 9.
  • 90
    OAIC, Submission 9, p. 6.
  • 91
    Attorney-General's Department, Submission 10, p.3.
  • 92
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 49.
  • 93
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 48.
  • 94
    Ms Mia Garlick, Director of Policy, Meta, Committee Hansard, 10 March 2022, p. 61
  • 95
    Ms Angelene Falk, Australian Information Commissioner and Privacy Commissioner, Committee Hansard, 10 March 2022, p. 38.
  • 96
    Article 10.5, Australia-United States Free Trade Agreement [2005] ATS 1 (AUSFTA).
  • 97
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 49. See Article XIV(c) of the World Trade Organization's General Agreement on Trade in Services [1995] ATS 8 (GATS). Please note Article XIV of the GATS is expressly incorporated into the AUSFTA by virtue of Article 22.1(2) of the AUSFTA.
  • 98
    Centre for Digital Wellbeing, Submission 23; eSafety Commissioner, Submission 5; OAIC, Submission 9; Law Council of Australia, Submission 1; Centre for Digital Wellbeing, Submission 23; Digital Rights Watch, Submission 17; Twitter, Submission 8; Professor David Rolph, Submission 14; Law Council of Australia, Submission 1; Meta, Submission 7; Women's Services Network (Wesnet), Submission 15; Sue Chrysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC, Submission 35.
  • 99
    Attorney-General's Department, Submission 10, p. 1.
  • 100
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 43.
  • 101
    The Hon Paul Fletcher MP, Minister for Communications, Urban Infrastructure, Cities and the Arts, House of Representatives Hansard, 10 February 2022, p. 10;
  • 102
    See DIGI, Submission 26; eSafety Commissioner, Submission 5; OAIC, Submission 9; Law Council of Australia, Submission 1; Centre for Digital Wellbeing, Submission 23; Digital Rights Watch, Submission 17; Twitter, Submission 8; Professor David Rolph, Submission 14; Law Council of Australia, Submission 1; Reset Australia, Submission 27; Meta, Submission 7; Professor Michael Douglas, Submission 11; Women's Services Network (Wesnet), Submission 15; Dr Jennifer Beckett, Submission 31; Sue Chrysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC, Submission 35; Women's Services Network (Wesnet), Submission 15; Bennett + Co, Submission 22; Dr Evita March, Submission 13; Northern Territory Government, Submission 25; Harmony Alliance, Submission 24.
  • 103
    eSafety Commissioner, Submission 5, pp. 11-12.
  • 104
    Centre for Digital Wellbeing, Submission 23, p. 3.
  • 105
    Twitter, Submission 8, p. 4.
  • 106
    Attorney-General's Department Submission 10, p. 1.
  • 107
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 10 March 2022, p. 19.
  • 108
    Women's Services Network (Wesnet), Submission 15, p .2.
  • 109
    Ms Nyuon, Committee Hansard, 10 March 2022, p. 8.
  • 110
    Professor David Rolph, Submission 14, p. 12.
  • 111
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 9.
  • 112
    Attorney-General's Department, Submission 10, p. 1.
  • 113
    Social Media (Anti-Trolling) Bill 2022, explanatory memorandum, p. 14.
  • 114
    DIGI, Submission 26, p. 11.
  • 115
    eSafety Commissioner, Submission 5, p.14.
  • 116
    Mr Michael Johnson, Attorney-General's Department, Committee Hansard, 10 March 2022, p. 46.
  • 117
    Sue Chrysanthou SC, Patrick George, Rebekah Giles, Nicholas Olson, Richard Potter SC and Kieran Smark SC, Submission 35, p. 1.
  • 118
    Attorney-General's Department, Submission 10, pp. 2-3.

 |  Contents  |