Australian Greens' additional comments

The Australian Greens thank everyone who made a public submission and/or public representation to this inquiry into the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021 and Online Safety Bill 2021 (the bill).
We are concerned that this inquiry has been unnecessarily truncated and therefore has not allowed for a full canvassing of potential issues and unintended consequences.
The Australian Greens consider the online safety of internet users to be a significant issue and an important concern for Australian internet users. Online abuse of many internet users, such as women, children, and minority groups, is particularly abhorrent. The Australian Greens are also alarmed about the online proliferation of extremist groups, interests, and rhetoric.
The Australian Greens believe more can be done, and should be done, to protect vulnerable internet users, and stamp out abuse, violence, and extremism, both online and offline.
In part, the bill does seek to provide protections and remedies for people subjected to online bullying, abuse, and non-consensual sharing of intimate images. These aspects of the bill are important, and to be commended, although there remain some issues with these aspects of the legislation as it stands.
Despite some sections and objectives of the bill being commendable, many concerns were raised during consultation on the legislation's exposure draft. This is why the Australian Greens, along with many stakeholders, were disappointed the Liberal and Labor parties colluded to ram scrutiny of the bill—which differs little from an exposure draft—through such a truncated inquiry process. In its submission, the Prostasia Foundation raised concerns that:
…the bill was introduced at first reading a mere 10 days after 370 public submissions on the exposure draft were received, with no amendments reflecting the very serious concerns that we and other respondents raised. Furthermore, less than a week was provided for submissions to the Senate Committee. This creates the appearance that the government is pushing this legislation through with only a token attempt at public consultation. Given that the law would significantly impact the human rights of millions of Australians, this lack of commitment to democratic accountability is disturbing and unacceptable.1
As the bill is being rammed through the Parliament, and through a truncated inquiry process by this committee, it has not yet been considered by either the Senate Standing Committee for the Scrutiny of Bills, or the Parliamentary Joint Committee on Human Rights. This is disappointing, as the bill, despite its many commendable sections and objectives, will also, as surmised by former Senator for Western Australia and Greens spokesperson for digital rights Scott Ludlam, provide:
…intrusive and coercive powers with wide ranging effect, in the hands of an e-Safety Commissioner without recourse to adequate rights of appeal.2
Under powers provided by the bill, the eSafety Commissioner (the Commissioner) will become the sole arbiter of internet content suitability for all Australians. The Commissioner will be guided by the National Classification Code (the Code), which is currently being reviewed. As submitted by the Communications Alliance, given the review of the Code has not yet concluded:
…it is not clear how potential findings of this review will interact with the proposed new Online Safety Act, especially with Part 9, Online Content Scheme, of the draft legislation.3
Numerous submitters argued the Code was inappropriate for online mediums, and therefore not fit for the purposes of the bill. As submitted by the QUT Digital Media Research Centre (QUT):
By using the outdated Classification rules, this Bill creates new restrictions on lawful content that are a significant interference with the right to freedom of expression.4
Several submitters also argued that the Class 1 and Class 2 designations provided by the bill could capture lawful sexual content.
In particular, Class 2 captures non-violent sexual activity, including nudity and implied or simulated sexual activity, as well as materials considered "unsuitable for a minor to see".
The bill creates extraordinary power for a single, unelected bureaucrat—with little to no oversight—to wield. Which is why many submitters raised concerns regarding oversight. These concerns commonly ranged from a lack of consultation, transparency, procedural fairness, and reporting required of the Commissioner, to a lack of quick and practical review and appeals processes with appropriate remedies, through to a lack of independent oversight of activities covered by the bill. As submitted by Electronic Frontiers Australia (EFA):
This is a breathtaking amount of power to be handed to a single person, regardless of the level of oversight. The severe lack of checks and balances over the exercise of power granted by the Bill only compounds the danger. Granting extraterritorial jurisdiction over all Internet content to an unelected person appointed by the government of the day is an astounding proposition in a country that holds itself out as a liberal democracy.5
A lack of oversight could lead to the bill being weaponised by people with moral or political agendas. People opposed to sex work, pornography, sexual health for minorities (e.g. LGBTIQ+ people) could abuse the complaints process to seek to have lawful online adult content removed.
As submitted by Domestic Violence Victoria to consultation on the exposure draft of the bill, this legislation:
…provides opportunities for vexatious and malicious use of technology by perpetrators to further perpetrate family violence.
The Australian Greens see value in creating capacity for the quick takedown of violent extremist content. However, with limited checks, balances, and appeals, and no limitations on how many times a three-month blocking notice can be renewed, the bill could result in significant and detrimental unintended consequences for innocent businesses and individuals.
Vulnerable minority groups could also be adversely affected by the bill's requirement that content providers and platforms must take reasonable steps to prevent children from accessing Class 2 material. Due to the vast amounts of content many platforms host and/or curate, much of the responsibility for the monitoring and removal of Class 2 Material will fall to automated processes, such as algorithms and artificial intelligence (AI). However, as submitted by Digital Rights Watch, these automated technologies:
…disproportionately remove some content over others, penalising Black, Indigenous, fat, and LGBTQ+ people.6
Digital Rights Watch also noted similarities between the bill and controversial Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) and Stop Enabling Sex Traffickers Act (SESTA) legislation in the United States (US). In its submission, it argued that FOSTA-SESTA had led to platforms applying blanket bans and removal of all sexual content as a simple way to avoid penalty, rather than applying more rigorous—and onerous—processes to determine what content is actually harmful to users/viewers. Furthermore, Digital Rights Watch submitted that:
…when sex workers are forced offline they are often pushed into unsafe working environments, in turn, creating direct harm.7
The chilling effect of FOSTA-SESTA was also raised by Scarlet Alliance in its submission, evidence, and answers on notice. In its submission, Scarlet Alliance further argued that as with FOSTA-SESTA in the US, the Bill could lead to online platforms in Australia:
…over-comply[ing], capturing content that may or may not actually be illegal and leaving those impacted with little recourse as private companies become proxy censors.8
Scarlet Alliance was one of ten submissions specifically representing sex workers and content creators, along with Assembly Four, Behind Closed Doors Radio Show, Eros Association, Leila, Red Files, Respect Inc, Sex Work Law Reform Victoria, Victorian Pride Lobby, and Working Man. All these submitters, which accounted for over one-quarter of all organisational submissions, raised significant concerns with the bill, and how it would affect their industries, workers, and customers/consumers. The bill, they argued, would not only affect online content, but also online advertising of legitimate sex products and services. As submitted by Scarlet Alliance, there are significant concerns held with:
…the way the Bill frames online harm and safety, and with the way it fails to consider the impact of action taken under the Bill on sex worker safety, both online and in real life ... the Bill fails to differentiate between actual harm and a subjective, moralistic construction of harm.9
Their concerns are amplified by the current COVID-19 pandemic and associated restrictions and job losses which have not only impacted on sex workers, but also driven people from that and other industries to work in pornographic content creation. As submitted by Sex Work Law Reform Victoria:
Without the ability to see clients in person, many sex workers turned to alternative forms of generating income, such as creating online porn content on rapidly growing adult content websites such as OnlyFans. This move to online porn creation appears to be ongoing, with many sex workers maintaining an online porn presence even after coronavirus restrictions were lifted.10
As currently drafted, the bill could provide power to the Commissioner to limit, restrict, or undermine encrypted services and communications, as well as information gathering and investigative powers that could cover encrypted services. This is particularly concerning, given the Commissioner has publicly argued:
Encryption can result in serious harms by hiding or exacerbating criminal activities…[and] make investigations…significantly more difficult.
For the bill to cover encrypted services and communications would be unacceptable scope creep. But this is a Government, as perhaps best demonstrated by the Assistance and Access Act 2018, that has a track record of trampling on people's rights for privacy. Each time, it will present arguments around national security based on the dangerous notion that people only need fear the intrusion if they have something to hide. But as argued by the United Kingdom's Don't Spy on Us coalition:
"If you have nothing to fear, you have nothing to hide" is not the language of a democratic society. Our right to privacy forms the bedrock upon which all of our other rights and freedoms are built.
The bill provides yet another example of why we need an Australian Charter of Rights. In its submission, the EFA recommended:
…delaying the Bill until after a Federal enforceable human rights framework—one that includes protections for freedom of speech—is introduced into Australian law.11
Numerous submitters also drew attention to Australia's lack of privacy and digital rights in particular. In its submission, Reset Australia argued for privacy and digital rights on par with the European Union’s General Data Protection Regulation (GDPR), and in particular recommended the following rights be afforded Australians:
Right to Erasure, as in Article 17 GDPR ... Right to Object, Article 21 GDPR …[and] Automated individual decision-making, including profiling, Article 22 GDPR.12


The Australian Greens recommend that the bill be withdrawn and redrafted to take account of concerns raised by submitters, including:
use of the National Classification Code, which is currently under review;
potential for elements of the bill to be used against lawful online content and content creators;
inadequate rights of appeal and remedy for businesses and individuals whose content is wrongly blocked or removed, either by the Commissioner or online platforms;
inadequate transparency and accountability regarding discretionary decisions made by a single, unelected officer;
powers covering restricted access/encryption services; and
potential significant and detrimental effects on sex workers.


The Australian Greens recommend that Australia introduce a constitutionally or legislatively enshrined Charter of Rights which includes privacy and digital rights consistent with the European Union's General Data Protection Regulation.
Nick McKim
Senator for Tasmania
Participating Member
Sarah Hanson-Young
Senator for South Australia
Deputy Chair

  • 1
    Prostasia Foundation, Submission 6, p. 1.
  • 2
    Mr Scott Ludlam, Submission 13, p. 1.
  • 3
    Communications Alliance, Submission 18, p. 5.
  • 4
    Digital Media Research Centre, Queensland University of Technology, Submission 38, p. 4.
  • 5
    Electronic Frontiers Australia, Submission 30, p. 9.
  • 6
    Digital Rights Watch, Submission 27, p. 5.
  • 7
    Digital Rights Watch, Submission 27, p. 3.
  • 8
    Scarlet Alliance, Submission 36, p. 2.
  • 9
    Scarlet Alliance, Submission 36, p. 2.
  • 10
    Sex Work Law Reform Victoria, Submission 12, p. 3.
  • 11
    Electronic Frontiers Australia, Submission 30, p. 4.
  • 12
    Reset Australia, Submission 20, pp. 15–16.

 |  Contents  |