Chapter 2Key issues
2.1While most participants in the inquiry broadly supported the provisions of the Privacy and Other Legislation Amendment Bill 2024 (the bill), issues were mainly raised in relation to the provisions that related to the:
Australian Privacy Principle (APP) codes;
emergency declarations;
Children's Online Privacy Code (COP Code);
overseas data flows;
penalties for interference with privacy;
automated decision making (ADM) and privacy policies;
statutory tort for serious invasions of privacy;
doxxing offences; and
future privacy reforms.
2.2Speaking on behalf of several civil society organisations with an interest in digital technologies, Ms Elizabeth O'Shea, Chair, Digital Rights Watch (DRW), broadly reflected on the bill:
Collectively, we welcome this bill and think that it is a good first step in the journey to improve and strengthen the Privacy Act 1988 [Privacy Act]. The Privacy Act has not been meaningfully updated in nearly four decades and it's simply not capable of responding to the significant technological changes we've seen in this time.
2.3Reset.Tech Australia agreed with that assessment by stating 'Australia's privacy laws are substantially out-of-date and ineffective in the digital age'.
2.4The Attorney-General's Department (AGD) highlighted the importance of reform to Australian privacy laws:
The rapid evolution of technology is transforming the way Australians engage with each other, providing significant benefits and opportunities. However, advances in technology are also facilitating harms, like scams, fraud and doxxing, and research indicates the law has not kept pace with community expectations.
2.5It is necessary to reform the Privacy Act to ensure that it is able to provide an appropriate level of protection to Australians in the digital age. As the AGD highlighted:
Strong privacy protection reforms are critical to the Government's efforts across a range of areas to ensure technology is deployed for the benefit of Australians. The measures in this Bill and future privacy reforms will support initiatives to protect children and adults from online harms, uplift cyber security across the economy, prevent and address scam and other fraudulent activity, ensure the adoption of safe and responsible Artificial Intelligence [AI] and provide a comprehensive and consistent legal framework to support the use of automated decision-making by Government.
Australian Privacy Principle codes
2.6Access Now argued the ability for the Information Commissioner to develop APP codes in the public interest would 'provide a regulatory backstop in case industry initiatives are slow or inadequate'.
2.7The Human Rights Law Centre (HRLC) also welcomed the proposal to grant 'greater powers and flexibility' to the Information Commissioner. Allowing the commissioner to develop temporary APP codes would be 'particularly important' when there is a higher necessity to safeguard personal information in situations that are rapidly evolving or uncertain. It is appropriate that the safeguard to prevent the temporary APP code from being in force for longer than 12 months is included in the bill.
2.8The Internet Association of Australia (IAA) suggested the bill be amended to require the Information Commissioner to:
…consult with entities that would be subject to the APP Code or temporary APP Code (as the case may be), and any other person the Commissioner considers appropriate in the development of APP Codes and temporary APP Codes.
2.9Mr Harry Godber, Head of Policy and Strategy, Tech Council of Australia, questioned the appropriateness of temporary APP codes for the technology industry. Given that they would only operate for up to 12 months:
I would question whether a temporary 12-month code would reflect the reality of what is required for tech companies to comply, which typically, depending on the nature of the reform, could include restructuring the way that data is collected, managed and held. It would require systems reforms that would take potentially longer than 12 months to implement.
2.10The Business Council of Australia (BCA) opposed the provisions that would expand the Information Commissioner's power to make APP codes as it would increase their 'role through delegated legislation, and risks imposing confusing and misinformed compliance obligations on participants'.
2.11The BCA warned the bill 'does not specify what the Minister must consider in determining that the Commissioner should make a code'. It suggested that could lead to situations where the minister makes decisions about 'how industry should operate, without consideration of the experiences of those directly impacted'.
2.12The Law Council of Australia (Law Council) did not support the proposal in the Privacy Act Review Report that would enhance the Information Commissioner's code-making powers, which the provisions related to the APP code-making process are based upon. It argued that the proposal lacked 'sufficient clarity and, consequently, carried the potential for conflict and uncertainty'.
2.13The Law Council argued the Information Commissioner has significant 'knowledge of developments in the technological and privacy sphere…[and] is, therefore, well-placed to identify matters that need to be addressed by way of an APP code'.
2.14If the provisions that would enhance the Information Commissioner's code-making powers are to remain, the Law Council recommended the bill be amended:
…to empower the Information Commissioner to advise the Minister of the necessity for an APP code (or temporary code), and so that the Minister is required to consider this request prior to issuing a direction under proposed sections 26GA and 26GB of the Privacy Act.
2.15The AGD argued the amendment proposed by the Law Council is unnecessary:
The Information Commissioner already has advice-related functions as set out in section 28B of the Privacy Act which may be performed by the Information Commissioner on request or on the Commissioner's own initiative and include: Subsection (1)(a) providing advice to a Minister about any matter relevant to the operation of the Privacy Act, and Subsection (1)(c) providing recommendations to the Minister in relation to any matter concerning the need for, or the desirability of, legislative or administrative action in the interests of the privacy of individuals.
2.16The AGD explained that 'APP codes provide entities with certainty on how to comply with their obligations and provide individuals with transparency about how their information will be handled'. The amendment to the APP code-making process would 'provide clarity on how new privacy obligations will apply in different circumstances, including in the context of new and emerging technologies'.
2.17The Office of the Australian Information Commissioner (OAIC) stated consultation with entities affected by an APP code 'is a critical component of any code development process'. That office also has guidelines for the development of APP codes, which includes consultation requirements. Those guidelines will be updated to reflect any amendments to the code making process if the relevant provisions of the bill are passed.
Emergency declarations
2.18The main issue raised in relation to the emergency declaration provisions of the bill related to a potential drafting error in the bill.
2.19The ABC and the Special Broadcasting Service (SBS) alerted the committee to a potential drafting error in the bill. As the bill is currently drafted, it would permit the ABC and SBS to be provided with personal information during declared emergencies as they are 'national broadcasters' and not 'media organisations'.
2.20Ms Lyn Kemmis, Senior Legal Counsel, SBS, and Member, Australia's Right to Know (ARTK), queried why the national broadcasters could be provided with personal information during declared emergencies when commercial media outlets would not be privy to that information. Her understanding of the provision:
…is about the provision of people's personal information in the case of an emergency, like a COVID outbreak. It is permitting government entities to quickly distribute information about individuals who might need to know their exposure. The media is not the way to do that.
2.21The AGD confirmed the national broadcasters were unintentionally excluded from the provisions of the bill. They should be treated in the same way as commercial broadcasters.
2.22The AGD explained the bill would 'enable emergency declarations to be more targeted'. Its provisions would require the kinds of personal information that would be shared during emergencies to 'be specified within the declaration, instead of allowing wide sharing of personal information in a declared emergency or disaster'. The amendment is intended to 'strike a better balance between protecting individuals' privacy, and enabling effective and coordinated responses to an emergency or disaster'.
Children's Online Privacy Code
2.23Most inquiry participants welcomed the development of a COP Code.
2.24ChildFund Australia emphasised 'the existing Privacy Act does not adequately address privacy and security challenges faced by children in the digital age'. It voiced concern 'about how children's personal data is collected, shared, and used by a range of actors, including technology companies'. There are potential life-long risks associated with inadequately protecting children's data.
2.25Those risks are associated with a wide range of potential online harms, many of which may not be immediately apparent. The Alannah and Madeline Foundation (AMF) suggested that the public conversation is focussed on the most visible risks associated with children's use of the digital environment. Personal information is collected by online platforms and monetised, often in ways that adults let alone children struggle to understand. Depending on how it is developed, the COP Code 'would provide a strong foundation for change'.
2.26UNICEF Australia (UNICEF) referred to data as 'the currency of the online world, and children's data – where it's collected, traded and sold on mass scales – is considered big business'. It saw the proposal to develop the COP Code as an opportunity to:
…ensure children's data is only collected and used in a way that serves their best interests and will provide them with the protections they are entitled to. It will hold tech companies accountable, ensuring they are transparent with how they use children's data, and that terms and conditions of apps are clear and straightforward.
Scope of the Children's Online Privacy Code
2.27UNICEF recommended that the COP Code apply 'to all online services likely to be accessed by children including educational technology providers, games and commercial health apps'.
2.28AMF agreed the COP Code should apply to a wide range of online services including 'apps, connected toys and devices, search engines, streaming services, online games and education products'. That approach would be consistent with the UK legislation upon which the COP Code is intended to be modelled.
2.29Reset.Tech Australia argued that the COP Code could be strengthened by expanding its application beyond 'social media, designated internet services and relevant electronic services'. For example, it suggested that it could be extended to 'EdTech and data brokers'. Learning from similar codes that have been developed in international jurisdictions could also improve the development of the Australian COP Code.
2.30Professor Elizabeth Handsley, President, Children and Media Australia (CMA), argued 'there is no justification' to exclude edtech from the COP Code. She alerted the committee to the situation in the United States where 'there has been a law firm set up…for the sole purpose of suing edtech companies for the way they treat children's data in that country'. In her view, 'the existence of a law firm solely dedicated to dealing with edtech's breaches of children's privacy tells you there is a major issue there that this legislation should really be addressing'.
2.31Additionally, children do not limit their online usage to child-specific platforms or services. For example:
Instagram and TikTok are very popular among Australian teens and younger children, but more than 90% of Australian Instagram users and approx two-thirds of TikTok users are recorded as aged 18 and over. This could enable their providers to state, plausibly, that they do not provide 'children's services' and therefore are not in scope of a [COP Code].
2.32BCA indicated that determining the services that are 'likely to be accessed by children' would be difficult for 'business to assess'. Based on experience from the UK, organisations had 'found it difficult to operationalise' similar provisions.
2.33Meta outlined the services that are included in 'industry codes that have been and are being developed under the [Online Safety Act 2021]'. They include:
…app stores, search engines, on-demand program services (e.g. some but not all streaming apps), third-party hosting services (cloud providers), ISPs and equipment makers and providers (for example, a company that makes phones or laptops).
2.34These services are regularly used by young people and should be included in the COP Code.
2.35The Food for Health Alliance argued that the term 'likely to be accessed by children' must be retained as the COP Code should apply 'to the services children use, not only those services designed for them'.
2.36Professor Handsley contended that '[i]f you're really serious about protecting children, you look at where they are, not at where people want them to be'. She explained:
Children are curious and they do have a right to go online, find information and get experiences, and they need to be kept safe in all of those places. If we really are serious about keeping children safe, then we apply the legislation and regulations where the children are and we limit their exposure to the relevant risks, rather than cutting it down to be a matter of what the aim was in putting that information out.
2.37Dr Mark Zirnsak, Senior Social Justice Advocate, Uniting Church in Australia, Synod of Victoria and Tasmania, agreed that industry should not be left to regulate itself:
Ideally in this space, you would want the regulator to almost be checking what kind of things children are accessing in reality. Just because a provider says, 'My content is targeted at adults', like it's an old game or whatever does not mean children are not accessing it. You'd almost want this checking to say, 'This is really a product that children are not accessing?', or in this case, 'Are they likely to be accessing?'.
2.38The Interactive Games and Entertainment Association (IGEA) and the Digital Industry Group Inc. (DIGI) suggested that international experiences of adopting similar children's privacy codes should be considered in the drafting of the COP Code.
2.39There should be alignment between the COP Code and international children's privacy codes. For example, the bill should be amended so that the COP Code applies to providers of social media services, relevant electronic services or designated internet services 'targeted at, or directed to, children'. Similar terminology is used in the US Children's Online Privacy Protection Act (COPPA). The bill would provide greater clarity about which services are expected to adhere to the COP Code if it is aligned with the COPPA.
2.40Ms Jessie Mitchell, Advocacy Manager, AMF, stated the term 'likely to be accessed' is similar to the terminology used in the UK children's code:
They say it's a digital environment where the probability of children accessing it is higher than the probability of them not accessing it, and they do provide some guidance to services as to how to make that assessment. I do agree it can be a challenging point for a service to assess, and that is a space where we'd want to see the regulator, the Office of the Australian Information Commissioner, being adequately resourced to provide that expert guidance.
2.41The IAA cautioned against broadening the scope of the COP Code. It considered that the terms 'relevant electronic service' and 'designated internet service' are already 'extremely broad'.
2.42The OAIC indicated the range of online services likely to be accessed by children would include, 'but not [be] limited to, social media services, websites, apps, instant messaging services, and online gaming services'.
2.43The COP Code would be required to 'set out how one or more of the APPs are to be applied or complied with in relation to children'. The code would be able to 'impose additional requirements provided those requirements are not inconsistent with the existing APPs. In this way, the requirements in the Code must be grounded in the APPS'. The COP Code could build upon the requirement for APP entities to have privacy policies and collection notices by:
…set[ting] out how organisations should tailor privacy policies and collection notices for a child so that they are clear and easy to understand, for example, by using graphics, video and audio content, rather than relying solely on written communication.
2.44The COP Code would apply to APP entities 'that provide social media services, designated internet services or relevant electronic services that are likely to be accessed by children'. It would be developed as 'children merit special privacy protection as they may be less aware of the risks and consequences associated with the handling of their personal information, particularly online'.
2.45Regulated entities would be required 'to design their services in a manner that protects children from harm'. The government:
…will consider additional proposals to increase privacy protections for children – including in relation to harmful targeting and trading in children's personal information, and requiring entities to have regard to the best interests of the child when handling their personal information.
2.46Ms Catherine Fitch, Assistant Secretary, Privacy Reform Taskforce, Integrity Frameworks Division, AGD, indicated the term 'likely to be accessed by children' was modelled on similar terminology used in the UK age-appropriate design code:
The UK Information Commissioner's Office guidance outlines what is meant by 'likely to be accessed by children', and it provides some guidance that I think many organisations would be familiar with. In our bill currently the subject of this inquiry it also provides factors relevant to whether a service is likely to be accessed by children, such as the nature and content of the service, whether it has a particular appeal and so on.
Exclusion of health service providers
2.47Ms Ariana Kurzeme, Director, Policy and Prevention, AMF agreed with the intent of the provision of the bill that would exclude health service providers from the COP Code. However, she stated her organisation is:
…concerned about the proliferation of commercial platforms marketing health products and services of varying quality to children, including nominally free products, presumably based on handling personal data. We believe that protection should extend to all digital spaces where children's personal information is at risk of exploitation or misuse.
2.48The Law Council similarly queried why the bill would exempt health service providers from complying with the COP Code. As it is currently drafted, the exemption would:
…exclude many APP entities that should be covered by the COP Code, given that 'health service' is broadly defined in section 6FB of the Privacy Act (and includes physical and psychological health). This exclusion is also much wider than entities providing 'preventative or counselling services', as was agreed to in the Government Response to Proposal 16.5, and would potentially exclude many digital providers whose tools are targeted at children.
2.49The Law Council was not certain that the blanket exemption for health service providers is necessary, given the bill would 'allow for the OAIC to specify within the COP Code itself which APP entities are, and are not, covered'. On that basis it recommended '[t]he breadth of the exclusion of health service providers…should be narrowed to exclude counselling services only, not health services more generally'.
2.50The OAIC clarified the COP Code would not apply to health service providers 'which ensures the code is not a barrier to providing essential services to children'. Health service providers would include entities 'such as online counselling and advice services, and telehealth'.
2.51The AGD similarly explained health service providers would be excluded 'to ensure the COP Code is not inadvertently a barrier to providing essential services to children, and allows more detail about the scope of the COP Code to be determined through the code-making process', noting there is another provision in the bill that allows specified health service providers or types of health service providers to be bound by the COP Code.
Consultation with stakeholders in the development of the Children's Online Privacy Code
2.52The bill specifies that the Information Commissioner may consult with stakeholders when developing the COP Code. DIGI and Meta recommended the bill be amended to require the Information Commissioner to consult with stakeholders, including industry.
2.53DIGI agreed it is appropriate for the Information Commissioner to 'consult with children, children's welfare organisations, the eSafety Commissioner, and the National Children's Commissioner' when developing the COP Code. However, in addition to those stakeholders the bill should be amended to require the commissioner to 'consult with providers of Social Media Services, Relevant Electronic Services or Designated Internet Services'. That consultation would help to ensure the COP Code is 'fit for purpose and technically feasible'.
2.54Reset.Tech Australia welcomed the development of the COP Code by the OAIC seeing that office's involvement as 'critical to the initiative's success'.
2.55Based on survey data provided by Reset.Tech Australia, the Australian public overwhelmingly supports an independent regulator, such as the Information Commissioner, developing the COP Code (see Figure 2.1).
Figure 2.1People's preference about who should write the codes for privacy safety for children (n=1508)

Source: Reset.Tech Australia, Submission 3, p. 5.
Note: Respondents were asked: If you had to choose, who would you most prefer to write the codes about online privacy for children?
2.56Miss Alice Dawkins, Executive Director, Reset.Tech Australia, elaborated on this evidence by providing a case study demonstrating the potential shortcomings of having industry draft its own codes. In that case, 'where industry attempted to insert safety standards for young Australians, particularly privacy defaults…we saw some unacceptably low standards of protection'. She maintained that industry should not be permitted to draft the COP Code and 'it's excellent news that the commissioner is being empowered to do this'.
2.57UNICEF reflected on the importance of including children in the design of the COP Code and suggested they must be consulted in its design:
Every child and young person under 18 has the right to participate and have their opinions included in decision-making processes that relate to their lives…Including the voices of children and young people in the development of policy isn't just the right thing to do, it's the smart thing to do – policies co-designed with children and young people are better placed to respond to their needs and deliver better outcomes.
2.58As it can be difficult for children to understand privacy matters, AMF suggested the consultation period be extended from 40 days to a minimum of 60 days. A longer consultation period would allow for meaningful consultation with children about their privacy.
2.59The IGEA also argued the bill be amended to require the Information Commissioner to consult with industry on the development of the COP Code, rather than consult with them at their own discretion.
2.60To address its concerns, the IGEA recommended:
As the Government has committed to developing the COP Code, the Code should be referring to services that are 'targeted at, or directed to, children', which is less ambiguous than the term 'likely to be accessed by children'.
Should the OAIC be assigned with the responsibility for developing and consulting on the COP Code, the Bill should explicitly require the OAIC to meaningfully consult with relevant industry stakeholders who are directly impacted by the COP Code. Consultation should at least occur during the development and public consultation stages of the Code.
2.61According to the Australian Human Rights Commission (AHRC), the COP Code would 'significantly strengthen privacy protections for children and young people'. To ensure that the views of children and young people are taken into account during the drafting of the code, the AHRC recommended the bill be amended to specify the Information Commissioner must consult with children and other stakeholders.
2.62In developing the COP Code, the OAIC:
…intend[s] to adopt a transparent and collaborative approach…and will consult widely with children, parents, child development experts, child welfare advocates, civil society, other regulators and across the online industry to ensure different voices are heard and represented throughout the process.
Definition of a child
2.63Some inquiry participants questioned the bill's definition of a child as an individual under the age of 18.
2.64BCA suggested the definition should be consistent with other legislation such as 'the age of criminal responsibility, minimum working age (which varies by State jurisdiction), and proposed definitions for social media platforms (and associated policies)'.
2.65Privacy 108 similarly discussed 'the ongoing debate around restricting children's access to social media and other digital platforms until they reach a certain age, likely between 14 and 16 years'. In its view, there should be a consistent definition in the Privacy Act and any other legislation that regulates children's use of technology.
2.66DIGI suggested a challenge associated with the COPPA is that it 'only applies to young people under the age of 13, where it is much easier to distinguish between material targeted to a child of that age vs an adult'. If the COP Code applies to anyone under the age of 18 'there will be challenges distinguishing between children under 18 and young adults'. The UK has provided guidance to regulated entities on this matter to assist them in determining if their service is likely to appeal to children.
2.67The Uniting Church in Australia, Synod of Victoria and Tasmania observed the internationally accepted definition of a child is someone under the age of 18. On that basis, it agreed with the definition included in the bill.
2.68It was suggested age restrictions may not be the most effective way of ensuring that online platforms protect children's data. For example:
…if a platform can claim that children are not supposed to use their service due to age restrictions, it may avoid the additional legal obligations that would otherwise apply to protect children's data. This loophole allows companies to argue they are not "knowingly" collecting data from children, even though children may still access their platforms.
2.69That loophole is evident in other jurisdictions, including the United States, 'where some platforms avoid compliance by stating their service is not intended for children under 13'. To avoid this loophole, online platforms should be required to 'adopt tailored privacy policies, and heightened security measures to ensure the privacy of children is genuinely protected'.
2.70AMF argued the international definition must be retained in the bill. That would align:
…it with both the Online Safety Act 2021 and the UN Convention on the Rights of the Child. This is crucial, as we've seen efforts by the tech industry to limit protections to younger age groups, for example, the first round of the industry codes under the Online Safety Act. We see these reforms as a once-in-a-generation opportunity to reduce the risk to children in digital environments, protect them from exploitation of their personal information and other intrusive practices, and uphold their rights.
2.71Ms Mitchell suggested that in instances where the definition of a child has been set at an age lower than 18 there have been 'lower default protections for children'. She cited the example of industry safety codes:
These codes were drafted by industry and they introduced a definition of what they called a young Australian child, meaning a child under 16, and it was only for that category that social media services were put into the new code to introduce high privacy settings by default. That's a lower threshold of protection than a number of other countries have. We think that children up to 18 should have a right to these inbuilt protections for their personal information, because of the flow-on effects that has for their experience and safety.
2.72The Law Council cautioned that defining a child as an individual under the age of 18 years 'may lead to unintended consequences'. The insertion of that definition would make it applicable to all matters in the Privacy Act, which would 'erode many of the existing privacy-enhancing practices that respect the agency of young people under 18 years'.
2.73Additionally, the definition of 'child' proposed in the bill could introduce inconsistencies in relation to a child's capacity to consent in relation to their health and privacy.
2.74According to the Law Council, 'the generally accepted position regarding a child's capacity to consent' is as follows:
…the Government agrees in-principle that the Privacy Act should codify the principle that valid consent must be given with capacity…The guidance provides sufficient flexibility by allowing entities to decide if an individual under the age of 18 has capacity to consent on a case-by-case basis. If that is not practical, as a general rule, an entity may assume an individual over the age of 15 has capacity, unless there is something to suggest otherwise.
2.75To avoid those potential unintended consequences, the Law Council recommended 'the proposed definition of 'child' should be limited to the use of that term in the COP Code only'.
2.76The AGD responded directly to the concern raised by the Law Council, stating:
The proposed definition of 'child' in the Bill will apply across the Privacy Act unless the contrary intention appears. The Law Council refers to the importance of respecting the agency of young people under 18 years and the issue of capacity to consent. Currently, the approach to capacity to consent under the Privacy Act is set out in [OAIC] guidance rather than in the legislation and specifies that an individual must have capacity to give consent. Proposal 16.2, which was agreed in principle in the Government Response to the Privacy Act Review Report would codify in the Act the principle that valid consent must be given with capacity. The proposed definition of child would not be determinative of capacity to consent where required under the Privacy Act.
Overseas data flows
2.77Submitters broadly agreed with the provisions that would simplify the regulation of overseas data flows.
2.78BSA | The Software Alliance (BSA) reminded the committee that overseas data transfers are already permitted under the Privacy Act. By prescribing countries that have 'substantially similar' privacy protections to Australia, the amendment would 'provide businesses with greater legal certainty and substantially reduce compliance burdens'.
2.79It was not clear to BSA 'what would constitute a "substantially similar" level of protection'. It was concerned a strict interpretation that would require foreign privacy laws 'to mirror, point-by-point, the APPs, would defeat the purpose of the mechanism'. To avoid that situation, BSA suggested that further consultation be undertaken 'on the process for, and factors involved in, determining whether a country or certification scheme offers the appropriate level of protection'.
2.80BSA and the Global Data Alliance suggested two certification schemes that Australia could consider prescribing for this purpose. Those schemes are:
The Cross Border Privacy Rules; and
International Standards Organization 27701.
2.81Some foreign privacy laws, including the General Data Protection Regulation (EU) 2016/679 (GDPR), have mechanisms that simplify the flow of data between countries. The Jeff Bleich Centre for Democracy and Disruptive Technologies, Flinders University (Jeff Bleich Centre), considered the limited reforms contained in the bill could mean 'Australian privacy law may not meet the privacy standards of other countries and jurisdictions'. That limitation would 'represent a problem for data controllers or processors attempting to transfer data to Australia'. For example, it could 'act as an impediment for scientific research between Australia and European Union countries'.
2.82The Law Council submitted 'the Bill does not expressly harmonise with existing cross-border mechanisms widely used by many APP entities to address the requirements of the EU GDPR and APP 8'. For that reason:
…the Bill should amend APP 8.2(a)—and the Privacy Regulation should also be amended—so as to expressly reference some of the mechanisms that are widely used by APP entities to address Article 46 of the EU GDPR ('transfers subject to appropriate safeguards'). Reference should particularly be made to safeguards, such Standard Contractual Clauses, adopted by the European Commission in accordance with the examination procedure referred in Article 92(2) of the EU GDPR.
2.83By expressly referring to those mechanisms, the bill would better 'avoid the unintended consequences of potentially conflicting measures being described, or adopted, by APP entities, especially if some countries may be added, or subsequently removed, by the regulations'.
2.84The AGD advised:
APP 8.1 requires entities to take such steps as are reasonable to ensure that an overseas recipient of personal information does not breach the APPs in relation to the information. APP 8.2 provides that APP 8.1 does not apply where certain circumstances are established. One such circumstance is where the entity reasonably believes that the recipient of the information is subject to a law or binding scheme that has the effect of protecting the information in a way that is at least substantially similar to the way in which the APPs protect the information and there are mechanisms that the individual can access to take action to enforce the protection of the law or binding scheme (APP 8.2(a)). Schedule 1, subclause 37 introduces a mechanism to enable countries and certification schemes to be prescribed as providing substantially similar protection to the APPs under APP 8.2(a). Entities may still make their own assessment about whether countries or schemes that are not prescribed meet this test for the purposes of APP 8.2(a). The Government has also agreed in principle to progress Proposal 23.3 of the Privacy Act Review to make standard contractual clauses available to APP entities for transferring personal information overseas.
2.85The AGD submitted the intention of the amendment is to 'provide greater certainty to disclosing entities about the standard of privacy protections in prescribed countries enhancing the flow of information across national borders while ensuring privacy is respected'. Future reforms will consider how overseas data flows can be further enhanced.
Penalties for interference with privacy
2.86Submitters broadly agreed with the proposed amendments that would introduce tiered civil penalties for breaches of privacy.
2.87CHOICE argued a tiered approach would 'better protect people from breaches of privacy that do not fulfil the criteria for a "serious" interference, but which are nevertheless harmful and should be deterred'. It suggested the penalty provisions could be improved by:
…allowing the courts to determine the penalty based on the value of the benefit resulting from the interference, or as a percentage of turnover. This would ensure that large businesses take the provision seriously.
2.88The IAA recommended the penalty provisions be accompanied by a 12-month education and awareness raising program to give industry and other stakeholders time to understand the compliance and enforcement approach.
2.89Clubs Australia agreed that education and remediation should be applied to minor infractions or unintentional breaches of privacy. That approach 'would foster a culture of compliance and continuous improvement' and ensure that smaller organisations are not burdened with large fines.
2.90The Law Council submitted that the penalty provisions 'are broadly consistent with Proposals 25.1 and 25.2 of the Privacy Act Review Report, to which the Government agreed in its Response'.
2.91The Law Council generally supported the penalties for interference with privacy contained in the bill. It suggested that those penalties:
…may not be appropriate if the Privacy Act is eventually extended to smaller organisations, noting that the Government agreed, in principle, to the removal of the small business exemption in its Response to the Privacy Act Review Report.
2.92The Law Council's main concern with the penalty provisions related to their clarity and proportionality to the offence. It queried whether the 'principles-based obligations [included in the bill]…are sufficiently prescriptive to enable certainty in compliance by entities'. The Law Council explained the civil penalties would apply to entities that have been found to have breached one or more of the APPs. Many of those APPs require entities to take ''reasonable' (as opposed to absolute) steps to address compliance. These are typically not prescriptive or binary matters that lend themselves to a simple determination of liability'.
2.93The provisions of the bill would allow the Information Commissioner to issue infringement notices without explaining how entities could better comply with their obligations under the Privacy Act. The Law Council suggested that 'may disincentivise—rather than promote—open and consultative communications with the OAIC'. The Law Council recommended the bill be amended so that the OAIC would be required to provide the entity with a 'notice that clearly outlines what is required to remedy the issue' before issuing an infringement notice.
2.94According to the AGD, the infringement notice power would be 'limited to specified provisions' and was designed in accordance with the AGD's Guide to Framing Commonwealth Offences. That guide:
…states that an infringement notice scheme is appropriate for 'relatively minor offences, where a high volume of contraventions is expected, and where a penalty must be imposed immediately to be effective' and 'an enforcement officer can easily make an assessment of guilt or innocence'. The specified provisions were selected to align with this guidance. The provisions selected are similarly proscriptive to provisions subject to infringement notice powers of other regulators including the ACCC, ASIC and ACMA.
2.95By issuing infringement notices in the first instance, the Information Commissioner would be able 'to issue infringement notices in relation to alleged minor contraventions of the Act. This would allow the Commissioner to ensure compliance with privacy obligations without the need for protracted litigation'.
2.96The OAIC welcomed the proposed introduction of a new civil penalties regime for interference with privacy and saw it as a means to enhance the enforcement powers available to it:
The enhanced civil penalty framework would provide more enforcement options to deter non-compliance and fill a gap where previously the Commissioner was only able to seek civil penalties for serious and repeated interferences with privacy, while the new infringement notice regime for administrative breaches of the Act would be a quick and cost-effective way for the OAIC to respond to non-compliant behaviour without the need for court proceedings.
2.97The AGD similarly explained that the tiered penalties regime would:
…provide more enforcement options to the Information Commissioner to deter non-compliance and address a gap in the enforcement of privacy protections which allowed the Information Commissioner to seek civil penalties only for the most serious or egregious interferences with privacy. Lesser penalties and an infringement notice scheme for breaches of the Act that are less serious will allow the Information Commissioner to resolve matters more efficiently an proportionately.
Automated decision making and privacy policies
2.98The Tech Council of Australia raised concerns about the lack of clarity in the drafting of APPs 1.7-1.9. It suggested they will require 'substantial refinement for entities to interpret and apply consistently across the economy'. It may be appropriate to introduce the control/processor distinction to provide the necessary clarity.
2.99It argued the ADM provisions should 'be carefully reconsidered'. As they are currently drafted, they are:
…likely to inadvertently capture ADM activities that contribute positively to a safe and functioning internet, which includes automated decision-making systems to manage spam, scam, ADM systems that support vital cyber threat detection and security measures to combat hacking and fraud, as well as systems that assist in the moderation of harmful online content.
2.100The BCA similarly warned increased transparency 'about ADM processes could result in security risks'. For example, detailing how 'ADM processes are used to detect fraud…would tip off fraudsters and help them avoid detection'.
2.101The disclosure of information about ADM processes could also have implications for the protection of intellectual property and commercially sensitive information. A requirement to disclose that information could 'discourage or undermine business innovation'. The GDPR provides an exemption for 'trade secrets or intellectual property which, if disclosed, will adversely affect the rights or freedoms of others'.
2.102The Australian Chamber of Commerce and Industry (ACCI) agreed '[i]t will be important that information required to be disclosed by these [APP] entities is not commercially sensitive'.
2.103The Law Council argued there is insufficient clarity around some of the terms used in the provisions related to automated decision making:
For instance, we are concerned that the Bill fails to provide certainty as to the meaning of 'automated decisions' and imposes an unnecessarily high bar with the proposed requirement for the computer program to 'make, or do a thing that is substantially and directly related to making, a decision'.
2.104The Law Council provided the following terminology related to automated decision making in the GDPR:
…decisions based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
2.105The EM provides the following meaning of 'computer program':
The term 'computer program in APP 1.7(a) is intended to take its ordinary meaning and encompass a broad range of matters, including pre-programmed rule-based processes, artificial intelligence and machine learning processes to make a computer execute a task.
2.106These different definitions related to automated decision making may introduce limitations to the harmonisation and interoperability of the Privacy Act and the GDPR. In the Law Council's view:
Alignment to existing frameworks is required to address the need for consistent practices and harmonisation with existing regimes that already regulate this field of activity and type of technology. This need for clarity is further reinforced by the fact that non-compliant disclosures will be the subject of new civil penalty provisions under the Bill.
2.107To ensure that the Privacy Act better aligns with existing legal regimes in foreign jurisdictions, the Law Council recommended:
The terminology in Part 15 of Schedule 1 to the Bill should be aligned with Article 22 of the EU GDPR, which regulates 'a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly affects him or her'.
2.108The Law Council also suggested that the bill appears to have been drafted with the understanding that automated decisions are made in relation to a single decision. It argued that, in some circumstances, ADM processes:
…may be the result of several decisions that follow a series of 'decision trees'—some of which may include the use of computer programs in deciding what branch of the decision tree is taken next. This may be a complex process, potentially involving sensitive commercial-in-confidence information, that is not appropriate for disclosure in that entity's privacy policy.
2.109The Law Council was not certain about whether the bill was 'drafted to capture these circumstances'. To provide further clarity, the Law Council recommended the bill be amended to explain how entities are expected to comply with their new obligations 'in circumstances where a series of decisions are made, some of which may include the use of computer programs and commercial-in-confidence information'.
2.110If the bill is passed, it is possible that an entity required to disclose how personal information will be used in automated decision making processes will include generic statements in their privacy policy. Such a statement would fulfill the new obligation but may not provide any 'substantive information to meet the commendable objective of providing meaningful information to individuals'.
2.111The Law Council recommended the bill:
…be amended to include a list of factors that must be considered by APP entities, prior to determining whether an automated decision may reasonably be expected to affect the rights or interests of an individual.
2.112To help individuals better understand how their personal data may be used in ADM processes, the Law Council recommended the bill 'be amended to provide for a right for individuals to request meaningful information about how substantially automated decisions with 'legal or similarly significant effect' are made'.
2.113The AHRC accepted the bill would generally improve transparency of ADM processes. However, that transparency 'may be limited due to the inaccessible nature of privacy policies, with people either not reading them or struggling to understand them'.
2.114The AGD explained the provisions related to ADM are intended to increase:
…transparency about substantially automated decisions which significantly affect individuals' rights or interests. Entities will be required to include information in their privacy policy about the kinds of decisions and kinds of personal information used in these decisions. The use of the language 'rights or interests' is intended to have broad coverage. Rights do not have the same application in Australian law as in Europe which has more developed rights-based frameworks. Interests may include things that are not rights under the Australian law – for example the provision of benefits under an Act or denial of significant services or support.
2.115ADM has the potential to provide 'significant opportunities for enhancing productivity and facilitating economic growth, and improving outcomes for Australians across the areas of health, environment, defence and national security'. However, there are possible privacy risks associated with those potential benefits:
…ADM systems may pose privacy risks as they can use personal information about individuals to assist or replace the judgement of human decision makers in ways which may have significant impact, with little transparency. Providing individuals with greater transparency on ADM allows them to understand how an entity handles their personal information and for what purposes, and allows them to take further action if there has been a breach of their personal privacy.
2.116If the bill passes, entities would be required:
…to include information in privacy policies about the kinds of personal information used in, and types of decisions made by, computer programs that use personal information to make decisions that could reasonably be expected to significantly affect the rights or interests of an individual.
2.117According to guidance materials provided by the OAIC, 'a privacy policy is general in nature, and focuses on the entity's information handling practices'. Privacy policies are 'not expected to involve any commercial-in-confidence information'.
Information about the use of personal data
2.118Some inquiry participants suggested that the bill take a stronger approach by providing individuals with information about how their personal data has been used in ADM.
2.119For example, the Jeff Bleich Centre agreed the bill would 'bring openness and transparency to the processing and use of personal information…[it] fails to bring Australian privacy law in line with other jurisdictions'.
2.120In the European Union, the GDPR:
'requires a data controller to inform a data subject whether their personal data will be processed as part of automated decision making';
'requires meaningful information about the logic used in processing'; and
'allows a person to opt out of automated decision making if it would produce legal effects that significantly affect this person'.
2.121The Jeff Bleich Centre recommended the bill be amended 'to reflect emerging regulatory practices and the need for openness and transparency in data processing'.
2.122Privacy policies do not limit the collection and use of personal information or generally inform consumers about how that information is collected and used.
2.123Those policies are generally very lengthy, and it would take the average person 14 hours to read the privacy policies of all the sites and applications they use in a single day. CHOICE argued:
Rather than adding to the length of privacy policies nobody reads, a far more effective reform would be to introduce the fair and reasonable use test recommended by the Review. This would require that businesses limit their collection, use and disclosure of personal information (including in automated decision-making) to purposes in line with a reasonable person's expectations.
2.124The BCA agreed the provisions could increase 'information overload in a way that is less, not more, understandable to our customers'.
2.125The bill would also place a large burden on businesses as it would require them 'to review all decision-making processes in their business or organisation that rely on personal information'. They would then need:
…to assess the extent to which those decisions relate to ones that could reasonably be expected to significantly affect the rights or interests of an individual and depend solely or substantially on automated means.
2.126Businesses would then need to update their privacy policies after the implementation of new ADM processes. In the process of doing that, businesses would be required to assess whether that new ADM process makes, or does 'a thing that is substantially and directly related to making, a decision'. In the BCA's understanding of the bill, that would include everything 'from basic use of spreadsheet formulas to complex AI systems'. In global privacy laws, similar requirements 'are typically limited to decisions that are 'solely' based on automated processing or are made without any meaningful human involvement'. For example, if an ADM process is used to provide a human decision-maker with data to make a decision, 'this could constitute an 'automated decision' under the Australian law but would not elsewhere in the world'.
2.127While the bill would increase transparency around the use of automated decision making in line with some of the recommendations of the Privacy Act Review, it would not incorporate Proposal 19 in its entirety. The bill does not include an associated accountability mechanism, which means that it 'do[es] not live up to the spirit of Proposal 19 as a whole'.
2.128Consumers should also have the 'right to request meaningful information about how substantially automated decisions with legal or similarly significant effects are made'. The introduction of that right would fulfil one of the recommendations made in the Review and complement new APP guidance on automated decisions. Without this right in the bill 'consumers [would be] aware that automated decision-making was used by a service, but without any idea how or to what effect'.
2.129CHOICE argued the right for consumers to be able to know how their data is being used:
…is a baseline expectation…[B]usinesses should just say how they are using the data across the board. That's really the bare minimum. We think that businesses should be obliged to use it in a fair and safe way, and not just let consumers know how they are using it.
2.130Amending the bill to incorporate similar language to that used in the GDPR would assist in better incorporating Proposal 19:
…the GDPR requires 'data controllers' to provide 'meaningful information about the logic involved in the decision-making, as well as the significance and the envisaged consequences of such processing for the data subject'.
2.131An amendment to APP 5 to require entities to specify the use of ADM, and include 'meaningful information about the personal information and logic used' would similarly satisfy the spirit of Proposal 19.
2.132The Insurance Council of Australia indicated the bill would 'apply to decisions that may affect an individual's interests as well as their rights'. That differs from the proposal put forward in the Privacy Act Review Report which only mentioned individual's rights. The ICA argued that by including interests the bill would broaden:
…the scenarios and use cases where organisations may have to disclose the use of ADMs. And given the penalties for non-compliance, this could perversely incentivise organisations to over-disclose information, leading to cognitive overload for customers.
2.133The ABA echoed that view:
The focus on 'interests of an individual' is not a position that was previously contemplated, it introduces unnecessary ambiguity and broadens the application of the APP without any likely consumer benefit. Contrary to the intention of the reforms, inundating consumers with excessive and unnecessary disclosures could lead to general customer confusion.
2.134In its view, privacy policies should be required to outline the use of 'substantially automated decisions that have a legal or similarly significant effect on an individual's rights'.
2.135The Financial Advice Association of Australia recognised 'consumer rights, whilst vital in empowering individuals, cannot be a replacement for concerted and collective action by Government targeted at the misuse of information in this space'.
2.136For example, BSA argued that consumers should be able 'to know how their personal data is used and protected'. That right 'should be backstopped by strong legal obligations on companies that collect or process personal information'.
2.137The Human Technology Institute, University of Technology Sydney (HTI) submitted that while the requirement to include information about ADM in privacy policies 'would improve transparency regarding when automation is used in decision making, it is unlikely to have a significant practical impact'. The provision of additional information in a privacy policy that an individual would need to consent to prior to accessing a product or service would do little to support them in 'seeking a review of an adverse decision'.
2.138The HTI recommended:
…the Bill should be amended to include a provision that would provide individuals with the right to request meaningful information about how substantially automated decisions with legal or similarly significant effects are made – as recommended by the Privacy Act Review report.
2.139Privacy 108 took a similar view, arguing that 'individuals must have stronger rights beyond transparency when AI is used', including:
The right to human intervention in AI-driven decision-making processes;
The right to refer AI uses to a specialist regulator for review based on fairness and the protection of broader human rights, particularly regarding their impact on individuals.
2.140Arca warned that the bill does not provide guidance on 'how specific the disclosures about automated decision making need to be'. A lack of guidance could result in entities 'making their disclosures as complete as possible'. In some instances, that would lead to very lengthy disclosures that are unlikely to be read by consumers. In other cases, some APP entities may disclose less information than others. Divergent approaches are likely to 'increase consumer confusion and uncertainty'.
2.141To overcome the limited guidance, Arca recommended either the:
addition of a new subclause to explain 'that subclauses 1.7 and 1.8 do not require APP entities to set out every single piece of information used by a computer program, or the effect the information has on the decision'; or
EM be amended to provide more guidance, specifically that APP 1.8 does not require 'very granular detail'.
2.142Ms Louise McGrath, Head of Industry Development and Policy, Australian Industry Group (Ai Group), suggested it would be difficult for businesses to comply with the proposed ADM provisions:
Due to the generally commercial or proprietary nature of automated decision-making or other artificial intelligence tools, it will also be extremely difficult for most employers to be open and transparent in an APP privacy policy. A better approach would be to support our members in their adoption of these tools, including with reference to the 10 voluntary guardrails recently introduced by the government to support safe and responsible AI use.
2.143According to the AGD, future privacy reforms are expected to focus on:
…measures to increase the transparency of ADM – including by providing individuals a right to request meaningful information about automated decisions that have a significant effect on an individual's rights or interests. This proposal will be further considered as part of a second package of privacy reform measures, and in the context of the [sic] developing a consistent legislative framework for the use of ADM in the delivery of government services to ensure consistency in approaches.
2.144Ms Virginia Jay, Director, Privacy Reform Taskforce, AGD, explained:
The Privacy Act report proposed that privacy policies should set out the types of personal information that would be used in substantially automated decisions which have a legal or similarly significant effect on an individual's rights. That wording was drawn from similar wording used in the European Union's GDPR. The bill has applied that transparency requirement to decisions which could be reasonably expected to affect the rights or interests of an individual. The application of that transparency requirement to decisions with significant effect on an individual's interests, in addition to rights, was in recognition that rights do not have the same application in Australian law as in Europe, which has a more developed rights based framework. In Australian law, the concept of interests covered the types of similarly significant effects it was anticipated that the new requirement should cover—that is, the provision of benefits under an act or the denial of significant services or support.
Use of artificial intelligence in automated decision making
2.145Several submitters referred to the consultation underway by the Department of Industry, Science and Resources (DISR) on introducing mandatory guardrails for high-risk uses of AI.
2.146Some of those submitters suggested the ADM provisions in the bill be implemented after the other regulatory reforms related to AI and ADM are completed. Implementing the provisions prior to that could increase complexity, introduce conflicting requirements, add to compliance burdens, and lead to uncertainty.
2.147Dr Lisa Archbold et al disagreed with that perspective. In their view, 'the possibility of overlap with AI regulation is not a good reason to delay the implementation of Proposal 19.3'.
2.148Ms Celeste Moran, First Assistant Secretary, Integrity Frameworks Division, AGD, acknowledged the AGD is working with DISR on AI policy. The ADM provisions in the bill are designed to complement 'the work DISR are doing in relation to AI to ensure that safe and appropriate use of AI'.
2.149The government has agreed to implement Proposal 19.3 of the Privacy Act Review. According to the AGD:
This proposal is proposed to be advanced in a further package of reforms, alongside other proposals to expand and introduce new individual rights. This would allow implementation of these reforms to be informed by the Government's work to develop guardrails for safe and responsible AI and a legal framework to support [ADM], consistent with the principles recommended by the Robodebt Royal Commission.
Statutory tort for serious invasions of privacy
2.150Most inquiry participants broadly supported the introduction of a statutory tort for serious invasions of privacy.
2.151There is wide community support for legal mechanisms that address invasions of privacy. According to research conducted by CHOICE, 85 per cent of Australian consumers 'believe in the right to sue a business that breaches their privacy'. CHOICE considered that as the proposed tort:
…would only apply to intentional or reckless invasions of privacy…[the] impact on businesses would be minimal and would only impose a bare minimum standard that consumers should be able to expect of all businesses.
2.152The HTI submitted the introduction of a statutory tort for serious invasions of privacy would 'provide protection for individuals from some of the worst forms of privacy breach, and provide for a right to remedy for some people affected by such privacy violation'.
2.153The main concerns raised in relation to the statutory tort related to the:
increased legal risk for certain professions and industries;
fault element and whether it should be broadened to include negligent acts that lead to serious invasions of privacy;
exemptions for journalists and enforcement bodies and whether a similar exemption should be included for businesses; and
public interest test.
Increased legal risk
2.154The ACCI and the Ai Group did not support the introduction of the statutory tort provisions as currently drafted. The ACCI argued the provisions 'would create significant legal issues, incentivise class action lawfare, and is inconsistent with the approach taken by like-minded countries like New Zealand, the US and Canada'.
2.155The Australian Medical Association (AMA) was similarly concerned about the potential for the tort to expose 'healthcare providers and researchers to significant legal risks'. The tort would operate:
…independently of the [APPs] and the rest of the Privacy Act, meaning an individual or organisation can be sued under Schedule 2, even if they have complied with the Privacy Act, or are not subject to its provisions. It also introduces a dual liability system, where medical professionals may face penalties under the Privacy Act and damages under Schedule 2 for the same act of collecting, using, or disclosing personal information.
2.156Uncertainty about what would be considered a 'reckless' or 'serious' invasion of privacy 'also creates ambiguity for medical professionals, hospitals, researchers and public health bodies who routinely handle sensitive personal data'. Healthcare providers are also protected by exemptions under the Privacy Act, which would not apply to the tort. If the tort provisions were passed, those providers would be put at increased risk of litigation for breaches of privacy.
2.157Some of those risks could arise from 'actions that are routine in medical practice', such as collecting family medical history or reports from other specialists without express consent. Medical professionals often engage in other practices that are necessary for them to comply with other legal obligations associated with their work or provide medical care and treatment. Some of those practices that could expose medical professionals to the tort include:
'disclosing health information to family members or authorities';
'raising concerns about colleagues'; and
Undertaking 'medical research using personal data'.
2.158The AMA was also concerned about the lack of an exemption from the statutory tort for peer-reviewed scientific journals, including the Medical Journal of Australia. Without an exemption, 'the author, the editor, and potentially the directors and other senior officers' could be subject to the tort if they publish information about a person's financial links to a health product or service.
2.159The Justice and Equity Centre suggested the proposed tort and its defences contain provisions that would 'alleviate many of the concerns raised' by the AMA. As the tort would only apply 'in circumstances where a person has a 'reasonable expectation of privacy'', many of the AMA's concerns would not be subject to the tort. Furthermore, 'permitted health situations' are exempt from the APPs. The tort is also limited to the 'misuse of personal information'. In the Justice and Equity Centre's view, it is unlikely that the legitimate use of medical information would reasonably constitute such a misuse. The obligation of medical professionals to make mandatory reports to regulatory bodies, such as the Australian Health Practitioner Regulation Agency, 'would attract the lawful authority defence' contained in the bill.
2.160The AICD also queried whether a company that experienced a cyber attack involving the theft and misuse of personal information by a third party would be subject to the tort. It was concerned that such a company could take all reasonable steps to protect itself from a cyber attack and still find itself in a situation where personal information is misused.
2.161Google proposed 'an innocent dissemination defence' for digital platforms that have acted as an intermediary in the serious invasion of an individual's privacy. It argued '[i]t would be impractical for a digital intermediary to operate under any other premise'.
2.162Mr Harry Godber, Head of Policy and Strategy, Tech Council of Australia, raised similar concerns:
…if a tort of privacy is introduced that is actionable per se that does not require demonstrated significant loss in order to be actionable, you would risk having large classes of action against any company at any tie where a data breach might have occurred. The crux of the proposed tort is that a serious breach has occurred¬¬—a serious invasion of privacy—but does not necessarily require an individual to have experienced any loss in order to pursue what would be nominal damages but in a class action environment could pose a significant threat to the just, quick and cheap operation of the legal system.
2.163The AGD explained the tort has several protections for the legitimate use of personal information:
The tort already provides a number of safeguards to protect legitimate and necessary activities in the medical sector. The includes a public interest balancing element that requires a plaintiff to satisfy the court that the public interest in protecting their privacy outweighs any public interests the defendant may raise. The non-exhaustive list of public interests explicitly includes public health. The tort also includes a range of defences, including consent, and a necessity defence where the defendant reasonably believed that the invasion of privacy was reasonably necessary to prevent or lessen a serious threat to the life, health, or safety of a person. The model of the tort was deliberately crafted to ensure that privacy is balanced with other important public interests. This design of the tort should ensure that legitimate practices in the course of medical care or research do not attract liability under the tort.
Fault element
2.164The AHRC suggested the bill is too restrictive by limiting the cause of action to serious invasions of privacy that are caused by either intrusion upon seclusion or the misuse of private information. That 'approach is not responsive to a rapidly changing world where digital innovation has sometimes aimed to 'move fast and break things''. A more responsive approach would reflect the human right to privacy set out in the International Covenant on Civil and Political Rights (ICCPR).
2.165Ms Georgia-Kate Schubert, Head of Policy and Government Affairs, News Corp; and Member, ARTK, argued that approach would give primacy to the right to privacy and ignore the right of free speech that is also enshrined in the ICCPR. In her view, that would not be appropriate as '[b]oth of those rights actually have equivalence'.
2.166Emeritus Professor Barbara McDonald confirmed that the ALRC had 'no interest in preferring one [right] over the other' when it proposed the model statutory tort.
2.167According to the EM:
The statutory tort for serious invasions of privacy is intended to operate similarly to other torts, in that it would be developed through jurisprudence. It is distinct from the regulatory regime established in the Privacy Act, which requires compliance with the APPs and is overseen by a regulator. As such, it is intended that courts would draw on key concepts from other torts, including privacy torts in other jurisdictions.
2.168The AGD stated there is broad public support for the introduction of the statutory tort, based on public consultation held in March 2024.
2.169The cause of action would only apply if an individual:
…suffer[s] a serious invasion of their privacy, either by an intrusion into their seclusion or by misuse of information, in circumstances where a person in their position would have a reasonable expectation of privacy. Only intentional or reckless invasions are actionable. A mistake – such as an accidental data breach – or mere negligence would not be sufficient. There is no requirement for a plaintiff to prove damage as a result of the invasion, however, the damage or harm a plaintiff suffers will be a relevant factor in assessing the seriousness of the invasion, and the remedies that may be awarded.
2.170The tort would balance the right to privacy against the public interest:
Where there is evidence of a competing public interest, the plaintiff must satisfy the court that the public interest in protecting their privacy outweighs it. This balancing exercise is a key element of the cause of action, and recognises that a plaintiff should not be able to claim that a wrong has been committed where an invasion of privacy is justified in the public interest. For example, an individual's privacy may legitimately be invaded in the course of taking action to protect them from serious harm, for example in a bushfire or other emergency situation.
Negligent acts
2.171The HTI argued 'the tort imposes an appropriately high threshold; it does not, for example, include invasions that are negligent, but requires recklessness or intent to invade the plaintiff's seclusion or misuse of their information'.
2.172Emeritus Professor McDonald and Professor David Rolph argued:
…it is appropriate, at this early stage, to limit the statutory tort to intentional and reckless conduct, for several reasons: the novelty of the cause of action in Australia; other available remedies for negligent conduct or for breaches of the Australian Privacy Principles and similar data protection legislation in the States and Territories; and the fact that intentional and reckless invasions of privacy comprise the most egregious forms of invasions of privacy. The recommended fault element underlies the recommendation that the cause of action be actionable without proof of damage.
2.173In their view, 'if the fault element were to be broadened to include negligence or abandoned in favour of strict liability' the cause of action without proof of damage should be rethought.
2.174The AICD argued 'it is critical that a statutory tort be confined to 'serious' invasions of privacy and require a fault element of 'intentionality or recklessness'. In its view:
Mere 'negligence' is not sufficient – particularly where it is proposed that the statutory tort be actionable even without proof of actual loss or damage. Further, what system or processes are deemed negligent in terms of protection of data is unclear given the evolving nature of data governance and the dynamic threat environment.
2.175Confining the fault element to intentional or reckless acts 'would be consistent with comparable jurisdictions such as the UK, New Zealand and the US'.
2.176Google supported limiting the tort to serious invasions of privacy. If the bill was amended to require a plaintiff to prove damage, the legislation 'would be better aligned with the considered reforms in defamation and would strike a better balance between the efficient use of court resources and individuals' rights to privacy'.
2.177In contrast, Electronic Frontiers Australia recommended the bill be amended to include negligent acts in the tort.
2.178The AHRC suggested 'the fault element should also cover negligent acts'. It explained the fault element includes an entity conducting an intentional or reckless act that seriously infringes on an individual's privacy. The term 'reckless' is defined in the Criminal Code Act 1995 and 'requires awareness of a substantial risk. In contrast negligence requires a 'great falling short of the standard of care that a reasonable person would exercise in the circumstances''.
2.179As the fault element would currently require 'a specific awareness of risk, it potentially rewards ignorance of privacy obligations and allows individuals to shield themselves from litigation by pleading ignorance'. The AHRC recommended the bill remove the fault element as it is 'too high'. If the fault element is retained, it should be redrafted 'to include negligent acts'.
2.180The ALRC considered including negligent acts in its model tort. Its report stated:
If the tort were not confined to intentional or reckless invasions of privacy, but was extended to include negligence or provide for strict liability, this would undermine an important justification for making the tort actionable without proof of damage. Rather, such an extension would require proof of actual damage to be consistent with other tort law.
2.181Emeritus Professor McDonald suggested that expanding the definition beyond 'intentional or reckless invasions of privacy' to include negligent acts would require a re-examination of the whole statutory tort framework:
…the design needs to be taken as a whole—particularly the point that it is 'actual damage' without proof of actual damage, as that is known in the law. 'Actual damage' means personal injury, psychiatric illness, property damage or economic loss. The law traditionally does not treat emotional distress as damage for the purposes of any negligence action. It was a contentious issue as to whether we should extend it to negligence. In our view, looking what has happened in other jurisdictions, the most egregious forms of invasions of privacy have been deliberate.
…
Extending it to negligence without requiring proof of actual damage flies in the face of negligence law generally.
2.182The tort 'is consistent with the model the ALRC recommended in that it doesn't include a serious harm threshold. It does only apply to serious invasions of privacy, though'. According to Ms Fitch:
…this recognises that the harm suffered by a serious privacy invasion might often be emotional distress rather than actual damage. Requiring proof of actual damage would prevent damages being awarded for the significant distress and mental anguish which may be caused by a serious invasion of privacy but isn't generally considered under law as actual damage.
Exemptions
2.183Some inquiry participants highlighted the importance of achieving an appropriate balance between protecting privacy and ensuring that the public interest is not harmed. In their view, the bill should ensure there are safeguards for whistleblowers and journalists acting in the public interest.
2.184Google argued the bill should expressly address some additional exemptions. In its view, the tort should not apply to:
any inadvertent capture of images of private activities by street photography, CCTV cameras, or drone usage; and
digital intermediaries associated with third party invasions of privacy of which they have no knowledge.
2.185The HTI recognised that while the tort is broadly based on a model developed by the ALRC, it differs from that 'model in some key respects'. In its view, 'the broad exemptions in the Bill will unnecessarily weaken the scope of the tort's application, and the effectiveness of the tort as a tool to address serious infringements of privacy'.
2.186The HTI explained:
International human rights law, and rule of law principles, generally provide that any legal exemption or exception should flow from a specific, demonstrated justification based on the particular circumstance or activity in question, rather than merely the status of an organisation as operating within a context such as 'law enforcement,' 'intelligence' or 'journalism'.
2.187There are grounds to limit the right to privacy:
…where the limitation is lawful and not arbitrary. In order for interferences not to be arbitrary, they must seek to achieve a legitimate aim (such as public interest, national security), and be reasonable, necessary and proportionate to achieving that aim.
2.188The AHRC similarly submitted that there are limitations to the right to freedom of expression:
The right to freedom of expression has been described as constituting 'the foundation stone for every free and democratic society' and is enshrined in a range of international and regional human rights instruments. The right is not absolute and its exercise 'carries with it special duties and responsibilities'.
2.189Restrictions on the right to freedom of expression:
…must be provided for by law and may only be imposed for 'respect of the rights or reputations of others' or 'for the protection of national security or of public order or of public health or morals'. Any such restrictions must also meet strict tests of necessity and proportionality. This requires that any proposed restriction pursues a legitimate aim, is proportionate to that aim, and is no more restrictive than is required for the achievement of that aim.
2.190The AHRC explained:
The proposed criminal offence of doxxing will limit the human right to freedom of expression by restricting certain forms of sharing information. The key issue is ensuring any offence is carefully tailored to meet the strict tests of necessity and proportionality and to avoid capturing reasonable online discourse about a person.
2.191The AHRC recognised there may be instances where there are legitimate public interest grounds to share or publicise personal information. It also suggested:
While the sharing of information online in this way has the potential to enhance public safety, there is also a potential risk of digital vigilante activity, which may see individuals seek to enforce a 'parallel form of criminal justice', and can undermine rule of law protections.
2.192The doxxing offences could also 'unreasonably capture public interest whistleblowing and journalism'.
2.193The bill would:
…guard against these risks by providing that the offences only apply where 'the person engages in the conduct in a way that reasonable persons would regard as being, in all the circumstances, menacing or harassing towards those individuals'. However, including a provision in the Bill which expressly protects the release of information for legitimate public interest purposes would help to further strengthen the protection of freedom of expression while still effectively addressing the harms caused by doxxing.
2.194To provide greater protection of freedom of expression, the AHRC recommended the bill include a provision that protects 'the release of information for legitimate public interest purposes'.
2.195The AGD explained the exemptions were included in the bill to save the time and resources of courts and litigants:
The main rationale for including exemptions rather than relying on defences or the public interest balancing test itself is to that it makes it clearer at an earlier stage that, on the face of the act, for the tort a broad range of entities are not captured. That would be intended to indicate there's no need to begin an inquiry into these matters at all or for those entities to even raise a defence. That's ultimately directed at not wasting time and resources, either of the litigants or the courts.
Exemption for journalistic activities
2.196The bill's exemption for journalism was welcomed by some inquiry participants.
2.197For example, the Queensland Council for Civil Liberties argued there should be an exemption for 'professional journalism'. The exemption should apply to journalistic material that is 'genuinely in the public interest'.
2.198The ABC reminded the committee that 'national broadcasters, as agencies, benefit from an exemption derived from the Freedom of Information Act 1982, which operates differently to the media organisation exemption'. The ABC strongly emphasised 'it is critical that the national broadcasters' exemptions continue to apply'.
2.199To ensure that the exemption for national broadcasters continues to apply, the ABC proposed 'that an additional exemption be created to align the privacy tort exemption regime' to the existing exemption in the Privacy Act.
2.200ARTK argued 'media organisations and those engaged in the activity of professional journalism should not be subject to the statutory tort either directly or indirectly'.
2.201The Free Speech Union of Australia (FSU) and Australian Christian Lobby suggested that those engaged in journalistic activities should include 'citizen journalists'. Electronic Frontiers Australia Inc (EFA) agreed it is necessary to include citizen journalists to 'hold them accountable for any privacy harms that their reporting might cause individuals'.
2.202Per Capita, Centre of the Public Square argued the journalism exemption 'should be removed and replaced with a blanket public interest journalism defence. This would require journalists to prove that an invasion of privacy was in the public interest'.
2.203ARTK expressed a preference for an exemption for journalistic activity, rather than a defence. The exemption as currently drafted, however, would 'not be fully effective' as it:
…is a narrow exemption which is not adapted to the circumstances of the proposed cause of action. Many persons involved in journalism are not covered. It does not cover:
publishers who are not the employer of the journalist—the drafting assumes the publisher is the journalist's employer, but in modern news organisations the publisher is often a different entity to the employer entity;
publishers who are licensees of the original publisher;
publishers whose journalists are engaged as contractors or contributors, rather than as employees;
other involved in journalism/publishing who are not journalists but not limited to printers, distributors, retailers, production personnel;
journalists' sources who are not acting in a "professional capacity.
2.204ARTK argued that these limitations to the proposed exemption would lead to 'a serious chilling of public accountability, as its operation will both indirectly and directly undermine journalism and curtail freedom of expression'. The exemption would hinder journalists from being able:
…to investigate, gather information on and report stories of public interest, particularly where the reporting relates to a public figure or wealthy individual with the means to launch legal action to prevent publication of the story via an indirect injunction and/or pre action discovery.
2.205ARTK argued the term 'journalistic material' is defined too narrowly to adequately protect all aspects of journalism from the tort.
2.206The bill would define 'journalistic material' as any material that:
(a)has the character of news, current affairs or a documentary; or
(b)consists of commentary or opinion on, or analysis of, news, current affairs or a documentary.
2.207In ARTK's view, the term 'journalistic material' as defined in the bill only refers to:
…one subset of journalism…Free speech should not be constrained by the 'character' or mode of expression by a journalist or the media as this imports an unnecessary and undesirable subjective element which inhibits that freedom. In contrast, it is all the activities of a free press which underpin the public's interest in the free flow of information. Those activities are matters of fundamental importance in a democratic society. Thus the proposed exemption should protect the activity of journalism.
2.208Ms Kiah Officer, Executive Counsel, Nine Entertainment Co, and Member, ARTK, maintained journalism plays an important role in Australian society and it should be properly protected:
Journalism is a very wide range of services that professional media provide to the community. All of those services have importance. There is a public interest in freedom of discourse, freedom of expression and transparency, and we would say that those principles should certainly receive protection.
2.209The proposed exemption would not extend to a journalist's sources, which would further constrain the work of the media as those sources may be less willing to participate in journalistic activity:
…the narrow scope of the exemption raises the prospect of sources being sued for the provision of the information to a journalist or media organisation. No doubt many people—sources— will be reluctant to assist in those circumstances and this also will have a considerable impact on the flow of information and reporting.
2.210Ms Officer, Nine and ARTK, opined that the lack of an exemption for sources would leave those sources 'potentially vulnerable to being sued under the tort'. Even sources who offer information confidentially might not be protected as they would not:
…have the benefit of the shield law protections and whistleblower protections which we currently have under various jurisdictions and which are not necessarily adequate to provide a level of protection to those sources. We have not examined, nor are we certain about, the interplay between whatever current protections there may be, but we have concerns that the bill as currently drafted appears to create some exposure for confidential sources.
2.211Ms Georgia-Kate Schubert, Head of Policy and Government Affairs, News Corp, and Member, ARTK, argued there are similar frameworks internationally that protect freedom of speech and the media:
…including article 10 of the European Convention on Human Rights, obviously, the First Amendment in the US and, of course, Australia's own Privacy Act, which provides an exemption for journalism, as it currently stands.
2.212Ms Clare O'Neil, Director, Corporate Affairs, SBS, and Member, ARTK, added '[t]here is already a wide range of regulations and laws that govern anything that might be considered malicious or nefarious activity—things like the Surveillance Devices Act and defamation'.
2.213Mr Hamish Thomson, Head of Legal, Guardian Australia, and Member, ARTK, explained journalists are bound by professional codes and complaints processes that deter them from acting outside of the public interest:
There is a very healthy claims lawyer industry…We are all organisations that are facing enormous financial pressures. The point of an exemption here is to avoid us getting into those preliminary, very costly and really inappropriate ways to deal with what is already something for which we have systems in place. Just to go to court, to have to prove a defence, to have to prove initial issues of serious harm, will have enough of a chilling effect on our journalists and on our editors to stop scrutinising the very people that are employing these kinds of lawyers—the wealthy, powerful people who are employing these lawyers.
2.214Ms Officer echoed that view:
…international experience suggests that these are not laws utilised by everyday citizens to protect individual breaches of privacy. They very much become tools of celebrities, politicians and wealthy public figures to essentially stifle the publication of information that might be at odds with whatever public persona they seek to portray…This tort is unlikely to provide a remedy for individual citizens seeking to protect their privacy.
2.215She further argued that the serious harm test might not limit the number of cases that go before the courts as that has not been the experience with Australian defamation law:
Protection of reputation is very well catered for in current Australian law. Even so, the legislature saw fit to introduce a serious harm test to defamation, to recognise that there is a threshold there, to try to minimise the number of claims before the courts and to limit those to only the most serious cases. Even so, we still have a booming defamation industry. We would say that is a comparable test to apply.
2.216Ms Bridget Fair, Chief Executive Officer, Free TV Australia, and Member, ARTK, indicated there has been 'a complete explosion in the use of defamation claims as a means of trying to shut stories down'. The increase in defamation cases against media organisations:
…in itself has a chilling effect on the ability of media organisations to report stories, both large and small. Boardrooms around the country, from small organisations to listed companies, are actively debating the amount of money they can put aside to fund legal actions that are brought speculatively because we a defamation claims industry in this country. We do not want to replicate that in the privacy arena.
2.217Ms Sarah Kruger, Chief Legal and Government Affairs Officer, Commercial Radio and Audio, and Member, ARTK, also highlighted the pressures experienced by the commercial radio industry that deter it from acting outside of the public interest:
Any risk, whether that is in terms of additional administrative burden, financial burden, litigation or the risk of some kind of adverse consequence, will have a chilling effect because our stations simply cannot afford to go down a litigious route.
2.218Commercial Radio and Audio (CRA) posited that the exemption for journalistic activities would:
…be developed through jurisprudence. This means that Australian media organisations, and individual journalists, will be subject to costly and protracted litigation to determine that scope. That threat of litigation, in itself, will have a chilling impact on freedom of speech and journalism.
2.219CRA suggested Australia has the opportunity to avoid the weaponisation of its proposed statutory tort, by:
…ensur[ing] that journalism is entirely exempted from the tort. This will ensure the protection of journalism and free speech, as well as the public's right to be fully informed, which is essential in a democracy.
2.220That more fulsome exemption could include:
a broader definition of the exempt journalist content, particularly to include other material of [public] interest…There is no need to link the definition to a particular class of persons such as journalists or their employers, as any defendant should be able to rely on this exemption in the event that proceedings are commenced against them in relation to exempt journalist content;
an additional statutory right for defendants to apply for a claim to be struck out at an early stage of proceedings where the matter complained about relates to the collection, preparation for dissemination or dissemination of exempt journalist content; and
if a defendant is successful in having the proceedings struck out then:
a Court may not order the defendant to pay the plaintiff's costs except where in the Court's view misconduct of the defendant in relation to the claim justifies such an order; and
the defendant will be entitled to indemnity costs relief, unless in the Court's view the defendant has engaged in misconduct.
2.221ARTK put forward two alternative options that would replace the journalism exemption in Item 15 of Part 3 in Schedule 2 of the bill. Its preferred option would broaden the exemption to include a wider definition of journalistic material:
This Schedule does not apply to an invasion of privacy by any person engaged in activities related or incidental to the provision of information for, collection, preparation for publication, or other activities related or incidental to reporting news, presenting current affairs, expressing editorial or other content in news media or documentary media.
2.222The other alternative proposed by ARTK would add the underlined text to proposed clause 15(1):
This Schedule does not apply to an invasion of privacy by any of the following, to the extent that the invasion of privacy involves the provision of information for, collection, preparation for publication, communication or other activities related and incidental to reporting news, presenting current affairs, expressing editorial or other content in news media or documentary media:
(a)a journalist;
(b)an employer of a journalist, or person or organisation engaging a journalist;
(c)a person assisting a journalist who is employed or engaged by the journalist's employer or person or organisation engaging a journalist;
(d)a person assisting a journalist in the collection, consideration and or preparation for publication, or who otherwise assists in the provision of information for, preparation of or in the course of reporting news, presenting current affairs or expressing editorial or other content in news media.
2.223In contrast, several inquiry participants raised concerns about the exemption for journalistic activities.
2.224Peter Clarke suggested that instead of a blanket exemption, the bill should be redrafted to include '[r]obust defences in support of journalists properly undertaking their profession'. In his view, the exemption 'goes beyond free expression and supporting the legitimate exercise of journalism but provides a blanket coverage to the excesses committed by journalists which should not be tolerated'.
2.225Emeritus Professor McDonald & Professor Rolph, University of Sydney, submitted:
The exemption for journalists in the bill is surprising, in that we know of no similar exemption anywhere else in the common law world, and particularly in its broadness. Once engaged, there is no limit on the exemption. Even if a journalist were to commit a crime in the serious invasion of the plaintiff's privacy the exemption would apply. We note that an exemption was neither proposed no recommended by the ALRC in its 2014 Discussion Paper or Report, nor has it been the subject of widespread public consultation before this Bill.
2.226Emeritus Professor McDonald argued that if the exemption is retained:
…at the very least, the exemption should have limitations on it. At the moment it's absolute. The exemption applies even where the relevant parties committed a crime in invading someone's privacy.
…
But that's no help for the victim. The victim isn't able to take action. The victim relies on the law enforcement authorities being interested in pursuing that. There's no recompense for the victim. We could, for example, after every crime, every criminal provision, put in a provision that the court may, on application, provide compensation—for example, for breach of the Surveillance Devices Act or the Telecommunications Act and so on. We could give courts the power to grant compensation for that. That would fill some of the gaps that we have, rather than this wholesale new tort.
2.227The HTI raised the same point:
…the journalism exemption would exclude any acts ostensibly done while developing 'journalistic material', regardless of the content or purpose of the journalism, including its merits as public interest journalism. This would exclude from coverage blatantly unlawful, even criminal, and unjustifiable infringements of privacy by journalists that are not in the public interest.
2.228The HTI provided the following examples of journalistic material that would not be covered by the tort:
illegal phone hacking by a news corporation, as occurred during the News of the World phone hacking scandal in the United Kingdom; or 'upskirting' photos taken of celebrities by a tabloid.
a journalist maliciously publishing the home addresses and contact details of government officials. Doxxing would not fall within the scope of the tort if it occurred in the course of journalistic activities. However, doxing would be covered by the criminal provisions for doxxing offences in Schedule 3 of the Bill, where it meets the relevant criteria.
2.229The HTI recognised the importance of public interest journalism and suggested that the bill be amended to make it clear that the defences and exemptions for journalistic activities 'only applies to journalism that is in the public interest'.
2.230Professor Santow proposed that instead of a broad exemption for journalistic activities:
…there should be a really strong focus on protecting public-interest journalism, and that would be by having a broad, robust defence for anyone who is accused of this statutory tort [inaudible] that they are engaging in public-interest journalism, not simply that they have the status of a journalist and so whatever they do cannot be impugned.
2.231DRW endorsed Professor Santow's view and suggested there are other avenues available to protect public interest journalism:
…there are also other court processes that exist, including the use of things like summary dismissal, striking out and the like that exist as part of normal court procedure and would assist media organisation where they've engaged in proper practices and engaged in public-interest journalism. I don't think there needs to be an exemption to still have the functionality of trying to deal with vexatious litigation. There are other processes that exist within court rules already that would provide for that. So I share HTI's concerns about the breadth of the exemption and its functionality. I think we'd better redraft it as a defence.
2.232There are other protections available in the tort that protect journalism:
Apart from other defences available to defendants, the cause of action imposes a 'seriousness threshold' that operates in addition to the public interest balancing test, a construction which the ALRC acknowledges was intended to 'further ensure the new tort does not unduly burden competing interests such as freedom of speech'. This was recognised in the Privacy Act Review Report, which noted that the '[ALRC] model slightly preferences other public interests over the public interest in privacy as the test requires that privacy outweigh other interests'.
2.233The journalism exemption was viewed as unnecessary and unjustified as:
Senior and lower courts across Australia, and indeed the common law world, have a demonstrated history of being able to balance the public interest in individual privacy protection against the broader public interest in journalistic activity. The proposed exemption is a clear outlier in that regard.
2.234The ABC indicated the bill may expose a third party that distributes or publishes national broadcaster content to the tort. It provided the example of 'an airline that shows news to passengers, will be vulnerable to the tort, whereas the news provider will not be'. There is a similar risk that if a national broadcaster sourced licensed content from a provider that was not protected by the exemption, the national broadcaster could be exposed to the tort. The ABC requested further clarity on these issues.
2.235The Law Council suggested that organisations involved in the publication of journalistic material may not be the employer of the journalist and, therefore, may not be exempt from liability. It explained 'that publishers often source material from self-employed journalists or other content providers'.
2.236There is also a possibility that the tort could 'have a chilling effect on legitimate free speech'. The Law Council argued:
the exemption for journalists will not cover many journalists' sources, with the effect being that tortious action could be taken against a source instead of a journalist as a means of bypassing the exemption.
2.237In relation to the related matter of journalistic freedom, the Law Council opined:
the definition of 'journalistic material' in proposed subclause 15(3) with reference to 'news, current affairs or a documentary' is unduly narrow in scope, despite the evolving nature of information transmission methods and delivery, and the broad range of topics they ay touch upon—thereby risking journalistic freedom to pursue matters of genuine public interest.
2.238The proposed definition of 'journalistic material' would also fail to provide an exemption for several:
…other legitimate forms of free speech (for example, works including biographies or memoirs) or other media content that otherwise offers a valuable contribution to cultural and public life (for example, comedy, satire and other entertainment).
2.239The AGD explained the definition of journalistic material:
…is intended to be 'platform neutral'; it covers those materials considered relevant for the additional protection provided by a journalism exemption – i.e. material that has the character of news, current affairs or a documentary, or consists of commentary, opinion on, or analysis of news, current affairs or a documentary. Material, activity or expression that does not meet the requirements of the exemption could potentially still be subject to the tort where the elements were established – including that the privacy invasion was serious, the plaintiff had a reasonable expectation of privacy, the defendant's conduct was intentional or reckless, and if the public interest balancing test were met.
2.240The exemption is not intended to extend to journalists' sources. The AGD indicated:
…there are policy reasons for taking this approach. Whistleblower laws are intended to address concerns about liability in certain public interest contexts in a consistent manner that extends beyond specific causes of action. Every Australian jurisdiction also has 'journalist shield' laws that prevent journalists from being required to disclose the identity of sources.
2.241The exemption also 'extends to journalists' employers. It is not clear the extent to which extending its application more broadly would satisfy the policy intent of protecting the beneficial role of journalism'.
2.242The AGD advised the exemption for journalistic activities:
…recognises the important and beneficial role of journalism in a free and democratic society; it is intended to mitigate the risk that the mere prospect of litigation would have a chilling effect on reporting…Conduct that does not meet the requirements of the exemption could potentially still be subject to the tort where the elements were established—including that the privacy invasion was serious, the plaintiff had a reasonable expectation of privacy, and the defendant's conduct was intentional or reckless, as well as the public interest balancing element.
Exemptions for enforcement bodies and intelligence agencies
2.243The HTI submitted that enforcement bodies would be exempt from the tort. Those bodies would include:
…agencies such as police, and a range of other bodies at the state, territory and federal level with powers to issue civil penalties or sanctions, ranging from the Department of Home Affairs to Sports Integrity Australia.
2.244An enforcement body is exempt from the tort when it 'reasonably believes that the invasion of privacy is reasonably necessary for one or more enforcement related activities'. In the Human Technology Institute's view ''[e]nforcement related activity' is another broad term, which includes pursuing minor civil fines, and the prevention of minor crimes, such as speeding'.
2.245The HTI suggested:
The exemption for enforcement bodies could hypothetically cover unlawful actions by enforcement bodies, as long as the body 'reasonably believes the invasion of privacy is reasonably necessary'. This very low expectation of what the Bill deems to be proper behaviour by enforcement bodies, coupled with the extraordinary breadth of activities in respect of which enforcement bodies may claim this exemption, means that this exemption almost offers carte blanche to those bodies to flout the privacy protection in this new statutory tort.
2.246It illustrated its point by explaining that the tort would not apply to an 'unlawful search and seizure by the Australian Tax Office, where it was 'reasonably believed it was reasonably necessary' to prevent potential tax avoidance'.
2.247The HTI indicated that 'any activity by intelligence agencies would be fully exempt from the scope of the tort, including activities conducted for malign or illegal purposes'.
2.248It suggested that the tort would not apply to 'illegal surveillance of Australian citizens by national security organisations due to their race, religion, sexuality or political beliefs'.
2.249Peter Clarke asserted that the exemption for 'enforcement bodies is too wide'. He indicated that enforcement bodies, including police services, 'have had a chronic problem of misusing information or abusing their positions. The exemption should not exist. There should be a defence available to enforcement bodies'.
2.250The HTI argued that the exemption provisions for enforcement bodies and intelligence agencies is unnecessary as they would be 'covered by the 'authorised by law' defence' elsewhere in the bill. It suggested the proposed defence should be amended to make it clear that it would apply as follows:
Enforcement bodies are covered by this defence if they are conducting lawful investigations in respect of serious crime, with independent authorisation. This wording would encompass authorisation through judicial warrants, and warrants issued by a judge or magistrate acting persona designate.
Intelligence agencies are covered by this defence if they are conducting a lawful national security operation, with independent authorisation.
2.251The defence provisions should be further amended to make it clear that it would:
…be restricted to invasions of privacy required or expressly authorised by law. This would prevent arguments claiming that privacy interferences were merely impliedly authorised by law – for example, the use of spyware in devices that is not explicitly included in a warrant authorising some limited search activities.
2.252Emeritus Professor McDonald also saw the exemption for enforcement bodies and intelligence agencies as unnecessary:
We didn't feel that was needed because we've got the lawful authority defence and also there can be no reasonable expectation of privacy for someone where the law enforcement bodies are lawfully pursuing their investigations. Obviously, we want our law enforcement bodies to investigate the commission of crimes and so on. That's what they are there for. It may be that, firstly, there's no reasonable expectation of privacy and, secondly, the lawful authority defence would apply.
2.253Emeritus Professor McDonald did not understand why the exemption had been included in the bill:
…other than perhaps they feel there is some ancillary conduct. Take a case like Smethurst v Commissioner of Police, which went to the High Court a couple of years ago. There, a journalist's home was raided under an invalid warrant by the Australian Federal Police, who seized data from her phone, put it onto their own USB stick and then refused to give it back. In the end she couldn't get it back. She couldn't get a mandatory injunction for it to come back. The High Court stated many times in their judgements
2.254The AGD explained the exemption for law enforcement bodies is intended to ensure:
…these entities are not unduly restricted in carrying out their functions which may need to be privacy invasive. An invasion of privacy by an enforcement body is exempt only to the extent that an enforcement body reasonably believes it is reasonably necessary for one or more of the enforcement-related activities it is undertaking.
2.255The Australian Federal Police advised that the proposed exemption for law enforcement agencies 'is consistent with existing provisions in the Privacy Act'.
Exemptions for businesses
2.256The Council of Small Business Organisations Australia (COSBOA) and the Australian Industry Group (Ai Group) suggested there should be an exemption for small businesses.
2.257COSBOA, argued that without an exemption for small businesses:
The introduction of the statutory tort will create a new risk of vicarious liability, potentially exposing small businesses of all sizes to legal responsibility for the actions of their employees or agents where the invasion of privacy occurred within the course of employment.
2.258Professor Santow was not aware of a similar small business exemption in any other comparable jurisdiction and that the position of COSBOA and the Ai Group is 'a very difficult position to maintain'. He explained the position is difficult to maintain as:
The right to privacy is a human right. As just an ordinary member of the public, it is no less harmful to you if your right to privacy has been seriously invaded by a large company or a small company. It is just harmful if it's a small company. So I don't think it's helpful to focus on this idea of offering certain companies some sort of blanket exemption. Rather, it would be more helpful to focus on what the small business ombudsman has been calling for, which is really clear practical guidance for small- and medium-sized enterprises so that they really understand how they can stay on the right side of the law and some of the easy things that they can do to make sure that they don't inadvertently fall foul of this new law.
2.259The Business Council of Australia recommended there be an exemption in relation to employee records.
2.260Ms Louise McGrath, Head of Industry Development and Policy, Ai Group, reminded the committee:
…we have consistently pressed for the employee records and small business exemptions to be retained and for the government to instead pursue best practice outcomes for members' dealing with personal information. We maintain this position.
2.261She outlined that employers deal with the personal information of their employees:
…for legitimate reasons and in the public interest. We ask the committee to consider that. This includes where employers manage their workers' conduct and performance to ensure compliance with their obligations under workplace laws and to keep people and property safe and free from injury or damage.
2.262Ms Yoness Blackmore, Principal Advisor, Workplace Relations Policy, Ai Group, elaborated on the reasons that businesses might collect and store the personal information of their employees:
…privacy can't be viewed in isolation, and there is a complex web of workplace relations laws. Employers must, and are entitled to, manage their workplace and run an efficient business. Part of doing so requires that they deal with data of their workers, too—for example, ensure that those workers work in a safe way, ensure that sexual harassment isn't happening in the workplace, and so forth.
2.263In her view there should be definitional consistency between the Privacy Act and the proposed tort:
…both the small business exemption and the employee records exemption be applied to the statutory tort, because otherwise you would have the Privacy Act going along and then you would have the case law happening on the side. We would also propose that information in terms of the statutory tort should have the same definition as that applied under the Privacy Act. At the moment the Privacy Act has a definition of 'personal information' and 'health and sensitive information', but that is not the same as information under the statutory tort. Even on the grounds of consistency, we think that there needs to be alignment of those things.
2.264EFA argued employee records are treated no differently to 'the information organisations or government agencies collect about their customers or clients'. For that reason:
There's no reason why the employee record exemption cannot be repealed and work equally well, amongst the new privacy principles, by way of any necessary minor exemptions that might need to be made to protect employers' positions in relation to legal claims.
Public interest test
2.265The bill would allow a defendant to adduce 'evidence that there was a public interest in the invasion of privacy'. To counter that evidence, a 'plaintiff must satisfy the court that that public interest was outweighed by the public interest in protecting the plaintiff's privacy'.
2.266Emeritus Professor McDonald and Professor Rolph, University of Sydney indicated this provision of the bill differed from the recommendation made by the ALRC as outlined in Table 2.1.
Table 2.1Difference between ALRC recommendation and clause 7(3) of the bill
| |
9.1 The Act should provide that, for the plaintiff to have a cause of action, the court must be satisfied that the public interest in privacy outweighs any countervailing public interest. | If the defendant adduces evidence that there was a public interest in the invasion of privacy, the plaintiff must satisfy the court that that public interest was outweighed by the public interest in protecting the plaintiff's privacy. |
9.3 The Act should provide that the defendant has the burden of adducing evidence that suggests there is a countervailing public interest for the court to consider. The Act should also provide that the plaintiff has the legal onus to satisfy the court that the public interest in privacy outweighs any countervailing public interest that is raised in the proceedings. |
|
Suggested clause 7(3) |
The court may only determine that the plaintiff has a cause of action if the court is satisfied that the public interest in the plaintiff's privacy outweighs any countervailing public interest. A defendant may adduce evidence that there is a countervailing public interest that outweighs the plaintiff's interest in privacy. (These will probably need to be two separate sub-clauses). |
Source: Emeritus Professor Barbara McDonald and Professor David Rolph, University of Sydney, Submission 27, pp. 3-5.
2.267In their view, the bill should require a court to 'consider the matters of public interest that justify the invasion of the plaintiff's privacy'. It should also clarify 'the onus/burden, if any, on a defendant to adduce evidence'.
2.268Emeritus Professor McDonald and Professor Rolph suggested there are cases where the public interest is self-evident without the adducing of evidence. For that reason:
We do not think that defendants should be required to adduce evidence of public interest in every case before they might raise a countervailing public interest for the court to consider. So to require would unnecessarily increase the cost and complexity of litigation without any demonstrable public benefit. However, defendants should be permitted to adduce evidence relevant to public interest considerations if they think it is appropriate or necessary. The statutory cause of action should be redrafted to make it clear that the defendant may adduce evidence to establish public interest grounds to be balanced with the public interest in the plaintiff's privacy, but are not required to do so.
2.269Their suggested clause 7(3) would overcome this issue while:
…also mak[ing] it abundantly clear that liability depends upon balancing competing interests, including any public interest ground identified or argued by the defendant, and that the public interest is not a defence, properly so called.
2.270The HTI endorsed that proposal 'recognising that it will not always be necessary or practical for the defendant to adduce evidence as to the public interest limb in every case'.
2.271Emeritus Professor McDonald argued against the inclusion of a public interest defence:
…it is a matter of judgement as to whether something is in the public interest. How can you prove that it is in the public interest to reveal somebody's residential address, or something like that? We couldn't see how it could be a matter of proof on the balance of probabilities that more probably than not it was in the public interest. There are other contexts in which public interest is seen as a matter for judgement. For example, in sub judice contempt, where the court is considering whether the media have published information which would have a tendency or a likelihood of interfering with a fair trial, one of the issues will be whether or not there is sufficient public interest in the revelation of the material to outweigh any risk. So again, it is a judgement which the court brings.
2.272Emeritus Professor McDonald argued that, instead of a public interest defence, the onus of proof should include a public interest element:
There was great concern, particularly amongst privacy advocates, that too much of a burden was being put on claimants or plaintiffs, and that the defendants could just sit back and leave it to the plaintiffs to show everything about their case—remembering that this hurdle of public interest is instead of a defence of public interest. It is something which we felt needed to be brought right up front. It would be quite appropriate that, if there are facts which ought to be brought to the attention of the court, and if the plaintiff hasn't done so, then the defendant may bring those to the court. Perhaps that is obvious. It is just recognising that the defendant may bring those to the court; the defendant doesn't have to, or may not need to. It may be blindingly obvious that this is a matter of legitimate public interest.
2.273Emeritus Professor McDonald and Professor Rolph put forward a 'suggested compromise, which goes back to the ALRC recommendation'. That compromise proposed:
The court may only determine that the plaintiff has a cause of action if the court is satisfied that the public interest in the plaintiff's privacy outweighs any countervailing public interest. A defendant may adduce evidence that there is a countervailing public interest that outweighs the plaintiff's interest in privacy.
2.274Emeritus Professor McDonald contended that the tort should not:
…be like a defamation case. In a defamation case, the plaintiff proves that they have been defamed; that's it. They are entitled to a remedy, although now there is a 'serious harm' requirement, which has modified that, importantly. They can say, 'I am entitled to success unless the defendant on the balance of probabilities can prove a defence.' In our design, the court must consider any countervailing public interest up front before the plaintiff can go ahead, or can succeed.
2.275She clarified:
We don't say that the plaintiff must persuade the court. We say that the plaintiff only has a cause of action if the court is satisfied. So both parties will have to use persuasion to the court as to whether or not the public interest outweighs the privacy interest.
2.276In Emeritus Professor McDonald's view, the court should make this decision 'partly because the whole issue of onus is murky—whether you can have an onus about something which is a judgement'.
2.277If clause 7(3) is redrafted, clause 7(4) should also be redrafted to state that it is a non-exhaustive list of matters the court could take into consideration when considering the public interest test. Amending the bill in that way would bring the statutory tort closer to the model proposed by the ALRC:
…recommendation 9(3) of the Law Reform Commission report did say that the act should also provide that the plaintiff has the legal onus to satisfy the court that the public interest outweighs any countervailing public interest that is raised in the proceedings. I am departing from that in saying that I think a better way is to say that the court should be satisfied.
2.278Emeritus Professor McDonald indicated artistic expression should be one of the kinds of public interest that should be included in the bill. She saw 'no explanation' for not including it in the non-exhaustive list of public interests. Artistic expression is:
…an important aspect of freedom of speech and freedom of expression in many different contexts: plays, films, novels, books and so on. There may be cases where there is sort of intrusion but it's something that has to be balanced.
2.279The bill would allow a court to grant an injunction that restrains the defendant from invading the plaintiff's privacy. Emeritus Professor McDonald brought to the committee's attention that the section heading of clause 9 of Schedule 2 'is problematic and needs to be changed'. The provision would not limit the court's power 'to an interim injunction'. For that reason, '[t]he word “Interim” should be omitted' from the bill.
2.280The AGD contended '[t]he drafting of clause 7(3) was intended to give effect to the approach set out in recommendation 9.3 of the ALRC Report 123'. The public interest test:
…is intended to allow judicial consideration of relevant countervailing public interests. It requires a plaintiff to satisfy the court that the public in their privacy outweighs any countervailing public interests in the invasion of privacy raised by the defendant. It provides that the defendant has the burden of adducing evidence that there is a countervailing public interest for the court to consider. It recognises that the public interest balancing element of the tort will not need to be satisfied in all cases; there may be matters in which competing public interests do not exist and the plaintiff should not need to prove the non-existence of public interests that have not been raised.
2.281The AGD advised the redrafted version of the clause proposed by Emeritus Professor McDonald and Professor Rolph:
…removes the evidential burden from the defendant and requires the court to consider any/all countervailing public interests in determining whether the public interest balancing element of the cause of action is made out. The department is now considering how the proposed formulation in the suggested redraft would operate procedurally to ensure the plaintiff is aware of the case it must meet and the court is properly informed of the matters it must consider.
2.282In relation to the proposal to expressly include 'artistic expression' as a public interest, the AGD advised:
These amendments could be made but arguably are not technically necessary. The Explanatory Memorandum to the Bill clarifies that freedom of expression includes (among other things) artistic expression and that the public interest in the freedom of the media pertains to the responsible investigation and reporting of matter of public concern and importance.
Doxxing offences
2.283Participants in the inquiry broadly supported the doxxing offences contained in the bill.
2.284The Australian Information Industry Association (AIIA), for example, submitted:
The AIIA commends the government's efforts to address the escalating dangers associated with doxxing. However, while punitive measures such as the proposed statutory tort for serious invasions of privacy and amendments to the Criminal Code Act are a positive development, the focus must also include measures to mitigate harm to victims swiftly and effectively.
2.285Dr Zirnsak, Uniting Church, observed that the proposed doxxing offences fill a gap in the law and, for that reason, they should be passed:
The risk always is the danger that somebody is being targeted for harmful behaviour and the legal team who are defending the perpetrator will try to argue the existing offences don't cover it. The more that we can make sure there aren't gaps in the law, the better it is. Our view is that this was filling gaps to make sure that, even if others want to argue you might possibly be able to use another law on some of this, this just makes it really crystal clear and ensures that on these matters of doxxing there will be a very clear and specific piece of law that will allow that behaviour to be addressed.
2.286The AFP contended that the criminalisation of doxxing 'may deter offenders and will provide a clear message to the community that this conduct is not tolerated'. In its view, '[t]he new offences complement existing offences in the Criminal Code Act 1995 for using a carriage service to menace, harass or cause offence'.
2.287While ACCI was not opposed to the introduction of the doxxing offences, it was concerned they could have 'unintended consequences':
…there may be good reasons to identify someone who is acting anonymously online. Anonymity facilitates a range of abuse and trolling on social media and the internet more broadly. People acting anonymously can bully and harass, issue death threats and threats of sexual abuse all without consequence. It may be that the Bill provides those persons with statutory protection to stay anonymous and continue their abuse of others. Additionally, “doxxing” may be in the public interest – many journalistic investigations have exposed political scandals and corruption by making public private text messages and other modes of communication.
2.288The Queensland Council for Civil Liberties argued that the doxxing offences should not be passed and should be subjected to 'substantial further consultation'. If the offences are introduced, they should only apply to cases of doxxing where the behaviour 'can be equated to harassment or stalking'.
2.289The FSU agreed that doxxing should be addressed as it 'can have a chilling effect on public discourse'. In its view, the doxxing provisions:
…are remarkably expansive in their breadth. They would seemingly cover misgendering, dead-naming, and even pointing out which societies or organisations Politicians frequent…Such criminalisation is inappropriate in a democratic society. The Criminal law should be slow to regulate the dissemination of personal information, given the Freedom of Expression issues that arise.
2.290The Law Council took a similar stance in relation to the broadness of the term 'doxxing'. It acknowledged that while doxxing can be harmful and deleterious to the wellbeing of individuals:
…there are instances in which doxxing behaviour is legitimate and should not be circumscribed. For example, doxxing can be part of public interest journalism where it involves the unveiling of private information that exposes contradictory, unethical, or illegal behaviour by public officials or business people.
2.291To better ensure that the proposed offences are not misused, the Law Council suggested amending the bill to 'provide further guidance on what constitutes 'menacing' or 'harassing' behaviour'.
2.292The AGD indicated that the terms 'menacing' and 'harassing' are well defined in legislation and are understood by the courts:
…the concept of menacing and harassing is well established in the Criminal Code, in existing section 474.17, which is 'use of a carriage service to menace or harass'. Those provisions have been in place since 2004. They are routinely prosecuted through the courts.
2.293Mr Parker Reeve, Assistant Secretary, High Tech Crime Branch, AGD, explained:
To menace or harass involves conduct that is threatening or repeatedly vexatious to such an extent as to cause—and I think we've set this out in the explanatory memorandum—would cause a person to fear for their safety or limit their ability to go about their daily life. It is a serious threshold, as you would expect for a criminal offence attracting a high maximum penalty, and appropriately so.
That goes to why we have not included a defence of the kind suggested in the offence. I think as we've set out in the explanatory memorandum, the objective requirement that a reasonable person would consider is the conduct to be menacing or harassing—that is, involving threats or vexatious behaviour et cetera—is such that legitimate conduct would not be captured by the offence, in the first instance, and the inclusion of a defence then would provide a permission, an authority, a licence, for people to engage in what is quite harmful behaviour.
Personal data
2.294Some inquiry participants commented on the definition of 'personal information' used in the Privacy Act and the term 'personal data' that would be introduced into the Criminal Code.
2.295Privacy 108 indicated that the bill would introduce the term 'personal data' into the Criminal Code. That term would be distinct from the term 'personal information' used in the Privacy Act. It argued in favour of consistency between the Criminal Code and the Privacy Act. The term 'personal information' should be consistently used and 'updated to reflect modern digital realities'. The updated definition should 'include metadata, geolocation data, and digital identifiers such as IP addresses, usernames, and other indirect identifiers, which are currently not comprehensively covered under the Privacy Act'.
2.296Those reforms 'would ensure clarity and consistency across Australian law'. They would also improve 'both privacy protection and the criminalisation of malicious online behaviours, including doxxing'.
2.297The FSU argued that the term 'private information' is 'unduly broad'. It argued the 'most appropriate approach' to the criminalisation of doxxing:
…would be to have a small and complete list of categories. We would expect that residential addresses would cover most doxxing, and possibly home phone numbers. Indecent images are already covered by existing law, as is harassment using a carriage service.
Penalties
2.298The FSU considered the maximum penalty of seven years imprisonment to be 'disproportionate'. In its opinion, a maximum penalty of one year imprisonment would be more appropriate. It argued that the distribution of 'information that potentially enables an offence should not have a far higher penalty than the offence itself'.
2.299The application of the penalty should also not be predicated on 'harassment' which, in the FSU's view, 'is too broad for the offence'. Instead, the penalty should only be applied where there is 'a substantial risk of physical harm to an individual, which is the main legitimate concern with doxxing'.
2.300The AGD explained that the maximum penalty for using a carriage service to make personal data available would be six years' imprisonment. The penalty would be higher than that associated with using a carriage service to menace, harass or cause offence to reflect 'the serious harms caused by doxxing, the potentially enduring nature of these harms, and the significant steps that a victim may need to take to mitigate these harms'.
2.301Similarly, the proposed penalty for using a carriage service to make personal data available about members of certain groups would be seven years' imprisonment. The penalty would be higher to reflect:
…the seriousness of such conduct. Doxxing persons because of a belief that they are part of a group that shares one or more protected attributes is particularly serious in nature, as it is likely to instil fear or anxiety in victims where there is a history of, or ongoing, persecution or prejudice and can encourage or incite other persons who share discriminatory views in relation to the protected group to engage in similar menacing or harassing conduct towards the victims.
Future privacy reforms
2.302Several inquiry participants highlighted the government's intention to introduce a second tranche of privacy reforms.
2.303Those participants mainly focused on the:
timeline for future consultation and reform;
small business exemption;
fair and reasonable test;
controller-processor distinction; and
definition of personal information.
Timeline for future consultation and reform
2.304Some participants in the inquiry suggested the government introduce the second tranche of privacy reforms as soon as possible or commit to a timeline for their introduction.
2.305For example, the AIIA highlighted considerable concern within the digital technology industry about the time it has taken to introduce reforms to the Privacy Act:
The government's apparent proposal to introduce two tranches of reforms, with the present Bill representing the first tranche, is a cautious step forward. However, we remain concerned that deferring critical reforms to a potential second tranche has fostered cynicism within the industry…[T]his approach has been viewed as timid and insufficient in addressing the pressing need for comprehensive privacy reforms. The delay in updating the Privacy Regime risks leaving Australian citizens vulnerable to privacy breaches, while businesses are left without clear guidance on their obligations.
2.306Reset.Tech Australia argued further reforms:
…are vital to ensuring Australia's privacy framework is fit for the digital era, especially including the updates to the definition of personal data and the inclusion of the fair and reasonable test for processing data.
2.307Some of those reforms will also strengthen privacy protections for children and have 'the capacity to be far more protective for children than a children's code'.
2.308Reset.Tech Australia recognised that the first tranche of reforms would take 'some vital first steps that are necessary to updating Australia's privacy framework'. It did not see any 'reason to delay or impede the progress of these proposed reforms'.
2.309The HTI similarly viewed the bill 'as an important first step in reforming the Privacy Act. This reform is long overdue, and much more is needed to modernise Australia's privacy law for the 21st century'.
2.310The HTI encouraged the government:
…to commit publicly to a clear and specific timeline for introducing the next bill. This timeline should be as expeditious as possible, noting that many of the reforms have been the subject of extensive public and broader stakeholder consultation – some going back almost two decades. Hence, this Committee should recommend that the Government commit to introducing the second reform bill to the Australian Parliament no later than six months after the forthcoming federal election.
2.311The next tranche of privacy reforms 'should incorporate the remaining recommendations that were agreed to, or agreed in principle, by the Government in its Response to the Privacy Act Review Report'.
2.312BSA recommended the publication of exposure drafts of future privacy bills to allow for public consultation ahead of their introduction to Parliament.
2.313Similarly, auDA - .au Domain Administration Ltd 'encourage[d] the Government to consider the need for certainty for Australian businesses on what future regulatory or legislative steps are planned'. Future consultation should include multiple stakeholders and provide 'an adequate timeframe for consideration and response'.
2.314Professor Handsley, CMA, expressed concern about amending the bill to include tranche two reforms at this stage:
I would have some slight concerns about the process of adding things in that would otherwise be tranche 2 that haven't been subject to the same consultation process these particular reforms have…[W]e wouldn't have had the opportunity to comment on them and we might not agree with precisely what the parliament was proposing to do.
2.315While it appreciated that incorporating additional reforms into the bill 'is likely to be practically difficult', the HTI proposed that the definition of personal information be expanded. The Privacy Act Review Report recommended that the term be expanded to make clear:
…personal information is an expansive concept
2.316DRW stated that the second tranche of privacy reform should not impede passage of the bill:
…there is a real urgency around privacy reform. In part, that's because the current act is so significantly out of date. We are out of line with comparable jurisdictions from a basic privacy rights perspective. Of course, the public largely support these measures to the extent they've been asked.
2.317Professor Santow, HTI, shared Ms O'Shea's view about passing the first tranche of privacy reforms as soon as possible:
In a perfect world, of course, all 100-plus recommendations that the federal government has committed to should be in a single bill, but we don't live in a perfect world…The sad history of privacy reform in Australia, particularly over the last 19 years, has been that government has found it very, very difficult to progress to a bill, let alone to actually enact the bill, so I would treat stakeholders who are saying, 'We haven't been properly consulted. This has all been foisted upon us. It's all new information to us,' with some real scepticism.
2.318EFA opined that splitting the privacy reforms into tranches has potential benefits for the businesses that will need to implement compliance changes:
…it's essential this bill pass. Understandably, from a business perspective, it's been put into two tranches, because the other approach would have been a big bang change—very expensive and very difficult for business to pass. So this is actually quite intelligent, from a business perspective, even wearing my civil society hat. But it's critical that we get some promise from the current government to enact a bill within six months of the election or get some sort of promise around tranche 2 not going off into 2027 or something like that.
2.319While the Law Council was pleased there will be a second tranche of privacy reforms, it:
…call[ed] for a roadmap, or strategy, to publicly detail how these reforms will be progressed—similar to the materials that the Government issued in 2023 for the Security of Critical Infrastructure Act 2018.
2.320That information would 'promote much-needed certainty for the multitude of sectors that expect to be impacted by these significant changes'.
2.321The AHRC welcomed the bill but indicated it failed to include 'many of the needed reforms'. It was particularly disappointed it did not 'include the 'fair and reasonable test', which would operate as a 'shield' against excessive data collection'.
2.322The AHRC recommended the government provide 'a clear timeline for when each 'agreed' and 'agreed in principle' amendment will be introduced in future tranches'.
2.323The OAIC agreed a second tranche of reforms is required and saw the bill as 'an important first step in strengthening Australia's privacy framework'.
2.324Ms Carly Kind, Privacy Commissioner, elaborated by explaining the bill would 'fundamentally contribute to the range of regulatory levers we have as an effective regulator and enable us to respond effectively to privacy harms and emerging threats in a strategic and proportionate way'.
2.325While Ms Kind argued that the bill should be passed, she suggested the next tranche of privacy reform should be developed urgently. Further reform is required quickly 'to keep ahead of the privacy threats and advancing skills of malicious cyberactors'.
2.326Ms Celeste Moran, First Assistant Secretary, Integrity Frameworks Division, AGD, stated:
The government is implementing its privacy reform agenda in a measured and considered way.
…
The bill implements a first tranche of proposals from the government's response to the Privacy Act review ahead of further work towards the second package of reforms. The Attorney-General has stated publicly that the department will develop the next tranche of privacy reform for targeted consultation, including draft provisions in the coming months. We have commenced work towards this.
2.327The AGD similarly explained the importance of passing the bill prior to the introduction of further privacy reforms:
The Bill provides individuals with greater protection, transparency and control over their privacy. These protections are vital to the Government's broader agenda to ensure Australian's personal information is safe and secure and is used responsibly and in the interests of the Australian community. The Government will continue advancing a further package of privacy reforms based on proposals from the Review that were agreed in-principle, through further targeted consultation with stakeholders on draft provisions.
2.328The AGD stated it:
…will undertake targeted consultation on draft provisions to progress the outstanding legislative proposals from the Government response to the Review, which will ensure the details of an updated privacy framework for the digital age are appropriate and workable in a diverse range of contexts.
2.329The consultation would add to:
…the extensive consultation to date with a broad range of stakeholders interested in protecting Australian's privacy and will ensure that the right balance is struck between protecting people's personal information, and allowing it to be used and shared in ways that benefit individuals, society and the economy. The targeted consultation will inform the Government's decision making on next steps in relation to Privacy reforms.
2.330Ms Moran indicated that consultation would be undertaken:
…with entities including industry, tech organisations, small businesses, the media and civil society as well as experts in academia and peak bodies. This is intended to avoid unintended consequences and ensure the right balance is struck between protecting people's personal information and allowing it to be used and shared in ways that benefit individuals, society and the economy.
2.331The AGD anticipated beginning the consultation process on draft provisions before the end of 2024. However, '[t]he final make up and timing for any second package of reforms will be informed by this consultation and is ultimately a matter for the government'.
Small business exemption
2.332Some inquiry participants called for the removal of the small business exemption.
2.333Under the small business exemption, around 95 per cent of Australian businesses are not required to comply with the Privacy Act. CHOICE indicated that includes 'high risk organisations like real estate firms'.
2.334Mr Rafi Alam, Senior Campaigns and Policy Advisor, CHOICE, shared the results of research conducted in December 2023:
…our polling found that 65 per cent of consumers did not trust businesses to use their data responsibly and in their best interest; 78 per cent were concerned about businesses selling their personal information to data brokers; and 70 per cent were concerned about their data being used in automated decision-making. Sadly, this widespread lack of trust in businesses is warranted. Across many sectors, businesses have failed to demonstrate their ability to self-regulate and protect personal information, and consumers have paid the price.
2.335Mr Alam highlighted that there is broad consumer support for the repeal of the small business exemption. According to polling conducted by CHOICE, '83percent of people agreed that small businesses should be required to follow privacy laws'.
2.336The Consumer Policy Research Centre argued the removal of the exemption 'would ensure all businesses who collect, share and use consumer data are held accountable to treat people's data with care and respect'.
2.337Ms Rosie Thomas, Director of Campaigns, CHOICE, referred to research that examines the privacy practices of 'the 'rent tech' sector; that is, third-party rental applications'. The research conducted by CHOICE found:
Broadly, a large number of businesses there collect large amounts of data. We cannot definitively say whether they are small businesses; that is part of the challenge. From a consumer's perspective, they have no way to know whether the business that is collecting large amounts of their personal data for rental applications or for other reasons is subject to the Privacy Act. It is an example of a sector where it is likely that the Privacy Act is not always applying, yet they are dealing in large amounts of personal information.
2.338Ms Thomas reported that CHOICE 'supported a proportionate approach to regulation'. It:
…would be open to calibrating the obligations in a way that is proportionate to the risk, while recognising that businesses might need time to transition, an appropriate transition period, and appropriate resourcing for the OAIC to be able to help and educate business on this journey. Fundamentally, it isn't acceptable that well over 90 per cent of the businesses in the country are not captured by the Privacy Act at the moment.
2.339Mr Jordan Carter, Internet Governance and Policy Director, auDA, indicated that the vast majority of small businesses surveyed by his organisation support stronger privacy regulations.
2.340Ms Elizabeth O'Shea, Chair, DRW, argued that the exemption can be damaging to small businesses. She explained that while there are some small businesses that do not treat their customers privacy carefully:
…there's a lot of small businesses who I think want to do the right thing by their customers and actually end up being sold substandard products form a cybersecurity perspective because they don't have that requirement to comply with the Privacy Act, and I would say there is significant benefit that would come to small business from an elevation in the standards of handling personal information that would improve the experience. It would mean off-the-shelf products sold to small business and purchased by them which they use on daily basis would adhere to a higher standard of privacy protection, which is a good thing. We shouldn't be creating a situation where small business has to contend with substandard products because the people selling them know the standards are low, so I think there's a lot of benefit that would come from the removal of that exemption. It's not all downsides.
2.341The AIIA acknowledged 'that all businesses, regardless of size, should be held accountable for protecting the privacy of Australians and their personal and sensitive information'. The small business exemption:
…not only heightens the risk to individuals but also overlooks the significant role these businesses play in the broader digital economy, forming 97% of businesses in Australia and contributing approximately one-third of Australia's [gross domestic product].
2.342Mr John Pane, Chair, EFA, supported the removal of the small business exemption arguing it is an anachronistic measure that does not align with privacy legislation overseas. In all privacy legislation that has developed since the 1990s:
…there is no similar or equivalent exception for small businesses. In fact GDPR also applies to small businesses. Small businesses have adapted, and with regard to economic and other impacts there has not been any significant negative reporting. Time have changed, particularly since the year 2000, when the national privacy principles were first drafted. We now have digitally enabled businesses that don't have a shopfront. They're all online, and so is our data. That's why it's important to ensure that the data is protected, that Australians have rights in relation to their data and that small businesses who process and collect that data have obligations to manage it and protect it.
2.343COSBOA assured the committee:
Small businesses of all sizes already actively process data with appropriate care and concern, many of which already have a turnover of over $3 million and are therefore already subject to the Act.
2.344If small businesses with an annual turnover of less than $3 million were required to comply with all APPs it would undermine 'the viability of those businesses already facing a laundry list of increased red tape and regulation'.
2.345Mrs Adele Sutton, Head of Policy and Advocacy, COSBOA, explained the small business 'exemption was introduced as recognition of the significant additional compliance burden facing small businesses without the extensive resources available to large companies'. She observed that 'small businesses with turnovers between three and $10 million are required to comply with the Privacy Act'. If the proposed amendments are passed, those small businesses will be required to comply with them.
2.346Exemptions for small businesses exist in similar legislation in foreign jurisdictions. For example:
…under the GDPR, although it applies to small business, there are specific exemptions within that European legislation from certain types of principles. For example, I don't believe that businesses with less than 250 employees necessarily need a data protection officer.
2.347Mrs Sutton clarified the GDPR does not contain an exemption for small businesses, but there are exemptions for small businesses from compliance with some of its provisions. In the Australian context:
For a small business, particularly a sole trader, without guidance as to what they need and what they can practically do to comply—and we haven't got the figures as to how much compliance with all 13 [APPs] would cost—what does this look like? Our understanding is that it is cost prohibitive to comply with such a broad brush of [APPs] without assistance and guidance.
2.348Mrs Sutton, stated:
…COSBOA is relieved that small businesses with turnovers under $3million, including 1.1 million sole traders, appear to have been spared further complexity due to the government's decision not to proceed with the removal of the small business exemption from the Privacy Act.
2.349The Financial Advice Association of Australia urged caution on proposals to increase the regulation of businesses, particularly those engaged in the provision of financial advice.
2.350The AIIA proposed a middle course between rescinding the small business exemption and maintaining the status quo. It argued small and medium enterprises (SMEs) do not necessarily have to comply with the same legislative requirements as large businesses. The AIIA proposed 'penalties could be tailored for SMEs, providing them with a longer implementation timeframe to ensure compliance'.
2.351Small businesses with an annual turnover of $3 million or less are exempt from the Privacy Act. It was considered that extending the Privacy Act to these businesses would place an unreasonable burden on them that is not proportionate to the privacy risk these businesses pose. The government agreed in-principle that the small business exemption should be rescinded. However, further consultation with affected stakeholders is required before that occurs.
Controller-processor distinction
2.352Some inquiry participants expressed disappointment that the bill did not include provisions that address the controller-processor distinction.
2.353BSA suggested the bill be amended to introduce the controller-processor distinction. If it is not possible to make that amendment, it 'strongly urge[d] any subsequent tranches of reform to prioritise introducing this distinction due to how fundamental it is to the function and structure of a privacy law'.
2.354The Tech Council stated the introduction of the distinction should be prioritised. It explained the difference between controllers and processors:
Controllers own the end-user relationships and are generally the first point of contact for individuals who wish to exercise their rights. They have the primary responsibility for ensuring compliance with privacy laws. This includes obtaining valid consent and responding to individual requests…[P]rocessors have responsibilities to comply with controller instructions, implement appropriate security measures, and ensure that any sub-processors are also in compliance with applicable privacy laws.
2.355BSA explained the controller-processor 'distinction has existed for more than 40 years and is foundational to privacy laws worldwide'. Privacy frameworks in foreign jurisdictions reflect 'the different roles of controllers (which decide how and why to process personal data) and processors (which handle personal data on behalf of other companies and pursuant to their instructions)'. By distinguishing between controllers and processors 'a privacy law can better craft obligations that fit both types of organisations'.
2.356Ms Erika Ly, Policy Manager, Tech Council of Australia, further explained the helpfulness of making a distinction between data controllers and processors:
When we're starting to think about how we may necessarily apply privacy obligations in the context of automated decision-making, being able to clarify the roles of controller and processor folks who are operating across the technology stack becomes very important, because it clarifies the appropriate entity that then will apply the disclosure notices for automated decision-making. At the moment in terms of our read of those particular provisions, it is fairly unclear.
2.357The inclusion of that distinction in the Privacy Act would 'not only align Australia's privacy law with other international laws and frameworks, but also, provide much-needed clarity for businesses and consumers'.
2.358Introducing the distinction into legislation:
…would increase the efficiency of the Privacy Act by allocating responsibilities relating to notification, consent, and security. The clear separation between controllers and processors forms a fundamental aspect of effective privacy frameworks worldwide, including the European Union's GDPR. The AIIA cautions that a failure to provide a clear delineation between these concepts would result in an inability to clearly define responsibilities in the management of personal data.
2.359Mr Godber understood that the controller/processor distinction:
…would apply to the tranche 2 provisions, which we also understand are reasonably advanced. The introduction of a controller and processor distinction is just one of a number of fundamental reforms that we are expecting in tranche 2 that critically will make Australia's privacy regime interoperable with key regimes overseas.
2.360In its response to the Privacy Act Review Report, the government stated:
Complexity and regulatory burden for entities acting as 'processors' would likely increase following reforms proposed in the Report. To recognise that different entities have differing degrees of control over the handling of personal information, the Government agrees in-principle that a distinction between controllers and processors of personal information should be introduced into the Privacy Act (proposal 22.1). This will bring Australia into line with other jurisdictions, reflect the operational reality of modern business relationships, and reduce the compliance burden for entities acting as processors.
Updating the definition of personal information
2.361Some inquiry participants called for an updated definition of 'personal information'.
2.362EFA argued the definition of 'personal information has not kept pace with technological advances'. To remain relevant and protect people's identity, it needs 'to include online tracking technologies, individuation, the use of location data, face and voice recognition, and other emerging methods of identifying individuals'.
2.363Ms Chandni Gupta, Deputy Chief Executive Officer and Digital Policy Director, CPRC, argued privacy reform should include an update to the kinds of data that are included in 'personal information'. She gave the example of location data:
We haven't specifically had anything from government about why or why not, or what is included or not included. However, we do believe it is those points of data, such as location data, that are very much personal to you. If you think about it, your journey and your different locations that you visit across a day are very much specific to you, and there is probably no other person that follows your pathway in the same way.
2.364CPRC argued that this kind of data, even if it is anonymised or not associated with your name in any way, would be 'easy to re-identify because of how unique our map is. You do not need to know who the person is. Businesses can very easily single you out'. The definition of personal information that was drafted in 1988:
…would have been enough to protect us, but it's not 1988. At that time there had only been one data breach, which was the Morris Worm, and we know now that there are multiple data breaches every year. We are living in a very different time.
2.365For that reason, the definition of 'personal information' should be modernised.
2.366Professor Santow, HTI, called for a broadening of the term 'personal information'. He suggested the term should include information such as IP addresses, device identifiers, and geolocation data. That information 'should fall within that subset of personal information known as sensitive information' as it has the potential 'to cause quite serious harm'. The Privacy Act already makes a distinction between personal and sensitive information.
2.367Ms O'Shea, DRW, argued that the technology industry should not be surprised by reform to the definition of personal information:
Industry have been involved in this consultation process. They've had their say and they've participated in this throughout. This is not a novel proposal. They are also, I think, on notice that they require a social licence to operate, and, at present, polling surveys of the Australian public show support for stronger privacy protections over their personal information. The element of surprise is not there, I think, from the perspective of this process or in general, in terms of social attitudes to privacy.
2.368Mrs Lorraine Finlay, Human Rights Commissioner, expressed the AHRC's disappointment that the bill did not include more of the reforms from the Privacy Review that the government had agreed, or agreed in principle, to where not included in the bill. The bill should have included 'a fair and reasonable test which would serve to strengthen privacy protections in the modern era'.
2.369Ms Kind, OAIC, agreed that the introduction of a fair and reasonable test and an updated definition of personal information should be prioritised. Broadening the definition of personal information 'would greatly expand our ability to be an effective regulator'.
2.370She suggested there is legal uncertainty about the definition of 'personal information':
…there has been some legal uncertainty for some time in the aftermath of a case, Privacy Commissioner v Telstra Corporation Ltd [2017] FCAFC 4, which has left some legal uncertainty about the definition of 'personal information'.
2.371Broadening the definition and making it clearer would:
…take away that uncertainty, aid the regulated community in being clear about what their obligations are and aid our office in unqualified enforcement with respect to the kinds of personal information that might not instinctively be classified as personal information but which would relate to an individual as that broader definition proposes.
2.372Ms Kind also argued the introduction of a fair and reasonable test would provide:
…a way to raise the general standard of personal information handling across the economy, and would address the current power imbalance inherent in the existing framework by shifting the responsibility to businesses and organisations to proactively consider the impact that their data-handling practices may have on individuals.
2.373The introduction of that test would modernise Australian privacy legislation in line with community expectations as it:
…would change and also level up the nature of the obligations in the Privacy Act, commensurate with the kinds of harms and challenges we see, particularly in the digital economy. At the moment, the Privacy Act regime really focuses on the collection of personal information as the relevant process to regulate and control. What the fair and reasonable test gives us is a way to also try to ensure that the ways in which personal information are being used are consistent with community expectations and the interests of the individuals. We think that that's really the linchpin of bringing Australia's privacy regime into the 21st century.
2.374Mrs Finlay argued there is no reason to delay the introduction of these reforms:
…while public policy requires consultation, there has been consultation on this point…[I]t is something that has been discussed for a lengthy period of time. While we certainly understand the need to proceed in a measured and considered way, we would say there is also a need to proceed in a timely way, and we would like to see a clear timetable in terms of when that reform and others will be progressed.
2.375Ms Catherine Fitch, Assistant Secretary, Privacy Reform Taskforce, Integrity Frameworks Division, stated the AGD understands the importance of revisiting the definition of personal information and introducing a fair and reasonable test. In the AGD's view:
…they both clearly do assist in providing an upgraded baseline standard for personal information. The government at this stage has committed to progressing the remaining agreed-in-principle proposals, of which those were two, in a second package.
2.376The rationale for introducing the reforms in two tranches was based on stakeholder feedback during the consultation phase. As Ms Fitch explained:
…this first tranche bill was really intended to address areas where reform had been identified as urgent and can be progressed largely without a direct or significant regulatory impact on entities. The reason for that I think is that the government's seeking to balance an uplift in privacy protection with a reasonable and proportional impact on entities. In our consultations to date, several stakeholders have made clear that they would welcome the things that directly impact them being progressed in one group.
…
…the government has made clear that it does seek to do another tranche of reform and, in certain cases, that requires further consultation on the detail of the draft proposals.
Committee view
2.377The bill would enact a first tranche of reforms to the Privacy Act 1988 agreed by the government in its Response to the Privacy Act Review. The bill would also introduce a new statutory tort for serious invasions of privacy and targeted criminal offences to respond to doxxing.
2.378Multiple civil society organisations told the committee that privacy reform is overdue. They did not support delaying the bill to incorporate further reforms.
2.379The committee heard evidence from children's advocacy organisations. They highlighted the importance of including children's views in the design of the Children's Online Privacy Code. Given the logistical difficulties associated with consulting children, the committee considers that the 40-day consultation period is insufficient. To gain meaningful input from children, the committee recommends the Information Commissioner consult stakeholders over a minimum of 60 days.
2.380The committee recommends that the minimum consultation period for the Children's Online Privacy Code is extended to at least 60 days.
2.381The committee received evidence from representatives of a broad range of industries. Many of them indicated a strong desire to engage in the development of the Children's Online Privacy Code. While the bill would allow the Information Commissioner to consult with industry on the development of that code, the committee is of the view that the commissioner should be required to engage in that consultation to ensure that all relevant stakeholders are consulted on the design of the code. Input from a wide range of stakeholders would help to ensure the COP Code is fit for purpose and technically feasible.
2.382The committee recommends that the bill is amended to include a requirement for the Information Commissioner to consult with relevant industry bodies or organisations when developing the Children's Online Privacy Code.
2.383The ABC and the Special Broadcasting Service drew the committee's attention to a possible drafting error in the bill. The Attorney-General's Department confirmed the provision that would permit national broadcasters, such as the ABC and Special Broadcasting Service, to access personal information during declared emergencies was a drafting error. On that basis, the committee recommends that the error is rectified.
2.384 The committee recommends the exclusion for media organisations from accessing personal information during declared emergencies is extended to exclude national broadcasters such as the ABC and Special Broadcasting Service.
2.385The bill would provide the Information Commissioner with the power to issue infringement notices to entities subject to the Australian Privacy Principles that have not complied with their obligations. The committee understands these provisions would not necessarily explain to the affected entity what steps could be taken to remedy their breach of the principles. The committee is concerned that without the power to issue a discretionary notice in the first instance, the bill could disincentivise entities from engaging in open and consultative dialogue with the Office of the Information Commissioner.
2.386The committee recommends that the bill is amended to empower the Information Commissioner to issue a discretionary notice to an entity to remedy an alleged breach of one or more of the provisions in section 13K before issuing an infringement notice.
2.387The bill would require Australian Privacy Principle entities to include information in their privacy policies about how personal information is used in automated decision making processes. The committee received evidence that the provision could compromise sensitive business information and potentially stifle innovation. The committee considers it appropriate that sensitive commercial information is protected.
Recommendation 5
2.388The committee recommends that the Explanatory Memorandum to the bill is amended to make clear that the level of information required in privacy policies is not expected to compromise commercial-in-confidence information about automated decision-making systems.
2.389The committee acknowledges concerns raised about the public interest test outlined in clause 7 of the bill, particularly those discussed by Emeritus Professor Barbara McDonald and Professor David Rolph. The committee is of the view that a defendant should not be required to adduce evidence of a public interest in every case and that the court should be required to consider countervailing public interests in determining whether the statutory tort cause of action is made out. The committee also considers that the bill should make clear that 'artistic expression' is a form of freedom of expression.
2.390The committee recommends that the Commonwealth government considers amending clause 7 of the bill to:
require a court to consider the matters of public interest that justify the invasion of the plaintiff's privacy;
not require a defendant to adduce evidence of public interest in every case; and
provide that 'artistic expression' is a form of freedom of expression.
2.391The committee received evidence that the journalism exemption from the statutory tort for serious invasions of privacy should be broadened to include a wider range of journalistic activities. In the committee's view, individuals involved in the publication, re-publication or distribution of journalistic material should also be exempt from the tort.
2.392The committee recommends that the Commonwealth government considers amending Schedule 2 of the bill to ensure that the journalism exemption applies to a person involved in the publication, re-publication or distribution of journalistic material.
2.393The bill would exempt journalism that comprises commentary or opinion on, or analysis of, news, current affairs or a documentary from the statutory tort for serious invasions of privacy. The committee is concerned that the bill would not provide adequate certainty that editorials are a form of journalistic material. In the committee's view, editorials are journalistic material and should therefore be exempt from the tort.
2.394The committee recommends that Schedule 2 of the bill is amended to make clear that the concept of 'journalistic material' for the serious invasions of privacy tort includes 'editorial' material.
2.395Clause 9(1) of the bill would allow a court to grant an injunction restraining the defendant from invading the plaintiff's privacy. That power is not limited to interim injunctions as the section title suggests. The bill should be amended to make it clear that this power is not limited to the issuance of an interim injunction.
2.396The committee recommends that Schedule 2 is amended to make clear that the power conferred on a court to issue an injunction is not limited to an 'interim' injunction.
2.397Subject to the preceding recommendations, the committee recommends that the Senate passes the bill.
Senator Nita Green
Chair