Chapter 2 - Complexity and accountability

Chapter 2Complexity and accountability

Complexity

2.1It is undeniable that social media, and the broader online environment is a highly complex space that requires a complex regulatory response. This was acknowledged by several submitters. However, many were also optimistic that in spite of the task at hand, there is now a valuable opportunity to construct a holistic policy framework that will provide a lasting foundation for regulating a dynamic and continuously evolving sector that affects all Australians.

2.2The core of this complexity is not simply technological, it is also in the fact that while there are many harms that can be caused by social media, it also brings significant benefits to users, and is integral to the ways people interact in the modern world, particularly young people. This makes the task of regulating social media to promote and protect those benefits, while minimising harm, a complex policy problem to solve.

2.3Despite these difficulties, the committee heard that Australia has a proud history of tackling difficult issues. Beyond Blue provided tobacco regulation as an example:

There are many examples of complex public policy that affect the whole of the Australian population where governments successively from both sides of politics designed multifaceted policy responses that actually are effective. The example that sprung to mind when I was thinking about this was tobacco control. This is a wicked health and social policy issue. When we all started smoking back in the day, no-one really understood the impacts.[1]

2.4batyr, a youth mental health organisation, also cited Australia's record of being at the forefront of difficult policy issues:

We have done this many times in our history. Australia is really an innovator in this space of looking at how we keep people safe and encourage skill development and access in ways that are appropriate for our culture.

I think that is the really exciting thing about this committee—to be having these conversations about what the answer is. If it is not a blanket no or yes, because we know that doesn't work, what is it? I think what my colleagues are saying is the complexity of the way that we deal with this is something we've done in the past and we can do now.[2]

2.5The challenge of developing a policy and regulatory response to the impact of social media—which has both benefits and risks—does differentiate it from examples such as tobacco that are overwhelmingly harmful:

It's really important, I think, to distinguish between social media, which is a very different beast, to tobacco, not least because every piece of social media content is doing you harm, unlike every cigarette. There are positive aspects to social media; we've touched on them today. Australian governments have a really long and proud history of designing policy responses that are multilayered, multifaceted and sustainable.[3]

2.6This report looks in some depth at areas where this complexity is at its most evident, which includes how social media contributes to harm, what can be done to minimise that harm while maximising the benefits, age verification, and how best to ensure that accountability is properly attributed and enforced. In this regard the committee is mindful of advice it received from one witness that this complexity should not be used by social media platforms as an excuse not to progress an age-appropriate online experience:

It's the narrative of complexity that tech is selling us that they want us to get stuck on. It's not insurmountable. It's just that they want us to keep talking about this complexity.

From where I sit, it's a difficult and complicated or poisoned debate that we're led into because it's complex, and this is what tech wants us to get stuck in and write about. The tech isn't ready and there are privacy concerns about age verification.

I think it's a debate that's a bit of a trap.[4]

Who is responsible for the operation and content of social media

2.7A key element of ensuring that social media is a positive force in society is to hold digital platforms accountable for the impact of their operations, and to ensure they are subject to Australian laws and regulations to meet the expectations of Australian society.

2.8In terms of accountability, the crux of the argument rests on who is responsible for keeping users of social media safe from online harm. Many online platforms place that burden on their users and support them to do so by providing tools to manage their online space. However, many submitters were of the view that the platforms are abdicating their responsibility to provide safe environments, and rather than designing regulations to hold platforms accountable for the harms done after the fact, regulation should force digital platforms and their operations to be safe by design.

2.9This is the area the Australian Government (government) should focus its regulatory efforts on, according to the eSafety Youth Council, which argued that platforms are currently not accountable for the impacts of their operations on society:

The […] Government should focus on … the lack of concern for safety and broader socio-ethical implications from the Big Tech industry and their lack of transparency and accountability impacting young people's experiences online. Government should prioritise and regulate the incorporation of Safety by Design features in the Big Tech industry's digital products and platforms to ensure the burden of safety doesn't fall solely on us and our families.[5]

2.10Safety by design is a well-known and established concept in social media and digital spaces that mitigates online harms and provides in-built protections in the design of digital products. In Australia, the Office of the eSafety Commissioner (eSafety Commissioner) has developed a comprehensive approach that 'encourages technology companies to alter their design ethos'[6] to ensure that safety is built into their operational models.

2.11The eSafety Commissioner's comprehensive initiative is supported by implementation guides, assessment tools, and bespoke information for various company types and sizes. The principles underpinning the concept are central to the debate of who should be accountable for users' online experience. They are:

service provider responsibility;

user empowerment and autonomy; and

transparency and accountability

2.12The principle of service provider responsibility is that 'the burden of safety should never fall solely upon the user'. Enacting this principle would involve social media platforms assessing the risk of harms in their design and operation of the platform and taking steps to 'engineer' these out.[7]

2.13User empowerment and autonomy is a rights-based principle which serves to ensure that products and services should 'align with the best interests of users'. Building this principle into the design involves features and functions that 'preserve fundamental consumer and human rights', and 'understanding that abuse can be intersectional, impacting on a user in multiple ways for multiple reasons, and that technology can deepen societal inequalities'.[8]

2.14The third principle of safety by design is that the operations of platforms and services should be transparent, and that the operators should be accountable for their content and services. This means not only that services should operate safely, but that there are mechanisms available for users to report and address safety concerns if unsafe practice is discovered. The platforms and services also have a duty to inform and educate users on what steps they can take to preserve their safety.[9]

2.15These three principles reflect the concerns that many submitters had about social media and its operations and operating model. How these concerns are addressed, and by who, is one of the key questions.

2.16Placing the onus on platforms to make them accountable rather than the users was supported by Reset.Tech Australia who told the committee:

The first ingredient is some sort of measure to drive towards accountability, such as a duty of care or some sort of statement that flips the table and says, 'Actually, platforms have a responsibility towards their users. The responsibility is on the platforms not on the individual users to keep themselves safe'.[10]

2.17The Alannah and Madeline Foundation accepted that shifting the onus of accountability from the individual to the platforms raises the prospect of the business model of social media platforms having to change. And by extension the costs on individuals to access the services:

… the accountability for feeding them [information] should not be on the child. It should be on the design of that service.

[…]

And I know it's hard because it means that the business model will have to change, and it is likely we're going to have to pay with money for the value of the services that we want to access.[11]

2.18Meta acknowledged that there were concerns around the accountability of social media platforms:

… we certainly acknowledge the need for greater transparency and accountability of digital platforms. That's why we've long advocated for government regulation of digital platforms. We were a founding signatory to the DIGI Code of Practice on Disinformation and Misinformation, and we've constructively engaged with policymakers and regulators on how to address many of the issues being considered by the committee.[12]

Complaints process

2.19Central to platforms' accountability is a robust and meaningful complaints process at a platform level. A common theme of the inquiry was that complaints or concerns raised often go nowhere or are dismissed by the platform with no further avenue available to users. Reset.Tech Australia cited an example of raising a system level complaint under the Australian Code of Practice on Disinformation and Misinformation administered by the Digital Industry Group Inc. (DIGI):

… the complaints really have nowhere to go. We provide painstaking evidence to underpin our complaints and we prepare a multipage, rigorous report which goes to an administrator and decision-maker, and then industry can simply disagree and there is really no avenue for a meaningful and proper debate on the empirical findings we get to.[13]

2.20Similarly, Collective Shout expressed its considerable frustration that it is often unable to get a response from the platforms when concerns are raised or complaints made:

We are constantly frustrated by the lack of action by the platforms. Instagram rarely responds […] We track the activities of online predators and are constantly reporting to Instagram, with very little action.[14]

2.21Collective Shout noted that it had tried numerous avenues, but to no avail:

We use every lever at our disposal and go direct through the platforms. That's usually the least effective action. We go to eSafety, to the National Center for Missing and Exploited Children, to the Australian Centre to Counter Child Exploitation, to global policing bodies. We try every mechanism. But it is hard. The power of these platforms is stacked against civil society organisations.[15]

2.22When questioned on the avenues available at the user and system level, Meta provided an explanation of how complaints are processed:

In terms of the overall content policies and how they're set, we have teams that are based in different offices. They look at how our complaint systems are handling feedback that we're getting from different stakeholders. They set up a product policy forum—I think it's at least once a month—which engages with a wide range of academics and other experts to give us feedback about adjustments that need to be made, and we test those with feedback from local offices and different stakeholders around the world.

There are tools on the platform, and then there are different ways for us to escalate issues or concerns.[16]

2.23Legal accountability and the ability of platforms to avoid legal redress or accountability for harms done on their platforms, was also a strong theme from some witnesses. For example, Tattarang, who have been acting in response to a series of high-profile scams involving Dr Andrew Forrest AO, submitted:

Despite the best intentions of lawmakers to mitigate this and other harms, our courts and regulators have limited powers. Social media platforms such as Meta and X Corp. engage in jurisdictional arbitrage to avoid effective accountability to Australian regulators and courts and deny redress to the victims of these scams and other harms.

By structuring their businesses so that all relevant operations are managed and controlled by US based companies with no relevant entities based in Australia they can frustrate attempts at service, refuse to comply with codes of practice, refuse to comply with legislation, render voluntary their compliance with injunctions and other court orders and force litigants to go through a convoluted process to sue or get a court order enforced in the US. Furthermore, they claim absolute immunity for virtually all their activities thanks to section 230 of the Communications Decency Act 1996.[17]

2.24On the question of avenues available to Australians attempting to take legal action against Meta, the answer was less than clear on how that could occur in Australia:

Senator HANSON-YOUNG: If I wanted to sue Meta, could I come to your office in Sydney, Ms Garlick, and serve document on you?

Ms Garlick: I'm sure people can and do, and have, but in terms of whether it would be effective and legally binding—I'm not a lawyer. Often, when people ask to be able to serve us with documents, we try to direct them to outside counsel to support that process.

Senator HANSON-YOUNG: So you don't actually know whether anyone in Australia can be served on directly?

Ms Garlick: I don't think we're authorised, under the technical legal processes, to be served on, but that doesn't mean that there aren't ways to find solutions…[18]

How to better regulate social media platforms

2.25Meta maintained that it has 'long advocated for government regulation of digital platforms',[19] while DIGI's submission outlined a series of active regulatory processes that the digital industry is working on with governments. These include:

the statutory review of the Online Safety Act 2021;

the recent Online Safety (Basic Online Safety Expectations) Amendment Determination;

the government's pilot of age assurance technology;

the second stage of the modernisation of Australia's National Classification Scheme;

development of industry codes for Class 2 material;

processes for the making of subordinate regulation (rules/standards) as a consequence of the recently passed Digital ID Bill 2024, together with the Digital ID (Transitional and Consequential Provisions) Bill 2024;

the review of the Privacy Act 1988 (with a foreshadowed introduction into Parliament in August 2024), including government agreement to implement a Children's Online Privacy Code to promote the design of certain services in the 'best interests of the child';

the Misinformation and Disinformation Bill 2023 and associated processes;

the voluntary code for online dating services;

the voluntary AI Safety Standard;

establishment of the Select Committee on Adopting Artificial Intelligence; and

the anticipated government response in relation to dispute and complaints resolution processes of digital platforms, flowing from the Australian Competition and Consumer Commission's (ACCC) Digital Platform Inquiry.[20]

2.26Strong government regulation was supported by Reset.Tech Australia who told the committee that 'only hard laws can achieve accountability in a digital platforms market'. Reset.Tech Australia argued:

Governments continue to defer to industry as the technical experts, and in doing so they concede vital opportunities to maximise public protections and secure accountability. Our failed experiment with industry-led regulation has lured us into a state of protracted regulatory capture.[21]

2.27The Digital Services Act (DSA) in the European Union (EU) was cited as a model of strong regulation for Australia, and many other jurisdictions, to follow. One of the most significant elements of the DSA is that it forces large social media platforms to consider the rights of its users, and the impacts on society more broadly. It then forces them to mitigate those risks. The Human Rights Law Centre submitted its synopsis of the legislation in a submission to the Senate Economics References Committee inquiry into the influence of international digital platforms:

Under the DSA, very large platforms are required to undertake annual risk assessments to identify, analyse and assess any significant systemic risks stemming from the functioning and use of their services, including their algorithms, recommender systems, content moderation systems, terms and conditions, advertising systems or data-related practices. In their risk assessments, very large platforms are specifically required to consider the following systemic risks:

Actual or foreseeable negative impacts on a range of fundamental rights, including the right to dignity, to respect for private and family life, protection of personal data, freedom of expression and information, the prohibition on discrimination, the rights of the child, and to a high-level of consumer protection.

Any foreseeable negative effects on civic discourse and electoral processes, and public security.

Any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person's physical and mental well-being.

Dissemination of illegal content through their services.[22]

2.28The question of where best to target regulation was also a contentious topic for many submitters to the inquiry. The eSafety Commissioner was of the view that appropriate responsibility should be placed at all levels of the online environment, to avoid blame shifting or avoidance of responsibility:

we have to take an ecosystem approach ultimately, and I believe that every element or every sector of the tech ecosystem has a role—from the device manufacturers, on the devices, to the operating systems, the search engines and the social media companies. We're seeing people pointing in directions saying, 'Well, it's not our responsibility; it is theirs.' Everyone has to take a little bit more responsibility to provide safeguards up and down the stack. That is my view— along with effective regulation, of course, to gain more of that transparency. You can't have accountability without transparency.[23]

2.29DIGI was also of the view that accountability cannot be ensured by only one part of the online ecosystem, or by only one approach:

Online harms are multi-faceted social problems that cannot be fixed with technical and legal safeguards alone; this is why we are a proponent of multi-stakeholder approaches in relation to online harms that continue to ensure strong accountability and responsibility on the part of online platforms, while also situating platform-level responses in a wider context.[24]

2.30CHOICE submitted comments from supporters that went to the crux of the business model of the platforms, and how they monetise content without bearing any responsibility for that content:

The scam site came up on the top of the Google search list and was tagged as 'Sponsored'. The site basically required a payment for something that was free - ABN. I have emailed the scam group and received no response. I believe if Google (and similar web companies) are making money from 'sponsored' sites then they're making money from scams where this is the case. There should be an accountability that also goes back to them.[25]

2.31To this end, participants advocated for a range of measures to combat social-media based scams, including (but not limited to):

a requirement that foreign social media companies be compelled to be domiciled in Australia, subject to Australian laws;[26]

placing duties on digital platforms to protect users from fraudulent advertisements and scammers;[27]

access to reimbursement and a single dispute resolution pathway;[28] and

improving digital literacy to help social media users identify likely scams.[29]

2.32The ACCC also highlighted the need for 'additional enforceable protection' in relation to the broader scams ecosystem:

We think that that can be achieved through the currently proposed model which has some broad legislative principles that will apply across the ecosystem consistently, with specific sectoral codes that will go to how those obligations are, in effect, given practical life given the different nature of the players across the ecosystem. So we do think that there is more that is required to ensure that there are no weak links in our system that may that enable scams to get to Australians.[30]

2.33Ms Catriona Lowe of the ACCC welcomed the government's commitment to introduce 'mandatory and enforceable scam codes' applying across the scams ecosystem, which will:

… clarify the roles and responsibilities for government regulators and the private sector including banks, telecommunications providers and, most relevantly to this inquiry, digital platforms.[31]

2.34The proposed new legislative regime was supported by other participants such as CHOICE, which advocated for imposing strong obligations on digital platforms that would require them to:

prevent fraudulent ads being hosted on their platforms;

maintain accessible and searchable ad libraries;

have effective processes for searching, identifying and acting on, similar users or similar fraudulent content once it becomes aware of a scam on its platform;

meet minimum standards to authenticate and verify the identity and legitimacy of users who sell or advertise on their platform; and

calculate and publish the revenue generated from scam advertisements, or report to an appropriate regulator to be published in a public database.[32]

2.35In relation to online marketplaces, CHOICE suggested that platforms should offer optional verification for users (and the option for users to only interact with other verified users), as well as the use of technology to monitor suspect marketplace activity.[33]

2.36Similarly, the Australian Banking Association supported the mandatory codes and proposed the addition of:

an obligation for social media platforms to promptly block, take down or otherwise restrict access to scam content that has been identified by the National Anti-Scan Centre and other trusted parties (in addition to the proposed obligation for platforms to investigate and act on potential scams flagged or reported by consumers);

providing powers for an appropriate regulator to direct platforms to block, take down or otherwise restrict access to scam content; and

obligations to enhance monitoring, prevention, detection and removal of scam content on the platform, especially advertising on the platform and accounts that are used to place advertisements.[34]

2.37To this end, the Australian Banking Association suggested that Singapore's recently-introduced Online Criminal Harms Act 2023 could provide a model for building on Australia's ecosystem approach.[35] Likewise, CHOICE highlighted the United Kingdom's Online Fraud Charter and the EU's Digital Services Act as possible models.[36]

2.38More broadly, further legislative change both in Australia and overseas to address the lack of accountability of platforms was supported by various organisations.For example, the Butterfly Foundation advocated for the Online Safety Act 2021 to be reviewed to place a duty of care on platforms, alongside various other accountability mechanisms:

[W]e recommend that: the Online Safety (Basic Online Safety Expectations) Determination 2022 (Cth) be modified so that social media must take 'reasonable steps' to promptly remove pro-eating disorder content and advertisements from their platforms; and that…the Online Safety Act 2021 (Cth) review result in: an overarching duty of care to protect the health and wellbeing of young people, risk assessments and risk mitigation obligations, meaningful transparency measures to make publicly visible the risks and mitigation measures created by systems and elements, and strong accountability and enforcement mechanisms.

2.39A practical way forward in terms of government oversight was suggested by the Centre of the Public Square:

That's why we advocate for either a senior minister with overriding responsibility for tech platform accountability or at least a more coordinated internal process to look at these reforms as a package.[37]

2.40Safe on Social supported reform of Section 230(c) of the Communications Decency Act 1996 in the US as this Act has 'inadvertently allowed social media companies to evade responsibility for harmful content on their platforms.' And that '…developing global standards and regulations to address cross-border online threats and lobbying for Section 230(c) changes will help ensure social media companies are held accountable worldwide'.[38]

Footnotes

[1]Ms Georgie Harman, Chief Executive Officer, Beyond Blue, Proof Committee Hansard, 1 October 2024, p. 3.

[2]Ms Katie Acheson, Chief Executive Officer, batyr, Proof Committee Hansard, 1 October 2024, p. 4.

[3]Ms Georgie Harman, Chief Executive Officer, Beyond Blue, Proof Committee Hansard, 1 October 2024, p. 3.

[4]Dr Rys Farthing, Director of Research, Reset.Tech Australia, Proof Committee Hansard, 10 July 2024, p.5.

[5]eSafety Youth Council, Submission 60, p. 3.

[6]Office of the eSafety Commissioner, Safety by Design(accessed 26 July 2024).

[7]Office of the eSafety Commissioner, Safety by Design(accessed 26 July 2024).

[8]Office of the eSafety Commissioner, Safety by Design(accessed 26 July 2024).

[9]Office of the eSafety Commissioner, Safety by Design(accessed 26 July 2024).

[10]Dr Rys Farthing, Director of Research, Reset.Tech Australia, Proof Committee Hansard, 10 July 2024, p.4.

[11]Ms Sarah Davies, Chief Executive Officer, Alannah and Madeline Foundation, Proof Committee Hansard, 10 July 2024, p. 46.

[12]Ms Mia Garlick, Regional Director, Policy, Australia, Japan, Korea, New Zealand and Pacific Islands, Meta, Proof Committee Hansard, 28 June 2024, p. 2.

[13]Ms Alice Dawkins, Executive Director, Reset.Tech Australia, Proof Committee Hansard, 10 July 2024, p. 9.

[14]Ms Melinda Tankard Reist, Movement Director, Collective Shout, Proof Committee Hansard, 10 July 2024, pp. 26–27.

[15]Ms Melinda Tankard Reist, Movement Director, Collective Shout, Proof Committee Hansard, 10 July 2024, p. 27.

[16]Ms Mia Garlick, Regional Director, Policy, Australia, Japan, Korea, New Zealand and Pacific Islands, Meta, Proof Committee Hansard, 28 June 2024, p. 6.

[17]Mr Bruce Meagher, Head of Public Affairs, Tattarang, Proof Committee Hansard, 28 June 2024, p. 40.

[18]Ms Mia Garlick, Regional Director, Policy, Australia, Japan, Korea, New Zealand and Pacific Islands, Meta, Proof Committee Hansard, 28 June 2024, p. 7.

[19]Ms Mia Garlick, Regional Director, Policy, Australia, Japan, Korea, New Zealand and Pacific Islands, Meta, Proof Committee Hansard, 28 June 2024, p. 2.

[20]DIGI, Submission 57, Appendix 2, p. 31. DIGI's founding members are Apple, eBay, Discord, Google, Linktree, Meta, Microsoft, Snap, Spotify, TikTok, Twitch, X, and Yahoo.

[21]Ms Alice Dawkins, Executive Director, Reset.Tech Australia, Proof Committee Hansard, 10 July 2024, p.1.

[22]Senate Economics References Committee, inquiry into the influence of international digital platforms, Human Rights Law Centre, Submission 84, Attachment 3, p. 10.

[23]Mrs Julie Inman Grant, eSafety Commissioner, Proof Committee Hansard, 21 June 2024, p. 57.

[24]DIGI, Submission 57, p. 2.

[25]CHOICE, Submission 51, p. 10.

[26]Tattarang, Submission 58, p. 2; Australian Banking Association, Submission 144, p. 4.

[27]NSW Service for the Treatment and Rehabilitation of Torture and Trauma Survivors (STARTTS), Submission 195, p. 4.

[28]CHOICE, Submission 51, p. 5.

[29]STARTTS, Submission 195, p. 4.

[30]Ms Catriona Lowe, Deputy Chair, Australian Competition and Consumer Commission, Proof Committee Hansard, 21 June 2024, pp. 24–25.

[31]Ms Catriona Lowe, Deputy Chair, Australian Competition and Consumer Commission, Proof Committee Hansard, 21 June 2024, p. 18.

[32]CHOICE, Submission 51, p. 5.

[33]CHOICE, Submission 51, p. 5.

[34]Australian Banking Association, Submission 144, p. 2.

[35]Australian Banking Association, Submission 144, p. 3. Under Singapore'sOnline Criminal Harms Act 2023, designated person can issue directions to specified types of persons including social media companies and other digital platforms. These directions include stop communication direction, disabling direction, access blocking direction, account restriction direction, and app removal direction.

[36]CHOICE, Submission 51, pp. 14 and 15.

[37]Mr Peter Lewis, Founder, Centre of the Public Square, Per Capita, Proof Committee Hansard, 10 July 2024, p. 2.

[38]Safe on Social Media Pty Ltd, Submission 37, pp. 1 and 2.