During 2018 disinformation influenced democratic processes around the world; most notably the 2016 US Presidential election, and the United Kingdom European Union membership referendum (Brexit referendum). Although the Committee addressed disinformation as part of its report on the 2016 election, it considers that–like oversight of the AEC–the effect of disinformation on democratic processes is a matter that requires long-term parliamentary oversight.
Accordingly, in November 2018, the Committee resolved to incorporate this issue into its ongoing oversight inquiry, with the following focus areas:
the extent to which social media bots may have targeted Australian voters and political discourse in the past;
the likely sources of social media manipulation within Australia and internationally;
ways to address the spread of online ‘fake news’ during election campaigns; and
measures to improve the media literacy of Australian voters.
Threats to democracy
Digital technologies provide platforms for greater political engagement than has ever been possible. While greater engagement in political and policy discussions has the potential to enhance our democracy, it also poses threats:
Internet communication is posing two interlocked challenges to Australian democracy: hostile strategic actors are attempting to sow division in society by weaponising controversial or misleading information; the self-selection of news and disappearance of attitude-challenging content in some parts of the population’s news diet can lead to the rise of ‘echo chambers’ which facilitate the dissemination of misinformed opinion.
This information threat touches every elector. The Committee’s preliminary findings on this issue identify the need for improved community media literacy and critical analysis of disinformation. The term ‘fake news’, while accurate, has also been politically weaponised in recent years. Many countries are now using the terms ‘disinformation’ or ‘misinformation’ to identify this threat.
Dr Carlo Kopp affirmed the Committee’s early findings that media literacy and critical thinking skills should be improved. He suggested that this needs to be wider than the education system:
One of the difficulties that we have had with a free-for-all in media and social media for many years is that a lot of improper and deceptive conduct has ended up being normalised in the community. In other words, people just don't take any notice of it and nobody gets upset if somebody goes out and makes false statements or makes blatantly false statements—ho-hum!
He further explained:
…the problem, particularly with fake news, in many ways resembles an epidemiological problem with an infectious disease. In other words, you have something that's in effect toxic infecting an individual and when they share it with other individuals via social media they're propagating it. There has been a very vigorous debate about this issue of immunisation and whether it's appropriate for a person to refuse immunisation if it means they become a public risk by distributing an infectious disease. That probably sets a conceptual precedent for the problem that we see here in social media with people irresponsibly propagating things they may not know to be false or may know to be false but may not care whether or not it's false.
Dr Kopp noted that these threats will continue to evolve and become more sophisticated, pointing to the development of ‘deep fakes’ as an emerging threat.
As well as the threats posed by deliberately false information, computer networks are also a target for malicious activity. For this reason, as noted, in Chapter 1, this Committee suggests a future Electoral Matters Committee consider inquiring into whether Australia’s electoral network–including access to, and management of, the electoral roll–should be considered to be a national critical infrastructure asset.
In early 2019, the Australian Parliament computing network, alongside the networks of the Labor, Liberal and National parties, were the target of a malicious interference. About the attempt, the Prime Minister said:
…there is no evidence of any electoral interference. We have put in place a number of measures to ensure the integrity of our electoral system. I have instructed the Australian Cyber Security Centre to be ready to provide any political party or electoral body in Australia with immediate support, including making their technical experts available. They have already briefed the electoral commissions and those responsible for cybersecurity for all states and territories. They have also worked with global antivirus companies to ensure Australia's friends and allies have the capacity to detect this malicious activity.
The Committee notes that political parties may need more support to ensure the security of their systems. In response to the Prime Minister’s statement, the Leader of the Opposition stated:
Political parties are small organisations with only a few full-time staff, yet they collect, store and use large amounts of information about voters and communities. These institutions can be a soft target, and our national approach to cybersecurity needs to pay more attention to non-government organisations. Our agencies shouldn't be just providing advice to political parties but actively assisting in their defence. I have to say today to people listening in parliament that my words are not just about political parties but also about Australia's small and medium businesses, our educational institutions and individuals.
Like the threats, the solutions to these issues are multifaceted. The Committee has already identified the need for a permanent taskforce to prevent and combat cyber-manipulation in Australia’s democratic processes and clarification in the legal framework surrounding social media services and their status as a platform or publisher.
Shortly following the release of the Committee’s report, the Australian Competition and Consumer Commission (ACCC) released a preliminary report on its digital platforms inquiry. The report notes:
…we are at a critical point in considering the impact of digital platforms on society. While the ACCC recognises their significant benefits to consumers and businesses, there are important questions to be asked about the role the global digital platforms play in the supply of news and journalism in Australia, what responsibility they should hold as gateways to information and business, and the extent to which they should be accountable for their influence.
The ACCC report made a range of preliminary recommendations to:
address Google and Facebook’s market power;
monitor digital platforms’ activities and the potential consequences of those activities for news media organisations and advertisers;
address regulatory imbalance;
assist more effective removal of copyright infringing material;
better inform consumers when dealing with digital platforms and to improve their bargaining power.
Any legislative responses will need to balance infringing on freedom of speech and freedom of political communication with the need to combat disinformation. As Table 2.1 demonstrates, governments around the world are developing approaches that range from educative to punitive actions. Journalists have raised concerns about punitive actions in some countries specifically targeting reporters for publishing negative government stories. While similar punitive laws would not be considered in the Australian context, it does highlight the complexity of imposing regulation in this area.
Table 3.1: Global anti-misinformation actions
Laws that impose prison terms for spreading ‘propaganda’ and posting ‘aggressive and frightening’ content.
Laws allowing prosecution of people spreading false information online and blocking websites.
Expert group of journalists and scholars formed to develop potential solutions. ‘Stop Fake News’ website launched to aimed at public education.
Government taskforce, legislation and platform agreements. Authors of ‘fake news’ can be prosecuted with penalties ranging from fines to imprisonment.
Laws allowing the government to block media that threatens national security, fines and jail time for the publishers for ‘fake news’ and a live television program addressing misinformation.
Media regulation–it is illegal to ‘report any news without being able to prove either its truth of that he had good reason to believe it to be true.’
Funding projects aimed at public media literacy, a ‘critical Election Incident Public Protocol’ to monitor and inform the public about disinformation attempts, draft legislation to ‘compel tech companies to be more transparent about anti-disinformation and advertising policies.’
Task force for addressing misinformation, distribution of social media literacy material and strengthening of the intelligence services to avoid foreign interference in parliamentary elections.
Commission report on misinformation recommending a range of global policy approaches to combatting misinformation.
Specifically focussed on elections, laws now define ‘fake news’ that is reported to change a vote, strict rules on media during and three months prior to an election, greater financial transparency in advertising and power for a judge to order the removal ‘false news’ within 48 hours of a notification.
Jail terms of up to four years for spreading false information on social media. Government taskforce to combat misinformation, a team to monitor social media traffic and remove posts that promote ‘fake news’ and weekly government briefings to debunk misinformation.
Online portal through which citizens can report ‘fake news’ to the police unit that investigates cyber-crime. A recent successful prosecution of a user posting Trip Advisor reviews under fake profiles demonstrated that creating a false identity online is a violation of existing Italian law.
Campaign to make people more ‘critical news consumers’. The army conducts a live radio broadcast to debunk Facebook misinformation and hotlines for citizens to report misinformation.
Government authority aimed at promoting factual content.
Parliamentary investigations (see DCMS Committee below) and the establishment of a government unit to combat disinformation by state actors and others.
California has passed a law that requires the education authorities to list resources on how to evaluate trustworthy media.
Bills are being debated to introduce more transparency in the funding of political ads.
Laws requiring internet service providers to disclose user data so the origin of posts can be traced. Platforms must delete content at the government’s request. Spreading false information is a crime that can result in imprisonment.
Source: Summary taken from www.poynter.org/ifcn/anti-misinformation-actions/. Also listed are actions being taken in China, Côte d’Ivoire, Croatia, Egypt, Germany, India, Ireland, Kazakhstan, Kenya, Malaysia, Myanmar, Pakistan, the Philippines, Russia, Rwanda, Saudi Arabia, Singapore, South Korea, Spain, Sri Lanka Taiwan, Tanzania, Thailand, Turkey, Uganda, United Arab Emirates–ranging from bills under debate and community education to internet shutdowns.
Disinformation campaigns are now known to have influenced electoral processes with subversive political messaging in the United States (2016 Presidential election) and the United Kingdom (European Union membership referendum, commonly referred to as Brexit).
The Committee held in-camera hearings with both Twitter and Facebook in November and December 2018 to discuss these issues. The Committee took the unusual step of agreeing to in-camera hearings in order to hear about measures that are being put in place to address malicious actors.
Both platforms have different audiences and roles in hosting the dissemination of information, and have approached global concerns with differing degrees of transparency.
In February 2019, Twitter announced stronger advertising regulations aimed at ensuring that political advertising is run by verified individuals, organisations, or registered political parties, candidates or groups.
The Committee was concerned to see February 2019 reports that Facebook is not adequately applying the authorisation rules set out by the Electoral Act to advertising on its platform.
Documents revealed by an ABC News investigation indicate that Facebook executives took a month to respond to AEC concerns about sponsored content from group campaigning on foreign donations laws. This is unacceptable, particularly when an election period may only be 5 to 6 weeks long.
ABC News also reported that ‘Facebook said it will make an announcement on political advertising rules after the federal election is called.’
This is also unacceptable. The political advertising authorisation rules set out by the Electoral Act apply to all political advertising in Australia, regardless of which platform is used. Like all businesses operating in Australia Facebook has a responsibility to ensure that it complies with local content laws.
Further, the Committee is also concerned with reports that Facebook is restricting independent scrutiny of advertising on its platform.
Digital advertising in Australia is governed by a combination of federal and state laws, and industry self-regulation. Third party transparency campaigners are also an important part of scrutinising social media to ensure compliance. Any attempt to limit the access of these third parties would be a move away from full transparency and accountability.
International Parliamentary consideration
As noted above, parliamentary committee investigations have formed an important component of considering the impact of disinformation on democracy. Specifically, the Committee notes the work undertaken by the United Kingdom House of Commons Select Committee on Digital, Culture, Media and Sport (the DCMS Committee) and the Canadian House of Commons Standing Committee on Access to Information, Privacy and Ethics (the ETHI Committee).
In the ETHI Committee’s December 2018 report, the Committee recommended that the Government:
subject political parties and political third parties to the Personal Information Protection and Electronic Documents Act (PIPEDA)[Canadian data privacy law];
provide additional resources to the Office of the Privacy Commissioner to ensure efficient exercise of its additional powers;
ensure that no foreign funding has an impact on elections in Canada;
ensure transparency in online political advertisements;
impose certain obligations on social media platforms regarding the labelling of content produced algorithmically, the labelling of paid advertisement online, the removal of inauthentic and fraudulent accounts, and the removal of manifestly illegal contents such as hate speech;
provide to an existing or a new regulatory body the mandate to proactively audit algorithms;
include principles of data portability and interoperability in PIPEDA;
study the potential economic harms caused by data-opolies and determine whether the Competition Act should be modernized;
study how cyber threats affect democratic institutions and the electoral system;
conduct research regarding the impacts of online disinformation and misinformation as well as the cognitive impacts of digital products which create user dependence; and
invest in digital literacy initiatives.
In concluding its report the ETHI Committee stated:
As the Committee concludes this study, it continues to believe that changes to Canada’s legislative and regulatory landscape are needed in order to neutralize the threat that disinformation and misinformation campaigns pose to the country’s democratic process.
In the DCMS Committee’s February 2019 report, the Committee recommended the following actions:
development of a regulatory definition of tech companies, including social media platforms, and impose appropriate regulation;
strengthening the application of data privacy and anti-competition laws to social media companies, particularly in lights of evidence that Facebook acted in a manner ‘considering themselves to be above and beyond the law.’
updates to electoral law to be updated to cover modern campaigning, including online, micro-targeted campaigns;
extension of transparency laws around political donations;
addressing foreign influence in elections–including advertising generating from foreign sources/funding and to make public government investigations into foreign influence, foreign-funded campaigns, disinformation and voter manipulation;
introduction of a foreign agents registration scheme to require political actors to make public disclosure of relationships with foreign principles; and
digital literacy as the fourth pillar of education, alongside reading, writing and maths, and a comprehensive cross-regulator strategy developed to promote digital literacy.
The DCMS Committee found:
…the dominance of a handful of powerful tech companies has resulted in their behaving as if they were monopolies in their specific area, and that there are considerations around the data on which those services are based. Facebook, in particular, is unwilling to be accountable to regulators around the world. The Government should consider the impact of such monopolies on the political world and on democracy.
The Australian Parliament must pay attention to the work undertaken by Canadian and UK parliamentary committees. While there are some areas of law, particularly transparency and electoral law, that Australia has been proactive in addressing, the threats to democratic processes that both committees have uncovered should not be ignored. Australia cannot wait until an electoral crisis occurs, and we should not be complacent or diminish the probability of this threat.
Other parliaments have also been investigating the extent to which disinformation campaigns have interfered with elections. Committees from these parliaments have convened an International Grand Committee to bring a joint focus to the issue and to call for accountability from major social media platforms.
The International Grand Committee has been underpinned by the substantial work undertaken by the United Kingdom House of Commons DCMS Committee and the Canadian House of Commons ETHI Committee.
The consistent refusal of the CEO of Facebook, Mark Zuckerberg, to appear before the committees of the Canadian and British Parliaments, as well as the International Grand Committee, despite the many millions of users in these countries, is contemptuous. Interference with democracies through disinformation spread through social media is a global problem that needs a unified global approach. The Committee encourages its successor to continue to engage with its global peers on this issue and in pursuing accountability from global social media platforms.
On 27 November 2018, members of the International Grand Committee on Disinformation and ‘Fake News’ signed a declaration on principles governing the internet (see Appendix B). While the declaration has no legal effect, it articulates the starting principles that prioritise the strength and health of representative democracies and will form the basis of that Committee’s work.
The International Grand Committee plans to meet on a regular basis and the Committee hopes its successor is able to be engaged with its work.
Legislatures play a fundamental role in ensuring the integrity of elections. Since 1983, this Committee has played a key role in advising the Parliament. As recent reports by counterpart committees have found, legislatures are increasingly confronting transnational challenges. These challenges threaten the integrity of national elections, but cannot be easily resolved by any one nation in isolation.
The ability to share information and solutions to transnational issues outside of traditional venues is strength of the International Grand Committee. Although co-operative work between Executive Governments is ongoing, developing similar inter-parliamentary networks will be essential in tackling the international threats posed to democracy that will be addressed by International Grand Committee members.
It is clear to this Committee that this inquiry should continue into the 46th Parliament and the recommendations made by committees of international Parliaments be considered in an Australian context.
It may be appropriate for other parliamentary committees to also examine aspects of the spread of disinformation, however, given the identified threat to democratic processes, the Committee considers that it is a core issue for the Electoral Matters Committee.
The Committee notes that addressing disinformation raises issues of national intelligence and security. However, the Committee considers that, wherever possible, these matters must be publicly investigated and discussed under the core electoral principle of transparency. It is only by openly examining our democracy that we will build and maintain public faith in its processes.
Senator the Hon James McGrath
22 March 2019