Luke Buckmaster and Tyson Wils, Social Policy
Governments and parliaments around the world have increasingly been focused on the issue of fake news, with some identifying it as a major threat to democratic and social institutions. How has Australia been responding to the issue, and what are some of the tensions associated with taking action in this area?
What is fake news?
The term ‘fake news’ is used in a variety of ways but, in this article, describes disinformation intentionally spread for political purposes via social media platforms such as Twitter, Facebook or WhatsApp. This can include information fabricated to mimic genuine news content and a range of other false material.
Fake news is said to have influenced the 2016 US presidential election and UK European Union membership (Brexit) referendum. More recently, the spectre of fake news is said to have loomed over various other elections around the world, including the 2019 Australian federal election. The spread of false information has also been connected with persecution of minority groups in various countries.
It has been argued that the term ‘fake news’ should be avoided for various reasons, including the lack of an agreed definition and the politicisation of the term. Some also suggest that ‘fake news’ does not adequately convey the complexity of the various practices associated with the term, and suggest instead that words such as ‘misinformation’ and ‘disinformation’ be used. Nevertheless, the political salience of the term ‘fake news’ means that is likely to continue to be widely used, alongside other terms.
Is fake news new?
Deliberate dissemination of false information for political purposes is nothing new. A key difference between past and current modes of dissemination is the role of social networks using digital technologies. According to US researcher on digital disinformation, Claire Wardle:
Previous attempts to influence public opinion relied on ‘one-to-many’ broadcast technologies but, social networks allow ‘atoms’ of propaganda to be directly targeted at users who are more likely to accept and share a particular message. Once they inadvertently share a misleading or fabricated article, image, video or meme, the next person who sees it in their social feed probably trusts the original poster, and goes on to share it themselves. These ‘atoms’ then rocket through the information ecosystem at high speed powered by trusted peer-to-peer networks.
Digital technologies also increase opportunities for reaching specific audiences via the harvesting of personal data from social media and other digital platforms, which can enable better targeting of political advertising. One way such ‘dark advertising’ (or ‘micro-targeting’) differs from traditional advertising is that ads tailored for specific recipients can only be seen by the advertisement’s publisher and the target group. The best-known example of this type of activity is the harvesting of data from up to 87 million Facebook profiles by British political consultancy, Cambridge Analytica—over which Facebook was fined £500,000 by the British Information Commissioner’s Office.
A further important aspect of the newness of fake news arises from the way in which social media can enable foreign actors to interfere in the social and political processes of other countries. Russia’s alleged intervention in the 2016 US election via ‘troll’ or ‘sockpuppet’ accounts and botnets is probably the best-known example of this.
Fake news is increasingly seen as a threat to political legitimacy, democratic institutions and social cohesion by governments around the world. Governments have responded to the threat posed by digital disinformation in various ways, including:
- laws against spreading false information online
- blocking of social media and other online platforms
- laws against foreign interference in elections
- increased regulation of social media (advertising, bots)
- media literacy campaigns
- social media monitoring and
- promotion of factual content.
Some responses have been highly contentious and criticised, particularly those legislating against false information, due to a perception such measures are aimed at extending government control and silencing dissent.
Governments and parliaments across the world have also sought to investigate the extent and impact of fake news and possible responses. Over the past year, there have been significant reports tabled by the European Commission and committees of the Canadian and UK parliaments.
In early 2018 the European Commission set up a high-level group of experts (the HLEG) to provide advice on how to counter fake news and online disinformation. The HLEG report was completed in March 2018. The report says that any measures to counter the spread and influence of disinformation must not violate fundamental principles of freedom of expression—including the right to receive and share information—or undermine the technical functioning of the Internet. To this end, the HLEG supports targeted, non-regulatory approaches to problems of disinformation, rather than broad regulatory interventions (although it does not rule out the potential use of regulation in some instances).
A key recommendation of the HLEG report was that multiple stakeholders collaborate to ensure digital media companies are more transparent regarding news production and distribution. This has a number of aims, including supporting users to identify who is behind a certain type of information and their motivations and funding sources; and the use of ‘source transparency indicators’ to help users better discern between credible journalistic content and various kinds of disinformation. Other key recommendations include developing tools for empowering and educating public and professional users of platforms; and strategically implementing large-scale media and information literacy programs across schools, industry sectors and communities.
In March 2018 the Canadian House of Commons Standing Committee on Access to Information, Privacy and Ethics began an inquiry into ‘the breach of personal information involving Cambridge Analytica and Facebook’. The Canadian committee’s preliminary report contained a number of recommendations (mostly amendments to the Personal Information Protection and Electronic Documents Act). The Canadian committee’s final report (December 2018) included a number of potential regulatory responses to the problem of misinformation and disinformation on social media platforms. One recommendation was that social media companies would be required to be more transparent about the processes behind online dissemination of material, including clear labelling of automated or algorithmically produced content. Other suggestions were that there should be obligations on social media companies to take down illegal content, including disinformation, and more investment by both social media companies and governments in digital literacy programs and public awareness campaigns.
The UK House of Commons Digital, Culture, Media and Sport Committee has examined the issue of disinformation and fake news since January 2017, focusing on issues such as the definition, role and legal liabilities of social media platforms; data misuse and targeting; and foreign interference in elections.
The UK committee tabled an interim report on 29 July 2018, and a final report in February 2019. Recommendations made by the committee include a compulsory code of ethics for technology companies overseen by an independent regulator with powers to launch legal action; changes to electoral communications laws to ensure transparency of political communications online; and obligations on social media companies to take down known sources of harmful content, including proven sources of disinformation.
International Grand Committee
As part of its efforts to promote cooperation across countries, the UK House of Commons Digital, Culture, Media and Sport Committee invited elected representatives from Argentina, Belgium, Brazil, Canada, France, Ireland, Latvia and Singapore to establish an International Grand Committee on Disinformation and ‘Fake News’ (International Grand Committee), which held its inaugural session in November 2018.
Following the session, members of the International Grand Committee signed a declaration on the ‘Principles of the Law Governing the Internet’, which ‘affirms the Parliamentarians’ commitment to the principles of transparency, accountability and the protection of representative democracy in regard to the internet’. Further sessions of the International Grand Committee are planned for 2019.
Australia is in the early stages of responding to fake news and disinformation. The main Australian Government response so far has been the creation of a taskforce to address threats to electoral integrity, though the foreign interference laws, which passed the Parliament in June 2018, also have some relevance to the issue. In addition, the Australian Electoral Commission (AEC) commenced a social media literacy campaign (discussed below) and other activities to coincide with the 2019 federal election.
There have also been several recent parliamentary inquiries and an inquiry by the Australian Competition and Consumer Commission (ACCC) examining issues related to fake news.
Electoral Integrity Assurance Taskforce
Prior to the July 2018 federal by-elections held across four states, the Turnbull Government established a multi-agency body, the Electoral Integrity Assurance Taskforce, to address risks to the integrity of the electoral system—particularly in relation to cyber interference. Agencies involved include the AEC, Department of Finance, Department of Home Affairs and the Australian Cyber Security Centre.
According to the Department of Home Affairs, the taskforce’s role is to provide the AEC with ‘technical advice and expertise’ in relation to cyber interference with electoral processes. A media report on the establishment of the taskforce suggested that its central concern is cybersecurity, including interference with the electoral roll or AEC systems. The media report added that:
… the potential use of disinformation and messaging, and any covert operations designed to disrupt the by-elections, would also be closely monitored, with the foreign interference ‘threat environment’ having escalated even within the two years since the last federal election in 2016.
Foreign interference laws
The National Security Legislation Amendment (Espionage and Foreign Interference) Act 2018 has inserted new foreign interference offences into the Commonwealth Criminal Code. The elements of these foreign interference offences could arguably be applied to persons who ‘weaponise’ fake news in certain circumstances. The offences extend to persons who, on behalf of a foreign government, engage in the deceptive or covert conduct intended to ‘influence a political or governmental process of the Commonwealth or a State or Territory or influence the exercise (whether or not in Australia) of an Australian democratic or political right or duty’.
Australian Electoral Commission
The AEC commenced an advertising campaign, Stop and consider, on 15 April 2019 on social media platforms (such as Facebook, Twitter and Instagram) to encourage voters to ‘carefully check the source of electoral communication they see or hear’ during the 2019 federal election campaign. Stop and consider was essentially a media literacy campaign that alerted voters to the possibility of ‘disinformation, or false information’ intended to influence their vote; and help them ‘check the source of information so [they] can cast an informed vote’.
The AEC is reported to have sought to establish protocols for social media companies to address advertising on their platforms that contravene Australia’s electoral laws (such as those relating to authorisation).
Select Committee on the Future of Public Interest Journalism (2018)
In May 2017 the Senate established the Select Committee on the Future of Public Interest Journalism. The impact that digital disruption has had on the business model of traditional media was a key focus of the inquiry. It also considered the effect search engines and social media has had on public interest journalism with the spread of fake news. To this end, the committee sought to find out more about the ‘sources and motivation of fake news in Australia and elsewhere.
The committee said it was ‘struck by the number of journalist jobs that have been lost in the last few years’ due to industry restructuring. It received evidence that such changes were, in some cases, leading to a decline in quality journalism and continuing to erode the public’s trust in media. A number of witnesses and submitters to the inquiry saw a correlation between the impact of these changes and the increased prevalence of fake news.
The Senate committee’s report made a number of recommendations in regards to supporting public interest journalism, including adequate levels of funding for the national broadcasting sector and an audit of current laws that impact on journalism. The Senate committee specifically referred to fake news in relation to education, proposing that the Commonwealth work with states and territories to see if the national curriculum can be strengthened—not only to enhance student awareness of fake news, but to improve digital media literacy skills more generally.
ACCC digital platforms inquiry
Since December 2017 the ACCC has been conducting an inquiry into the impacts of digital platforms on media competition in Australia, including the implications of these impacts for quality news and journalism.
The ACCC’s preliminary report discusses a range of intersecting factors that pose a risk of increasing audience exposure to less reliable news. These factors include commercial incentives for media companies to produce sensational ‘click bait stories’, which are optimised for search engines and news aggregators and designed to go ‘viral’. Other potential problems include those associated with news feeds on digital platforms. Such feeds show users a mix of individual news stories that do not have context regarding source credibility, which makes it difficult for users to discern the quality of information. In addition, platforms select and prioritise news on the basis of users’ past behaviours and preferences, so news stories that share the same perspectives may be repeatedly displayed to consumers.
Broadly, the ACCC proposes that further analysis be done into:
- how to make digital platforms more transparent so that users can recognise where information in their news feeds is coming from and assess the credentials of the media entities responsible for the stories surfaced and displayed
- whether additional measures are needed to develop media literacy programs in Australia and
- whether more mechanisms can be introduced to improve the capacity for news media businesses to produce news and journalistic content, including stories in the public interest.
In its response to the ACCC’s preliminary report, the Australian Communication and Media Authority stated that a new, single regulatory framework should be developed; and that—as the authority with primary responsibility for regulation of media and communications content in Australia—it is best placed ‘to take on the role of a single content regulator’.
Joint Standing Committee on Electoral Matters
The Parliamentary Joint Standing Committee on Electoral Matters (JSCEM) examined matters relating to fake news and disinformation in two inquiries during the 45th Parliament.
In the Report on the conduct of the 2016 federal election and matters related thereto, the JSCEM examined the question of whether ‘cyber-manipulation of elections—the interference of social media bots and foreign interference in electoral events’ had any impact on the 2016 federal election. The JSCEM noted that it had not received any evidence of any cyber-manipulation in the 2016 federal election, but that it is 'essential that this issue be actively considered as part of Australian elections'.
The JSCEM’s interim recommendations included that:
- a permanent task force be established to prevent cyber-manipulation in Australian elections
- greater legal clarity be brought to the standing of social media services as 'platform' or 'publisher' and
- consideration be given to how media literacy can be incorporated into school education programs, including civics education.
The JSCEM also briefly examined ‘democracy and disinformation’ in a chapter of its report on the AEC’s 2017–18 annual report. The JSCEM raised concerns about media reports that Facebook was not meeting its obligations relating to authorisation rules under the Commonwealth Electoral Act, and that it had been ‘restricting independent scrutiny of advertising on its platform’.
In the concluding section on the AEC’s annual report, the JSCEM expressed hope that its successor committee in the 46th Parliament would be engaged with the work of the International Grand Committee, and consider recommendations made by committees of international parliaments.
A key challenge in responding to fake news is the complex, overlapping nature of the problem—which involves issues pertaining to cybersecurity; national security; privacy; electoral integrity; media and advertising standards; transparency; and media literacy. Any effective response will need to account for these various aspects.
Essentially, taking action to address fake news involves grappling with how to regulate the multitude of content shared/distributed (often rapidly, anonymously and from overseas) by users of social media. This raises questions such as:
- What responsibilities do social media companies have for content shared on their sites (are they platforms or publishers)?
- What are the appropriate mechanisms for ensuring these responsibilities are enforced?
- Can measures to combat fake news be designed to avoid unintentionally applying to genuine news?
- What responsibilities do users of social media have in relation to the material they share and consume?
- What role do governments have in developing greater information literacy among citizens?
Australia is at the early stages of its response to these questions. As the JSCEM has indicated, an effective response to fake news will most likely require engagement with work on this issue by overseas parliaments.
Joint Standing Committee on Electoral Matters, Inquiry into the Australian Electoral Commission annual report 2017–18: Status report, Commonwealth of Australia, Canberra, March 2019.
United Kingdom House of Commons Select Committee on Digital, Culture, Media and Sport, Disinformation and ‘fake news’: Final report, 8th report of Session 2017–19, HC 1791, February 2019.
Canada House of Commons Standing Committee on Access to Information, Privacy and Ethics, Democracy under threat: Risks and solutions in the era of disinformation and data monopoly
, 1st Session, 42nd Parliament, December 2018.
Back to Parliamentary Library Briefing Book
For copyright reasons some linked items are only available to members of Parliament.
© Commonwealth of Australia
With the exception of the Commonwealth Coat of Arms, and to the extent that copyright subsists in a third party, this publication, its logo and front page design are licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Australia licence.