Mis- and Disinformation in Australia and the United States: A Comparison

The following piece is an essay written for my Democratic Politics unit at university. What’s a political science or communications/journalism course without… yet another piece on misinformation on the internet. So I made a point of bashing the media too, you’re welcome! Reference list at the end.

Mis- and disinformation, fake news, propaganda – these are all terms that have recently been rejuvenated in public discourse over the past decade, particularly in the wake of the election campaign (and subsequent electoral victory) of Donald Trump in the United States in 2016. They often have different meanings for different people, can have significant overlap, and (most importantly) are able to be spread by anyone whether they are aware of it or not. The aim of this essay is to compare the prevalence and influence this influx of (mis)information has had on the democratic processes and systems in the United States and Australia, and how their respective systems and institutions have influenced the dissemination of it in return.

Definitions of key terms will be discussed, followed by a look at how Americans and Australians consider and approach them. Focus will be given to how traditional (mainstream) media have both been vectors of this phenomenon and as targets of accusations, alongside social media as a platform for the creation and dissemination of information within and beyond the mainstream conception of media. To finish, a discussion of regulatory and countering strategies will go through possible avenues for combatting misinformation on social media and how successful these are.

Mis- and dis-information are extremely broad terms used in a variety of contexts, and often interchangeably. Misinformation can be characterised as information that is “misleading or harmful”, while disinformation is information that is explicitly false (Sharevski et al., 2021). Another, perhaps overly lenient, way of separating the two is that misinformation campaigns are unintentional whilst disinformation are a deliberate action (Weber et al., 2022). This is a problematic way of comparing the two, but it does highlight the importance of intent behind the sharing of false information and how that information is shared (for example through the use of bots or organically by genuine accounts on social media).

Another way to conceive of disinformation that is useful for analysing media is Lash’s “disinformation society” (Lash, 2002). For Lash, there is a distinction between knowledge as a product, such as mainstream media or posting on social media, and knowledge gained through deliberation and reflection. In the age of social media, Mahailidis (2023) uses an analogy of cars that illustrates this idea, where the promise of automobiles and mediatisation is to open up a wider world, but in reality leaves us isolated and interacting through brief, instantaneous moments rather that genuine engagement with society. While this does not map onto more recent and general definitions of mis- and disinformation exactly, this lack of deliberation and reflection undoubtedly plays a critical role in its spread.

Manufactured Consent

Before comparing the role misinformation currently plays in US and Australian democracies, it is prudent to discuss the role of the government and media in proliferating it prior to the advent of social media. Not only are the comparisons between the two countries on this front enlightening in their own right, but it plays into the growing lack of trust in institutions of media and government that accelerates the current misinformation cycles. Accusations of fake news may often be unsubstantiated or conspiratorial, but there are legitimate criticisms of the media that can and should be made to understand how this came about.

In Manufacturing Consent, Chomsky & Herman (1988) outline a propaganda model wherein the mass media act as proponents of state and capital power (namely, that of US interests). While the examples are older and US-centric, it clearly and strongly makes the case that “disinformation” (misinformation does not seem to appear as much in older literature) is an integral part of traditional media fare. For instance, Edward Bernays (who wrote Propaganda in 1928) and others, with the CIA and United Fruit Company, swung US public opinion against the fairly moderate government of Jacobo Arbenz in Guatemala, leading to public approval of the coup against him in 1954 (Schlesinger & Kinzer, 2005). McNair (2018) says “Disinformation is, of course, a form of military public relations”, primarily referring to the invasion of south Vietnam that was initially suppressed, then endorsed with increased presence following the coverage of the Gulf of Tonkin “incident”.

A more recent example involving Australia is the illegal invasion of Iraq in 2003 by the US and the “coalition of the willing”. Both governments justified their involvement in the war in large part due to the fears of Saddam Hussein having (or attempting to develop or attain) weapons of mass destruction and his government’s connections with al Qaeda and other Islamic extremists, something heavily pushed by the governments, intelligence agencies and portions of Western media with little evidence (Rice & Bartlett, 2006; Rane et al., 2014; The Age, 2003). As an aside, there had been WMDs (specifically chemical weapons) in Iraq in the 1980s, but at this time Hussein was an ally of the Reagan administration against Iran so not only their existence but their use was endorsed by the US (Ozdemir, 2022).

While it dropped over time, US belief in the above claims was high and, once it began, so was support for the invasion as public and (ostensibly) global opinion was on side (Kull et al., 2003; Boussios & Cole, 2010). In Australia, meanwhile, there was greater skepticism by the general public and (to an extent) the media towards the justifications for the war and the disinformation fed to them by the government than compared to Americans (Rice & Bartlett, 2006; Lewandowsky et al., 2005). Lewandowsky et al.’s study (2005) notes that Australians were more likely to be sensitive to media retractions of false claims, while Americans, even if they are aware of the retractions, remain strikingly committed to false memories and information.

These instances of state and media channels of mis- and disinformation were obviously extremely harmful for the Iraqi population, the consequences of which are still playing out today. They were also harmful to democracy in the invading countries, with governments knowingly lying to their constituents and the media playing along with one ear to public opinion. And this is by no means a complete list of manipulation. Though the arenas and topics may be different to the accusations of “fake news” thrown at them over the past few years, mainstream media does have a healthy tradition of omitting, framing and selectively choosing information to present. In terms of trust in media, government and democracy in general, there is not much to compare between Australia and the US in this regard – both are losing trust in these institutions over time (Edelman, 2022).

Social Media

Mis- and disinformation today, however, is much more polarising and clearer, often coming from sources beyond the mainstream institutions of media, government and even academia (with exceptions in the US, see below). Social media, particularly Facebook, Twitter and YouTube, are major sites of contemporary misinformation and “alternative” narratives. Climate change, vaccines and the COVID-19 pandemic and rigged elections are US favourites, and the industry and audience for that are alarmingly prominent. While the first of those two also have some mild sway in Australian politics, conspiracy is a much more fringe concern. Misinformation on specific events or proposals, however, are not infrequent.

For instance, Twitter has become a vital method of communication during natural disasters (Bruns & Liang, 2012). While it helped get information and updates spread quickly during the tragic 2019-2020 bushfires, it also helped spread two pieces of misinformation – that the fires were sparked by arson and that the fires were worse due to green (both party and group) activism to prevent backburning (Weber et al., 2020). This picked up some traction on media networks such as Sky News, but social media is where a majority of the accusations and false information was spread with the help of bots and trolls.

Another case is the impending Voice referendum later in the year, where Australians will vote on whether our First Nations’ peoples shall have an advisory body enshrined in the Constitution. Claims that such a body would divide the country by race, giving one group more privilege than another (imagine that), and that it would be a third house of Parliament with legislative or veto powers to undermine democracy, are becoming increasingly common (Courty, 2023; Davidson, 2022).

To the contrary, the Voice is quite a conservative approach, a merely advisory body that can be ignored and, in Anthony Albanese’s words, will be “subservient” to the government and that (genuine) critics say “gives [First Nations’ peoples] no rights of self-determination as outlined in the United Nations Declaration of the Rights of Indigenous People.” (Stephenson, 2023; Twomey, 2023; Karp, 2023; Butler & Kolovos, 2023). In other words, ineffective pablum and hardly a radical proposal.

How the campaigns for and against the Voice will play out is yet to be seen, but social media will certainly have a role in spreading misleading and false information about many aspects of it.

While these bouts of misinformation on social media (and occasionally picked up by the conservative side of politics) in Australia can have some damaging effects, they are for the most part isolated. There are two core differences between Australia and the US in this regard that, when compared, highlights the US as drastically more at risk to dangerous levels of mis- and disinformation. The first is the global hegemony of the US that leaves it more vulnerable and enticing to swarm with such potentially destabilising content. A notable example is Russian bot campaigns to spread and amplify false information regarding the safety of Covid-19 vaccines produced by the American pharmaceutical industry – claims that were, regardless of one’s thoughts on the corporate dominance of healthcare, simply false (Warner et al., 2022; Jemielniak & Krempovych, 2021).

The second contrast is the scale of the audience and industry surrounding domestic misinformation, both from social media personalities and from the Republican Party following the ascension of Donald Trump when he began his presidential bid in 2015. YouTube has been a vital platform for far-right figures to build substantial audiences. Between just five people and groups of this community – Ben Shapiro, Steven Crowder, Candace Owens, Jordan Peterson, and PragerU as prominent choices – there is a collective view count of over 6.3 billion. There are a number of political and socioeconomic factors that have led to what some have called a “supply and demand” loop for this style of content in the US that just doesn’t exist in Australia (Munger & Phillips, 2022). That is not to say Australia does not experience similar – if perhaps less pronounced – socioeconomic struggles or that we are immune to such content, but it is a relatively fringe import with no established figures of our own beyond Rebel News’ Avi Yemini.

An extreme reason for this contrast is the adoption of such content and tactics by the Republican Party once Donald Trump became the nominee in 2016. As a communicator utilising social media, Trump mobilised immense support and has since seemingly become a necessary part of the Republican Party’s survival. One of the major tipping points for misinformation becoming a staple of American political communication was the advent of “pizzagate”, a conspiracy peddled by the loose group QAnon following the 2016 election. After the release of emails from the DNC, including from Hillary Clinton and John Podesta, by WikiLeaks, it was suggested that high profile Democrats were involved in Satanic and sexual abuse of children (Bleakley, 2023). This was thoroughly debunked, the emails in truth mostly biased discussion against the campaign of Bernie Sanders in favour of Clinton.

This was also closely followed by the false claim by Trump that over 3 million illegal immigrants voted for Clinton and that, despite winning the election through the Electoral College, he would have also won the popular vote (Thomas & Werner, 2017). It is worth noting there was a similar reaction to much of Trump’s rhetoric among his supporters as there was to misinformation about the Iraq war among Americans mentioned earlier. If claims were attributed to Trump, supporters were much more likely to believe it and, even if they did reconsider the facts once corrected, vote for him regardless of the veracity of such claims (Swire et al., 2017).

In an Australian context, this is exceedingly outlandish. Beyond offensive and untrue statements by certain right-wing politicians and commentators on select topics, there has been no successful upheaval or questioning of the democratic process, nor a popular figure that has mobilised anything more than a fringe minor party. There is an argument to be made that most Australians are simply more sceptical and disinterested in such polarising political and cultural debates.

Regulation and Countering

As social media is an international phenomenon, the means of regulating misinformation on it are fairly similar in Australia and the US – that is, nigh on impossible. One difference, however, is the First Amendment in the US protecting free speech. Viewing Covid-19 misinformation through the lens of free speech, Sage & Yang (2022) noted that the government itself is limited in its ability to regulate or punish it unless there were other legitimate legal reasons (and the political will) to do so. This freedom does not, however, necessarily extend to private businesses who are not constrained by the Constitutional limitations. In this context it relates to medical business and bodies, where the speech of an individual could infringe on the speech rights of the business itself, but this would apply to any private business.

While Australia does not have an explicit “right” to free speech and expression in our Constitution, the High Court has said there is “an implied freedom of political communication exists as an indispensable part of the system of representative and responsible government created by the Constitution. It operates as a freedom from government restraint, rather than a right conferred directly on individuals.” (Australian Human Rights Commission, n.d.). There are, of course, laws and consequences regarding harmful and offensive speech relating to defamation or discrimination, but the state cannot in theory regulate political expression, even if it is false.

This brings us to social media, where the vast majority of mis- and disinformation comes from. Arguments about whether the state and private companies should have the power to regulate and moderate speech on social media platforms is beyond the scope of this piece, but for now it will be assumed that, as is happening now, they do and are working on finding a balance between free speech and the spread of false information. Private social media companies are able to set terms of service and codes of conduct on their platforms that generally cover content of an extremely offensive, violent or pornographic nature, but the free expression of political beliefs and opinions has been the subject of lengthy and often vicious debates.

In Australia, the government has introduced a Code of Practice on Disinformation and Misinformation (known as the “DIGI CoP”) based on recommendations by the Australian Competition and Consumer Commission (ACCC) and informed by similar EU measures (Hurcombe & Meese, 2022). While the government did call for technology companies to introduce voluntary codes of conduct and have the Australian Communications and Media Authority (ACMA) oversee the code, they did not give ACMA any powers to enforce it. Hurcombe and Meese (2022) view some of its measures favourably, stating its “diverse objectives better address the contemporary platform misinformation ecology, which includes not just coordinated and monetised misinformation campaigns but also misinformation spread by ordinary users.” They do, however, note that these codes are not extended to media and political organisations, drawing attention to media outlets and politicians who, in political commentary and advertising, have spread misinformation (Hurcombe & Meese, 2022).

So what are some measures employed by social media companies, and do they work? This again goes back to trust, this time not just in institutions like the state and media, but also in the tech companies who need to make decisions about what does and does not constitute misinformation. In the US, many of the people involved in spreading mis- and disinformation are distrustful of mainstream narratives and companies that they perceive as pushing it. Throughout the Covid-19 pandemic, Twitter and Facebook used fact checking and accuracy warnings in an attempt to “nudge” users toward correct information, as well as promoting expert corrections in digital media literacy campaigns. In theory this sounds useful, but in practice it has been termed “soft moderation” and can in fact bolster convictions and entrench “belief echoes” for those fighting against the mainstream (Ali, 2020; Sharevski et al., 2022; Vraga et al, 2020). The pandemic’s first year saw more misinformation spread on Twitter, for instance, than factual information (Ali, 2020).

In comparison, Weber et al (2022) noted that once Australian bushfire disinformation and trolling had been picked up by social media users and, after reaching a sizeable level of impact, news outlets, more people combatted the false information through the linking and sharing of articles disproving it. Unlike the American examples, where people believed claims based on who it was attributed to and were more inclined to continue supporting them, those labelled as “Unaffiliated” in the Australian case were more likely to share content in opposition to misinformation claims (Weber et al., 2022).

Mis- and disinformation is a global concern and can come from any avenue of society. Along with other existential threats such as climate change, pandemics and war, over time the amount and risk of false information has grown dramatically and leads to the exacerbation of other issues. While Australia is by no means immune to misinformation or conspiracy, both in terms of traditional institutions and media and the new social media paradigm, it is in a much better position than the US. This can largely be attributed to a much more moderate and sceptical approach to false claims on the part of Australians, and the lack of extremely polarising figures and political stances with the ability and audience to mobilise such damaging campaigns. Conspiracy is also a fringe (but arguably growing) concern in Australia, whilst in the US half of their political system has fallen back on conspiracy as a means of maintaining voters.

Much more needs to be done globally to combat mis- and disinformation, but how – and if – this will happen is something that will not be resolved any time soon. Australia needs to remain vigilant and learn lessons from what could be considered a significant collapse in US democracy over the past eight years. If our (still imperfect) liberal democracy is to avoid a similar fate to our Pacific ally, further investigation into the differences in approach to issues such as misinformation will be vital to prevent its expanded proliferation here.

Reference List

Ali, S. (2020). Combatting Against Covid-19 & Misinformation: A Systematic Review. Human Arenas, 5, 337-353.

Australian Human Rights Commission. (n.d.) Freedom of information, opinion and expression. https://humanrights.gov.au/our-work/rights-and-freedoms/freedom-information-opinion-and-expression.

Bleakley, P. (2023). Panic, pizza and mainstreaming the alt-right: A social media analysis of Pizzagate and the rise of the QAnon conspiracy. Current Sociology, 71(3), 509-525.

Bruns A., Liang, Y. E. (2012). Tools and methods for capturing Twitter data during natural disasters. First Monday, 17(4).

Boussios, E. G., Cole, S. (2010). Americans’ Attitudes toward War: Trend Analysis of Public Opinion for the Iraq War. Journal of Applied Security Research, 5(2), 208-226.

Butler, J., & Kolovos, B. (2023, February 8). Greens’ First Nations conveners side with Lidia Thorpe and say they do not support voice to parliament. The Guardian. https://www.theguardian.com/australia-news/2023/feb/08/greens-first-nations-conveners-side-with-lidia-thorpe-and-say-they-do-not-support-voice-to-parliament.

Chomsky, N., & Herman. E. S. (1988). Manufacturing Consent: The Political Economy of the Mass Media. Vintage.

Courty, A. (2023, May 22). Peter Dutton says Indigenous Voice will ‘re-racialise’ the country in a speech Linda Burney describes as ‘disinformation’. ABC News. https://www.abc.net.au/news/2023-05-22/peter-dutton-says-indigenous-voice-will-re-racialise-the-country/102378700.

Davidson, R. (2022, September 14). Will the proposed Indigenous Voice to Parliament become a third chamber? RMIT. https://www.rmit.edu.au/news/factlab-meta/will-the-proposed-indigenous-voice-to-parliament-become-a-third-

Edelman. (2022). Edelman Trust Barometer. https://www.edelman.com/sites/g/files/aatuss191/files/2022-01/2022%20Edelman%20Trust%20Barometer%20FINAL_Jan25.pdf.

Hurcombe, E., & Meese, J. (2022). Australia’s DIGI Code: what can we learn from the EU experience? Australian Journal of Political Science, 57(3), 297-307.

Jemielniak, D., & Krempovych, Y. (2021). An analysis of AstraZeneca COVID-19 vaccine misinformation and fear mongering on Twitter. Public Health, 200, 4-6.

Karp, P. (2023, January 10). Albanese rejects Dutton’s call to legislate Indigenous voice before referendum. The Guardian. https://www.theguardian.com/australia-news/2023/jan/10/albanese-rejects-duttons-call-to-legislate-indigenous-voice-before-referendum.

Kull, S., Ramsay, C., & Lewis, E. (2003). Misperceptions, the Media, and the Iraq War. Political Science Quarterly, 118(4), 569, 598.

Lash, S. (2002). Critique of Information. SAGE Publications.

Lewandowsky, S., Stritzke, W. G. K., Oberauer, K., & Morales, M. (2005). Memory for Fact, Fiction, and Misinformation: The Iraq War. Psychological Science, 16(3), 190-195.

McNair, B. (2018). An Introduction to Political Communication (6th Edition). Routledge.

Munger, K., & Phillips, J. (2022). Right-Wing YouTube: A Supply and Demand Perspective. The international journal of press/politics, 27(1), 186-219.

Ozdemir, S. (2022). Iran-Iraq War: The Employment of Chemical Weapons. Journal of Iranian Studies, 6(1), 105-133.

Rane, H., Ewart, J., & Martinkus, J. (2014). Media Framing of the Muslim World: Conflicts, Crises and Contexts. Palgrave Macmillan.

Rice, B., & Bartlett, J. L. (2006). Legitimating organisational decisions: A study of media framing of the Australian Government’s legitimacy strategy and public opinion on the war in Iraq. Journal of Communication Management, 10(3), 274-286.

Sage, W. M., & Yang, T. Reducing “COVID-19 Misinformation” While Preserving Free Speech. Journal of the American Medical Association, 327(15), 1443-1444.

Schlesinger, S., & Kinzer, S. (2005). Bitter Fruit: The Story of the American Coup in Guatemala. Harvard University Press.

Sharevski, F., Alsaadi, R., Jachim, P., Pieroni, E. (2022). Misinformation warnings: Twitter’s soft moderation effects on COVID-19 vaccine belief echoes. Computers & Security, 114.

Stephenson, S. (2023, February 23). No, the Voice isn’t a ‘radical’ change to our Constitution. The Conversation. https://theconversation.com/no-the-voice-isnt-a-radical-change-to-our-constitution-200056.

Swire, B., Berinsky, A. J., Lewandowsky, S., & Ecker, U. K. H. (2017). Processing political misinformation: comprehending the Trump phenomenon. Royal Society open science, 4(3), 160802-160802.

Thomas, K., & Werner, E. (2017, January 23). AP report: Trump advances false claim that 3-5 million voted illegally. PBS. https://www.pbs.org/newshour/politics/ap-report-trump-advances-false-claim-3-5-million-voted-illegally.

Twomey, A. (2023, February 28). What happens if the government goes against the advice of the Voice to Parliament? The Conversation. https://theconversation.com/what-happens-if-the-government-goes-against-the-advice-of-the-voice-to-parliament-200517.

Vraga, E. K., Bode, L., & Tully, M. (2020). Creating News Literacy Messages to Enhance Expert Corrections of Misinformation on Twitter. Communication Research, 49(2), 245-267.

Warner, E. L., Barbati, J. L., Duncan, K. L., Yan, K., & Rains, S. A. (2022). Vaccine misinformation types and properties in Russian troll tweets. Vaccine, 40(6), 953-960.

Weber, D., Falzon, L., Mitchell, L., & Nasim, M. (2020). #ArsonEmergency and Australia’s “Black Summer”: Polarisation and Misinformation on Social Media. Disinformation in Open Online Media, 12259, p.159-173.

Weber, D., Falzon, L., Mitchell, L., & Nasim, M. (2022). Promoting and countering misinformation during Australia’s 2019–2020 bushfres: a case study of polarisation. Social Network Analysis and Mining, 12(1), 64-64

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s