Covid-19 Vaccine Conspiracy Campaigns on Twitter

Initially did not plan to post this, but given the high grade it received and the fact I’m posting other university essays here today, let’s go! This essay simply covers misinformation and conspiracy theories surrounding Covid-19 and the vaccinations on Twitter, as well as some of the (failed) attempts at combatting it. We truly did unleash a virtual portal to hell with the internet and social media, and no one has figured out how to fix it.

The digital age has seen an exponential growth not just in individuals’ access to information and data, but an unprecedented growth in the sheer volume of information that is created and uploaded to the internet. Alongside this powerful resource in what is called the information society, there is also what Lash (2002) called the “unintended consequences” of an “overload of information” to create a “disinformation society”. While Lash was talking in a rather different context than what is considered dis- and misinformation today, one distinction he makes worth noting is between information and knowledge gained through discourse and reflection, and knowledge as a product, such as through the media. Today, this would be more obvious through social media, and it is this lack of reflection and discourse that helps fuel disinformation (Lash, 2002). Misinformation is information that is “misleading or harmful”, often for a particular audience and purpose; disinformation is simply false information, such as conspiracy theories. (Sharevski et al., 2021).

On online networks, from media outlets to social media websites, misinformation is more likely to become viral and spread faster than factual information (Lobato et al., 2020). Twitter, one of the largest social media platforms often referred to as the “town square”, has been a battleground for many competing narratives filled with misinformation and conspiracy theories. These can include wide ranging topics, from mass shootings in the US being “hoaxes” to the anti-Semitic and monolithic QAnon movement. By far one of the more prolific issues for misinformation is public health, particularly in the last few years with the advent of Covid-19 and the quick development of a vaccine. Twitter is a major vector for these theories as it has a history of being a platform for misinformation on past pandemics as well (Krittanawong et al., 2020)

Vaccine misinformation and conspiracy theories are not new. Since the creation of the first (rather primitive) vaccine by Edward Jenner in 1796, there has been concern about the safety and effectiveness of vaccines and the role of the State in mandating and distributing them (Mnookin, 2011). While the latter point is a salient one about civil liberties and State power, the health concerns have only become more prolific – and more unfounded. In a now retracted study, Andrew Wakefield gave the modern anti-vax movement new life by claiming the MMR vaccine caused autism to develop in children (Motta & Stecula, 2021). Despite being thoroughly and repeatedly refuted, anti-vax sentiment persisted and fuelled the rise of conspiracy theories online and, as a result, an increase in vaccine hesitancy (Muric et al., 2021).

The most common threads of misinformation and conspiracy surrounding the Covid-19 vaccine (in a medical context) on Twitter were about the ineffectiveness of the vaccine, the potential for the flu or Covid-19 vaccines to give you the virus or a positive result when tested, flu statistics being added to the Covid-19 numbers, and a number of links to 5G and microchip technologies being implemented to track vaccinated individuals (Krittanawong et al., 2020). Others included misinformation regarding adverse reactions to vaccines, a high risk and death rate in children, and a “depopulation” effort by “global elites” (Muric et al., 2021; Warner et al., 2022).

Sources of Covid-19 and vaccine misinformation are varied, but there are some common threads here as well. In a study to understand what factors were likely to predict who would be willing to share misinformation related to Covid-19, there was a link between that willingness, political conservatism, and conspiracy ideation that follows the trends of previous research (Lobato et al., 2020). While there were also a number of pro-vaccine messages with misinformation based on a misunderstanding of pathogens and vaccines or to discuss and refute misinformation, most of the misinformation present on Twitter came from and was shared by anti-vax and heavily politically conservative accounts and outlets (Muric et al., 2021; Warner et al, 2022; Lobato et al., 2020).

For the individual, misinformation like this tends to proliferate because conspiracies

“appeal to precedent, self-heroization, contempt for the benighted masses, a claim to be only asking “disturbing questions,” invariably exaggerating the status and expertise of supporters, […] circularity in logic, hydra-headedness in growing new arguments as soon as old ones are chopped off, and […] the exciting suggestion of persecution.”

(Aaronovitch, cited in Mnookin, 2011).

While conservative media outlets, such as Breitbart or Fox News, and newer sites, such as “vactruth”, “childrenshealthdefense”, or “humansarefree”, make up the majority of accessed and linked websites for misinformation, there have also been Twitter bot account campaigns on Twitter to spread misinformation. Some of these have been linked to Russian networks with seemingly dual motives. The first is, as with a number of bot campaigns, to simply promote misinformation to cause unrest and division among populations, but there have been suggestions that the Russian government had financial interests in discrediting certain vaccines, like AstraZeneca, in favour of promoting their own vaccine product (Warner et al., 2022; Jemielniak & Krempovych, 2021).

Twitter and other social media platforms have undertaken a number of measures to combat misinformation, often with mixed results. In the first year of the pandemic, Twitter had the highest rate of Covid-19 misinformation of the major social media platforms and that the number of tweets spreading it “were comparatively higher than accurate information by ordinary users” (Ali, 2020). There are strong debates about how platforms like Twitter should approach the spread of misinformation, and about the effectiveness of certain methods over others.

For instance, the clearest example of handling misinformation is simply removing it and the accounts posting it. Twitter removes and suspends bot accounts and accounts that break its terms of service, the most notable of which was US President Donald Trump following his election loss in 2020. While as a private company Twitter has the power to do this, blunt censorship in the hands of a massive source of media raises legitimate concerns and isn’t, some argue, a viable solution for medical misinformation, particularly for new research that may itself be inaccurate or misunderstood (Niemiec, 2020).

Another method visible across most social media websites is fact checking and accuracy warnings to try and “nudge” readers towards the correct information (Ali, 2020). While this sounds useful, some have called these tactics “soft moderation” that actually works to entrench the views of people who believe and spread misinformation about Covid-19 by creating “belief echoes” (Sharevski et al., 2022). Rather than changing minds and correcting information, people already distrustful of the mainstream narrative and big tech companies will see it as proof to bolster their own convictions. Attempts to use “expert corrections” to improve news literacy online are also shaky, further evidencing the cynicism of readers and a lack of media attention and reflection due to the large volumes of content being consumed (Vraga et al, 2020; Lash, 2002).

Covid-19 and vaccine misinformation and conspiracy theories are extremely prevalent online, and they gain a lot of traction on social media platforms where they can spread faster than more accurate information. While there have been many different methods and attempts at combatting this “infodemic” by companies like Twitter, many of them appear to be inadequate at achieving their intended goals, and in the worst cases actually cement users’ belief in the misinformation they consume. While further efforts to increase media and news literacy among the general public should not be ignored, a fresh approach may be needed by Twitter and others to tackle the issue of misinformation.

References

Ali, S. (2020). Combatting Against Covid-19 & Misinformation: A Systematic Review. Human Arenas, 5, 337-353.

Jemielniak, D., & Krempovych, Y. (2021). An analysis of AstraZeneca COVID-19 vaccine misinformation and fear mongering on Twitter. Public Health, 200, 4-6.

Krittanawong, C., Narasimhan, B., Virk, H. U. H., Narasimhan, H., Hahn, J., Wang, Z. (2020). Misinformation Dissemination in Twitter in the COVID-19 Era. The American Journal of Medicine, 133(12), 1367-1369.

Lash, S. (2002). Critique of Information. SAGE Publications.

Lobato, E. J. C., Powell, M., Padilla, L. M. K., & Holbrook, C. (2020). Factors Predicting Willingness to Share COVID-19 Misinformation. Frontiers in Psychology, 11.

Mnookin, S. (2011). The Panic Virus: Fear, Myth and the Vaccination Debate. Blank Inc.

Motta, M., & Stecula, D. (2021). Quantifying the effect of Wakefield et al. (1998) on skepticism about MMR vaccine safety in the U.S. PloS one, 16(8).

Muric, G., Wu, Y., & Ferrara, E. (2021). COVID-19 Vaccine Hesitancy on Social Media: Building a Public Twitter Data Set of Antivaccine Content, Vaccine Misinformation, and Conspiracies. JMIR Public Health Surveillance, 7(11), e30642.

Niemiec, Emilia. (2020). COVID-19 and misinformation: Is censorship of social media a remedy to the spread of medical misinformation? EMBO Reports, 21(11).

Sharevski, F., Alsaadi, R., Jachim, P., Pieroni, E. (2022). Misinformation warnings: Twitter’s soft moderation effects on COVID-19 vaccine belief echoes. Computers & Security, 114.

Vraga, E. K., Bode, L., & Tully, M. (2020). Creating News Literacy Messages to Enhance Expert Corrections of Misinformation on Twitter. Communication Research, 49(2), 245-267.

Warner, E. L., Barbati, J. L., Duncan, K. L., Yan, K., & Rains, S. A. (2022). Vaccine misinformation types and properties in Russian troll tweets. Vaccine, 40(6), 953-960.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s