Dealing with two pandemics. Why is ‘infodemic’ a threat?
Since the beginning of the COVID-19 pandemic in 2020, people are choosing social media as a source of news explaining the current situation of the global spread of the virus that would ease their anxiety (Gruzd et al. 2021, 1). However, WHO warns that the affordances of social media enabling fast and easy interactions between users, and the possibility of shared production of knowledge, gave birth to another pandemic that was fueled by the former one – the ‘infodemic’ (United Nations 2020). The urgent demand of the society for the explanations of the actions undertaken by the governments to stop the spread of the COVID-19 pandemic caused the production of immoderate levels of misinformation concerning the epidemic published by unreliable sources, which was found trending among social media users (Radu 2020, 1). The research on Facebook algorithms undertaken by Avaaz, the campaigning community fostering people-initiated change, has shown that ‘the top 10 websites spreading health misinformation generated almost four times as many views as equivalent content from the top 10 websites of leading health institutions’ (Avaaz 2020, 6). Academics are highlighting the importance of structuring proper legal remedies that would regulate the spread of misinformation online which is thought to have a potential influence over the democratic qualities of trust relationships between professional institutions and non-experts, thus exposing individuals’ life and health to danger (Radu 2020, 1; Van Dijck and Alinejad 2020, 3).
The networked model of science communication
Van Dijck and Alinejad use the characteristic of the trust link between various parties and the flow of information, to present the difference between the institutional model of science communication, and the networked one that enabled the emergence of the ‘infodemic’ (2020, 2). On one hand, the former is based on a one-sided path of communication that affords respected and entrusted communities of experts to share specialized knowledge with regular individuals (Van Dijck and Alinejad 2020, 2). The latter, on the other hand, is established on social media where all the participants of the group exchange expertise on various topics. A product of such a type of knowledge production Ramsay calls ‘a writerly text’ (2010, 8). Due to the fact that all online users have the same ability to create content, it presents lots of various approaches, theories, and opinions. However, Ramsay notices the concerning issue that the affordances of contemporary online networks and a wide variety of their content enable individuals to stay within one flow of thought at all times, meaning they are easily being limited to only a particular representation of reality, instead of being able to assess the differences between all the possible views (2010, 8). For example, within the networked model of science communication, the algorithmic processes assess the importance of the information by the number of reactions left by network participants and thus place the most popular content on the top of the website (Van Dijck and Alinejad 2020, 3; Avaaz 2020, 6). Therefore, in this case, the non-experts have an influence over which information is displayed better than the other.
The health misinformation on Facebook
Unfortunately, as proven by Avaaz, the misinformation concerning health is usually very controversial thus it initiates high user involvement and reaction (2020, 6). It was in fact captured by researchers analyzing the spread of misguided content concerning the COVID-19 pandemic on Facebook. By extracting a sample of 82 websites, that were found continually sharing deceitful news about the coronavirus, and 42 Facebook pages, that ramified the spread of this content on the social media platform, Avaaz assessed that these outlets all together have generated over 130 million interactions and 3,8 billion views on Facebook over a year (2020, 5).
Figure 1: ‘(…) the total estimated views of the 82 health misinformation spreading websites on Facebook over one year’ (Avaaz 2021, 5)
This way, researchers concluded Facebook as the main driving force of the popularity and shareability of pandemic misinformation (Avaaz 2020, 4). They have accused the platform’s algorithms of wrongly ascribing high importance to the deceitful content by merely considering users’ engagement with it, instead of paying attention to its reliability and legitimacy (Avaaz 2020, 6). The academics are presenting the seriousness of the negative effects of the ‘infodemic’ to emphasize the need for proper legal remedies that would assign the responsibility of the spread of misinformation (Graham-Harrison and Hern 2020). The untrustworthy content shared online included not only useless but also deceptive advice that posed serious threats to readers’ health and life. For example, the disinformation about pure methanol being the most successful at fighting the virus was found to cause 800 deaths and 60 cases of blindness among individuals that followed the ‘advice’ (Graham- Harrison and Hern 2020).
Why should we care?
The ‘infodemic’ is undermining a long-established trust of society in democratic institutions such as hospitals, universities, courts, or governments (van Dijck and Alinejad 2021, 3). The affordances of the online networks enable all non-experts to not only create content and share knowledge but also influence its virality on the platform by engaging with it (Avaaz 2020, 6). Therefore, nowadays non-experts have the same possibility to spread information as experts. However, because of their lack of specialized knowledge, they most often share misleading content. Unfortunately, as mentioned by Ramsay, contemporary digital technologies enabling participatory creation of knowledge are presented as ‘(…) more useful (and practical)’ than the academic source of knowledge. Indeed the news online is easier to find and comprehend, but it is impossible to assess its legitimacy in comparison to reliable information from a specialist. The spread of misinformation has tremendous effects on society as shown in the example of false advice of fighting the coronavirus shared on Facebook (Graham-Harrison and Hern 2020). Regular individuals should not be expected to have expertise in many different disciplines. This specialized knowledge is outsourced to various experts that within a democracy are granted society’s trust to always act in the interest of the common good (van Dijck and Alinejad 2021, 3). However, the contemporary digital environments are dismantling these trust relations.
Bibliography:
Avaaz. 2020. ‘Facebook’s Algorithm: A Major Threat to Public Health’. Accessed 25 September 2021. https://secure.avaaz.org/campaign/en/facebook_threat_health/.
Graham-Harrison, Emma, and Alex Hern. 2020. ‘Facebook Funnelling Readers towards Covid Misinformation – Study’. The Guardian, 19 August 2020, sec. Technology. https://www.theguardian.com/technology/2020/aug/19/facebook-funnelling-readers-towards-covid-misinformation-study.
Gruzd, Anatoliy, Manlio De Domenico, Pier Luigi Sacco, and Sylvie Briand. 2021. ‘Studying the COVID-19 Infodemic at Scale’. Big Data & Society 8 (1): 20539517211021116. https://doi.org/10.1177/20539517211021115.
Radu, Roxana. 2020. ‘Fighting the “Infodemic”: Legal Responses to COVID-19 Disinformation’. Social Media + Society 6 (3): 2056305120948190. https://doi.org/10.1177/2056305120948190.
Ramsay, Stephen. 2010, April 17. ‘The Hermeneutics of Screwing Around; or What You Do with a Million Books’.
United Nations. 2020. ‘UN Tackles “Infodemic” of Misinformation and Cybercrime in COVID-19 Crisis’. United Nations. United Nations. Accessed 25 September 2021. https://www.un.org/en/un-coronavirus-communications-team/un-tackling-%E2%80%98infodemic%E2%80%99-misinformation-and-cybercrime-covid-19.
Van Dijck, José, and Donya Alinejad. 2020. ‘Social Media and Trust in Scientific Expertise: Debating the Covid-19 Pandemic in The Netherlands’. Social Media + Society 6 (4): 2056305120981057. https://doi.org/10.1177/2056305120981057.