Covid-19 Vaccine Fake News: Google as Co-Producer to Knowledge Production

On: October 3, 2021
Print Friendly, PDF & Email
About Jessica Blom


   

Abstract

Fake news and misinformation regarding Covid-19 vaccines is able to be spread and believed through Google’s positive relevance feedback loops as the average user sees Google as a neutral source for information. Google therefore becomes a co-producer to knowledge production. Digital literacy is needed among users to engage with digital technologies and decrease positive relevance feedback loops of misinformation. 

Keywords: Covid-19 – vaccine – fake news – Google – knowledge production – positive relevance feedback loops – digital literacy

Introduction

            The information overload on the internet is able to be dealt with through search engines such as Google and Google Scholar. As Google is the main search engine that is used around the world by users and researchers, the search results that are returned, especially those on the first page, result in part to what users and researcher know and are able to access. The average user believes that the first page of search results are the most relevant for the query posed and will therefore be the most informational (Van Dijck 2010, 577). Van Dijck (2021, 574) argues that this results in search engines such as Google becoming co-producers to knowledge production; as well as having a lot of networked power (Van Dijck 2010, 575). Through Google’s influential ranking system, users only engage with information that Google makes the most visible. As users mostly click on the search results on the first page, and especially those on top, a positive feedback loop is created that keeps the same results on top regardless of misinformation. The search algorithm namely registers the clicked links as relevant to the query. This is called relevance feedback (Chirag 2021).

            The notion of relevance does not only include actual relevant information any longer, as entertaining search results are counted as relevant through the positive relevance feedback loops as well (Chirag 2021). This becomes problematic when search results become full of conspiracies, controversies, and sensationalism (Chirag 2021). This puts e.g., conspiracies higher in the search results ranking, while real news and facts disappear to the second page of Google. When a controversial topic is Googled and the user is exposed to fake news and misinformation, the naïve average Google user often trusts what Google shows them and continues the positive relevance feedback loop of misinformation and fake news (Chirag 2021). 

            It can thus be said users’ knowledge is in part produced from search results on search engines such as Google. This then raises the discussion regarding Google’s responsibility and accountability when misinformation or fake news is spread through its search results. As Google is profit-driven, there lies danger in combining “corporate profit motive and individual susceptibility” (Chirag 2021). Misinformation can spread easily because of the sensationalism and controversial news that users are drawn to (Chirag 2021). Additionally, the positive feedback loop keeps these results on top. This then creates echo-chambers as this fake news continues to stay on top until Google itself takes note of it and removes it (Tuters 2018, 60). Unconsciously, users thus keep being confronted with fake news, which could make them doubt what is fake or true. Moreover, exposure to misinformation is more likely to happen to already skeptical users, which also upholds the positive relevance feedback loops (Guess et al. 2020). 

Commentary

            In 2020 and 2021 a lot of misinformation and fake news has risen and continues to rise around the topics of Covid-19 and the vaccines because of the visibility of popular content rather than informational content because of this positive feedback loop (Polizzi 2020). Assuming most anti-vaxxers and Covid-19 conspiracists do their research via social media and search engines such as Google, these platforms and search engines could be argued to indirectly facilitate to the spread of misinformation and fake-news surrounding these topics. The most obvious misinformation regarding the Covid-19 vaccines that circulated and gained media attention was that the vaccine would make people magnetic and that a micro-chip would be implanted through the injection. Even though many doctors and researchers shared their scientific knowledge debunking these fake news stories, many people still believe(d) it. 

            I would argue that this could be the result of the positive feedback loops in the search results. A suggestive query such as ‘the covid-19 vaccine makes you magnetic’ in the midst of the conspiracy going around, could lead to results proving the query instead of debunking it as those results gain less popularity thus a lesser positive relevance feedback loop. This along with filter bubbles on social media and personalization, could have kept this misinformation circulating and gaining more traction. Additionally, the lack of social contact due to quarantine in the pandemic and social distancing, made people rely more on the internet for their information, therefore the possibility of doubt about information that is engaged with to be true or false enlarges as no counter information is served through social contacts (Polizzi 2020).

            Google as well as other big tech companies have been under increasing pressure to deal with all the misinformation and fake news about Covid-19. This increased monitoring, has led to anti-vaccine groups etc. to turn to platforms such as Telegram that have less monitoring of their users and groups (Giles, Spring 2021). With big tech-tech companies increasingly taking responsibility, the discussion of responsibility and accountability moves on to other platforms and websites. Fake news and misinformation thus continues to be spread; however, users would have to look out for this misinformation themselves rather than being confronted with it unconsciously on Google. 

            The problem is, that once people start believing in the conspiracies, fake news, and misinformation it is hard for them to change their views and ideologies. This is because they are constantly surrounded by these forms of “information”, filter bubbles, and are becoming part of communities and social networks (Giles, Spring 2021). Additionally, average users trust search engines to be “neutral mediators of knowledge” (Van Dijck 2010, 574). This is because most users are not aware of the manipulation that goes into the ranking of the search results; and they are not aware of the lack of transparency regarding this manipulation (Van Dijck 2010, 582). This lack of transparency is dangerous, as users’ ability to differentiate between fake news and facts might keep decreasing. This makes facts more malleable and open to interpretation (Tuters 2018, 59)

Conclusion

            What is needed for to people to deal with misinformation and fake news is “information literacy enriched with analytical skills and critical judgement” (Van Dijck 2010, 588). With these skills, users could decrease the positive relevance feedback loop of fake news and misinformation. As the misinformation is circulated through search engines, digital literacy is needed to make users able to “engage practically and critically with digital technologies” (Polizzi 2020). Additionally, search engines such as Google could prioritize exposing users to accurate information through the search results (Guess et al. 2020, 7804). 

Bibliography

Dijck, José van. 2010. “Search Engines and the Production of Academic Knowledge.” International Journal of Cultural Studies 13 (6): 574–92. https://doi.org/10.1177/1367877910376582.

Giles, Christopher and Spring, Marianna. 2021. “How Anti-Vaxxers Are Living and Loving in a Covid World.” BBC News, August 12, 2021, sec. BBC Trending. Accessed September 23, 2021. https://www.bbc.com/news/blogs-trending-58146525.

Guess, Andrew M., Brendan Nyhan, Zachary O’Keeffe, and Jason Reifler. 2020. “The Sources and Correlates of Exposure to Vaccine-Related (Mis)Information Online.” Vaccine 38 (49): 7799–7805. https://doi.org/10.1016/j.vaccine.2020.10.018.

Polizzi, Gianfranco. 2020. “Fake News, Covid-19 and Digital Literacy: Do What the Experts Do – Gianfranco Polizzi.” Inforrm’s Blog. Accessed September 23, 2021. https://inforrm.org/2020/06/25/fake-news-covid-19-and-digital-literacy-do-what-the-experts-do-gianfranco-polizzi/

Shah, Chirag. 2021. “It’s Not Just a Social Media Problem – How Search Engines Spread Misinformation.” The Conversation. Accessed September 23, 2021. http://theconversation.com/its-not-just-a-social-media-problem-how-search-engines-spread-misinformation-152155.

Tuters, Marc. 2018. “Fake News.” Krisis: Journal for Contemporary Philosophy 2018 (2): 59–61. https://dare.uva.nl/search?identifier=407ec75e-a5b2-431c-9240-f01b0dbfef9c.

Comments are closed.