by Karlijn Roex
The SARS-CoV-2 pandemic has posed serious challenges for science communication. Since the first months of the outbreak, scientific work has received unprecedented magnitudes of social media attention. During the first three months of 2020, 6162 COVID-19-related scientific publications had already received nearly 1.4 million mentions on Twitter. Researchers have played a vital role on this platform by efficiently serving wider audiences with the latest scientific updates.
However, not all of the COVID-19 related information disseminated on social media is of high quality, studies warn. Several scientific studies have been retracted after they were hastily dragged into a published format. Due to the urgent nature of this global health crisis, every minute counts and more public attention than ever is drawn to the fast-science disseminated through ‘pre-prints’.
Pre-prints are short reports on conducted research, communicating early results before being carefully reviewed by academic peers. In order to get your work published as a full scientific journal article, peer review is eventually required, but this obviously introduces some extra time between finishing the research and communicating the results to the outside world. Therefore, pre-prints foster up-to-date academic debate and facilitate quick science-based interventions during crises.
However, the caveat of winning time is the risk of spreading ‘knowledge’ that later turns out to be flawed or low-quality. This way, science may risk contributing to the fast spreading of COVID-19 misinformation – and loses its credibility. The diffusion of well-intentioned but sloppy research conclusions, is perhaps the one feature that characterizes the COVID-19 infodemic. This is caused by what some have called the Covidisation of research. “It is as if everyone is compelled to write about COVID-19, regardless of their prior work or expertise, Madhukar Pai complains. Several actors are pushing their sloppy research into a published format.
Superspreaders of misinformation
It makes sense to turn towards COVID-19. Many researchers have seen their lives and research projects turned upside down. They see that they have research skills and want to help tackling this pandemic. Others, however, are mainly eager for a spotlight, want to have their name on the first drugs against COVID-19. Some did not even bother about completing their promised data collection at all, and yet submit a paper claiming they did.
This happened in a hydroxychloroquine (HCQ) study based on data provided by an obscure company consisting of employees with little or no experience in data or research. After having conducted a tremendously ambitious project surveying as much as 1,200 hospitals worldwide in record time, the study concluded that administering HCQ was related to increased death rates among hospitalized COVID-19 patients.
When the results were published in one of the largest medical journals, The Lancet, more than a hundred independent academics started to raise their concerns about the study. Besides mistakes in the number of reported COVID-19 deaths, it turned out several hospitals “that would have been crucial for the patient numbers in the database to be reached” had never been involved in the study. The study was then retracted. Nevertheless, its impact had been substantial. After the publication in The Lancet, the WHO and universities changed their policies on HCQ and stopped all running trials on the drug.
On top of confusion caused by scientific work, some tweets had become so-called ‘super spreaders’ of misinformation. A notable example was the single tweet that made the #FilmYourHospital conspiracy going viral in March and April, especially in the United States. It encouraged people to start filming empty hospital rooms and hallways in order to ‘debunk’ the ‘coronavirus pandemic myth’. According to these conspiracy-endorsers, the country was being shut down for other hidden-to-the-public reasons than a deadly virus.
The digital stockpiling of such videos sparked the false denial of the fact that, elsewhere in hospitals, Intensive Care personnel were being confronted with an unmanageable influx of very sick patients and distressing choices of whom to help and whom to sacrifice. Indeed, such videos were only further encouraging the type of behaviour that would increase the hospitalization and mortality rates. What can researchers and others do to protect themselves and others against misinformation?
It is clear that we are not only facing a global pandemic, but also a global infodemic. First of all, let’s get the terminology straight here. What is an infodemic? With the abundance of COVID-19 related information that is overflowing people’s timelines and overloading their minds, nonsensical news is spreading fast as well. From deliberately false studies to 5G-conspiracy theories to sensational panic-headlines accidentally fueling misunderstandings.
The consequences of this are serious. Shocking headlines stick more heavily to our memories and are more frequently retweeted. Exemplary were the headlines we saw in June and July, spreading pessimistic predictions about vaccine-efficacy based on research showing quickly fading antibodies. Even when some immunologists refuted the journalists’ strong conclusions, the myth remained vivid among many laypeople. The same applies for other myths, causing people to believe that vaccines, when ready, are doomed to be ineffective. Still months after initial fears of a rapidly and significantly mutating SARS-COV-2 virus were sparked and then falsified (at least for now), many people still believe it today.
This is nothing new. Producing false or even nonsensical assertions typically costs less energy than refuting them – as programmer Alberto Brandolini warned. In the book Calling Bullshit, freshly published this month, the biologist Carl Bergstrom and information scientist Jevin West mention the famous example of the ‘vaccines cause autism’ myth. This myth, caused by one single low-quality study, proves to be very resistant against decades of falsifying research.
It is too easy to attribute the stubborn persistence of the myth to the idiosyncrasies of ‘anti-vaxxers’. The COVID-19 pandemic shows that all people are sensitive for misinformation. Its omnipresence and, especially, its resistance against later correction, are fueled by broader social, psychological and technological processes – not by particular groups of people being ‘idiots’ or ‘nuts’. That is just a lazy denial of deeper roots of infodemics. As scientists we need to research these roots, as well as study interventions that work.
The danger is that when I google ‘SARS COV2 mutates quickly’, I will receive supporting articles – as well as when I google the opposite. When I rephrase my search request into a question, ‘Does SARS COV2 mutate quickly or slowly?’, the articles that appear on top are not necessarily the most credible ones. Which information we tend to believe, will likely depend more on our psychological coping mechanisms. Some people, overwhelmed by a catastrophic impact on global life that they have never experienced before, may adopt the belief: ‘Deep inside, one knows that we are all more doomed than we dare to believe’. These people are then very sensitive to panic headlines, while being more skeptical towards any positive updates.
Research shows that people generally favor negative information over positive, and emotional stories over technical elaborations. Informational infrastructures should be more sensitive to our address our biases. Right now, instead, they amplify our biases by feeding us more of what we liked.
The importance of debunking myths
So far, I have used the word ‘misinformation’ to refer to information that is incorrect by accident. Incorrect information that is intentionally, is called ‘disinformation’ in the infodemics literature. At the beginning of this month, an obscure website ran by unknown authors caught the attention of several scholars. The website mimics the style of a genuine scientific study and could easily mislead laypeople into believing that the previously mentioned drug HCQ is effective against COVID-19.
The website was called out by the earlier mentioned biologist Carl Bergstrom. As someone who has conducted some cross-country time series analyses on public health myself, I was immediately cautious when the authors claimed to have conducted a ‘country-randomised trial’ with two billion people within more than 15 countries. This would mean that one has managed to randomly assign many countries to the experimental condition (receiving the medicine) while many others were assigned to the control condition (not receiving the medicine). This would be tremendously expensive, time-consuming and would require many difficult negotiations with as many as 15 national governments.
Even those that are racing towards a COVID-vaccine with the help of many budgets and governmental collaboration, have not been capable of such a thing. Nor will they be. It is virtually impossible that the authors of the HCQ-website have been able to conduct this study. In addition, those that are familiar with the COVID-19 literature on HCQ, suspected that the study ‘cherry picked’ countries to obtain the desired result. The fact that the authors are nowhere identifiable, and therefore cannot be held accountable for what they wrote, further feeds the suspicion that this website constitutes an attempt of disinformation.
For now, I will resort to the term ‘misinformation’ because it is in most cases impossible to know whether false facts are being spread on purpose. Regardless of the intentionality behind the falsehoods, we need scientists such as Carl Bergstrom that publicly fact-check circulating information.
Also others are doing a great job at this, such as Yale-immunologist Akiko Iwasaki. Other scientists have been assisting platforms such as Wikipedia with keeping their COVID-19 related information up-to-date and credible. There may be many more best practices to learn. Therefore, together with information scientist Giovanni Colavizza, I am conducting a study surveying researchers’ information behaviour during this pandemic. We are about to finish data collection for our first survey wave soon and start the second in late September. By that time, we expect to publish some early results.
Other infodemic scholars have been monitoring this issue closely as well. For instance, Manlio De Domenico, a complex systems researcher, has found the Covid19 Infodemics Observatory. Here, one can see the gravity of the infodemic, based on an analysis of more than 100 million tweets worldwide. A preprint of the research was published here. Interestingly, the infodemic risk decreases during a local resurgence of COVID-19. A new epidemic may re-alert local authorities and audiences of the importance of using credible information.
The time to address misinformation is now
Experts say that vaccines or quick at-home test kits will be the fastest way out of this pandemic. Likely it will be a combination of those. Yet, we see an upsurge of conspiracy theories that increase distrust towards these two very two solutions: vaccines and tests. Vaccination has traditionally been among the areas most plagued by misinformation. The problem was particularly large in Italy, a country that witnessed some substantial big pharma scandals and therefore saw a growing anti-vaccination movement. Then, the virologist Roberto Burioni went viral – no pun intended – after finishing a talkshow with one downright take home statement cutting-off all bullshit that had been said that evening.
His small act of rebellion against a talkshow culture that was dominated by anti-vax entertainment celebrities, was rewarded. Burioni has remained an important public voice against vaccine hesitancy. In my opinion, we urgently need a Burioni right now. Some popular polls estimated that only half to two-third of the American population would want to get vaccinated against COVID-19. Meanwhile, that nation is being clearly destroyed on several aspects by this virus. In Italy, one of the first and most severely hit countries, 41% is still hesitant towards vaccination. In the Netherlands, where I live, popular polls estimated that slightly more than 80% would accept vaccination. Others estimate that this is only 60%.
Besides those that refuse vaccination based on conspiracy ideas, there is a large group that worries about safety being sacrificed to meet a coveted deadline somewhere in 2021. This worry is valid. It makes sense, but often reflects lack of information about how exactly COVID-vaccine developers have won time. Transparency is therefore very important. Moreover, people’s concerns are based on the profit motives of big pharma. These people’s vaccine hesitancy is rooted in a concern that is valid and needs to be addressed: the profit-maximizing motives of Big Pharma. Indeed, these motives could slow down the entire undertaking of vaccinating the world population against this deadly virus. Rather, the vaccine skeptics and the vaccine promoters have a common enemy here: big business.
All in all: the time to address misinformation is now, not later. When we have found and produced the medical technologies to end the pandemic, we would not want to be caught in it any longer just because of infodemics. Fortunately, just as with vaccine development, we can learn from Ebola and other previous outbreaks. The previous year has provided us the gift ‘Anti/Vax: Reframing the Vaccination Controversy’, a book by literature scientist Bernice Hausman that contains much-needed lessons to fight this pandemic. Anti-vax is a social phenomenon. We need to seriously wonder: how does our society fuel the anti-vax phenomenon? What is the responsibility that we all have in maintaining it?
There is a body of scientific literature on increasing vaccine acceptance. For instance, a 2018 psychological experiment shows that pro-vaccination content was much more effectively communicated when it puts human experiences (e.g. the horrible consequence of not vaccinating against a preventable disease) in the spotlight rather than statistics. Moreover, current psychological research shows some benefits of letting people play a humorous ‘Fake News game’. The game does make people less vulnerable to being infected by misinformation. This works a lot better than paternalistically ‘educating’ people we treat as fanatic ‘idiots’. This time needs more of such creative innovations, and I am happy to see talented people working on these.
Dr. Karlijn Roex is a sociologist and science studies scholar who investigates infodemics and health discourses. She is based at the University of Amsterdam and Leiden University. See her research project here and her personal website here.
Copyright title image © Arco Mul
Image in text by iQoncept via Shutterstock
Carl T. Bergstrom and Jevin D. West, Calling Bullshit: The Art of Skepticism in a Data-Driven World, Random House (2020)
Bernice L. Hausman, Anti/Vax: Reframing the Vaccination Controversy (The Culture and Politics of Health Care Work), Cornell University Press (2019)