ABOVE: modified from © istock.com, gmast3r

As scientists report their research on SARS-CoV-2, the disease it causes, and potential treatments, vaccines, and other measures to slow COVID-19’s spread, the public consumes and shares it. But not all of that sharing is accurate, and some of what is shared is spun to support false narratives. In May 2020, echoing earlier comments from World Health Organization Director-General Tedros Adhanom Ghebreyesus, UN Secretary-General António Guterres said the world was experiencing a “pandemic of misinformation.” Indeed, according to a recent survey from the Kaiser Family Foundation, 78 percent of adults reported having heard one or more false statements about COVID-19.

While SARS-CoV-2 spreads via social contact, erroneous information spreads largely by social media. From the promotion of unproven treatments such as hydroxychloroquine to misrepresenting the severity of the pandemic, inaccurate information shared on Facebook, Twitter, and other sites has promoted dangerous behavior and, according to some, deepened social and political divides. But these same platforms also serve as primary channels for scientists invested in correcting the record and educating the public. Twitter, for example, has grown substantially since the pandemic began, with daily active users climbing by 20 percent between early 2020 and the first part of 2021 from 166 million to 199 million, and scientists are no strangers to the platform. While there are no official estimates, one 2017 study identified 45,867 Twitter users whose names and profile descriptions contain job titles associated with scientific careers, such as microbiologist or epidemiologist. In the face of the growing public attention on COVID-19 science, some of these users are now seeing their followings surge, and life science researchers who spoke with The Scientist say they see it as an opportunity to disseminate accurate information to a wide audience while ensuring that scientists play a central part in the discourse on issues that relate to their areas of expertise.

“[It’s] something the academic community has to do,” says Ionica Smeets, a science communication expert at Leiden University in the Netherlands. “If you withdraw from the dialogue, you let the misinformation win.”

See “Opinion: Scientists Must Battle the Disinformation Pandemic

A public channel

Cecile Janssens, an epidemiologist at the Emory University Rollins School of Public Health, has been teaching a course for undergraduate students on how to understand the science behind the news since 2013. The inspiration for the course was a study that had been covered by the press and made the rounds on social media, but for all the wrong reasons. The study’s claim was that, over the previous half century, American women spent less time on housekeeping than previous generations did, and that this contributed to an increasing prevalence of obesity among these women. This conclusion was based on self-reports about daily activities, Janssens quickly noticed; there were no data on actual weight or other potentially relevant factors such as diet.

If you withdraw from the dialogue, you let the misinformation win.

—Ionica Smeets, Leiden University

The study was covered “totally wrong in the news, and it was everywhere,” she recalls—including in The New York Times and Medscape, a site targeted to clinicians. No one covering the study, it seemed, was taking the time to properly evaluate it, and she realized that such critical thinking was often lacking among people, including her own students, who were consuming the news reports. So she crafted a curriculum on how to carefully examine science covered in the news; she has been teaching that course for more than eight years and is now writing a book on the topic. In parallel to the course, she regularly tweets about research methods with the goal of “bringing poor methodology in the spotlight.” Social media is invaluable as a platform for science communication, Janssens says, adding that she has experienced firsthand how engaging with people to answer their questions can be mutually satisfying. “There is an audience out there who really follows the scientists because of what scientists have to say,” she says. “The public is craving for better information.”

Baylor College of Medicine’s Peter Hotez agrees that people are eager to understand science, especially as it pertains to COVID-19, and that social media is an important avenue in a broader effort to provide science news that is not only accurate, but well explained. He regularly tweets pandemic-related research and commentary in addition to making appearances on news programs and otherwise making himself available to journalists. There’s an antiquated view “that says you have to talk about science to the American people like they’re in the fourth grade or sixth grade,” says Hotez, who codirects the Texas Children’s Hospital Center for Vaccine Development and has been working on coronavirus vaccines for a decade. “I’ve always maintained that the American people have a far deeper appreciation of complexity and nuance than we give [them] credit for.” When the pandemic struck, Hotez says he not only launched a research program to create a COVID-19 vaccine for low-income countries—made in collaboration with companies in India and elsewhere, that vaccine has recently undergone Phase 3 trials—he held steady on his stated mission “to basically provide the background, nuance, and complexity that I think people appreciate.”

This approach also applies to other areas of science that have historically been misconstrued by groups with ulterior motives, adds Hotez, who has a long-standing interest in researching coordinated campaigns to spread disinformation (misinformation shared with the intention to deceive) and the groups behind them. “[I’m] using [the pandemic] as a teachable moment about the disinformation movement,” he says, having recognized early on that “the anti-vaccine people and anti-science people would see this as their moment and seize upon it to rev up conspiracy theories. . . . It just didn’t come out of nowhere. This has been building.”

Hotez is one of several scientists who have seen their Twitter audiences balloon since the pandemic started; his following swelled from what he estimates to have been around 30,000 before COVID-19 to 254,000 as of mid-December. Similarly, Kent State University infectious disease epidemiologist Tara Smith now has more than 117,000 followers, up from what she estimates to be around 25,000 or 30,000 prior to the pandemic. University of Washington epidemiologist Trevor Bedford, who has tweeted about his own work using genetics to track SARS-CoV-2 spread and about others’ COVID-19 research, is now followed by more than 387,000 Twitter users, and virologist Christian Drosten, director of Charité – Universitätsmedizin Berlin’s Institute of Virology, by more than 877,000.

“COVID became such an overarching, massive issue that it’s hard to ignore it and not be part of the conversation,” says Gordon Pennycook, a behavioral scientist at the University of Regina in Canada. “[It] put a lot of people in a position they weren’t in before, where it matters what you share.”


Social media platforms have launched campaigns to stem the spread of misinformation during the pandemic, but there has been widespread criticism of their efforts. Site executives—Mark Zuckerberg, CEO of Facebook, and Jack Dorsey, then-CEO of Twitter—last fall appeared before a Senate Commerce Committee to answer questions about how they had handled misinformation on their platforms. 

© istock.com, Maria Petrishina

Some researchers are studying how and why misinformation spreads, in hopes of coming up with better strategies for stopping it. For example, Gordon Pennycook, a behavioral scientist at the University of Regina in Canada, along with David Rand, a cognitive scientist at MIT’s Sloan School of Management, and colleagues, has found that simply sending Twitter users who had previously shared misinformation a message asking them to assess the accuracy of particular headlines seemed to reduce how much misinformation they shared. “Often they’re sharing it because it seems important or other people might like it,” Pennycook surmises. “Whether it’s true, they may not be thinking that much about it.”

Rand, whose work has been funded by gifts from Google and Facebook and who serves on the advisory board for Birdwatch, Twitter’s crowdsourced fact-checking program, and Pennycook have recently partnered with Google’s technology incubator Jigsaw to develop approaches for social media companies to implement that might help remind users to think about accuracy and thereby improve the quality of information being shared on the platforms. Any positive effect in this regard should be amplified by network effects, the researchers note—if fewer people tweet a piece of false information, fewer followers will see it and have the opportunity to retweet it.

“I think it’s interesting to think about what you can do [to slow] the spread of misinformation online,” says Ionica Smeets of Leiden University in the Netherlands. “How can you do that, not as an individual but as a system?” 

A hostile environment

While Twitter offers a direct channel to the general public, it is frequently a combative environment, as many scientists have experienced firsthand, particularly during the pandemic. “The fact that everyone’s engaging on the same topic that has relevance to everybody but there are large differences of opinion . . . that’s going to lead to more animosity,” says Pennycook, who himself uses Twitter mostly for communicating with other academics but has researched how scientists can best stem misinformation on Twitter. (See “Better Approaches for Correcting the Record” above.) Indeed, the hostility that has become a hallmark of the platform has driven some researchers away from Twitter.

In one high-profile example, the Scripps Research Institute virologist Kristian Andersen deactivated his Twitter account last June after an email he had written to National Institute of Allergy and Infectious Diseases (NIAID) Director Anthony Fauci mentioning the possibility that SARS-CoV-2 had been genetically manipulated was made public by two news outlets, and fueled speculation that the pandemic virus had been engineered in a lab. In his attempts to respond on Twitter, Andersen told The New York Times, “I found that information and comments I posted were being taken out of context or misrepresented to push false narratives, in particular about the origins of SARS-CoV-2.” 

The Twitter storm that surrounded the release of Andersen’s email, among other government communications, was “really evidence of how this has gone off the rails,” says human population geneticist Jedidiah Carlson. A few years ago, when he was a postdoc at the University of Washington, Carlson took a closer look at the users sharing and talking about bioRxiv preprints covering various topics on Twitter and found that some discussions were dominated by communities of science denialists and conspiracy theorists. “We found basically 10 percent of preprints had a significant fraction of their audience that were associated with far-right ideology, and that would be even higher for human genetics and neuroscience,” says Carlson, now a bioinformatics scientist at biotech company TwinStrand Biosciences. (He tells The Scientist that he is speaking in a personal capacity.) “In extreme cases, some preprints had more than fifty percent of their audience [that] were coming from these communities.” Based on these findings, he surmises that “scientists never stood a chance of maintaining control over the conversation if it was played out on Twitter, for those controversial, conspiratorial aspects of the pandemic.” 

Even for scientists who choose what they make public, there’s always a risk of online harassment, even over subject matter that is decidedly less sensitive than the pandemic’s origins. “There can be threats. Especially in the last year and a half or so, the threats have escalated,” says Kent State’s Smith. And overall, “I think there’s a lot more trolling now. Just contrarian people—or bots too, who knows—that follow you just to argue and disagree. . . . Now almost everything I post it seems like somebody just comes out of the woodwork to argue with everything.”

People have their own thresholds for what they can handle in terms of battling vocal critics and the misinformation they spread. But for researchers who aren’t deterred by the current climate on social media, a new level of caution is warranted, says Smith. “Now not only is the public paying attention, but lots of us are followed by more journalists. So when we put out a tweet that is maybe meant for colleagues . . . that two years ago might not [have gotten] any press . . . now we’ve seen those things can explode.” She recalls a time when she included “Ugh” in a tweet responding to the Texas governor’s decision in March to end statewide social distancing restrictions and mask mandates. “That got picked up by The New York Times. . . . I did get some grief about that” from colleagues, says Smith. While that incident was relatively harmless, she adds, it’s a reminder that communications on Twitter are publicly available.  “I do consider my tweets a little more than I would have two years ago.” 

Now almost everything I post it seems like somebody just comes out of the wood­work to argue with everything.

—Tara Smith, Kent State University

The effect of heightened public interest in science over the past couple of years has had the reverse effect on other researchers, notes Janssens. She says she’s noticed more tweets from some scientists that have “a clickbait style,” crafted to generate likes and shares rather than to present information accurately. Carlson agrees that the pandemic has driven some researchers to alter their tone. “Twitter incentivizes being confrontational, being a devil’s advocate, being opinionated and snarky and funny,” says Carlson. “I think with those early days of the pandemic, where scientists viewed themselves and were viewed by the public in many ways the heroes that would save us . . . there was that lure of celebrity and influence in our culture. Scientists aren’t immune to that. . . . I think a lot of scientists with credentials remotely adjacent to biology had this compulsion to be the ‘expert in the room’ on social media.”

Janssens agrees that Twitter is now flooded with “experts,” noting that infectious disease epidemiology is “a totally different ballgame” than general epidemiology. She adds that she usually refrains from tweeting about COVID-19. “I’m an epidemiologist, but I don’t know anything about COVID.”

See “Opinion: Being Scientists Doesn’t Make Us Science Communicators

But when communicating within their areas of expertise, researchers who spoke with The Scientist reiterate that Twitter is an essential tool in the fight against misinformation, and the increased attention that scientists are now getting makes the platform that much more powerful. As for whether the new followers will stick around as the pandemic is brought under control, Smith says it’s hard to say. “During previous epidemics (Ebola, Zika, MERS) I’d gotten a bump in followers that never seemed to decrease over time, so I suspect some of the larger audiences will be here to stay,” she says. “But I think the constant presence in the news cycle will decrease once COVID-19 no longer dominates. I’m looking forward to spending more time on other topics.” 


As scientists take to Twitter to communicate among themselves and with the public, they must live up to their titles, says Cecile Janssens of the Emory University Rollins School of Public Health. Simply put: “If you want to be treated like a scientist, then you should tweet like a scientist,” she says. Janssens and others who spoke with The Scientist give some tips for how to maintain a professional profile and a productive interaction on Twitter.


For sharing research findings, consider a connected series of tweets, says David Rand of MIT’s Sloan School of Management. “[T]he Twitter thread is an ideal intermediate between an abstract (which is really short and high-level) and the full paper (which takes a big time investment to read),” he tells The Scientist in an email. “[T]hese threads help make the science much more accessible.”


Don’t just preach, says Janssens—engage in a back and forth. Even when someone seems to challenge your position, explain your reasoning, she adds. “Surprisingly, people appreciate that.” Ionica Smeets of Leiden University in the Netherlands agrees that conversational interactions are important. “It’s also a way of building trust, if you are having a discussion and responding to people’s concerns.”


Don’t engage with the segment of the public that is pushing conspiracy theories or other false narratives, or accusing scientists of lying, says Smeets. “You’re not going to get anywhere there no matter how much energy you put into it.” Peter Hotez of Baylor College of Medicine and Texas Children’s Hospital Center for Vaccine Development is of the same mind. “I try to be very strategic in how I use Twitter. I try not to get into Twitter fights.”


If you’re going to tweet about science, let people know where you’re coming from, says Tara Smith of Kent State University. “There are lots of people who had never published on infectious disease before, and now they’re COVID experts and that is concerning to me.” She keeps a series of tweets pinned on her profile that describe her own credentials, so people can learn about her background and assess “where my qualifications lie and, sometimes more importantly, where they don’t.” 


Avoid tweets that read like clickbait, or content simply designed to generate likes and shares, says Janssens. “As a scientist, I always feel like you would like to have this layer of objectivity in your voice,” she says. The clickbait style undermines the credibility of the researcher blasting the information, she adds, and tends to “generate a lot of responses that are not really very helpful for the conversation.”


Explain the science for what it is, while recognizing that it is just one part of the overall debate surrounding COVID-19–related policies, Janssens notes. “There’s public pressure too, there’s unrest, there’s activism, it’s a whole system,” she says. As such, researchers should not assume that policies supported by science are always the best ones for a given community, she continues. “But you see that scientists often have no respect or no consideration for everything else that matters too when you have to make public policy. And I think it’s not helpful if you want the science to be this kind of objective source of knowledge.”