Why coronavirus disinfo is spreading so far, so fast
I’ve been thinking about this tweet from Jeff and Jane because I was asking myself this week why all the efforts to fact-check seemed Sisyphean compared to the waves of disinfo out there. And in fact, since she tweeted this, the number of debunks she’s done has risen to 41, while Jeff has pointed out here (in French) that the International Fact-Checking network has documented 132 separate bits of disinfo circulating as of Thursday morning.
Here’s why I think we’re seeing what we’re seeing right now. The last big global flu pandemic was H1N1 in the 2009-2010 flu season, and it originated in Veracruz, Mexico. Social media was not a competitor for news distribution at that point.
In fact, Facebook in September of 2009 had only just unveiled the ability to tag your friends in status updates and comments. The “Like” button had been introduced earlier in the year, in February - and users wouldn’t get the ability to like individual comments until June of 2010.
Twitter, meanwhile, still regularly crashed because of traffic and we all got to spend a lot of time staring at the fail whale.
People were really, really into flash mob videos. And flash mob marriage proposals. I remember we used to play these videos on CBC News Network as a tease into commercial breaks. It was a different world.
Social media, then, was not our primary source of news. Social content didn’t spread as quickly or as easily because of the limitations of the platforms and the hardware itself. We were still mostly getting our news from mainstream media organizations, who were getting their updates from health authorities and governments. Not to say disinformation didn’t exist then but it was much more contained.
The next chapter begins with the Ebola outbreaks in Western Africa (Liberia, Sierra Leone and Guinea) from 2013-2016 and in the Democratic Republic of Congo in 2014, and 2018 to now.
Social media usage has increased for all of us at this point, and we increasingly live online and get our news from our feeds.
At the same time, Russian trolls help boost disinformation around the Ebola outbreaks in order to increase distrust of the United States.
Within some of these countries, rumours tended to spread by word of mouth, as fewer people had access to the internet. (Time estimates here that the DRC has internet “penetration” of just 7 per cent of the country.) That disinformation helped Ebola spread because of incorrect information on how to prevent catching it, and further prevented people from getting needed help because they were distrustful of aid workers.
I think a key difference in this situation compared to H1N1, however, is that far fewer people visit West Africa or travel from West Africa around the world. So, unless you were planning a trip there, the spread of Ebola was more contained, even though it is much more deadly than H1N1 or the current coronavirus outbreak (I think we can call it 2019-nCov now). That’s not to say there wasn’t racist disinformation and fear, but it was more limited on a global scale because people in rich Western countries did not think it applied to them.
That brings us to this year’s outbreak. We now have a world where people can live entirely through social networks and get their news from friends, through group chats and meme accounts.
Couple that with the fact that China is the world’s most populous country, and plenty of people to and from there for work, school or pleasure, and to visit family around the world for Chinese New Year and there’s the potential for the virus to spread much more quickly. At the same time, there’s much greater access to the internet, which means rumours can spread much more quickly, both within China and without. The Wall Street Journal has a look at how government and censors are battling social media posts here. From the peice: “The fact that criticism keeps appearing on social media—despite China’s far-reaching censorship—speaks to the challenge that Beijing faces in maintaining stability as fear and frustration spread.”
And, with the election of Donald Trump in 2016, white supremacists and the far right have been emboldened to share their views and push them into the mainstream. They’re also key drivers of racist disinformation this time around, stoking fear and pushing debunked narratives.
Lastly, I think that platforms need to do more to spot and remove disinformation from circulating, because an individual debunker can only stick so many fingers in the holes in the dam, when what you need is a bulldozer - or for the dam not to leak in the first place.
And for your reference, here are a few numbers regarding the flu, to help put things in perspective:
During the 2018-2019 flu season in Canada, 48,818 cases of influenza were detected by laboratory reporting. (Obviously flu cases were even higher, because there would be people who were sickened but whose illness was not detected by a lab.) Of those recorded cases, 613 people were admitted to an ICU, and 224 deaths were reported.
If you’re a data nerd you can see all this and more in the annual reports that the Public Health Agency of Canada puts out.
The Globe’s health reporter, Andre Picard, wrote a thoughtful piece about fear of coronavirus and how media influences that:
The mainstream media fearmongers, however inadvertently, by using exaggerated language like “killer virus” and by fixating on body counts. When you constantly update the number of cases and deaths, you wildly amplify incremental change. Of course people will be scared. Imagine if we sent out push alerts for every tuberculosis death (1.5 million a year) and every measles death (140,000 annually).
If you want to keep up to speed on the latest in the disinfo, follow Jane Lytvynenko, who has been doing excellent work this week and deserves a huge pat on the back.
And for French disinformation debunks (and some English ones too!), follow Jeff Yates.
I’ll leave the last word to Joan Donovan, one of my favourite scholars researching the internet.
In other news:
Kevin Chan, the head of public policy at Facebook Canada, has been appointed the new election integrity scholar at Carleton University.
Charlie Angus, who in the last session of Parliament was the vice-chair of the Standing Committee on Access to Information, Privacy and Ethics, has feelings about it: ““It sounds more like you’ve invited Dracula to oversee the security of the blood supply in the country,” he said.
What Will You Do When the Culture War Comes for You? Charlie Warzel on why the Post was wrong to suspend Felicia Sonmez over her tweets about Kobe Bryant, and why newsrooms are unprepared for attack campaigns on their journalists.
Quick reads:
The Hidden Human labour in A.I.
The Base updates. Jason Wilson has done a stellar job reporting on The Base for the Guardian, check out these links if you haven’t been staying on top of it.
Revealed: the true identity of the leader of an American neo-Nazi terror group
That’s it, thanks for reading this week!