Welcome to the first edition of All Systems Go, a weekly newsletter on disinformation, tech, privacy and labour with a focus on Canadian stories.
I called it All Systems Go, because it’s important to talk about the systems and structures that allow disinformation to flourish, not just individual companies or actors (though there will be some of that too). And I don’t know about you, but I feel like I’m always getting new information, always getting ready to respond or update or retweet, so I felt the title captured the always-on nature of our online lives.
The disinfo world can sometimes (often) be summed up like this:
And I would like to help us all feel a bit less like that.
So, let’s dive in:
Obviously Iran is a huge focus for disinformationists right now.
No, this video wasn’t taken inside the plane shot down in Iran/Non, cette vidéo n'a pas été tournée à l'intérieur de l'avion abattu en Iran
Today my colleague Jeff Yates posted this quick debunk (it’s in French):
Basically, someone has taken a video of a plane fire in Russian in May of 2019, edited it by cutting off the start of the video, and then shared it, claiming it was from the inside of the Ukranian Airlines Flight PS752. It isn’t, but this stuff spreads fast.
Given how many citizen videos have been posted online, it’s not surprising that fakes are getting mixed in.
But for those who haved posted videos that depict the truth, the stakes have gotten higher. Buzzfeed has a story about one Iranian man who went into hiding after posting a video that shows two missiles - not just one - hit PS752.
The video in question was first verified by the New York Times. The team there has been doing some first-rate work on video verification around the Iran plane crash.
And, when it comes to cyber warefare, these guys think that we’re likely to see “an expansion of cyberwarfare in the Middle East – and around the world.” And another thing to note: “According to industry sources, Iran has ongoing operations in at least 2,200 public and private facilities regulating everything from electricity to water.”
Lastly, an Iranian fact-checker who runs FactNameh (Book of Facts) which examines Iranian government claims lives here in Toronto. However, there are still questions over where the group gets its funding.
Last year, Jeff Yates and Roberto Rocha and I took a look at YouTube’s recommendation algorithm in Canada and found that, for logged out users, it doesn’t recommend conspiracy theories anymore - and in fact can steer users away from political content pretty quickly. It was a sign that YouTube was taking seriously problems with its algorithm, even if the fix appeared to create some new problems.
Now Becca Lewis, a researcher who covers media manipulation and political digital media at Stanford and Data & Society, says that even without the algorithm’s help, YouTube is still a hotbed of far-right propaganda, for two reasons: celebrity culture (because YouTube is still a starmaker), and community.
People can build up relationships with their followers, and bring fringe ideas to the mainstream because of the inherently non-hierarchical feeling of YouTube. The idea is, I’m not telling you what to think, we’re just two friends having a conversation.
And, says Lewis, the ways that influencers get their content shared by building networks, like by teaming up with other influencers, means that the good, old-fashioned idea of community with people you know (or feel that you do) helps radical ideas to spread. Lastly, audiences can demand their favourites provide them with more radical content.
Key quote: “All of this indicates that metaphor of the “rabbit hole” may itself be misleading: it reinforces the sense that white supremacist and xenophobic ideas live at the fringe, dark corners of YouTube, when in fact they are incredibly popular and espoused by highly visible, well-followed personalities, as well as their audiences.”
Basically, YouTube isn’t sending people down radicalization rabbit holes - the content is in plain sight.
Also of note from Avaaz: YouTube found promoting climate denial to millions. The non-profit found that YouTube is recommending climate denial videos to users and it’s making money off those videos by running pre-roll ads from Greenpeace and the World Wildlife Fund, along with other major brands.
The Toronto Star’s Betsy Powell highlights a few cases where curious jurors have gone to Google and that’s led to “sometimes forcing mistrials or worse, verdicts that could be based on misleading or false information gleaned from the internet.”
The U.K. has created rules against this which means jurors there could face criminal sanctions, but there are no similar rules in Canada.
Small bits to note:
Not strictly disinfo-related by Nieman Lab got a whole bunch of thinkers in journalism to talk about what the next decade in news looks like. I think this in particular sums everything up well: 20 questions for 2020
And, while looking to the future of news, Whitney Phillips suggests calling disinfo and the like “polluted information” which I quite like. (Phillips notes she is building on the suggestion of one of the leaders in the field, Clare Wardle of First Draft News. If you follow no one else in the disinfo sphere, follow these two women - and Joan Donovan.)
If you live in Toronto, you’ve been hearing a lot about smart cities, data regulation and Sidewalk Labs’ plans for a waterfront neighbourhood. Here’s another proposal: dumb cities. They’re not only better for your data privacy, turns out, they’re better for fighting climate change!
That’s it for now, keep your skepticism alive and remember: don’t share it unless you’ve read it first!