Craig Silverman, now the Toronto-based media editor for BuzzFeed News, has been digging into unhappy facts for years. But back in 2015, he came out with a report which would foretell the misinformation tsunami that would soon arrive. Published by the Tow Center for Digital Journalism at Columbia University, “Lies, Damn Lies and Viral Content: How News Websites Spread (and Debunk) Online Rumors, Unverified Claims and Misinformation” pretty much speaks for itself.
In the post-truth era, Silverman is known as a “fake news” expert – the go-to guy on digital rumors, misinformation and all sorts of information otherwise unverified in the virtual media space. It was Silverman’s work around the 2016 US presidential elections – including stories about Macedonian teens running pro-Trump websites – that propelled him firmly into the expert seat, where he’s in demand on shows like NPR’s Fresh Air and making the rounds at international media conferences, which is what he was up to in Seoul in early October when GIJN caught up with him.
But Silverman didn’t just emerge a couple years ago with a PowerPoint and some clever buzzwords. Starting out as a freelancer who wrote a blog about media accuracy, he did time as managing editor of PBS MediaShift, was a columnist with The Globe and Mail and created a real-time rumor tracker called Emergent.info while a fellow at Columbia’s Tow Center. He also happened to edit one of the gold standards for digital content verification, the Verification Handbook.
At the Uncovering Asia conference earlier this month, Silverman spoke to a packed room at the Millennium Hotel in Seoul, as clicking laptops tried to keep up with his insights, about the massive global disinformation space which, he said, feeds off polarization and human bias, exploiting networks and algorithmic programming at the same time.
Tanya Pampalone, GIJN’s managing editor, talked with him afterward to hear more about disinformation, as well as deep fakes, white noise, Hurricane Sandy, the Arab Spring, his first blog and what it is like to be the most depressing person in the room.
Here’s an edited extract of their conversation:
GIJN: One thing we learned at GIJN, when working on a story about Deep Fake technology and how investigative journalists might dissect and combat it, was that even the experts told us the technology was in such early phases that there was no standard to fight back, other than good old-fashioned analysis. That’s scary to hear.
Craig: Deep fakes are already good enough that they can fool a decent amount of people. At this stage, what we’ve seen them mostly deployed for is inserting the faces of actresses into porn videos. But it is only a matter of time before the technology is used to impact an election.
GIJN: Until now, “having the video” was pretty darn good proof. How do we recover truth from that?
Craig: I think it is likely a deep fake might last minutes or hours online before it is refuted. But I don’t think it is likely that we are going to see something dragged on for days or weeks. Deep fakes involve real people and a real person will be able to confirm what their location was or was not. And so, as much as this is a new technology, some of our existing approaches for verifying video are still going to apply.
The technology isn’t yet extremely well developed, which means there are some telltale signs if you look closely at a video; if you look closely at the mouth, you can see if they’re sort of fabricating someone’s speech. But along with that, one of the things that’s somewhat encouraging is that some of the research groups which are working to advance deep fake technology are also, at the same time, working on helping detect deep fakes; one lab released a set of fabricated and manufactured images that they can use for training data to help spot deep fakes.
You won’t believe what Obama says in this video 😉 pic.twitter.com/n2KloCdF2G
— BuzzFeed (@BuzzFeed) April 17, 2018
GIJN: BuzzFeed’s Barack Obama and Jordan Peele deep fake made a great – and frightening — point of how of an actual fake could go viral.
Craig: You could imagine deep fakes going into existing polarized communities that feed into perceptions they already have, and who then accept it at face value. But I don’t think a deep fake, in that sense, is all that different from a false or misleading article. So, I don’t know that it’s a game changer when you’re dealing with people who are already very polarized.
In the short term, how it might be effective is when there’s a video that comes out, you’ll have lots of newsrooms and others spending a ton of energy figuring out if it’s real or not, and it just distracts and takes up oxygen in the news cycle.
GIJN: How does this feed into the notion of “white noise” which you referred to in your talk?
Craig: People understandably fixate a lot on false information, on hyper-partisan narratives and on the seeding of those messages. But there is an equal danger just in distracting people from a narrative that a ruling party, a politician, a certain entity, may find uncomfortable. China is probably the biggest practitioner of this is. They have large groups of people who, when an inconvenient narrative is starting to spread, they will come out and start posting about some pop star or entertainment thing or a completely different event.
There is still a risk of censorship but in some ways the new censorship is drowning people out in an abundance of information. Rather than keeping it away from you, it’s preventing you from getting at the information you really should be seeing. It also has the effect of suppressing an important conversation which then gets completely washed out by this white noise that creates an impression that people actually don’t care about that particular topic. It’s one of the weird byproducts of a media ecosystem characterized by abundance. We have so many different sources of information, different platforms, so many people out there talking, and that’s a great element of democratization. But it also means that dealing with abundance is a big challenge, and if you can create a huge onslaught of a counter-narrative then you can get people to stop paying attention to the thing that maybe they should care about.
GIJN: It’s really creepy – a sort of real-life Brave New World scenario.
Craig: Yeah, and there are so many vectors of attack. You can have the destruction and the drowning out attack of inundating people with different kinds of information. There’s also the ability to target people with very specific kinds of messaging to change their mind or create action. And then there is the de-legitimization of real media, so that it makes it easier and more fertile ground for you to put your own messaging out there. There are so many different things of happening and they’re all happening at the same time, in all different degrees at a global level.
GIJN: Someone posed an interesting question to you earlier about why it is so hard for journalists to be credible right now – and that whole de-legitimization aspect is what she was getting at.
Craig: There are governments who are actively trying to tell people that “That’s not real journalism. What I’m giving you is real journalism.” And that creates a crisis of trust in society. It adds to the challenge that media always has of getting people to pay attention and read or watch the things we produce. The difference is that, over the last few years, there is a concerted effort by many governments to de-legitimize independent and fact-based media. And that is absolutely a worrying trend.
It’s not a coincidence that we have more journalists in jail right now than we have had in several years. It’s not a coincidence that media is struggling in some places to have a financially viable model. The time that we most need fact-based credible media is also a time when fact-based media is experiencing huge headwinds and challenges in many places around the world.
GIJN: It’s so disturbing.
Craig: It’s not fun to talk to me. I know, I’m sorry.
GIJN: Does this keep you awake at night?
Craig: I’m so in it and motivated to work on it that I don’t sit back and kind of marvel at how terrible things have gotten because there’s a lot of work to do. I am optimistic over the fact that it’s been roughly a decade that I’ve been kind of looking at this stuff and it hasn’t been until now, over the last year or two, that so many different key players in this — technology companies, academics, researchers, civil society groups and some governments — recognize the problem.
There is actually a global awareness and a sense of outrage about it, and now people are working on solutions. That wasn’t the case prior to 2016; there was a weird, small group of researchers and journalists and other people who were looking at this and saying this is getting really bad and it’s going to get worse. On the other side, my biggest concern is that this is a fad and, in a year or two, when there isn’t as much public interest and outrage around it, the money starts to dry up for funding for research.
GIJN: What else is scaring you right now?
Craig: Everything is so interconnected and complex — and one of the things that concerns me so much is that it’s hard to sort of solve one area, and say, alright, we took care of that, so now let’s move on to something else. All of these elements: the rise of more authoritarian leaning governments, the ongoing decline of trust in institutions in Western democracies, growing income inequality, the dominance of social platforms, the erosion of traditional media business models, and on and on and on. It’s not just about the disinformation. It’s also about global trends in society.
I worry just that populations are less resilient to this stuff. And if your population is less resilient then it’s very easy for folks to sow division, and it’s very easy for folks to spread fake stuff. So the hard part is building that resilience in populations and helping educate people.
GIJN: Can we backtrack a little about how this all started for you? I wanted to ask you about your blog, Regret the Error, which I understand you started as a freelancer in 2004.
Craig: I was reading media blogs like Gawker and others at the time and blogging was still relatively new and I just wanted to launch something of my own. I ended up focusing on corrections because they could be funny, and they could be shocking. But also, journalists talk so much about accuracy and about our ethic of correction and I didn’t see it was something that was being looked at or talked about. By looking at mistakes every day, by looking at corrections every day, I started to connect the dots and see things that other people who weren’t doing that work every day didn’t see. And that’s the same thing that happens now, looking at the disinformation and hyper-partisan content. By making it a habit and dedicating yourself to it, you can start to peel back the layers and learn more.
GIJN: During that time, from about 2004 to 2014, newsrooms around the world were experiencing a massive shift, with many young journalists coming into the newsroom in this information-overload age with social media at the center of things, where “according to reports” – as opposed to making the call yourself — was one part of what they believed was acceptable reporting. What concerns you most about young journalists coming into the post-truth era?
Craig: In 2015, I did a research project where I was looking at how online websites were treating unverified claims that were going viral on social media. It was really depressing because you saw really reputable news organizations just taking a tweet that someone wrote and writing a story around that. You would see them using a headline that took an unverified claim and the headline would actually make it seem like it was true, and then the body text would walk it back. And so that culture of quick aggregation and going after things to capture the traffic and the audience was, in retrospect, getting towards the zenith of it.
You can look at BuzzFeed as a case study, where journalists had always written up viral stuff. But now BuzzFeed has introduced many more standards around verifying anything that’s going viral. The good news is that, as media brands are realizing that the Facebook firehose of traffic is not what it used to be and will never be what it was, they’re having to get back to creating real relationships with their readership and their audience.
GIJN: I’m thinking back further, to Amina, the young Arab lesbian blogger in Syria covered by the Guardian back in 2011 – who ended up being a straight man – as a turning point, where newsrooms were looking closer at themselves during the digital onslaught and figuring out the frightening new ways they needed to verify information.
Craig: For me, Hurricane Sandy (in 2012) was a bit of a watershed in terms of the amount of misinformation that was spreading. But I don’t recall one specific event that made me look at it more. It’s just over time, I started realizing how easy it was for people to create fake accounts, realizing there was this whole evolving ecosystem of media manipulation that was coming along with what I felt were hugely positive developments in media.
I look back and I feel a bit naive that early on I viewed social media as a uniformly wholly positive thing and didn’t really see a lot of the downsides. Looking back at the Arab Spring, we felt like these platforms were going to be engines of democracy and they were going to prevent censorship and repression. At that point, I didn’t see that they could become powerful tools for autocratic governments, and powerful tools for repression and censorship. Post-Arab spring, I started to wake up a little bit, as I think a lot of people did when you started to see how these platforms could be could be used and misused.
GIJN: So what do you like to underscore with journalists about the verification process?
Craig: The first thing that I say is that they need to understand that the process of verifying online information doesn’t take a huge amount of time, and it doesn’t take technical expertise. So you actually don’t have an excuse to not look into that tweet before you write it up. I try to empower them with the skills to be able to do that, and for them to understand that people will forget those stories, but they’ll remember if you were the one who aggregated something that turned out to not be true, or that you fell for a trolling campaign. Most journalists don’t really want to just spend their whole time aggregating other people’s work and they don’t want to be driven by traffic alone. The good news is that the incentives in newsrooms are starting to shift away from that a little bit, and to realize that they need to create content that maybe helps drive subscriptions or helps build credibility.
You look at where BuzzFeed started — it was not in news, it was entertainment-driven and in the early days it would publish stuff that we would never publish now. Over the span of the last six years, it’s been the introduction of a news culture, an introduction of standards, the separation of news from entertainment. And now we have a team of more than 20 dedicated investigative journalists. We battle every day to establish credibility and we know that it takes a long time to build trust and to establish a news brand. And we know that if we publish stuff that turns out not to be true, if we don’t adhere to our standards, then we’re hurting our chances of building that trusted brand.
GIJN: Okay, I think we’ve asked you just about everything.
Craig: Sure. Now you can go curl up in a ball somewhere.
Be sure your toolbox includes Craig Silverman’s Verification Handbook — he says he’s hoping to work on an updated version soon – and his ever-growing list of Verification and Digital Investigations Resources.
Tanya Pampalone is GIJN’s managing editor. The former executive editor of South Africa’s Mail & Guardian, Tanya taught creative nonfiction and digital storytelling at the University of Witwatersrand and ethics and editorial integrity and Rhodes University’s Sol Plaatje Institute for Media Leadership. Her most recent book is I Want To Go Home Forever: Becoming and Belonging in South Africa’s Great Metropolis (Wits Press, 2018), and she contributed to Southern African Muckraking: 300 Years of Investigative Journalism That Has Shaped the Region (Jacana, 2018).