Transcript: Beware the “New Google” (And Much More)
Interview with Steven Brill & Gordon Crovitz, co-founders, NewsGuard
For podcast release Monday, September 26, 2022
KENNEALLY: Grapefruit peel and lemon peel simmered slowly in water to extract the maximum quinine and vitamin C. It’s not a recipe for a trendy homemade energy drink, but a DIY prescription for hydroxychloroquine and touted online as a cure for COVID-19. You can find the phony pharmaceutical on the world’s most popular website. No, not Google – the new Google, TikTok.
Welcome to Copyright Clearance Center’s podcast series. I’m Christopher Kenneally for Velocity of Content.
In growing numbers, people take questions about healthcare, politics, or finding the best restaurants not to Google, but to TikTok, the short-form video platform. This month, a NewsGuard investigation revealed that such TikTok searches consistently feed false and misleading claims to users, most of whom are teens and young adults.
NewsGuard is a journalism and technology tool that rates the credibility of news and information websites and tracks online misinformation for search engines, social media apps, and advertisers. About 40% of all the news and information sites assessed by NewsGuard receive a red rating, categorizing them as untrustworthy. NewsGuard’s co-founders join me today to report on the state of fake news in late 2022 and to share how to fight it.
Steven Brill, welcome to Velocity of Content.
BRILL: Thank you.
KENNEALLY: And Gordon Crovitz, welcome to the program as well.
CROVITZ: Thank you, Chris.
KENNEALLY: Gordon Crovitz, you and Steven started NewsGuard in 2018, the same year as TikTok became widely available. Fake news seems even bigger and badder than ever in 2022. How is NewsGuard fighting it?
CROVITZ: We fight misinformation on behalf of readers, on behalf of brands, and on behalf of democracies. So we rate all the news and information sources that account for 95% of engagement. News consumers can find those ratings and detailed nutrition labels through services like Microsoft. We similarly help brands keep their ads off of misinformation sites. With the nature of programmatic advertising and algorithmically driven advertising, the ads of every blue-chip company is going to end up on the worst websites unless they take some step against it. And then we also help democracies around the world, particularly against Russian and Chinese disinformation being aimed at populations in the US and in Europe.
KENNEALLY: And, Gordon Crovitz, TikTok is a problem, but it’s not the only problem. In fact, NewsGuard research says that misinformation is a $2.6 billion problem and one that’s especially dangerous for brands. How are companies unwittingly contributing to the problem?
CROVITZ: This is the biggest shock that we’ve had in our time at NewsGuard – to be able to quantify the amount of online advertising unintentionally going to support misinformation sites, from Russian propaganda sites to healthcare hoax sites that’ll sell you a subscription to peach pits to cure cancer – chock-full of ads from every brand you could think of. It’s $2.6 billion a year. And the reason it happens is that with programmatic advertising, which is advertising selected by a computer – that’s the biggest category of advertising, and an ad will end up on all kinds of sites unless the brand itself or the ad agency or somebody in the ad tech world takes some step to advertising responsibly. That means not advertising on Vladimir Putin’s websites or crazy healthcare hoax sites and instead steering ads to high-quality sites. Or in local communities, we have inclusion lists that include sites serving the Black community, Hispanic, Asian, LGBTQ+ communities – sites that get relatively little advertising, because they’re not that well known to advertisers.
When people think about the problem of misinformation, they may not understand that economics drives a lot of it. And if we can get that $2.6 billion down to $1.3 billion, down to $650 million, eventually down to nothing, that will really go a long way to solve the problem.
KENNEALLY: Steven Brill, what kind of searches lead to misinformation on TikTok? Is it only controversial topics that are liable to fake news? And we should say, of course, that TikTok has said it does not allow harmful misinformation, including medical misinformation, on its platform, and that it does take action to direct users to trustworthy sources. But what’s at stake here? What’s happening to drive the rise of misinformation on TikTok?
BRILL: Well, it may be taking steps, but it’s plainly true that it does allow disinformation and misinformation, as evidenced by the fact that just casual searches by our team popped up one in five pieces of clear disinformation and misinformation.
In that regard, let me add one thing. You noted accurately in your introduction that of the 8,000 news and information sites we’ve rated, close to 40% we’ve rated as publishing false news. I don’t want to give the impression that we’re a bunch of Puritans, and that if we see a site we don’t like, because we don’t agree with its politics, or it has a different policy position on climate change than one of us may have, that it gets a red rating. These sites have to be really bad to get a red rating. They have to be saying that 9/11 was an inside job by the Bush administration, or as Gordon mentioned, that peach pits will cure cancer, or that hydroxy will cure or prevent COVID, or that Barack Obama was born in Kenya. This is provably false, bad stuff. It’s not the subject of on the one hand, on the other hand debates.
So with TikTok, to come back to your question, we found just everywhere you looked provably bad stuff, stuff that normal people would, one assumes, laugh at, except in this country, and I guess in other countries, there aren’t normal people laughing at it. There are a lot of people believing it.
CROVITZ: Chris, those searches were on neutral terms, like 2022 election, mRNA vaccine. So these were not on highly charged topics. And our analysts also found that when they did searches, TikTok would actually complete the search for them. For example, when our analyst did a search for COVID vaccine, which is a kind of search that a young person might very well do to learn more about it, TikTok suggested that the search be for COVID vaccine injury or COVID vaccine truths or COVID vaccine exposed, COVID vaccine HIV, and COVID vaccine warning – in other words, highlighting the alarmist and often false claims about the COVID vaccine.
KENNEALLY: Steven Brill, that NewsGuard study did note a difference, then, in the kind of results that come up on TikTok and Google. Tell us about that. If one were to search on the war in Ukraine, or on COVID vaccines, as Gordon Crovitz was just describing, what would be the difference in the results that Google would yield?
BRILL: Well, the Google results are significantly tamer, shall we say. I think it’s because they’ve used humans to intercede a little bit. I never thought a year ago, before we started looking heavily at TikTok, that we’d be comparing any entity to Google and saying Google does a relatively good job, because they’re pretty awful. But TikTok has really put them to shame when it comes to disinformation and misinformation.
Part of that, I think – and maybe it’s just me – is that TikTok is owned and controlled by the Chinese Communist Party, and the one thing that we know for sure about TikTok is that TikTok has decided for whatever reasons not to expose the children of China to the content that our report demonstrates they are exposed to in the US and in Western democracies.
So you’ve got to ask yourself – you have an information machine that is controlled by an adversary of the United States, and lo and behold, that misinformation, disinformation, is going to the United States from that adversary and not being seen by any of the people in that adversary country. How would anyone explain that? If it’s dangerous for the children of China, why is it OK for them to make money giving it to the children of the United States?
KENNEALLY: Steve Brill, I’d like to follow up on the point about the NewsGuard ratings. You’re a highly regarded journalist. It’s a team of journalists who are making these ratings, right? And they’re doing so on the basis of really well established journalistic ethics.
BRILL: Right. It’s nine specific criteria. It’s not just a bunch of people sitting around saying, oh, yeah, that looks like a good site. We like that. Or I’ve heard of the Boston Globe. That’s legit. It’s a scrupulous, careful, multi-person look at how every one of these websites scores against nine specific criteria. Does it have a transparent policy to make a correction when they realize they’ve made a mistake? Do they mix news and opinion in a way that people can’t tell if it’s news and opinion? The basics that any journalist learns and adheres to.
KENNEALLY: Steven Brill, an easy answer to stopping the spread of misinformation would be for the social platforms to throw the carriers of fake news off them. Twitter has done that with former president Donald Trump, and he responded by starting his own social media site, Truth Social, earlier this year. NewsGuard has found, though, that the conspiracy theories of QAnon, for example, traveled there with him. What’s the concern there?
BRILL: Before I answer that, let me say one thing. We don’t believe in kicking anybody off of any site. We believe in the right of editors, which is what the platforms are, to make their own editorial decisions. But we don’t think the government ought to tell Facebook or Twitter or anybody that they have to ban certain content. What we believe in is providing information about who’s feeding you the content so you can make your own decision.
Having said that, Truth Social has taken that to what only Donald Trump could produce, which is the logical extreme, which is in essence, let’s just promote the really bad stuff under the flag of the First Amendment. That’s what they’ve done, and that’s what he’s done. It’s now being talked about everywhere. He’s now doing it at his campaign rallies. He’s promoted QAnon. And the executives at Truth Social, apparently still scratching and clawing to try to raise the money to keep the thing going, are appealing to what they perceive to be, apparently, as their base, which is the QAnon believers. So they’re promoting QAnon, and Trump’s promoting QAnon. It’s sort of hard to believe, but it’s true.
KENNEALLY: Gordon Crovitz, when a video of a UFO turns up on TikTok or anywhere else online, what steps should a child or should I take to see if it was just the Goodyear blimp?
CROVITZ: (laughter) Well, I think we always encourage people to look at the source and to be as familiar as possible with the source. Does this come from an established news operation? And if it does, hopefully you’re looking at it on a platform that includes an explanation of the nature of that news source from us or from somebody else. That’s the easiest way. We actually have agreements with a number of entities serving schools and universities so that when they do a search result for UFOs or any other topic, they’ll instantly see – it says from a generally reliable source, or is it not?
I think in the earlier era – even the internet, but before social media – I think it was still possible for people to teach themselves how to become news-literate. I frankly no longer believe that. I think it’s become impossible. The nature of these platforms, the algorithmic engines that promote falsehoods and send people down rabbit holes, the utter absence of disclosure about the nature of the sources on so many platforms really makes it hard for people to know what to believe or not to believe.
I think the platforms have an obligation to provide tools so that their users can protect themselves against what they’re seeing on the platforms. And I think it is significant that responsible companies like Microsoft do that, and most of the social media platforms still do not. Knowing, as we can now see ourselves, that a significant percentage of people are getting most of their news on social media platforms from unreliable sources, it’s not surprising that we have people believing things that are untrue.
KENNEALLY: Gordon Crovitz, co-founder of NewsGuard, thank you for speaking with me today.
CROVITZ: Thank you for having us.
KENNEALLY: And Steven Brill, also co-founder of NewsGuard, thank you for joining me.
BRILL: Thank you. It’s been a pleasure.
KENNEALLY: That’s all for now. Our producer is Jeremy Brieske of Burst Marketing. You could subscribe to the program wherever you go for podcasts, and please do follow us on Twitter and on Facebook. I’m Christopher Kenneally. Thanks for joining me on Velocity of Content from CCC.