How humans can hold back tech-enabled disinformation
Andy Carvin of the Atlantic Council explains the real-world stakes of virtual disinformation — and why real people are key to solving it.


Reality issue
It’s easy to get caught up in discussion about virtual worlds and forget that they have impact in our real world as well. The intersection of reality and information is fraught today and could get worse in the future. Andy Carvin, managing editor of the Digital Forensic Research Lab at the Atlantic Council, and his global team research the spread of disinformation in real time and trace its sources. Rampant disinformation is a problem that arguably is made worse by technology. Carvin thinks the solution, however, is human, and the window to solve it very well might be closing.
Matt Carmichael: What does the DFR Lab do?
Andy Carvin: As we all learned in the 2016 election, there are governments willing to spend a lot of money to interfere with democratic processes around the world. Our team tries to track them down, catch them in the act, figure out the reasons why they’re running these campaigns and, whenever possible, identify the institutions behind them and reverse engineer how they were put together.
Carmichael: Is the lab mostly focused on political disinformation?
Carvin: COVID-19 was an enormous shift for us. We were already looking into public health as a disinformation space, but then ended up shifting gears for more than half the team for a good chunk of two years, trying to identify all the COVID disinformation.
Carmichael: How big of a problem is this today?
Carvin: The problem is larger than I think any of us are able to get our heads around because there’s really no barrier to entry in cost. But it’s gotten more sophisticated where entire troll farms or call centers are running personas that have talking points with very clear influence goals, very clear persona histories that have been built up — sometimes over seven or eight years. It’s in every corner of the globe and every country as best we can tell.
Carmichael: Why is disinformation so effective?
Carvin: It’s hard to be immune from disinformation when any given community or population has lost trust in government and public institutions. It’s ripe for exploitation, whether you’re trying to make a quick buck with some snake oil you’re trying to sell during the pandemic, or if you’re trying to change the course of history by planting excuses for engaging in war against a country that has not attacked you first.
Carmichael: What will it take to make any of this better?
Carvin: While fact checking is an enormously important role played by news organizations and other entities, it’s fundamentally defensive because [the disinformation] is already circulated and quite likely the damage has been done.
Carmichael: Is there any way to combat this?
Carvin: Some of the most effective people I’ve ever seen talking with American veterans, for example, about disinformation are not only former veterans, but former veterans who've gone down their own rabbit holes and had to dig themselves out. Solving and mitigating the impact of disinformation ultimately is going to have to happen at a community level, driven by the idiosyncrasies of any given culture and by identifying people in those communities [who are] still trusted by those who've lost hope in everyone else.
Carmichael: Technology is probably going to make this worse in a lot of ways in terms of synthetic media.
Carvin: Yep.
Carmichael: Will we eventually have AI assistants to tell us what’s real and what’s not — like a browser plugin?
Carvin: I think tools like that are inevitable. But I think we’re also in a cat-and-mouse game here in which bad actors who are creating new technologies or new methodologies for spreading disinformation read the latest research in journals that study information warfare. I once had a chance to interview someone employed by a company that engaged in disinformation for hire. And the very first thing he said was that he was a big fan of the DFR lab. It's flattering and maddening at the same time.
Carmichael: So, it’s back to trusting each other and our shared values?
Carvin: Yeah, it goes to the issue of what can we be doing to rebuild trust and bonds within a community in which neighbors are more likely to give each other the benefit of the doubt and people are less likely to judge others based on slices of their demographics or their associations or affiliations. In some ways the reason why we have such a huge disinformation problem right now is because the internet and public discourse had already deteriorated to the point where too many people were already looking for an excuse to not trust someone. Disinformation is an enormous problem, but loss of trust within our culture and within our institutions is what's allowing that to happen.