RACHEL MARTIN, HOST:
The midterm elections are less than a month away. And Facebook is ramping up its fight against disinformation. In 2016, the social media network was caught pretty much flat-footed as foreign actors spread false narratives by masquerading as Americans. Facebook has now set up a system to try to detect and disrupt bad actors on their network who are trying to delegitimize elections, spread fake information and suppress the vote. NPR's Tim Mak has a look inside.
TIM MAK, BYLINE: We all live in an increasingly digital world. But even within a large tech firm like Facebook, there is no substitute for face-to-face communication. So last month, Facebook set up an elections war room on their Menlo Park, Calif., campus.
UNIDENTIFIED PERSON: Is now when I should take video? Or...
MAK: In the room, a large American flag hangs above custom-built dashboards that monitor viral news, spam and voter suppression efforts. Fox News, CNN and a rolling Twitter feed play on large screens along the wall. And a rotating group of data scientists, engineers and operations specialists occupy approximately 20 posts in the room. Here's Samidh Chakrabarti, the head of the civic engagement team at Facebook.
SAMIDH CHAKRABARTI: So we've built dashboards. And these dashboards actually have alarms on them so that if there's any spike in unusual activity, the war room is alerted to them. And then our data scientists will be able to look at any sort of anomalies that are detected, figure out if there are actually problems underneath those anomalies that we see and if there are, pass them along to our operations team.
MAK: We just want to note that Chakrabarti is the brother of Meghna Chakrabarti, host of WBUR's On Point. Facebook offered him as a source for the story. Public officials are watching closely to see if Facebook will be able to prevent disinformation like that seen during the 2016 campaign. Here's what Senator Mark Warner, the top Democrat on the Senate Intelligence Committee, told NPR.
MARK WARNER: Facebook was asleep at the switch. And even in the immediate aftermath of the elections, they denied that there were any foreign influencers on their platform. They were dead wrong.
MAK: These concerns reflect the centrality of Facebook in American civic life. Forty-three percent of Americans get news on Facebook - by far the website Americans most commonly used for news according to a recent survey by Pew Research. The social media network doesn't want to police content, generally speaking. But Facebook officials think that cracking down on inauthentic behavior, such as fake accounts and spamming, will help curb foreign influence operations. Here's Chakrabarti again.
CHAKRABARTI: We have actually made huge advances in artificial intelligence and machine learning. And we have been able to block, in a recent six-month period, 1.3 billion fake accounts from forming.
MAK: But Facebook faces a formidable challenge. Kevin Mandia, the CEO of FireEye, a cybersecurity firm which counts Facebook as a client, recently told NPR that there are many foreign operations that analysts have yet to discover.
KEVIN MANDIA: I strongly doubt that we've caught a hundred percent of the ones Iran's doing or Russia's doing. And I would say we're more on the 3 to 5 percent on what we've found, meaning there's a ton of it going on right now that we're wholly unaware of.
MAK: Facebook told NPR that it has seen foreign information operations increasing as the midterms near, which is what they expected. Here's Nathaniel Gleicher, their head of cybersecurity policy.
NATHANIEL GLEICHER: Information operations, what we're talking about here - it's a security challenge, which means that you have sophisticated adversaries that are continually trying to figure out new ways to cause harm and to manipulate your platform.
MAK: With now less than three weeks to go until Election Day, this war room will be at the heart of the effort to keep foreign misinformation out of the democratic system.
Tim Mak, NPR News, Menlo Park.
(SOUNDBITE OF TOR'S "VAULTS") Transcript provided by NPR, Copyright NPR.