Deep in the bowels of Facebook’s serpentine campus in Menlo Park, California, is a room about 25 feet square that may have a lot to do with how the world thinks about the company in the coming months. It looks like a Wall Street trading floor, with screens on every wall and every desk. And 20 hours a day—soon to be 24 hours a day—it’s jammed with about two dozen geeks, spooks, hackers, and lawyers trying to spot and quash the next bad thing to happen on the company’s networks.
It’s known appropriately as the War Room, and it was set up just a month ago—in advance of the Brazilian presidential election and US midterm elections—as perhaps Facebook’s most dramatic and visual step to ensure that the fraud and manipulation that was rampant on Facebook’s networks during the 2016 US presidential election don’t recur.
In years past Facebook would have worked hard to keep an effort like this under wraps for fear of letting competitors know what it was up to or of signaling an imperfection in one of Silicon Valley’s biggest success stories. But Facebook’s reputation has been damaged by the manipulation it did not detect in 2016, its arrogant response after the vote, and this year’s Cambridge Analytica scandal. So Wednesday morning it invited about two dozen journalists to take a look, to ask questions, and to hopefully tell the world that Facebook is at least trying to get things right this time.
The room doesn’t look like much from the hallway of Building 20. In fact, blackout paper on the windows makes it look like an unlit empty conference room. But inside are screens monitoring viral content, spam, hate speech, and voter suppression across all Facebook’s networks, as well as external sites like Twitter and Reddit. “We’ve been doing all this work virtually for two years. But when stuff needs to be done fast, there is no substitute for face-to-face contact,” said Samidh Chakrabarti, Facebook’s elections and civic engagement boss.
Spokesperson Tom Reynolds said the two dozen teams represented inside the War Room are backed up by 20,000 people Facebook has dedicated to better policing its platforms. The teams include specialists in threat intelligence, data science, engineering, research, operations, legal policy, and communications. For now, it is staffed about 20 hours a day. That will increase to 24 hours five days before Brazil’s October 28 vote and again five days before the November 6 US midterm elections.
The room is set up as an acknowledgment that Facebook is in an arms race against the tricksters and manipulators, and that threats can arrive from anywhere. They can start with someone creating a fake account, or a legitimate account launching a fake news campaign, or as spammers out to make a quick buck, and they can start in any language in any country. With representatives of so many teams working within ear shot, the hope is that the threats can be addressed in a matter of minutes before they get traction on the platform.
On Wednesday, executives wouldn’t commit to using the War Room beyond the US midterms. But they talked so effusively about its work that it’s hard to imagine why they’d consider shutting it down.
The jaded will quickly contend that this is nothing but spin from a company that continues to see the problems it has faced over the last two years primarily as public relations issues. It’s hard to take its fix-Facebook campaign seriously when a few times every month the company does something that makes users, investors, and employees shake their head with exasperation. Just in the past month it announced the largest hack in its history, affecting about 30 million accounts. It was surprised when its Washington boss Joel Kaplan appeared on television as a Brett Kavanaugh supporter during his confirmation hearings. Then last week it botched the messaging surrounding its in-home speaker known as Portal. First it said that no information from the device would be used to target advertising at users. Then it admitted this week that data about calls and app usage can in fact be used for targeting on Facebook and Instagram. During Wednesday’s briefing, a Brazilian journalist asked why Facebook in Brazil continued to be filled with fake news, despite Facebook’s efforts to stamp it out. Executives did not answer the question directly.
But it’s also hard to avoid noticing the fact that Facebook, the world’s biggest digital communications company, is now on record extolling the virtues of analog, face-to-face conversations. It would never have done something like this two years ago. Back then CEO and founder Mark Zuckerberg believed the less human interaction with content on Facebook the better.
Despite its missteps, the company has taken concrete steps this year to better police its platform. It’s revamped the News Feed algorithm. It’s imposed a slew of new restrictions on political advertising. It’s doubled the size of the teams working on these issues to 20,000 people. And it’s shocked investors by suggesting that it might need to spend about $4 billion before it felt like it had adequately addressed the problems.
It appears that Facebook is also trying to be more transparent about the issues it faces. Hardly a week now goes by without Facebook disclosing attacks it’s discovered and pages or accounts it has disabled. Compare that with Facebook’s response to the Cambridge Analytica scandal in March, when neither Zuckerberg nor COO Sheryl Sandberg said anything publicly or to employees for a week, and it’s obvious there have been changes.
Today, Facebook talks about using computers and people to solve problems this way: “Policing the platform is like finding a needle in a haystack,” Reynolds said. “We use AI to shrink the size of the haystack. Then we use people to find the needle.”
It would be great for all of us if this approach works as well as Facebook makes it sound.