Facebook launches ‘war room’ to combat manipulation
In Facebook's 'War Room', a nondescript space adorned with American and Brazilian flags, a team of 20 people monitor computer screens for signs of suspicious activity.
(FILES) A file illustration picture taken on April 28, 2018 shows the logo of social network Facebook displayed on a screen and reflected on a tablet in Paris. Facebook said September 26, 2018it now has 300 million daily users of “stories,” a format inspired by Snapchat, and would now begin delivering ads with these visual messages.The announcement shows Facebook is making strides in this new format which allows people to share videos or photos with emojis and augmented reality options that could allow users to draw eyeglasses or hats on faces, and which disappear after 24 hours. / AFP PHOTO / Lionel BONAVENTURE
The freshly launched unit at Facebook’s Menlo Park headquarters in California is the nerve centre for the fight against misinformation and manipulation of the largest social network by foreign actors trying to influence elections in the United States and elsewhere.
Inside, the walls have clocks showing the time in various regions of the US and Brazil, maps and TV screens showing CNN, Fox News and Twitter, and other monitors showing graphs of Facebook activity in real time.
Facebook, which has been blamed for doing too little to prevent misinformation efforts by Russia and others in the 2016 US election, now wants the world to know it is taking aggressive steps with initiatives like the war room.
“Our job is to detect … anyone trying to manipulate the public debate,” said Nathaniel Gleicher, a former White House cybersecurity policy director for the National Security Council who is now heading Facebook’s cybersecurity policy.
“We work to find and remove these actors.”
Facebook has been racing to get measures in place and began operating this nerve center — with a hastily taped “WAR ROOM” sign on the glass door — for the first round of the presidential vote in Brazil on October 7.
It didn’t take long to find false information and rumors being spread which could have had an impact on voters in Brazil.
“On election day, we saw a spike in voter suppression (messages) saying the election was delayed due to protests. That was not a true story,” said Samidh Chakrabarti, Facebook’s head of civic engagement.
Chakrabarti said Facebook was able to remove these posts in a couple of hours before they went viral.
“It could have taken days.”
Humans and machines
At the unveiling of the war room for a small group of journalists including AFP this week, a man in a gray pork pie hat kept his eyes glued to his screen where a Brazilian flag was attached.
He said nothing but his mission was obvious — watching for any hints of interference with the second round of voting in Brazil on October 28.
The war room, which will ramp up activity for the November 6 midterm US elections, is the most concrete sign of Facebook’s efforts to weed out misinformation.
With experts in computer science, cybersecurity and legal specialists, the center is operating during peak times for the US and Brazil at present, with plans to eventually work 24/7.
The war room adds a human dimension to the artificial intelligence tools Facebook has already deployed to detect inauthentic or manipulative activity.
“Humans can adapt quickly to new threats,” Gleicher said of the latest effort.
Chakrabarti said the new center is an important part of coordinating activity — even for a company that has been built on remote communications among people in various parts of the world.
“There’s no substitute to face-to-face interactions,” he said.
The war room was activated just weeks ahead of the US vote, amid persistent fears of manipulation by Russia and other state entities, or efforts to polarise or inflame tensions.
The war room is part of stepped up security announced by Facebook that will be adding some 20,000 employees.
“With elections we need people to detect and remove (false information) as quickly as possible,” Chakrabarti said.
The human and computerized efforts to weed out bad information complement each other, according to Chakrabarti.
“If an anomaly is detected in an automated way, then a data scientist will investigate, will see if there is really a problem,” he said.
The efforts are also coordinated with Facebook’s fact-checking partners around the world including media organisations such as AFP and university experts.
Gleicher said the team will remain on high alert for any effort that could lead to false information going viral and potentially impacting the result of an election.
“We need to stay ahead of bad actors,” he said. “We keep shrinking the doorway. They keep trying to get in.”
For more news your way, download The Citizen’s app for iOS and Android.
For more news your way
Download our app and read this and other great stories on the move. Available for Android and iOS.