WASHINGTON – Facebook plans to double its staff for combating fake news, terrorism and other security issues, a company representative told Congress on Wednesday, more than two months after CEO Mark Zuckerberg announced that shifting strategy will significantly eat into profits.
Facebook’s head of product policy and counterterrorism, Monika Bickert, and Twitter’s director of public policy and philanthropy, Carlos Monje, Jr., both appeared before the Senate Committee on Commerce, Science, and Transportation.
Bickert said that Facebook’s safety and security team is expected to grow from 10,000 to 20,000 by the end of 2018. Facebook currently employs 23,000 people.
Facebook, which has 2 billion users, reported about $10.3 billion in revenue last quarter. Zuckerberg said in November that he expects Facebook’s operating expenses to grow by 45 percent to 60 percent in 2018, while revenue is only expected to grow 30 percent.
Of the 10,000 employees who work either full-time or part-time on safety and security, Bickert said about 180 focus specifically on counterterrorism efforts.
Though he didn’t offer exact numbers, Monje told lawmakers that Twitter’s security team is far smaller, given that the company only employs about 3,700. But like Facebook, Twitter has about 2 billion users.
Monje said that Twitter’s entire engineering and design team is actively fighting about 4 million malicious, automated accounts each week, which is twice the amount recorded weekly in 2017.
“They keep coming back, and they try different methods to get back on the radar screen,” Monje said, while adding that many fake users brag about their ability to rejoin the network. He called it a “cat and mouse” game that requires Twitter to constantly evolve.
Twitter’s process for dealing with automated and malicious accounts, he said, involves sending messages to the owners to verify that they are human. Twitter, which recorded about $590 million in revenue last quarter, also employs contractors, consultants and academics to focus on this area, he added.
“In order to make progress on this issue, you do need to have humans, and we have former law enforcement,” Monje said. “We have experts.”
Sen. Brian Schatz (D-Hawaii) asked if the American public can expect interference again in the upcoming election.
“We think we’re better prepared for this election than we’ve ever been,” Monje said.
Chairman John Thune (R-S.D.) said in his opening remarks that the companies have a very difficult task: “Preserving the environment of openness upon which their platforms have thrived, while seeking to responsibly manage and thwart the actions of those who would use their services for evil.”
Foreign Policy Research Institute scholar Clint Watts offered lawmakers an assessment on what kinds of threat these fake, malicious accounts pose to American democracy. He argued that the end goal for malicious users, like troll farms in Russia, is to create chaos, not to support one party over the other.
In non-election years, malicious users sound off on social issues to gain an audience, and then in the election year they pivot toward candidates they want to support, he said. After the election, they spread information that is meant to make the American electorate suspicious about democratic institutions and voting systems, he added.
“It’s about destroying democratic institutions and confidence in the U.S. government or democratic institutions to govern properly, that the system is always rigged,” Watts said. “You can’t trust anyone. That’s really the focal point of all those efforts the Russians might run or any authoritarian regime that wants to run a campaign against the U.S. government.”
The Russian troll accounts, he continued, have had a ripple effect on the entire system, as legitimate campaigns now feel the need to resort to the same tactics.
“If there is no sort of regulation put around ads and social media, every political campaign, whether it’s in the U.S. or around the world, will have to use a dark social media campaign through either super PACs or candidates, to keep up with their competitors, and it will not only harm the societies which it is in, but it will actually harm the social media companies and their platforms,” Watts said. “They will actually make the environment so terrible and so polarized, as we’ve seen over the past few years, that it will create just a nasty sense for democracy. … It dilutes the line between fact and fiction, and when that happens, you can’t actually keep a democracy moving forward.”