Zuckerberg Promises Anyone Yanked Off Facebook for Content Will be Able to Appeal

Facebook's CEO Mark Zuckerberg speaks during the VivaTech trade fair in Paris on May 24, 2018. (Eliot Blondet/Abaca/Sipa via AP Images)

Facebook CEO Mark Zuckerberg told reporters on a conference call Thursday that the social media behemoth isn’t too big to manage, but “you have to hire a big team of experts, you have to hire a lot of people to do enforcement, people to be able to get into the nuances of the decisions, the AI systems, to be able to flag stuff and take it down proactively in some cases, reduce borderline content.”

Advertisement

“And then you also have to deal with the fact that you need to address bias in those systems, and the system is going to make errors, so you need to address that, and then are going to be multiple levels of that,” Zuckerberg said. “And then on top of that, you want to create transparency and you do that in a number of ways. You do transparency reports and calls and engaging people and encouraging academic research. I just think that this is the reality for any platform that is even at a fraction of the size that we are at. Not — these were not new problems we reached a billion people or two billion people.”

The Facebook boss said the company has been making progress on its goal of “getting harmful content off our services more broadly — things like hate, bullying and terrorism.”

He said the company is “building a much more robust appeals process” for those who find themselves suspended from the platform because of content. “We have a team of about 30,000 people today who work hard to enforce our policies. But many of these judgments require some nuance,” he added. “For example, to determine if a post violates our hate speech standards, a reviewer might need to determine if a post is using a racial slur to condemn the slur — or using it to attack someone.”

Advertisement

Content reviewers, though, “make a fair number of mistakes, and how we handle them is very important… our next step is to let anyone appeal a decision on content they reported and get a clear explanation of why it did or did not violate our standards.”

Monika Bickert, a former criminal prosecutor who leads the team developing Facebook’s community standards, said “greater transparency was really important” so users “could understand where we draw these lines and how our policies apply to their own posts or photos.”

“There are so many people and organizations with valuable experience on hard issues — like child exploitation, terrorism, hate speech, bullying — and we really want to learn from them,” she said. “We regularly ask them for advice, including when we think we need to change or update a policy, either because social norms are changing or because we’re seeing a new issue in the world and we want to address it.”

Zuckerberg said the company is “basically going through and rearchitecting a lot of systems, not only Newsfeed but the recommendations that we make for Groups and Pages to make sure that the content is not borderline content, whether that’s misinformation or mean content that doesn’t maybe violate our hate speech or bullying policies but may still be a little bit negative and make people feel not-good.”

Advertisement

“But we don’t want to ban it because it doesn’t cross the line of what we think should be prevented and — from giving people a voice, but we want less of that stuff in the network,” he added. “That’s generally very important.”

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement