New York Times CEO: Facebook’s Approach to News ‘Profoundly Dangerous’ for Democracy

(Kyodo via AP Images)

WASHINGTON –New York Times CEO Mark Thompson called Facebook’s process of defining “broadly trusted” news sources “profoundly dangerous” for the news business and democratic debate.

Advertisement

“We face an immediate threat here which is that Facebook’s catalogue of missteps with data and extreme and hateful content will lead it into a naïve attempt to set itself up as the digital world’s editor-in-chief, prioritizing and presumably downgrading and rejecting content on a survey and data-driven assessment of whether the provider of the content is ‘broadly trusted’ or not,” Thompson said recently at the Open Markets Institute conference.

“Now, you might expect The New York Times to favor such a scheme. Indeed, Mark Zuckerberg, whose idea this seems to be, told us the Times should expect to do well in such a ranking. In fact, we regard the concept of ‘broadly trusted’ as a sinister one, which misunderstands the role journalism plays in an open society and is likely to lead to damage and distort, not just the news business, but democratic debate,” he added.

Thompson continued, “Democracy depends in part on unbounded competition between different journalistic perspectives and the clash of different judgments and opinions. History suggests that mainstream news organizations frequently get it right, but also that, not infrequently, it is the outliers who should be listened to.”

“At any given moment, think of mainstream media today in Russia, or in continental Europe in the ’20s and ’30s – a majority of the public may judge trustworthiness incorrectly.”

Advertisement

To filter out “fake news” from appearing on its platform, Facebook has started to “prioritize news from publications that the community rates as trustworthy.”

“We surveyed a diverse and representative sample of people using Facebook across the U.S. to gauge their familiarity with, and trust in, various different sources of news. This data will help to inform ranking in News Feed,” read an official Facebook announcement earlier this year. “We’ll start with the U.S. and plan to roll this out internationally in the future.”

Thompson argued that Facebook’s approach is wrong and could lead to more fake news.

“To feed transient majority sentiment about trust back into the editorial decision-making process – and to do it essentially behind closed doors – is profoundly dangerous,” he said. “The process of citizens making up their own mind which news source to believe is messy, and can indeed lead to ‘fake news,’ but to rob them of that ability, and to replace the straightforward accountability of editors and publishers for the news they produce with a centralized trust algorithm will not make democracy healthier but damage it further.”

Thompson said he is in favor of “full transparency about both algorithmic and human editorial selection” done by digital platforms such as Facebook and Google.

Advertisement

“It would be best if this were done voluntarily, but even if it requires regulation or legislation, it must be done and done promptly,” he said.

Robert Thomson, chief executive of News Corp, also criticized Facebook’s approach to determining trusted publishers.

“In the midst of the modern morass, Facebook has made a contribution by highlighting the importance of ‘trusted publishers,’ but who is to judge trustworthiness?” he asked. “The very citing of ‘trusted publishers’ reinforces the rightness of Facebook paying premium publishers and premium journalists for the reputational and experiential services they provide.”

Thomson pitched the idea of creating an “Algorithm Review Board” to hold “omnipotent algorithms to account” and bring more “transparency” to digital techniques used at Facebook, Google, YouTube and Amazon.

“If you buy a small bar of chocolate in the U.S., you’ll be told the precise ingredients on the pack and generally how many calories per serving. There will be stark health warnings on even a low-alcohol bottle of beer,” he said. “Clothing labels are often synthetic screeds, in multiple languages, to ensure compliance. And yet the powerful, mind-altering, behavior-shifting, mood-changing algorithms are allowed to work their invisible alchemy on our personalities, on our societies and on our young people.”

Advertisement

Thomson elaborated on what an Algorithm Review Board should look like, explaining that it should consist of experts in related fields.

“Call it an Algorithm Transparency Board, if you like or, if you must, an Algorithm Altruism Board. It’s obviously important that experts in the related fields preside, not politicians, and that which should be confidential is kept confidential, but that which should change is changed,” he said.

NYT’s Thompson expressed his opposition to Facebook’s new advertising policy, which many other publishers have also criticized lately.

“The depth of Facebook’s lack of understanding of the nature and civic purpose of news was recently revealed by their proposal – somewhat modified after representations from the news industry – to categorize and label journalism used for marketing purposes by publishers as political advocacy, given that both contained political content,” Thompson said.

“This is like arguing that an article about pornography in The New York Times is the same as pornography. Facebook admitted to us that their practical problem was that they were under immense public pressure to label political advocacy, but that their algorithm was unable to tell the difference between advocacy and journalism. This would be the same algorithm, which will soon be given the new task of telling the world which news to trust,” he added.

Advertisement

Facebook has said it will not exempt news publishers from the policy.

Thompson said Facebook’s ad policy demonstrates the danger of “algorithmic control.”

“When it comes to news, Facebook still doesn’t get it. In its efforts to clear up one bad mess, it seems set on joining those who want blur the line between reality-based journalism and propaganda,” he said.

“But the underlying danger of the agency of editors and public alike being usurped by centralized algorithmic control is present with every digital platform where we do not fully understand how the processes of editorial selection and prioritization take place, which right now means all of them – thus, the urgent need for transparency,” he added.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement