Leaks, Lawsuits, and Government Likely to Reveal Facebook Algorithm, Cyber Lawyer Says
Whenever conservatives on Facebook find their posts censored as "hate speech" or their accounts temporarily blocked, the social media company apologizes and points to its algorithm, rather than a human being, as the cause. This enables the company to reject claims of bias and targeting, scapegoating a secret algorithm rather than addressing what many conservatives fear is a pervasive pattern of bias and censorship.
This week, tech companies like Facebook, Google, and Twitter have come under more scrutiny, with President Trump tweeting about alleged censorship of conservatives and employees inside Facebook complaining about anti-conservative bias within the company. A cybersecurity lawyer told PJ Media that Facebook will not be able to keep its algorithm secret, as government intrusion, private lawsuits, and internal leaks will eventually reveal whether or not the social media platform systematically censors conservatives.
"I'm not sure how Facebook's algorithms become public, but there's no doubt that greater transparency in how those algorithms are written would allow a more intelligent discussion about what's happening," the cybersecurity lawyer told PJ Media. He predicted that the algorithm may become public in any combination of four different ways.
"If the algorithms aren't discoverable, you'll see groups arising to pool the data available to them to reverse engineer what Facebook is doing," the lawyer predicted. "One of the most effective of these groups may be political groups that want to buy ads to targeted readers. How does Facebook meet that demand without revealing what it knows, and how does Facebook vet the 'authenticity' of those groups beyond asking the simple question of whether they have the ability to pay?"
Facebook's revenue depends largely on selling advertising to these groups, and political advertising is a very large market for the social media platform. If those advertisers work together and go public about how Facebook's process works, that can reveal a great deal.
Other methods of discovering the algorithm would prove more intrusive, however.
"You'll also see efforts to force confidential disclosures within protected settings — within litigation (brought by shareholders, users and advertisers), and government regulators — that will provide some visibility," the lawyer predicted. He suggested the social media giant has a rocky road ahead, with government considering whether to regulate social media as a public utility and with increasingly dissatisfied customers and advertisers.
"I think Facebook will face continued assaults from government regulators over whether it is a common carrier, as well as from private lawsuits over breach of contract, and simply from lost revenue as advertisers re-examine the value of eyeballs," the lawyer added. Facebook is already facing dozens of lawsuits.
Conservatives concerned about censorship suggest either that social media platforms are effectively monopolies and should be considered public utilities, or that companies like Facebook are effectively violating their terms of service, a much more limited strategy that would not open up the huge can of worms that is government regulation.
The lawyer warned that "imposing government obligations on social media to monitor content will make it impossible for new market participants to get a foothold, as they will lack the resources to hire the 25,000 censors that Facebook has." As of November 2017, Facebook employed a rough total of 25,000 staff. The exact number of the monitoring team is unknown, however 3,000 more staffers were dedicated to the effort last March.
Whether government regulation or private lawsuits challenge social media companies, either situation would help make the algorithm public. But the lawyer's final suggestion of how the algorithm will become public seems the most likely.
"You'll also see insiders leaking information," he predicted. "With 25,000 censors, it will be impossible to keep the secret sauce secret for long."
The cybersecurity lawyer suggested that at the end of the day, "Facebook's attempt to monitor content is a very heavy lift, and one that's likely to fail long term."
He suggested a new system based on zoning or reputation. "The better solution is for Facebook and anyone providing online forums to disclaim responsibility to monitor content or at most to create zoning rules that categorize content based on broad categories so that readers can avoid the zones they don't wish to enter — either by type of content or by age appropriateness."
"They should also empower individual users to block others based on whatever criteria they want to apply, including a reputation criteria," the lawyer suggested. "The failure to create a repetitional system that participants value is the primary reason why social media is so unpleasant today."
He suggested that "while we can dispute how such a concept is implemented — through private contract or through government coercion — it seems obvious that reputation matters and that many of the problems on the web are caused by the fact folks have incentives to speak who would not speak (or would speak differently) if they knew their reputation would be affected."
Were some sort of reputation system established, it would severely curtail the threat of "trolling" and cyberbullying. Such a system must be as hands-off as possible, however.
"It's also true that we err by forgetting the average consumer is far smarter than we give them credit for ... and if they aren't smart, there's no cure for stupidity like letting them make a mistake and suffer the consequences," the lawyer suggested. "We are trying too hard to make the entire Internet a safe space."
Indeed, it seems tech companies are working overtime to make the Internet a "safe space," but not a safe space for conservatives. Last week, Facebook took down posts sharing mainstream conservative articles from authors Salena Zito and Jenna Lynn Ellis, saying they "look like spam."
Facebook has repeatedly suspended Christian scholar Robert Gagnon for "hate speech," when he posts about the Bible's vision of marriage and sexuality. Facebook also suspended a German history professor in April for saying that Islam is not a part of German history. Facebook also "shadow banned" the conservative video nonprofit Prager University, preventing at least nine PragerU posts from reaching any of their 3 million followers, and deleting PragerU's videos.
Tech companies are notorious for their liberal culture. A survey early this year found that conservative employees in Silicon Valley tech companies live in fear that their political beliefs will be found out. James Damore, a senior software engineer fired from Google, said conservatives at Google are "in the closet" and that Google executives are digging through a secret mailing list in order to out them.
Worse, tech companies like Facebook, Google, Amazon, and Twitter have relied on the Southern Poverty Law Center (SPLC), a far-left smear factory that brands conservative and Christian organizations "hate groups," listing them along with the Ku Klux Klan. In June, the SPLC had to pay $3.375 million to settle a defamation lawsuit against a Muslim reformer. About 60 organizations are considering separate defamation lawsuits against the group.
Amazon.com has used the SPLC "hate group" list to deny access to its charity program, Amazon Smile. The company exiled D. James Kennedy Ministries and Alliance Defending Freedom (ADF), a Christian legal nonprofit that has won 9 Supreme Court cases in seven years. D. James Kennedy Ministries is suing Amazon and the SPLC over this action.
More recently, the SPLC-designated "hate group" Jihad Watch and its founder Robert Spencer were de-platformed by Patreon, and then the crowdfunding site GoFundMe effectively stole thousands from Spencer.
When Trump tweeted about the possibility that Google's algorithm is biased against conservatives, CNN's Chris Cillizza dismissed this idea as a "conspiracy theory." If PragerU, James Damore, Salena Zito, D. James Kennedy Ministries, and the SPLC did not exist, he'd have a point. As it stands, the evidence warrants further investigation.
If the cybersecurity lawyer who spoke with PJ Media is correct, Facebook's algorithm may be headed for the light of day. (Let's hope government intrusion isn't the way it happens...) This would help clear up exactly how the social media giant filters information, and whether or not this "conspiracy theory" is in fact the true story.
Follow the author of this article on Twitter at @Tyler2ONeil.