News & Politics

Silicon Valley's Futile Search for Utopia Via the 'Perfect Algorithm'

Silicon Valley's Futile Search for Utopia Via the 'Perfect Algorithm'
Picture: Zuckerberg and his bodyguard of B-Class voting shares. (AP Photo/J. Scott Applewhite)

I find myself thinking quite a bit about algorithms these days. The word itself derives from the Latinized name of the Persian scholar Mohammed ibn-Musa al-Khwarizmi, reputedly the inventor of algebra, who flourished during the Abbasid Caliphate in ninth-century Baghdad. 1200 years later with the advent of the computer, the age of programming and machine-learning dawned, refining and applying al-Kwarizmi’s scheme of numeration into conceptual regions he could never have imagined. According to the standard definition, an algorithm is a set of rules determining the nature and order of computer calculations, the major component of search engines that canvass data bases cued by key words.

In The Master Algorithm, Pedro Domingos writes: “A programmer—someone who creates algorithms and codes them up — is a minor god, creating universes at will. You could even say that the God of Genesis himself is a programmer.” The serpent in Algorithm Eden is complexity — of space, time and human limitation — creating a world that grows “increasingly fragile.” Data is turned into information and information is transformed into knowledge.

But the situation grows increasingly tangled when we realize that algorithms can reflect a programmer’s ignorance or prejudice or explicit design, and that algorithms can also learn to rewrite themselves, that is, they can also be self-programming, introducing a degree of uncertainty into the original parameters. Knowledge may be skewed, infected by error, and even prey to delusions — a tree whose fruit should not be plucked and eaten.

Similarly, Frank Pasquale in The Black Box Society: The Secret Algorithms that Control Money and Information alerts us to the troubling fact that algorithms, often opaque to their own programmers, can serve to reinforce social taboos, prejudices and prior assumptions which reflect the unconscious attitudes of the programmers. But these attitudes may also be quite conscious, introducing a propagandistic element into the algorithm. Pasquale writes: “The proprietary algorithms… are immune from scrutiny,” rendering us vulnerable to surveillance, censorship, and coercion masking as persuasion, and so “undermining the openness of our society.” He points out that mining data from social media in the hunt for potential malefactors “comes with a high risk of false positives.” We should be aware, too, that it comes with a growing certainty of false negatives. Indeed, given the monopolistic power of the major social media networks, practically all promoting progressivist memes and left-wing politics, such contamination is inevitable.

The principal social networks — Facebook, Google, Twitter, YouTube, Patreon, etc. — all rely on secret algorithms derived from human emotional and ideological input. As Jim Treacher writes at PJ Media, “Tech companies are notorious for their liberal culture. … Worse, tech companies like Facebook, Google, Amazon, and Twitter have relied on the Southern Poverty Law Center (SPLC), a far-left smear factory that brands Conservative and Christian organizations ‘hate groups.’” Clearly, these networks are not simply digital common carriers but a species of political cabal.

Niall Ferguson’s study of network theory in his recently released The Square and the Tower shows us how we are often the victims of the “discrepancy between the ideal and reality,” so that in “mak[ing] the world more connected” — Facebook’s mission statement — these networks may actually have made the world more susceptible to manipulation. As noted, they filter out content deemed “hateful” — that is, unpalatable to the controllers, who routinely censor posts and messages of a conservative stripe on the excuse that they “look like spam” or constitute “hate speech” — and strive to foster “community governance” on a global scale, which is to say, mass control from the top down. In effect, these algorithmic conglomerates have become both the sign and the driver, as Ferguson says, of a world “falling apart.”

In my wackier moments, I like to fantasize that the confusion and thought-programming from which we suffer in the contemporary West is the product of a jihadist conspiracy, orchestrated by a Muslim mastermind named Mohammed al-Gorithm, who has convinced us that Islam is a religion of peace and that those who object are guilty of Islamophobia. Or that a nefarious character whose real name is Al-Gorithm has managed to convince us that the globe is incinerating due to exponential carbon buildup. Of course, the databases relied upon here are subject to prior engineering, the information extracted is corrupted, and the knowledge acquired is filled with error about the world we live in.

On a more serious plane, we need to recognize that our beliefs and actions are increasingly predicated on falsehood. Islam is peaceful, the planet is growing warmer, the seas are rising and polar bears are on the verge of extinction. Ultimately, taking such calculations to their logical extreme, we can assert that gender is a social construction — there are 32 or more sexual morphisms with which we can identify — or that our universities are places of tolerance and free debate or that masculinity is toxic or that multicultural diversity makes us stronger or that socialism is the solution to all our political and economic problems. We create an alternate reality with no relation to actual social, political and physical reality which, to quote the philosopher Ludwig Wittgenstein in the Tractatus Logico-Philosophicus, is “all that is the case.”

Rather, the algorithms currently operating in the social and political spheres have us chiefly believing and promoting all that is not the case. They have us behaving in ways that must infallibly lead to our demise as rational beings as we become human bots seeking an ever more fraudulent epistemology.

The German philosopher Hans Vaihinger, in his book The Philosophy of ‘As If’ (Die Philosophie des Als Ob), argued that “fictions” that are functionally serviceable, imaginary constructs that work and are “fruitful,” may be regarded as true. Thus we might say there exists an algorithm unlike any of the others, a super algorithm that is, so to speak, plugged into the historical computer. Those who are programming us toward the abyss are not themselves responsible for the catastrophe that awaits. The progs, the feminists, the SJWs, the fetid creatures of the Swamp and the Deep State, a “liberation theologian” anti-pope, media moguls and university officials, billionaire socialists roiling the masses — they are merely elves and puppets, doing the bidding of the Master Programmer. They are the Great Manipulator’s unwitting retainers, busy processing their culturally inspired subroutines that serve a supervening purpose, unaware they are involved in an eschatological mission.

Naturally, they are preoccupied by their own personal, local and temporal concerns, but it is “as if” they are themselves programmed by the super algorithm to bring the civilization of which they are a part to its terminal moment. They do not provide us with algorithms to live by, as Brian Christian and Tom Griffiths suggest in their book of that title, algorithms that “people can borrow for their own lives [to] better understand the errors that we make.” These are algorithms to die by.

As for the Master Programmer whose super algorithm is ineluctable, none can say who or what it is, except to posit a quasi-mystical presence, one of whose names is Entropy, and whose algorithm governs the decline and death of civilizations like ours. Barring Divine intervention, there is no way around this algorithm. There is nothing we can do about it. The rules and commands are fixed. The ostensible villains delivering our ruin are merely its little helpers, nothing more.

Just as human beings are unavoidably prone to cellular programming, so civilizations are subject to the gradual silencing and shutting down of their historical trajectories. Their accounts are eventually blocked. I have no antidote to suggest that can frustrate the algorithmic teleology at work. All we can do is struggle episodically against the Master Programmer’s baneful deputies, if only to defer the inevitable. It’s better to go down tomorrow than today.