Politicians and Technological Complexity

AP Photo/Ng Han Guan

During the 20th century, human institutions operated mostly within a set of givens. Surrounded by what, in comparison to the 21st, was a relatively low level of technology, political leaders acted out their ambitions on a stage little changed in many respects from ancient times. But we are entering the ‘golden age of technology’, according to the NYT, and that calculus has changed.  Jennifer Doudna known primarily for her work on Crispr, the gene-editing Swiss Army knife that has been called “a word processor” for the human genome says … “‘I think we’re at an extraordinary time of accelerating discoveries.'” Paradoxically the public mood is pessimistic, not eagerly anticipatory. The NYT puts it down to bad vibes. “The pandemic has exhausted many Americans of medicine, and it has become common to process the last few years as a saga of defeat and failure.” Why is that?

Advertisement

Part of the reason is because order, including the global world order of the late 20th century, is more fragile than disorder in times of rapid change and the public senses it. The outcomes predicted by models constructed in simpler times fail to eventuate. To paraphrase David Beatty at Jutland, ‘something is wrong with our bloody theories today.’ For example, the ‘rising east, declining west’ scenario is not materializing.  Russia fails to conquer Ukraine; sanctions don’t shut down Russia; printing money falls flat; vaccines go disappointingly pfft. What happened to our crystal ball?

China Slips Into Deflation in Warning Sign for World Economy,” writes the WSJ. “The lifting of Covid-19 pandemic curbs has been followed by an unusual bout of falling consumer prices instead of a surge. … A drop in exports is accelerating, youth unemployment has hit record highs and the housing market is mired in a protracted downturn. China’s predicament stands in contrast to those of the U.S. and other Western countries, where soaring inflation prompted central banks, including the Federal Reserve, to raise interest rates in an effort to cool growth without triggering a recession.”

Washington isn’t doing too great either. “Congratulations, of a perverse sort, to President Biden and his Congressional comrades. The latest budget figures show that they are breaking peacetime, non-crisis records for spending and deficits… The biggest increase in outlays so far this year has been net interest on the soaring federal debt: a rise of $146 billion to $572 billion, or 34%. That interest total is nearly double all corporate tax revenue so far this year of $319 billion.” Like a patient who’s been given more medicine yet whose tremors keep increasing, there is a feeling that the old pills don’t work any more.

Advertisement

Are we facing a crisis of complexity? With technology creating ever more emergent possibilities while governance simultaneously becomes ever more inept,  the ratio between unintended to intended consequences U/I rises inexorably, slowly at first than ever more rapidly. If we let -T represent the negative potential of technology then as U rises relative to I, -UT/I becomes progressively more dangerous. Unintended effects begin to dominate calculated intent, which when amplified by puissant technology becomes worse. This can be called the “crisis of complexity”.

Why does U/I tend to rise with complexity? Two bits of entropy has 4 possible outcomes. The odds of guessing a right value is 1/4. But 20 bits of entropy has a 10^6 possible values. Guessing has a 1/10^6 chance of being right. The space of error grows faster than solution space. Tolstoy knew this. It’s called the Anna Karenina principle. Aristotle’s version of the same idea is: “it is possible to fail in many ways, while to succeed is possible only in one way”.

Jim Carrey once dramatized the absurdity of regarding such long odds as “a chance” in the movie Dumb and Dumber. The trouble is, old school politicians still think they can wager at this table.

When you only have stone age technology, outcomes are linear and predictable and -UT/I is a small negative. But as T advances and politicians convinced of their omnipotence things become overleveraged with respect to risk. Today we have the power to dim the sun, change the chemistry of the oceans, control the genetic destiny of humanity, create artificial intelligence and anticipate disease X, possibly because some government lab is already making it. And we will, because those opportunities are too tempting to pass up. A spectre is haunding the global world: the spectre of U/I.

Advertisement

If T or technology must monotonically increase, then the ratio of unintended to intended consequences (U/I) is inevitably magnified. To best manage U/I it is important to: 1. privatize costs; 2. ensure maximum information transparency. Government involvement in biotechnology, AI, geoengineering etc is particularly risky because it 1) subsidizes costs; and 2) throws a blanket of state power and secrecy over untried technologies. They will overreach and that will ultimately lead to disaster.

The psychological root of the problem arises from the subjective perception of risk formed during a simpler era. Even then it was not totally absent. Scientists during the Manhattan Project reckoned there was a small chance the Trinity Bomb would set off the atmosphere. The risk was deemed small enough to accept. The fact that you haven’t been hit by a car jaywalking across a freeway doesn’t mean you won’t. There many more technology cars coming on today than in 1945.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement