The iconic image of the “self-righteous” religious believer is a cliché of postmodernism. Dogmatic clerics in the Middle East smile with satisfaction while they denounce the infidels. Cable TV preachers wince ecstatic eyes to heaven, praising God all the way.
The famously self-righteous Jerry Falwell smirked with conviction that the 9/11 atrocities were “divine retribution” for America’s sins.
It’s a notable mien among true believers of any faith: a self-contented look of homemade rapture that stinks suspiciously of George Clooney’s Southparkian smug.
But now I get it. Smug actors and smiling preachers are brainsoothing.
In a new book called God’s Brain, anthropologist Lionel Tiger and co-author Michael McGuire provide the evolutionary and neurochemical narrative that may explain something of Rev. Falwell’s bliss.
The story goes like this: About one hundred and fifty thousand ago, some unwashed human sat at the mouth of a cave on a moonless night and peered for a long time into the uncertain darkness. Listening intently, and no doubt worn from the mortal losses of everyday living, he (or perhaps she) made a profound realization.
Death — that took your mother, your father, your children, your mate, and everyone else in the whole hard world — would someday come for you, too.
The thought unfolded in a wave of neurochemical changes passing through the posterior medial frontal cortex, where decision-making and cognitive uncertainty are managed by the human brain.
A depressing thought, to be sure. And, as this was the first person ever to consider it, it made him quite lonely and sad.
In fact, the thought might have killed him much sooner than if he had just not had it in the first place. If it lingered, such a hopeless and depressing idea might actually cause harm. The neurotransmitter norepinephrine would be effected. Brain serotonin levels would drop. There is a good likelihood that concomitant loss of social status would reinforce the depressive state.
It’s easy to imagine how this sort of phenomenological realization might actually lead to personal extinction. The dimmed prospects for reproduction alone suggest nature would select against such a cognitive insight.
But, as the authors point out, the brain is unhappy with ambiguities and uncertainties. It likes to know what lurks in the tall grass. And also beneath the still waters of death.
“The prospect for complete nothingness after death appears to be bewildering and unendurable to many people,” write the authors. “So an antidote arises.”
The antidote is religion: a comprehensible human narrative meant to “fill in the incompleteness of experience.”
For the authors, religion — or more precisely, the reinforcing triad of religious belief, socialization, and ritual — is the cure to the disease of cold enlightenment.
“At least 80% of the adult world professes religious affiliation and a large proportion of these people actually engage in observable religious behavior,” they write, citing behaviors like prayer, church attendance, wearing a cross or other religious insignia, naming yourself a member of a church or mosque, or any time spent visualizing the afterlife.
Here are the facts: about 2.1 billion human beings are self-identified Christians, 1.5 billion Muslims, and billions more adherents to the other 4200 cataloged religious groups on Earth. From art and architecture, to governance and war, religion has been a driver of human behavior across all societies throughout history, right up to our current era of tottering postmodernism.
The authors warn early that we “need both a zoom lens and a microscope to see religion,” thereby reducing human behavior — and the mind itself — to a consequence of biochemistry.
Leaving aside any direct criticism of religion — which the authors are scrupulous to avoid — it is nevertheless clear that Tiger and McGuire err firmly on the side of the secular, even to the point of its defense: the human brain “imagines and believes things for which there is no hard evidence. What else could produce such astonishing ideas as the existence of life in other galaxies, gods, a designer of life on earth, animals with human motivations and personalities, an afterlife, hell, heaven, witches, demons, angels, and the certified sin of pride?”
But here is where the authors commit their own certified sin of reductionism. Should readers believe the premise that religiosity is entirely reducible to an evolutionary survival trick? By this standard, virtually any human institution — including science — must also be reducible to the level of a neurochemical event.
Which takes us away from anthropology and into the realm of physics.
The Greek philosopher Leucippus is widely credited with developing the first theory of atomism. In his vision, there were two kinds of essential states: one was solid, and the other was space. As it turns out, his idea — for which there could be no scientific proof for another 2500 years — was eerily close to the truth.
Leucippus contemplated atoms, while Moses contemplated God. If atoms can be discovered first in the mind of a man, perhaps God can be as well.
But the authors offer no such succor to the symbol-starved bipeds of the postmodern world. God’s Brain is over-populated with trees, but contains no discernible forest.
God’s Brain is a fun read. Tiger and McGuire have done a fine job of presenting the materialist facts of neurochemistry, and the curious case of religiosity as a survival technique. I recommend the book for its powerful microscope, but don’t expect to find a telescope here.