Our grandfathers ran around as children playing cowboys and Indians. Our fathers played cops and robbers. In the digital age, we have video game iterations of the same dichotomy like Counter-Strike, a classic and frequently remade title featuring frantic objective-based gunplay between terrorists and the counterforces employed to stop them.
A mainstay of masculine entertainment, the terrorist stands in place of the generic black hatted villain of yesteryear, all but tying damsels to railroad tracks. As antagonists go, terrorists come readymade, requiring little to no explanation for their menace. They hail from somewhere exotic, believe something bizarre, and destroy as a means to their chosen end. Often, we don’t even care what fuels their violence so long as we get to shoot back. As I think back on terrorist films I’ve watched multiple times, like True Lies or Air Force One, I couldn’t tell you exactly why the bad guys were bad or what they hoped to accomplish. It didn’t really matter. They were there to rally our hate and earn a satisfying death at the hands of our hero.
For a moment, the reality of September 11, 2001 and the subsequent ‘War on Terror’ paused all of that. Suddenly, terrorists weren’t to be taken lightly as make-believe villains. What fueled their violence became a matter of grave consequence. No matter our political perspective, how we thought of terrorists changed dramatically.
For the Left, certainly during the Bush years, the terrorist became the pitiable personification of American imperialism, the sins of a nation come home to roost. For the neoconservative faction of the Right, as institutionalized by the Bush administration and its supporting organizations, the terrorist became the next advent of the Evil Empire, a virulent boogeyman lurking around every corner much like the Cold War spy before him.
Check out Walter’s previous articles in this ongoing series Thursday mornings exploring video games, cultural villains, and American values at PJ Lifestyle. From May 2: “Beating Back the Nazi “Sickness” and last week: What Zombies Teach Us About Human Nature. And also see Walter’s A Reason For Faith series, reprinted last week here. In these four articles Walter begins to formalize his task of synthesizing the Judeo-Christian tradition with Ayn Rand’s Objectivism and Tea Party activism. - DMS
In one of the most vivid dreams I can recall, I witnessed the landing of a plainly alien spaceship. It came lucidly, dancing on the edge of wakefulness, informed by enough of my rousing consciousness that it felt particularly real. I remember the feeling that my feet were glued to the ground, that I couldn’t move if I wanted to, not on account of some external force, but due to an overwhelming sense of awe and anticipation. The one thought dominating my mind: everything is about to change.
Though it was only a dream, I retain the memory as vividly as though it were of an actual experience and believe I will respond similarly if ever confronted by a true interplanetary delegation. Something about that kind of moment, when the veil lifts upon an existential mystery, produces an irresistible thrill. Perhaps that tops the list of reasons why our popular culture remains ever fascinated by the prospect of extraterrestrial life.
Aliens have become such a prolific device in our entertainment that we sometimes take them for granted. Like a modern deus ex machina, aliens can be relied upon to suspend disbelief in an otherwise inconceivable scenario. (How does Superman fly? Simple, he’s an alien!) Extraterrestrials rank alongside Nazis, zombies, and generic terrorists as the most common villains found in video games. Unlike those others, however, aliens may also be allies. Nothing inherent to extraterrestrial life demands it be villainous. Beings from other worlds often act as mirrors for examining the human condition, when not merely lurking among shadow and neon strobe.
It’s probably no coincidence that the advent of ufology, which is an actual word in the dictionary meaning the study of unidentified flying objects, coincides with the initial proliferation of aviation and the early years of the space age. We began to look up into the sky right about the time we realized there was nothing left to find over the horizon. In times past, when the known world was still defined by the flickering edge of torchlight, we imagined unspeakable monsters much closer to home. Spirits, ghosts, goblins, ghouls, fairies, vampires, all were the alien invaders and abductors of their time. As we have come to dismiss them as infeasible and childish, our imagination turns to the stars, where the realm of possibility remains seemingly infinite.
Certainly, we can see how aliens have stepped in to fill the role of menacing ghoul. Ridley Scott’s original Alien was essentially a horror film, a science fiction creature feature. While the execution was masterful, the formula proved well-established and has been revisited ever since.
In the coming years my friend Walter Hudson is going to emerge as one of his generation’s most effective, engaging voices fighting on behalf of freedom and American values. It’s been a great joy to work with Walter and see him continue to explore a variety of different subjects and styles. He’s proven himself as one of my most reliable regular writers, turning in polished, well-thought pieces each week that challenge and entertain. I’m convinced that someday everyone else will come to the conclusion that I have: he’s his generation’s equivalent of Dennis Prager — a welcoming, accessible, but still challenging, honest voice, capable of changing hearts and minds simultaneously. And he’s a Tea Party activist out in the grassroots doing work in his own state and community.
I’ll highlight some of Walter’s most engaging articles in several free miniature e-book collections here at PJ Lifestyle in the future. So far, I plan to bring together some of his writings on video games, race, Good and Evil, popular culture and the joys of capitalism. But first, I would like to begin showcasing Walter’s talent with this collection of four articles he wrote during February on a mission that he and I both fight together, the attempt to reconcile two warring philosophies and their activist movements: the Judeo-Christian tradition and Ayn Rand’s Objectivism. The battle between secular radicals and religious fundamentalists is a false one. We can be both Bible-based people of faith and reason minded, science enthusiasts. Walter makes the case in an invigorating, compelling way and I invite everyone to dive in to his engaging arguments.
Below you can click to see the original articles and the spirited debate they produced or jump to the articles in this collection. The pieces in this compilation feature new editorial afterwords by me.
- David Swindle, PJ Lifestyle Editor
First published February 7, 2013:
Objectivst philosopher Andrew Bernstein debates Judeo-Christian apologist Dinesh D’Souza. Click here to start at the beginning of the series on page 2.
First published February 14, 2013:
Objectivist philosopher Andrew Bernstein accused Christianity of rejecting reason in his recent debate with apologist Dinesh D’Souza. Click to jump to part 2 on page 8.
First published February 21, 2013:
As a dialogue begins between advocates of Ayn Rand’s objectivist philosophy and professing Christians, it’s vitally important to clarify terms. Click here to jump to part 3 on page 14.
First published: February 28, 2013:
Adherents of Ayn Rand and followers of Jesus Christ must set aside differences to secure individual rights. Click here to jump to the conclusion on page 21.
Last week’s article: Beating Back the Nazi “Sickness”
Zombies are all the rage these days. AMC’s The Walking Dead reigns as the top-watched drama on basic cable. Films like Warm Bodies, Zombieland, and I Am Legend stand out among recent entries in an enduring horror subgenre. None other than Brad Pitt will headline this year’s World War Z, which looks to amp up its action well beyond the shuffling flesh-eaters of yesteryear.
That’s to say nothing of video games, where the undead continue to suck cash from willing gamers anxious to live out an apocalyptic fantasy. Whether its Resident Evil, Left 4 Dead, or downloadable add-ons to Call of Duty, zombie hoards batter down the doors of our collective consciousness. What exactly makes them so popular?
Like the Nazis we considered last week, zombies provide guilt free slaughter. No one feels bad about shooting something that’s already dead. Plus, because zombies were once living human beings, they provide a cathartic release for that deeply suppressed homicidal impulse none of us want to admit to harboring.
Zombies are amoral. They have no agenda, no emotional motivation, no plan. They simply menace. So putting them down presents no moral dilemma. What would be murder were they living becomes a wholly defensible act of survival. The very nature of a zombie marks it for destruction. Since it has no feelings and endures no torment, the acceptable methods for disposing of a zombie are bound only by the imagination of the killer. So zombies enable creative guilt-free violence on a scale limited only by their numbers.
Zombies also serve an adaptive narrative purpose in storytelling. While they more often than not simply lurk around the corner as boogeymen, the nature of a zombie can be tweaked to represent certain themes. In George Romero’s 1968 classic Night of the Living Dead, the film which birthed the modern undead flesh-eater, zombies were implied to be the fulfillment of biblical revelation. Writing for The Washington Post, commentator Christopher Moreman expounds:
The zombie apocalypse is often equated with the wrath of God and biblical end times. Though the origins of zombie outbreaks usually remain indeterminate in the genre, most zombie narratives indicate that we brought this upon ourselves. Whether corporations, the government, or the military are to blame, the average person also bears fault for participating in a corrupt system, just as the people of Sodom and Gomorrah were collectively responsible for God’s wrath.
Romero’s 1978 Dawn of the Dead took the theme a step further, assigning a decisively anti-capitalist overtone to the narrative. The undead converged upon a shopping mall, retracing the routines of their former lives.
Before the Call of Duty franchise took on the subtitle Modern Warfare, it arguably reigned as the pinnacle of the World War II genre. While other first-person shooter games like those in the popular Tom Clancy series — including hit franchises like Ghost Recon and Rainbow Six – offered players the ability to engage in simulated modern warfare, for much of video game history the default setting for a run-and-gun, first-person shooter was World War II.
Many factors contributed to the period’s popularity as a setting for video-game violence. Chief among them march the jackbooted villains of the era, the Nazis. No one feels bad after shooting a Nazi. In fact, their evil proves so incontestable and absolute that killing them fulfills a profound sense of justice. No doubt that moral certitude contributed to their proliferation throughout gaming. Killing Nazis invites no controversy, leaving game developers with one less thing to worry about.
While the nature of Nazi evil may seem self-evident, the recent anniversary of the Holocaust Museum in Washington, D.C., provided an occasion to demonstrate that even former United States presidents can miss the mark. The local CBS affiliate reports:
Washington has many monuments and memorials that offer something special for visitors from around the world, “but the Holocaust memorial will be our conscience,” [President] Clinton said.
Since the museum opened 20 years ago, the world has made huge scientific discoveries, including the sequencing of the human genome, which proved humans are 99.5 percent genetically the same, Clinton said.
“Every non age-related difference … is contained in one half of 1 percent of our genetic makeup, but every one of us spends too much time on that half a percent,” Clinton said. “That makes us vulnerable to the fever, the sickness that the Nazis gave to the Germans. That sickness is very alive across the world today.”
The report does not include any specific examples of what Clinton diagnoses as the Nazi “sickness.” However, we may fairly assume he was referring to any intolerance of human diversity.
Gun control emerged as the primary political battlefront in the wake of the horrific Sandy Hook murders. While the battle to retain our Second Amendment rights remains a superior consideration, statist nannies push on other fronts as well.
A former writer for the Huffington Post, Peter Brown Hoffmeister, claims to have broken ties with the publication after its refusal to publish a piece he submitted regarding the influence of violent video games on troubled teenage males. Self-publishing on his personal blog with the provocative title “On School Shooters – The Huffington Post Doesn’t Want You To Read This,” Hoffmeister reveals his own troubled past while building a case against certain games.
As a teacher, I’ve spent a lot of time this past week [December 27, 2012] thinking about the Newtown shooting, school shootings in general, their causes and possible preventions.
It’s scary now to think that I ever had anything in common with school shooters. I don’t enjoy admitting that. But I did have a lot in common with them. I was angry, had access to guns, felt ostracized, and didn’t make friends easily. I engaged in violence and wrote about killing people in my notes to peers.
But there is one significant difference between me at 16 and 17 years of age and most high school shooters: I didn’t play violent video games.
But Jeff Weise did. He played thousands of first-person shooter hours before he shot and killed nine people at and near his Red Lake, Minn., school, before killing himself.
And according to neighbors and friends, Clackamas shooter Jacob Tyler Roberts played a lot of video games before he armed himself with a semi-automatic AR-15 and went on a rampage at the Clackamas Town Center in Portland, Oregon last week.
Also, by now, it is common knowledge that Adam Lanza, who murdered 20 children and six women in video-game style, spent many, many hours playing “Call of Duty.” In essence, Lanza – and all of these shooters – practiced on-screen to prepare for shooting in real-life.
Hoffmeister ends his retrospective with a call for government action. He encourages readers to “support the bill introduced… by U.S. Senator Jay Rockefeller, directing the National Academy of Sciences to examine whether violent games and programs lead children to act aggressively.”
Some years ago, while working as a contract security professional for a company I will not name in a Midwest town I will not specify, I was taken aback upon learning that a particular client site was an effective time bomb. The industrial facility lay at the heart of an urban center, unprotected by so much as a fence, within a short stroll from the nearest residence. On the premises was a number of chemical storage tanks, the contents of which I was told were so volatile that a properly configured explosion could result in devastation across state lines. Yet, there it sat in the open, protected far more by its inconspicuousness than any active security effort.
Such vulnerabilities are legion, cloaked in a shroud of public ignorance, protected by the fact that few know they exist or precisely how to exploit them. It is only when someone finally does the unthinkable that a particular vulnerability rises in profile and is taken more seriously. In retrospect, should not all cockpit doors always have been locked? It seems a sensible precaution, yet it took the attacks of September 11, 2001, to prompt the policy.
So it always is in the realm of security. While we may be tempted to blame policy makers or responders rather than the perpetrators of criminal or terrorist acts, we must first pause to recognize that security precautions are not guarantees, but exercises in risk mitigation. Like insurance, security measures serve to minimize potentially costly probabilities. Like insurance, more and greater risks mitigated translate to more expensive and inconvenient premiums paid.
Another client from my contract security past told me that his company went years without employing professional protective services until a late-night fire nearly destroyed their facility, inflicting a cost into seven figures in property damage and lost productivity. The cost of hiring an overnight guard to monitor the facility was a drop in the bucket by comparison. Yet, the company only perceived the value of a guard after the fire.
Spoiler Warning: Bioshock Infinite cannot be properly analyzed without revealing the details of its plot. If you plan to play it, or haven’t finished it, consider whether you wish to read further.
This may seem an odd way to start an analysis of a video game. But bear with me.
I was not always a Christian. There was a period of my life during which I searched for truth, trying to discern medicine from snake oil. One of the most compelling observations which led to the development of my Christian faith was the unique economy of sin presented in the Bible.
While many people believe that human beings are inherently good, an honest assessment of one’s own thoughts, along with cursory observation of even the youngest child, reveals that human beings are actually quite wicked. Not only are we bad, we like ourselves that way. Indeed, the notion that we are inherently good lowers the moral bar to the status quo, as if this life lived this way with all its horrors and violations were some kind of ideal.
Christianity stands unique among worldviews in not only acknowledging our congenital moral defect, but also in explaining how we contracted it while offering a cure. Other faiths tend to regard sin as some form of moral debit which can be offset by good deeds. Becoming a Christian requires acknowledging that the debt accrued through sin can never be paid by the sinner. Instead, the believer trusts in the atoning death of Christ, pointing to Him as the settler of accounts. Such faith proves difficult, both because we tend to deny our own wickedness and because we prefer to think we can overcome deficiencies on our own.
Surprisingly, this economy of sin proves quite relevant to an analysis of Irrational Games’ hot new shooter set in the skies above 1912 America, Bioshock Infinite. Redemption runs as a prominent theme throughout the experience, presented in various forms which tend to prove false. Protagonist Booker DeWitt, a former Pinkerton man and player avatar, seeks the seemingly simple redemption of a financial debt to a dangerous creditor. Antagonist Zachary Comstock, head prophet of a xenophobic cult, offers his followers redemption from “the Sodom below” within the floating city of Columbia. Daisy Fitzroy, leader of the leftist Vox Populi, offers her followers redemption from the tyranny of Comstock through militant revolution. Player companion and surprisingly able damsel Elizabeth begins as an innocent who comes to realize her own peculiar need for a second chance.
Political activists have a saying: when you’re explaining, you’re losing. The same could be said of business. When you have to explain to prospective customers why they need your latest innovation, when the product does not sell itself through mere presentation, you probably have a dud.
So may be the case with the latest iteration of home console hardware from Nintendo, the Wii U. iDigitalTimes reports:
Wii U sales are bad now, but it’s not the end of the world, according to Shigeru Miyamoto, who hopes that people will just give the Wii U some time to breathe before coming to a final conclusion about its worth. The console launched in November 2012, to huge initial sales and a quick decline, followed by slow and modest sales thereafter and predictions of doom and gloom from every quarter. Nintendo would leave the hardware business. It would go out of business altogether. It would go handheld only. Miyamoto thinks that’s all nonsense. We just need to give Wii U some time.
Miyamoto, a legend in the industry responsible for the creation of Nintendo’s hallmark Mario and Zelda franchises, goes on to explain how the Wii U represents an incredible innovation in gaming much like the handheld Nintendo DS did before it. Whether gamers at large come to realize they’ve been cheated all these years by the limitation of a single gaming screen, time will tell. Meanwhile, here are 6 horrible choices dragging down Nintendo.
New boss, same as the old boss. So gamers may come to regard Disney since its acquisition of the Lucasfilm family of companies, including video game developer LucasArts. Sitting on a rich catalog of intellectual properties including Star Wars and Indiana Jones, LucasArts should be at the forefront of the gaming community. At times, they have been. But recent years have left much to be desired.
The pairing of Disney’s acquisition with the looming transition to a new generation of gaming consoles presents an ideal opportunity to reinvigorate the brand. In a way, the lull in development from LucasArts in the past several years sets the stage for an all-the-more-impressive breakout. Here are 5 Star Wars games which need to get made already:
5) Remastered X-Wing Series
Steam led the way as a project pioneered by game developer Valve toward abandoning discs in favor of digital distribution. Now an established marketplace for titles from a variety of developers, Steam welcomes players with the latest new releases and a catalog of retro titles, many of which can no longer be played through conventional means.
As one example, Steam offers a large collection from LucasArts, including the Jedi Knight series, some classic Indiana Jones adventures, and the first and second Knights of the Old Republic role-playing epics. However, one franchise is conspicuously missing from the developer’s catalog, the X-Wing series of space combat simulators.
X-Wing, Tie Fighter, X-Wing vs. Tie Fighter, and X-Wing Alliance were once sold as a collection on CD-ROM. Each entry offered a compelling combat experience more akin to a flight simulator than an arcade game. Players had full control over the minutia of their spacecraft, able to direct energy between shields, weapons, and engines, all while targeting enemy subsystems and approaching missions creatively. The series was enormously popular, inspiring a major expansion to the Star Wars Galaxies online experience which offered similar gameplay.
For each passing day that the X-Wing series remains unavailable on Steam, a LucasArts executive should be fired. Releasing these games as digital downloads is an absolute no-brainer. Practically effortless aside from some paper pushing among lawyers, the move would provide LucasArts (and parent company Disney) with profit-bearing revenue on day one. That said, the opportunity exists to remaster these classic titles with updated graphics and modern network capabilities. There’s an entire generation of gamers who have never had the pleasure of experiencing X-Wing. Updated versions of these bar-setting titles would fly off the virtual shelf.
Life does not come with a reset button. That truth struck me whenever I glimpsed the face of my Nintendo Entertainment System. Reset was always there, lurking next to Power, ready to erase both my sins and the virtual world in which they had been committed. A fresh start, another try, Reset offered them free.
Moments like that, moments where some shadow of philosophical truth peaked through the veil of this childish pastime, came often over the years. The most recent occurred while I was playing Fable II on my Xbox 360. Set in a fantasy world with swords, sorcery, and muskets, the Fable series contains many game mechanics above and beyond the traditional hack and slash quest. Among them is the ability to purchase real estate and manage rental property, which maintains a steady stream of gold for upgrading weapons and other items. As I purchased one property and saved up to invest in another and yet another, I quickly realized I was mimicking a truly productive task. Why can’t I do this in real life? Oh yeah, I don’t have any money to start.
The experience of the game inspired me to revisit methods for creating wealth and fostering upward mobility. I won’t go so far as to say Fable II changed my life. After all, I’ve yet to buy that first investment property. However, it did plant a seed which may someday germinate.
Other games have offered real life lessons in ways both subtle and overt. Here are 7 for your consideration.
Bioshock Infinite releases next Tuesday, March 26. A highly anticipated prequel to one of the most widely acclaimed video games in history, the title stands poised to awe not only with inspiring visuals and thrilling gameplay, but with a controversial critique of American Exceptionalism.
Film critic Roger Ebert earned the ire of gamers a few years ago when he ruled declaratively that video games can never be art. Emerging from the resulting swarm of agitated youth, Ebert later relented slightly, if only to admit that he really ought to experience video games before banishing them from the realm of artistic consideration.
An intriguing debate regarding what makes a thing art is woven through both of Ebert’s pieces linked above. However, the argument may be moot. It seems fair to say that when a craft begins to express complex ideas regarding the human condition, when it begins to stimulate thought and debate on matters of genuine import in the real world, when it can affect how you think about issues and what you believe about your world, it achieves the status of art.
By that standard, the video game industry has produced a bounty of artistic titles amidst a sea of thoughtless cookie-cutter fare. Of course, this makes video games no different than any creative medium. There exist far more vulgar scratches on bathroom stalls than masterpieces hung in museums, far more trashy romance novels than genuine epics, and certainly more popcorn flicks and action movies than truly inspirational films.
Like any medium, games can evoke powerful emotions and make compelling philosophical statements. The element of interactivity can heighten such moments beyond the experience of a novel, painting, or film. No longer a mere observer, what happens in a game happens to you. The world of the game and the characters which inhabit it change, live, and die according to the choices you make.
The inherent power of the medium proves all the more reason to treat it seriously as an influential artistic form. Therefore, as Bioshock Infinite makes its case against the notion of American Exceptionalism, we do well to pay attention and respond.
Last week brought to light a likely Democratic challenge to Senate Minority Leader Mitch McConnell from actress-turned-politico Ashley Judd. The Daily Caller responded to the news with a jab at Judd’s character, pointing out how often she has been nude throughout her film career. This triggered a firestorm of indignation on the Left, with writers from The Raw Story, Salon, and Mother Jones among others lambasting conservative prudery.
While the Left’s objection appears to be informed by sexual licentiousness and a general obligation to feign offense at any suggestion of modesty as virtue, a legitimate critique can be made of the attempt to marginalize Judd’s candidacy. In several ways worth noting, making an issue of Judd’s on-screen nudity is a mistake.
First, let us concede that we live in the year 2013 amidst a generation separated from past chastity by a great cultural and technological divide. Naked women are not as shocking as they used to be, assuming they ever actually were. Granted, a higher-than-average standard ought to be applied to candidates for public office, and certainly to candidates for U.S. Senate. However, context matters. Judd acted in mainstream films. It’s not as though she made her career in pornography.
Activists on the Right ought to hold greater concern for the circumstances which make Judd’s potential candidacy viable. We live in a political culture where celebrity proves increasingly valuable. One of the greatest hurdles facing campaigns at any level is name recognition. If voters don’t know who a candidate is, they aren’t as inclined to vote for them. The campus paper for Vanderbilt University in Nashville, Tennessee, sources the work of political scientists in that area:
[Cindy] Kam and [Elizabeth] Zechmeister have shown, in a paper currently under consideration for publication, that brief exposure to a candidate’s name increases voter support by 13 percent, if voters know nothing else about the candidates.
No one should be shocked to learn that campaigns grow more expensive each cycle.
My conservatism caught me by surprise.
While raised in the peculiar isolation of Jehovah’s Witnesses by a white mother and a black father, politics was as elusive as birthday celebrations and gifts on Christmas morning (prohibited by JW theology). In elementary school, as other children would cover their hearts and recite the Pledge of Allegiance, I stood silent with my hands at my side. Participation in the political system of men was a betrayal of the kingdom of God, or so I had been taught. I therefore had little frame of reference for, or interest in, the political discourse.
I thus came into middle school ripe for indoctrination. My first impression of the major political parties was imprinted by a social studies teacher who explained as a matter of fact that Republicans were the party of the rich and powerful while Democrats were the party of the little guy. That settled it. Lacking in wealth and power as I was, if I was ever to be political, I was clearly to be a Democrat. Thus guided, I dutifully cast my ballot in the mock election of 1992 for the well-coifed champion of we little people – Bill Clinton.
In the years that followed, something happened which my teachers did not intend. I enrolled in my state’s postsecondary enrollment options program, and came to spend half the day at a local community college. My schedule was such that I drove between my high school and the college right when a certain talk radio personality took to the air. In a way, listening to Rush Limbaugh proved a form of youthful rebellion. My curiosity was aroused by leftist characterizations of the man as a bigoted hate-monger. Surely, listening to the rantings of a modern-day Klansman would prove entertaining.
You can fill in the rest of the story. What Limbaugh had to say on those daily drives to college proved more enlightening than what I was offered in class. I was not converted so much as matched with the ideology I implicitly held.
As I came of age politically, the reality of being a black conservative was no more isolating than being a Jehovah’s Witness. I had grown used to being a minority within a minority, the odd guy out, and having to routinely explain myself to others. While I eventually dropped the religion, I maintained its contentment with abnormality. As a result, I did not endure quite the same trials which many other black conservatives do when they reveal their values to a community enthralled by liberation theology.
Nevertheless, life as a black conservative has granted me insight into the plight facing those who stand up for what they believe in. Here are 5 tips for coming out as a black conservative.
Previous articles in this series:
- 5 Common Accusations Leveled at Christianity
- A Reason for Faith: Christianity on Trial
- A Reason for Faith: 6 Fatal Misconceptions
When Abraham Lincoln needed to rally the nation toward unity, he referenced Matthew 12:25:
But Jesus knew their thoughts, and said to them: “Every kingdom divided against itself is brought to desolation, and every city or house divided against itself will not stand…”
That principle proves timeless. Divide and conquer remains an effective tactic. Perhaps that informs the many writers on the Left who have strived to drive a wedge between followers of Jesus Christ and adherents to the philosophy of Ayn Rand.
Consider Boston University professor of religion Stephen Prothero, who once wrote that “marrying Ayn Rand to Jesus Christ is like trying to interest Lady Gaga in Donny Osmond.” He cautioned Republican readers against conflating them:
Rand’s trinity is “I me mine.” Christianity’s is the Father, the Son, and the Holy Spirit. So take your pick. Or say no to both. It’s a free country. Just don’t tell me you are both a card-carrying Objectivist and a Bible-believing Christian. Even Rand knew that just wasn’t possible.
Truthfully, one cannot be both a Christian and an Objectivist. As covered throughout this series, Objectivist epistemology does not allow for any acknowledgement of the supernatural. However, one can be a Christian and recognize many of the objective truths which Ayn Rand articulated. After all, Christians do not deny objective reality. We merely recognize an eternal context. Worldviews need not align to overlap.
Prothero employs the typical objection to any alliance between Christians and objectivists:
Real conservatism is also about sacrifice, as is authentic Christianity. President Kennedy was liberal in many ways, but, “Ask not what your country can do for you — ask what you can do for your country” was classic conservatism. Rand, however, will brook no such sacrifice. Serve yourself, she tells us, and save yourself as well. There is no higher good than individual self-satisfaction.
Here, both Christianity and Objectivism are misrepresented. True, Rand deplored Kennedy’s classic inaugural exhortation, perceiving it to subordinate the individual to the collective (although it could be argued Kennedy intended the opposite). However, she never presented “individual self-satisfaction” as the standard of value. One can be fully satisfied in any given moment without serving their rational long-term self-interest.
Previous articles in this series:
The title of the talk, “Capitalism: The Only Moral Social System,” was irresistible to a newborn activist bred from the Tea Party. As a lifelong conservative, I had always felt as though capitalism was morally superior to any alternative, but had not encountered a claim as bold as this. The speaker was Craig Biddle, editor of The Objective Standard. His thesis was not that capitalism was the best social system, or the most efficient, or the most tolerable among acceptable choices. His claim was that capitalism is the one true good, the only way to go, and that any other system proves profoundly bad.
Biddle’s argument was compelling, built upon observation of reality and application of reason. He took us through the mind’s eye to a far-flung island where we were marooned alone without a single piece of technology. He asked us how such a castaway would survive. What would have to be done? Through what means would it be done? What could prevent it?
In order to survive and thrive, human beings must act rationally to obtain and keep values. A castaway requires food, shelter, sanitation, recreation, and a means to escape or attract rescue. To obtain these things, the castaway cannot rely upon instinct like an animal. Rather, he must apply his mind to the task at hand. He must discern what can be safely eaten, how to fashion tools, how to construct shelter, how to trap and kill animals, how to effectively use the raw materials around him to affect his survival. Ultimately, the only thing which could prevent the castaway from doing these things, aside from his willingness and ability, is brute force from another human being.
Therein lies the objectivist ethic. What human beings need in order to survive and thrive is not provision, but the liberty to act upon their own judgment. Put another way, liberty is life. To deprive a man of his liberty is to deprive him of his life, to drain or contain him. Therefore, the recognition and protection of individual rights are essential.
Hearing this for the first time, I felt as though I had found the Holy Grail of conservative apology. While natural law evoked a Creator which secular leftists could simply deny, this objectivist argument stood firmly upon reason and the uncontestable facts of reality. How is it that this was not being echoed across conservative media, I asked myself. Then I got my answer.
Last Week’s article: 5 Common Accusations Leveled at Christianity
Christianity is profoundly bad. So argued philosophy professor Dr. Andrew Bernstein in a recent debate sponsored by The Objective Standard and the University of Texas Objectivism Society. Countering Bernstein was Christian apologist Dinesh D’Souza. They discussed whether Christianity is “good or bad for mankind.”
They spent a majority of their time debating more fundamental philosophical questions. What is the nature of reality? Does God exist? What is the proper source of morality? While many attendees commenting during the livestream chat saw these questions as diversions from the advertised topic, they were actually the crux of the matter. In order to discern whether Christianity is good or bad for mankind, “good” must first be defined.
Bernstein primarily accused Christianity of being irrational. To be irrational is to be immoral according to Objectivism, a philosophy advocated by Bernstein and best articulated by Ayn Rand in her magnum opus Atlas Shrugged. As Rand saw it, a proper morality arises only from the application of reason. Rand saw any assertion of faith as a rejection of reason. By parsing through Bernstein’s points, we examine not only whether Christianity is a fool’s errand, but whether faith of any kind is profoundly bad.
We begin at the foundation by first asking what we know and how we know it. Those questions are answered in the branch of philosophy known as epistemology. Objectivism holds that reason is the only means toward acquiring knowledge. In her essay Philosophy: Who Needs It? Rand argues:
Reason is the faculty which… identifies and integrates the material provided by man’s senses. Reason integrates man’s perceptions by means of forming abstractions or conceptions, thus raising man’s knowledge from the perceptual level, which he shares with animals, to the conceptual level, which he alone can reach. The method which reason employs in this process is logic—and logic is the art of non-contradictory identification.
Objectivist author William R. Thomas explains further:
The basis of our knowledge is the awareness we have through our physical senses. We see reality, hear it, taste it, smell it, feel it through touch. As babies, we discover the world through our senses. As our mental abilities develop, we become able to recall memories and we can form images in our minds.
Strict adherence to this means of acquiring knowledge precludes entertaining the supernatural. Like all religion, Christianity is a faith-based belief system which Objectivism rejects as nonsense.
How may Christians answer this view of knowledge? If the object of philosophy is to understand reality and access the whole truth of existence, then objectivist epistemology has an obvious limitation. Surely, applying logic to our perceptions is a solid method for discerning what is true. However, the amount of truth we can know through that process is capped by our perception.
Depending upon whom you ask, Christianity either withers under constant assault from a secular humanist conspiracy or flourishes as a virulent social tumor threatening intellectual and moral progress. This Friday, two leading intellectuals will take up the question of whether Christianity is “Good or Bad for Mankind.” Prolific writer, scholar, and filmmaker Dinesh D’Souza will trade arguments with professor of philosophy Dr. Andrew Bernstein. The debate will take place on February 8th at the University of Texas – Austin’s Hogg Auditorium beginning at 7pm CST, sponsored by The Objective Standard and the UT Objectivism Society. It will also be broadcast live over an internet stream. [Updated: see part 1 of Walter's analysis of the debate here.]
This intellectual confrontation “is guaranteed to set a new standard on the subject” according to The Objective Standard. That promise will be fulfilled. The arguments offered will differ from previous high-profile debates regarding Christian morality. While atheists whom D’Souza has engaged before have come from a position of skepticism or secular moral relativism, Bernstein’s body of work previews a fresh approach.
Bernstein will channel Ayn Rand and her philosophy of Objectivism, which not only rejects the Christian worldview, but emphatically indicts Christianity as a profound moral evil. While that may sound familiar and evoke recollections of Richard Dawkins, Christopher Hitchens, or the like, Bernstein’s argument will differ in that it will not merely cite alleged evils perpetrated in the name of Christianity but drill down to the root of what makes a thing good and assert that Christianity is the opposite.
Readers who have followed my recent work at PJ Media may have noticed two things. First, that I frequently evoke the work of Ayn Rand in support of my moral and political views. Second, that I am a professing Christian eager to contend for the faith. These two aspects of my person no doubt meet with frustration, confusion, or condemnation from both Christian and Objectivist readers who perceive their respective worldviews as irreconcilable. I dare to contend that, while there are certainly profound differences in these worldviews, they are not as wholly irreconcilable as either contingent thinks.
Let’s preview some of the arguments sure to be made in Austin. Next week, we’ll respond to these points along with any others which arise and consider just how incompatible Christianity and Objectivism truly are. Here are 5 accusations sure to be leveled against Christianity by Andrew Bernstein in his debate with Dinesh D’Souza.
It was like that moment in The Wizard of Oz when Dorothy emerges from the grey remains of her dislocated home into an exotic world of color. That was how I felt at twelve years of age upon my arrival in Minnesota.
Home up to that point had been the dank flat malaise of inner-ring suburban Detroit. In many ways, the Motor City evoked Dorothy’s Kansas. Everything was built on the grid system, many right angles, old houses of stone and brick. It was tangibly dull, colors muted by wear and grime. Winters were especially bleak. An amalgam of overcast, endless concrete and dirt-ridden snow drowned the world in grey. By comparison, the big skies and rolling hills of the Mississippi valley seemed a storybook paradise.
That first trip to Minnesota was made in order to spend time with my father. He had been maintaining an apartment in the Twin Cities while starting a new position with Northwest Airlines. We were to scout out potential homes in anticipation of transplanting the rest of the family, my mother and two sisters. It was perhaps the most visceral manifestation of upward mobility in our family’s history, chasing opportunity across the country.
It was the culmination of my father’s economic journey, which had its beginnings in poverty. Unfortunately, I don’t know much about my father’s childhood aside from the scraps I’ve managed to glean from remarks thrown here and there. I know enough, however, to understand that my father’s rise to the middle class beat the odds — which were stacked against him from the start.
Many years later, I continue to benefit from the choices Dad made. Now the father of my own young family, I stand atop his shoulders looking to grab the next rung. From that position, I realize that some of the essential concepts my father applied are still relevant to me today. As I seek to renew the momentum my father achieved, I reflect upon where he began and how he got to where he did. There are valuable lessons there.
First, it’s important to understand the goal. When we consider the quest for upward mobility, what is our measure of success? In a 2011 piece for Time magazine, assistant managing editor Rana Foroohar makes a crucial distinction:
You can argue about what kind of mobility really matters. Many conservatives, for example, would be inclined to focus on absolute mobility, which means the extent to which people are better off than their parents were at the same age. That’s a measure that focuses mostly on how much economic growth has occurred, and by that measure, the U.S. does fine. Two-thirds of 40-year-old Americans live in households with larger incomes, adjusted for inflation, than their parents had at the same age (though the gains are smaller than they were in the previous generation).
But just as we don’t feel grateful to have indoor plumbing or multichannel digital cable television, we don’t necessarily feel grateful that we earn more than our parents did. That’s because we don’t peg ourselves to our parents; we peg ourselves to the Joneses. Behavioral economics tells us that our sense of well-being is tied not to the past but to how we are doing compared with our peers. Relative mobility matters. By that standard, we aren’t doing very well at all. Having the right parents increases your chances of ending up middle to upper middle class by a factor of three or four.
It’s a mistake to take for granted the notion that “relative mobility matters” without asking why. As we consider some ideas for rising from poverty to the middle class, it will become apparent that improving our individual quality of life is a superior consideration to how our wealth compares with that of others.
What do we mean when we say, “You cannot legislate morality”?
Surely, legislation should not be ambivalent to right and wrong. Law builds upon the concept of justice. Is not justice derived from morality?
Sometimes, people simply mean that government cannot force us to be good. In other contexts, the statement signals a distinction between what is objectively wrong, like killing someone, and what is subjectively wrong, like swearing in public.
Yet much of the time it can be hard to discern exactly what someone means when they say morality cannot be legislated. The term is used on both the Right and the Left, by social conservatives and social liberals, by people on opposite sides of the same issue. On the one hand, you might have a conservative who uses the term to argue against redistribution of wealth while standing opposed to gay marriage and abortion. On the other hand, you might find a leftist who uses the term to argue in favor of gay marriage and abortion while seeking to seize money which they did not earn.
What gives? Does the term prove completely subjective? Does any given person simply want their sense of morality enforced while the other guy’s sits ignored?
It shouldn’t surprise us to find confusion whenever morality is invoked. People’s sense of right and wrong certainly varies and will affect their public policies. Perhaps recognition of that fact fuels the notion that morality ought not be legislated. Perhaps we think, “In a free country, we have the right to decide right and wrong for ourselves.”
Of course, that sentiment fails upon its first application. A murderer might think he is right, as might a thief or a rapist. Hitler thought he was right. Perhaps then, morality by whim is not a pillar of true freedom.
Upon acknowledging that some kind of morality must inform legislation, a most uncomfortable question arises. Whose? Should the morality informing legislation be dictated by the church? Should it be a consensus of “experts”? Should it be put to a purely democratic vote? Who has the right, and by what authority, to tell another what they may or may not do?
Historically, governments have derived their authority and their sense of morality through entirely subjective and arbitrary means. The king is so ordained by God. Better men should govern lesser ones. The majority should get their way. These approaches are united in their disregard for individual rights.
Note: This article from Walter Hudson was first published last year on July 17 here.
It shouldn’t matter that I, an author with the audacity to select such a title, am black. The arguments presented should stand or fall on their objective merit. Nevertheless, I declare my racial identity at the outset to defuse any prejudice readers may bring regarding the motivation behind this piece. Indeed, it is in part because I am black that the following must be said.
All things considered, blacks and the civil rights culture surrounding them are the most open and prolific purveyors of racism in America. This is an ironic travesty which spits upon the graves of history’s abolitionists and offends all who are committed to a dream of equality under the law and goodwill among men.
Surely, such a claim is provocative. Unfortunately, it is also demonstrable.
In a recent interview with National Public Radio host Michel Martin, the Oscar-winning black actor Morgan Freeman made the odd declaration that President Barack Obama is not America’s first black president. NPR reports:
“First thing that always pops into my head regarding our president is that all of the people who are setting up this barrier for him … they just conveniently forget that Barack had a mama, and she was white — very white American, Kansas, middle of America,” Freeman said. “There was no argument about who he is or what he is. America’s first black president hasn’t arisen yet. He’s not America’s first black president — he’s America’s first mixed-race president.”
This is a new take on Obama’s racial identity from Freeman, who has previously cited Obama’s blackness as the chief motivation behind political opposition from both Republicans in Congress and the Tea Party movement. From an interview with CNN’s Piers Morgan:
… Morgan asked the actor, “Has Obama helped the process of eradicating racism or has it, in a strange way, made it worse?”
“Made it worse. Made it worse,” Freeman replied. “The tea partiers who are controlling the Republican party … their stated policy, publicly stated, is to do whatever it takes to see to it that Obama only serves one term. What underlines that? Screw the country. We’re going to do whatever we can to get this black man out of here.”
Apparently, Obama is black enough to trigger baseless charges of racism, but not black enough to qualify as the first black president. If that makes your brain hurt, you might be rational.
Freeman’s comments are not anomalies. He channels long-held, broadly accepted ideas regarding what it means to be black, the relevance of race, and the claim of blacks upon the rest of society. These ideas are horrifically racist, yet uniquely tolerated.
The tolerance of racist ideas openly expressed by blacks and the larger civil rights establishment is informed by sloppy thinking regarding both race and the role of government in society. True reconciliation requires confronting these ideas with reason. Here are eight ways in which blacks are perpetuating racism, and the one true way to effectively thwart it.
A pastor visiting our church shared a story from when his children were young. The oldest was four years old, and the younger three, when their mother served them grapes on the vine. As they plucked the sweet fruit, the younger child asked of the older, “How does Mommy get the grapes on there?”
Summoning elder gravitas, the firstborn replied, “Mommy doesn’t put the grapes on there.
“The store does.”
Children have a wonderful way of modeling our deficiencies. While it is easy to laugh at the reasoning of a child, we ought to consider how silly our reasoning might prove if the whole truth were known. Indeed, if we cannot point to an idea or two which we have reconsidered in light of new evidence, it cannot be said we have grown.
One idea which I used to hold, which made perfect sense to me at the time and still makes perfect sense to most of my Christian brethren, is the notion that man cannot rationally demonstrate an absolute morality in a world without God. My reasoning echoed that of Jeff Jacoby in a 2010 piece for Townhall. He wrote:
For in a world without God, there is no obvious difference between good and evil. There is no way to prove that murder is wrong if there is no Creator who decrees “Thou shalt not murder.’’ It certainly cannot be proved wrong by reason alone. One might reason instead — as Lenin and Stalin and Mao reasoned — that there is nothing wrong with murdering human beings by the millions if doing so advances the Marxist cause. Or one might reason from observing nature that the way of the world is for the strong to devour the weak — or that natural selection favors the survival of the fittest by any means necessary, including the killing of the less fit.
Reason is not enough. Only if there is a God who forbids murder is murder definitively evil. Otherwise its wrongfulness is a matter of opinion. Mao and Seneca approved of murder; we disapprove. What makes us think we’re right?
This perspective contrasts with that typically offered by atheists and agnostics, who assert that right and wrong can be discerned without reference to the supernatural. As a Christian, it is tempting to respond to such skeptics as PJ Lifestyle contributor John Hawkins did while affirming Jacoby.
Obstructionist. Intransigent. Obstinate.
These words among others, used in reference to the Tea Party and fiscally conservative members of Congress, bark past teeth bared in animosity. Critics of the Tea Party lament its uncompromising stance against proposals like the recent fiscal cliff deal. Content to tolerate mere rhetoric, these critics draw the line at standing on principle when it actually counts. NPR’s Alan Greenblatt places the Tea Party at a crossroads:
In the coming year, the returning [Tea Party Republican] members [in Congress] will have to decide whether they want to continue practicing a politics of purity, advocating strong and unyielding positions, or accept that governance generally requires a good deal of compromise.
Compromise sounds reasonable on its face. Absent any context, the term invites a sense of begrudging contentment. Certainly, compromise permeates our everyday lives. Every relationship we engage in requires compromises subtle and plain. It remains true that gestures of goodwill go a long way toward fostering mutually beneficial arrangements. However, that assumes both parties act in good faith. It also assumes that a given compromise serves a profitable long-term goal.
Opponents of the Tea Party have no such qualifications in mind. They advocate compromise as an end in itself. The notion springs from a fundamental reverence in our culture for sacrifice. Misinterpretation and misapplication of Judeo-Christian tenets have fostered an irrational sense of nobility for giving up something of value in exchange for a lesser value or even nothing at all. Such counter-productive sacrifice is demanded from Tea Party-backed members of Congress by folks like International Business Times commentator Joseph Lazzaro. Contemplating the immediate economic repercussions of allowing the country to fall off the fiscal cliff, and writing before the deal’s passage in the House, he explains:
Now, the typical, moderate, independent American, assessing the damage that a long-term failure to reach a budget deal would cause, will no doubt reasonably argue that surely the Tea Party faction will compromise – for the good of the nation. I.E. that the Tea Party will approve the current tax/budget bill.
Unfortunately, however, if that independent American is thinking reasonably, i.e. views a compromise as a rational, prudent stance, he/she is not thinking like a Tea Party member of Congress. Pressured by their extremist supporters, Tea Party members of Congress have shown no inclination to compromise and agree to a fair deal, no matter how much damage that obstruction and intransigence causes to the credit markets and the U.S. and global economies. Obstruction, driven by an extremist conservative ideology – no matter how much financial and economic destruction it triggers – has been the Tea Party’s preferred strategy, if the alternative is a compromise that leads to increased income taxes and an agreement that includes support for the liberal social safety net.
What can your smartphone teach you about gratitude? A great deal.
Not many years ago, I despised the idea of a cell phone. I value my autonomy, which to my mind includes the ability to remain deliberately unavailable. The notion of carrying around a phone in my pocket sounded a lot like putting a leash around my neck.
The issue was forced one Christmas when my in-laws purchased phones for my wife and me, even paying the subsequent bill for a year. Later came the advent of smartphones. I stood unimpressed. Phones make calls. They don’t need to sing and dance. Nevertheless, a new device caught my wife’s eye during an opportunity to upgrade our cellular contract. The price seemed reasonable and I reluctantly traded up.
It was my exploration of that device which prompted a dramatic change in my attitude toward mobile technology. As I pilfered apps and discovered capabilities, I quickly realized that this tiny gadget was becoming the most used and essential tool in my navigation of life. It came to serve as my administrative assistant, my calendar, my GPS, my library, and my gateway to news, information, and entertainment. It grew into an extension of my civilized being. Like my wallet or keys, it stays with me at all times and remains jealously guarded.
No longer pulled reluctantly into the future, I recently became the puller, convincing my wife that it was time to switch providers and upgrade to the Samsung Galaxy S III. Our old phones barely qualified as “smart” and were woefully inadequate to fulfill our new demands.
Consider that transformation in attitude. How could I go from not knowing I had a need to eagerly fulfilling it? Behold the magic of the market!
The critic of consumer culture might suggest that I was right to perceive no need for something like a smartphone. After all, people got by fine without them for millennia, and much of the world still does. Then again, people got by without electricity and automobiles too. If you regard the function of the market as meeting only known demand and current needs, then it becomes easy to dismiss an innovation like the smartphone as somehow decadent.
However, the magic of the market is that it does not stop at known demand or current needs. It anticipates demand for products which do not yet exist. Specifically, individuals apply their minds to dream up new ways to deliver value. Strangely, more individuals seem to dream up new products and methods when they are politically free with their rights protected. Something called profit motive, they say.