Check out Walter’s previous articles in this ongoing series Thursday mornings exploring video games, cultural villains, and American values at PJ Lifestyle. From May 2: “Beating Back the Nazi “Sickness” and last week: What Zombies Teach Us About Human Nature. And also see Walter’s A Reason For Faith series, reprinted last week here. In these four articles Walter begins to formalize his task of synthesizing the Judeo-Christian tradition with Ayn Rand’s Objectivism and Tea Party activism. - DMS
In one of the most vivid dreams I can recall, I witnessed the landing of a plainly alien spaceship. It came lucidly, dancing on the edge of wakefulness, informed by enough of my rousing consciousness that it felt particularly real. I remember the feeling that my feet were glued to the ground, that I couldn’t move if I wanted to, not on account of some external force, but due to an overwhelming sense of awe and anticipation. The one thought dominating my mind: everything is about to change.
Though it was only a dream, I retain the memory as vividly as though it were of an actual experience and believe I will respond similarly if ever confronted by a true interplanetary delegation. Something about that kind of moment, when the veil lifts upon an existential mystery, produces an irresistible thrill. Perhaps that tops the list of reasons why our popular culture remains ever fascinated by the prospect of extraterrestrial life.
Aliens have become such a prolific device in our entertainment that we sometimes take them for granted. Like a modern deus ex machina, aliens can be relied upon to suspend disbelief in an otherwise inconceivable scenario. (How does Superman fly? Simple, he’s an alien!) Extraterrestrials rank alongside Nazis, zombies, and generic terrorists as the most common villains found in video games. Unlike those others, however, aliens may also be allies. Nothing inherent to extraterrestrial life demands it be villainous. Beings from other worlds often act as mirrors for examining the human condition, when not merely lurking among shadow and neon strobe.
It’s probably no coincidence that the advent of ufology, which is an actual word in the dictionary meaning the study of unidentified flying objects, coincides with the initial proliferation of aviation and the early years of the space age. We began to look up into the sky right about the time we realized there was nothing left to find over the horizon. In times past, when the known world was still defined by the flickering edge of torchlight, we imagined unspeakable monsters much closer to home. Spirits, ghosts, goblins, ghouls, fairies, vampires, all were the alien invaders and abductors of their time. As we have come to dismiss them as infeasible and childish, our imagination turns to the stars, where the realm of possibility remains seemingly infinite.
Certainly, we can see how aliens have stepped in to fill the role of menacing ghoul. Ridley Scott’s original Alien was essentially a horror film, a science fiction creature feature. While the execution was masterful, the formula proved well-established and has been revisited ever since.
Last week’s article: Beating Back the Nazi “Sickness”
Zombies are all the rage these days. AMC’s The Walking Dead reigns as the top-watched drama on basic cable. Films like Warm Bodies, Zombieland, and I Am Legend stand out among recent entries in an enduring horror subgenre. None other than Brad Pitt will headline this year’s World War Z, which looks to amp up its action well beyond the shuffling flesh-eaters of yesteryear.
That’s to say nothing of video games, where the undead continue to suck cash from willing gamers anxious to live out an apocalyptic fantasy. Whether its Resident Evil, Left 4 Dead, or downloadable add-ons to Call of Duty, zombie hoards batter down the doors of our collective consciousness. What exactly makes them so popular?
Like the Nazis we considered last week, zombies provide guilt free slaughter. No one feels bad about shooting something that’s already dead. Plus, because zombies were once living human beings, they provide a cathartic release for that deeply suppressed homicidal impulse none of us want to admit to harboring.
Zombies are amoral. They have no agenda, no emotional motivation, no plan. They simply menace. So putting them down presents no moral dilemma. What would be murder were they living becomes a wholly defensible act of survival. The very nature of a zombie marks it for destruction. Since it has no feelings and endures no torment, the acceptable methods for disposing of a zombie are bound only by the imagination of the killer. So zombies enable creative guilt-free violence on a scale limited only by their numbers.
Zombies also serve an adaptive narrative purpose in storytelling. While they more often than not simply lurk around the corner as boogeymen, the nature of a zombie can be tweaked to represent certain themes. In George Romero’s 1968 classic Night of the Living Dead, the film which birthed the modern undead flesh-eater, zombies were implied to be the fulfillment of biblical revelation. Writing for The Washington Post, commentator Christopher Moreman expounds:
The zombie apocalypse is often equated with the wrath of God and biblical end times. Though the origins of zombie outbreaks usually remain indeterminate in the genre, most zombie narratives indicate that we brought this upon ourselves. Whether corporations, the government, or the military are to blame, the average person also bears fault for participating in a corrupt system, just as the people of Sodom and Gomorrah were collectively responsible for God’s wrath.
Romero’s 1978 Dawn of the Dead took the theme a step further, assigning a decisively anti-capitalist overtone to the narrative. The undead converged upon a shopping mall, retracing the routines of their former lives.
There’s nothing that makes Hollywood more nervous than portraying Islamist terror. As far back as 1994, James Cameron’s True Lies was denounced as racially insensitive for imagining a chillingly plausible Islamist terror threat involving nuclear weapons. Cameron, anticipating accusations of unfairly linking terrorism with Islam and Arabs, took care to try for “balance” by placing an Arab-American character on the good guys’ side (the actor who played him, Grant Heslov, this year won an Oscar as one of the producers of Argo). Yet the advocacy group the Council on American-Islamic Relations (CAIR) slammed the film anyway. The hysterical 1998 movie The Siege imagined that, in an overreaction to a terrorist attack, Brooklyn would be placed under martial law and all young Muslim men would be interned in Yankee Stadium. Ridiculous.
Since 2001, of course, Hollywood has almost completely avoided showing any Muslim involved in terror, changing the bad guys in 2002’s The Sum of All Fears from Palestinians to neo-Nazis. The 2005 Jodie Foster movie Flightplan, about an abduction on an airplane, used a hint that Arabs might be responsible as a red herring. The actual villain: an all-American air marshal played by Peter Sarsgaard. Several Middle East themed movies like Ridley Scott’s Body of Lies essentially saw a moral equivalence between the U.S. and the Islamists, saying both sides were up to comparably nasty stuff in the War on Terror.
Before the Call of Duty franchise took on the subtitle Modern Warfare, it arguably reigned as the pinnacle of the World War II genre. While other first-person shooter games like those in the popular Tom Clancy series — including hit franchises like Ghost Recon and Rainbow Six – offered players the ability to engage in simulated modern warfare, for much of video game history the default setting for a run-and-gun, first-person shooter was World War II.
Many factors contributed to the period’s popularity as a setting for video-game violence. Chief among them march the jackbooted villains of the era, the Nazis. No one feels bad after shooting a Nazi. In fact, their evil proves so incontestable and absolute that killing them fulfills a profound sense of justice. No doubt that moral certitude contributed to their proliferation throughout gaming. Killing Nazis invites no controversy, leaving game developers with one less thing to worry about.
While the nature of Nazi evil may seem self-evident, the recent anniversary of the Holocaust Museum in Washington, D.C., provided an occasion to demonstrate that even former United States presidents can miss the mark. The local CBS affiliate reports:
Washington has many monuments and memorials that offer something special for visitors from around the world, “but the Holocaust memorial will be our conscience,” [President] Clinton said.
Since the museum opened 20 years ago, the world has made huge scientific discoveries, including the sequencing of the human genome, which proved humans are 99.5 percent genetically the same, Clinton said.
“Every non age-related difference … is contained in one half of 1 percent of our genetic makeup, but every one of us spends too much time on that half a percent,” Clinton said. “That makes us vulnerable to the fever, the sickness that the Nazis gave to the Germans. That sickness is very alive across the world today.”
The report does not include any specific examples of what Clinton diagnoses as the Nazi “sickness.” However, we may fairly assume he was referring to any intolerance of human diversity.
Political activists have a saying: when you’re explaining, you’re losing. The same could be said of business. When you have to explain to prospective customers why they need your latest innovation, when the product does not sell itself through mere presentation, you probably have a dud.
So may be the case with the latest iteration of home console hardware from Nintendo, the Wii U. iDigitalTimes reports:
Wii U sales are bad now, but it’s not the end of the world, according to Shigeru Miyamoto, who hopes that people will just give the Wii U some time to breathe before coming to a final conclusion about its worth. The console launched in November 2012, to huge initial sales and a quick decline, followed by slow and modest sales thereafter and predictions of doom and gloom from every quarter. Nintendo would leave the hardware business. It would go out of business altogether. It would go handheld only. Miyamoto thinks that’s all nonsense. We just need to give Wii U some time.
Miyamoto, a legend in the industry responsible for the creation of Nintendo’s hallmark Mario and Zelda franchises, goes on to explain how the Wii U represents an incredible innovation in gaming much like the handheld Nintendo DS did before it. Whether gamers at large come to realize they’ve been cheated all these years by the limitation of a single gaming screen, time will tell. Meanwhile, here are 6 horrible choices dragging down Nintendo.
Life does not come with a reset button. That truth struck me whenever I glimpsed the face of my Nintendo Entertainment System. Reset was always there, lurking next to Power, ready to erase both my sins and the virtual world in which they had been committed. A fresh start, another try, Reset offered them free.
Moments like that, moments where some shadow of philosophical truth peaked through the veil of this childish pastime, came often over the years. The most recent occurred while I was playing Fable II on my Xbox 360. Set in a fantasy world with swords, sorcery, and muskets, the Fable series contains many game mechanics above and beyond the traditional hack and slash quest. Among them is the ability to purchase real estate and manage rental property, which maintains a steady stream of gold for upgrading weapons and other items. As I purchased one property and saved up to invest in another and yet another, I quickly realized I was mimicking a truly productive task. Why can’t I do this in real life? Oh yeah, I don’t have any money to start.
The experience of the game inspired me to revisit methods for creating wealth and fostering upward mobility. I won’t go so far as to say Fable II changed my life. After all, I’ve yet to buy that first investment property. However, it did plant a seed which may someday germinate.
Other games have offered real life lessons in ways both subtle and overt. Here are 7 for your consideration.
Part 1 of a 4 Part series Deconstructing Family Guy
When Seth MacFarlane sang about boobs at the Oscars, I’m pretty sure he was referring to his own fans.
Most of the time it is taken for granted that we recognize the latent moronic nature of most television programming today.
Then again, do we?
If we agreed as a culture that television programming like Family Guy is so moronic, why would a collective cheer rise up at the sight of another Emmy win? Would we be told by media commentary royalty to worship Seth MacFarlane, the show’s creator, as fascinating? Not only does the guy have mega street cred in the pop culture universe, the primetime structure he’s so wholeheartedly mocked is singing his praises. In fact, it could be said that Family Guy’s seemingly counterculture humor has been legalized by the mainstream.
What’s more, like a bad addiction, Family Guy is the drug that has turned a generation of Boob-Tube addicts into junkies. So, what are the signs, Doctor? How do you know when a co-worker, a friend, even a loved one has become a total Boob? Let’s play MediaMD as we examine the 5 most common side effects of watching Family Guy.
If Chuck Norris gets a pedicure so that his toes will feel more comfortable when he kicks people in the face, will you think he is a wimp? No. If R. Lee Ermey wants to drink a Cosmopolitan because he feels that it will keep his throat perfectly primed to yell at people, he can get away with it. If UFC light heavyweight champion Jon “Bones” Jones likes to unwind by watching Twilight after choking someone unconscious in a cage fight, who are we to argue?
Still, there are some things that even the manliest of masculine manly men can’t get away with on their most masculinely manly days without having their man card permanently pulled. For example:
1) Geeking out on children’s entertainment
It’s one thing for a man to listen to the awful music of Justin Bieber and think, “Wow, that’s not the worst thing I’ve ever heard.” It’s quite another to actually go to one of his concerts for the fun of it or, worse yet, refer to himself as a “Belieber.” Wanna go to a comic-book convention? Ok, but if you’re a dude who dresses up like Thor and starts speculating about whether you can defeat the Hulk in a fight, you have a “man problem” you need to address. Don’t even get me started on being a damn brony and walking around in public talking about My Little Pony. Are you a five-year-old girl? If the answer to that question is “no,” then you don’t have any business being a fan of a show aimed at five-year-old girls.
Through the years Neverhood fans have asked for another game, and I’m partnering with my EWJ and Neverhood buddies Mike Dietz and Ed Schofield to make a full sized, PC and Mac point and click adventure game in clay and puppet animation. New characters, but in my usual style.
TenNapel’s “usual style” is mind blowing. The Neverhood debuted on the Dreamworks Interactive label in 1996. It was a point-and-click adventure built entirely in clay and animated via stopmotion. Here’s a taste, and keep in mind that he did this in 1996 on PCs that can’t even compete with today’s smart phones for processing power.
The bazillionth episode in the tomb-raiding life of Lara Croft hit Tuesday. Most of the previous episodes have not been good. Many came with flaws that rendered them nearly unplayable in spots. Unlike most of the previous, and especially the most recent, installments, reviews for the 2013 installment have not been mixed. Lara is scoring about a 9.25 across the board on video game-review sites. But is this hype fanboys falling in love with a game babe, or a reflection of a strong game that may just bring a storied but troubled franchise back from the dead?
I spent about an hour with the new Tomb Raider, so while I don’t yet have a comprehensive view of the game’s full story arc, I do have some strong first impressions.
Tomb Raider 2013 is an origins story, picking Lara up on an expedition to find a lost civilization off the coast of Japan. A nineteen year old on her first adventure, Lara isn’t yet the boss chick who greeted the gaming world in 1996. She’s young but determined, and convinced that if the expedition changes course, it will find the lost civilization they’re looking for. Changing course also risks entering the Dragon’s Triangle, an allegedly dangerous region of the Pacific similar to the Bermuda Triangle off Florida.
Things go about as you’d expect when a game amps up a threat — the expedition suffers a shipwreck and Lara finds herself stranded and alone. A knock on the head later, and she’s in a creepy, gory cave filled with bones and hanging corpses. It’s environments like this, and Lara’s tendency to lean on a couple of swears when she reacts to threats, that earn the game its M rating. No longer a cartoon, Tomb Raider is a cinematic beast.
The younger Lara is vulnerable. She picks up knocks and wounds. She scavenges and improves weapons as she goes. She gets hungry and has to hunt, which turns this Tomb Raider into more of an open world than any previous episode. She learns skills and, based on the dialogue, learns to overcome her fears. She thinks.
This Lara develops as the story goes, and is far more interesting and more realistically rendered than in any previous episode. She also eats meat, so she is more Duck Dynasty than Morrissey.
The story of Tomb Raider works extremely well, at least in the early going of the game that I’ve played.
On Wednesday Sony announced its next-gen gaming console, the PlayStation 4. Sony expects the new console to be available by the Christmas season of this year and is being coy about the price. When the PS3 arrived, it carried a hefty price tag of about $600, scaring some gamers off for a few months. Rumors are the new console will come in at around $450, but that’s just a rumor at this point. That’s one of the mysteries surrounding the new box. More about the other mystery later in the article.
The PS4 will not just be another console with beefier hardware. It will have that, with powerful new graphics processors capable of taking the visuals to another level of realism, while not presenting a quantum leap over the current hardware. But it will truly be a next-gen console in the sense that it comes with capabilities that up to now have mainly been available on game streaming sites like OnLive (which I reviewed, here). In fact, the PS4 may kill off the ailing OnLive service.
That’s because the PS4 is a social gaming console right out of the box. One of OnLive’s chief fun features is its ability to allow gamers to watch and interact with other gamers without being in the game themselves. Gamers can spectate in the Arena, picking up tips and tricks, jeering and cheering and generally checking out games before either buying them or downloading demos. The PS4 allows spectating and, with a push of a button on its new controller, sharing and uploading action clips. Some games currently allow this, but the new hardware makes sharing a universal feature. It also allows demos to be played the instant a gamer chooses them, putting it on par with one of the other great OnLive features. Along with that will come features that already exist, such as Amazon Video, Netflix and Hulu apps and Plex serving that turn the PS into a full home entertainment system. PS3 users can also already control their consoles when surfing YouTube via iPhones and iPods. Expect Sony to build on that capability as well.
The PS4 also builds on a feature currently found on the PS3 and the Wii U, remote play. Currently PS3 can be controlled via a handheld PSVita, while the Wii U can act as a server, with game play actually taking place on the screen in the controller. So it doesn’t really need a TV screen. The PS4 allows games hosted on its hardware to be played on the PSVita. So like the Wii U, the PS4 can free up your TV while still delivering the top level gaming experience.
The PS4 controller, the Dualshock 4, also builds on the current competition, adding Move capabilities, the aforementioned social gaming capabilities, and a new touchpad in the middle.
So, there’s the controller. But where’s the actual PS4? In its entire demo Wednesday, they never showed the PlayStation 4 itself. That has sparked a debate:
There are two rather polarized angles being tossed about this week as the Sony show (or no-show) of the PlayStation 4 was let loose. One side says it’s terrible that Sony made a 2+ hour presentation for the PlayStation 4 without actually showing the hardware, relying instead on the controller and a variety of promises from software developers to do all the talking. The other side says awesome! We know the PlayStation 4 is coming now, and we’ve got confirmation from some of the biggest-name developers that they’re on board, so we’re happy!
My own take is that Sony wants a second bite at the buzz apple, so they’re withholding images of the console for a later date, maybe E3 in June or SIGGRAPH in August. If they do that, they get to have another big moment, and may announce the price along with giving us a look at the beast. Sony usually goes the route of making their consoles dark and artistic (or odd, in the case of the PS3s that look like bbq grills). I would expect something smaller and sleeker than the PS3.
The bottom line is that we now have concrete specs on the next-gen system, a catalog of major titles that it will debut with including new material from heavyweights like Blizzard and its own in-house Killzone and InFAMOUS series, and solid information about the new things it will be able to do. And the things it won’t do, which brings me to the “bad” part of this article. Sony says that as things stand now, backward compatibility is not built into the PS4. Gamers will not be able to play legacy games on the new system, which may impact some of this year’s bigger releases like the Tomb Raider reboot. They say they’re working on it. They may be setting up to sell multiple forms of the PS4, some that will include backward compatibility for a price, and some that don’t. Backward compatibility can be gotten around via streaming games, but that requires hefty bandwidth that most American households still don’t have, or via downloads, which will take up valuable hard drive space and may create other issues. We’ll see. But the failure to provide backward compatibility from the get-go is an ominous sign that Sony may be looking to roll out their new box at one stated price, which is not the actual price gamers will end up paying if they want to keep playing their old Call of Duty titles on their shiny new systems.
The Dallas Sci-Fi Expo wrapped up on Sunday, February 10. We snapped photos of some of the best, most creative and most disturbing costumes of the show. Click on a thumbnail below to view photo galleries. They’re divided into Girls, Groups, and Guys.
You can see more costumes from the Dallas Sci-Fi Expo here.
We interviewed Battlestar Galactica’s Tricia Helfer, here.
And ran into MickeyDeadMau5Trooper here.
This thing caused a stir this weekend at the Dallas Sci-Fi Expo. I call it…MickeyDeadMau5Trooper. I saw it coming up the escalator and had to grab some video before it got away.
Today I’m at the Dallas Sci-Fi Expo (which is actually taking place in Irving). Kevin Sorbo and Morena Baccarin will be here today and tomorrow, along with stars from Back to the Future, Battlestar Galactica, Tron, comic book artists, and of course, just about every superhero and villain imaginable.
Let’s walk the exhibition floor and see who turns up.
I don’t know what they’re selling, but they had a lot of buyers.
Even a Sith.
During the Thanksgiving holiday The Wife proposed a Harry Potter movie marathon. I’ve never considered myself much of a Potter-fan. During the books’ popularity over the last 15 years I resisted reading them. And while I saw six of the eight movies during my film critic days — and appreciated them individually — the franchise as a whole never inspired devotions to the level of the pop culture cults of my childhood and teen years, Star Wars and Star Trek.
So I welcomed the chance to give the series a second look, fueled by The Wife’s enthusiasm. She read all the books and knows the arcane details backwards and forwards. The Potter books arrived for April, a few years my junior, as a receptive older child, for me as an angsty teenager looking for “mature” books.
Last Wednesday night after wrapping up the day’s editing I made a run to the library to pick up the four titles we didn’t already own (The Half-Blood Prince) or have recorded on the DVR (Prisoner of Azkaban and both Deathly Hallows). And so began our epic Thanksgiving Potterfest with The Sorcerer’s Stone that night; which we carried on at a pace of three films both Thursday and Friday before concluding on Saturday morning.
My conclusion: young geeks nowadays have much better options than previous generations. Compare the eight Harry Potter films with the six Star Wars and eleven Star Trek. By any “objective” measure — box office, percentage of positive reviews, or number of award-winning actors featured in the films – Harry Potter wins. And does any Jedi or Trekkie want to argue that by the “subjective” measure — just sitting down and watching all the films in the series — Harry fails to triumph over Luke, Han, Kirk, and Spock?
“Angry Video Game Nerd: The Movie” is a passion project by independent filmmakers James Rolfe and Kevin Finn, based on the popular web series. The film is being produced outside the studio system, entirely funded by fan donations. Principle photography took place in Spring of 2012, in the Los Angeles area, with Jason Brewer as the DP. Additional filming is taking place on the East Coast. Editing is in its early stages.
The film is inspired by the famous Atari video game burial of 1982. Atari produced a game based on the biggest blockbuster movie of that year, E.T., and rushed it to meet the deadline for the Christmas shopping season. It was a commercial failure and millions of unsold game cartridges were buried in a desert landfill in Alamogordo, New Mexico. Coincidentally, it’s not too far from Roswell, the landing site of a different kind of E.T.
The Trailer features music by Bear McCreary (Battlestar Gallactica, The Walking Dead). The track is called “Maverick Regeneration” and can be downloaded as part of the Play for Japan album. All proceeds go to help earthquake victims in Japan.
The film is expected to be completed in the summer of 2013, but only time will tell. Independent films take a long time to finish. This one is no exception. It’s eventually expected to be released on DVD and/or Blu-ray, and to be available around the world, after showing to some live audiences in theater venues. Digital downloads is also an option. The immediate goal is to finish the film first.
See updates on James Rolfe’s personal site
Related on video games at PJ Lifestyle:
Related at PJ Lifestyle:
The next logical step in a huge Disney/LucasFilm purchase announcement is who will direct and who will star in Star Wars VII. Speculation is running wild since last week but a few tidbits are slowly leaking out. For instance, Harrison Ford is apparently up for returning to his iconic role even though he really wanted Han Solo to die. This is what we get for hiring a smuggler.
The news comes via Entertainment Weekly. According to a “highly-placed source” Ford is “open to the idea of doing the movie and he’s upbeat about it, all three of them are.” The source is including his Star Wars co-stars Mark Hamill and Carrie Fisher.
But Ford, who has admitted to not particularly liking Han as a character in the past, said this in a 2010 interview, “I thought he should have died in the last one to give it some bottom…George [Lucas] didn’t think there was any future in dead Han toys.” And in fact, that was the case, Han was meant to die in an earlier draft of Return of the Jedi. So would he insist on going out in a blaze of glory should he return?
More Star Wars at PJ Lifestlye:
The quality of discourse for women today is poor. The many and varied reasons for this will make a post for another day, but for the moment, note that the Mommy Wars and hookup culture discussions might be heartfelt but rarely resolve anything.
Notable recent examples of unproductive chattering: Naomi Wolf has created a new range of vagina puns with her anecdotal account of her technicolor orgasms in her latest book Vagina. The Life of Julia is a left-looking faceless cartoon claiming that women need government to take care of them. (I linked to Iowahawk’s parody because the original is too depressing.) Hanna Rosin seeks to convince us that replacing domineering men with domineering women amounts to positive progress. And a fan fiction author addicted to “shouty capitals,” E.L. James, captured the imagination of women across the English-speaking world with a poor specimen of a bondage novel that has since spun off a line of sex toys with little Fifty Shades of Grey logo tags. (British comment threads are always informative. Why pay for trademarked logo pleasure balls when limes work just as well?)
Missing has been someone to show how absurd this all is. We, the most privileged and independent women in history, find those discussions compelling? Sure, the Right has been pointing out the absurdities in such discussions for a while, but we are written off as the bigoted and biased Other. Feminist thought needs some honest criticism from the inside.
Re-enter Camille Paglia, the “pro-sex, pro-porn, pro-art, pro-beauty, pro-pop” sixties feminist and heavily published art and culture critic, quiet for the past few years while writing her latest book due out on October 16th, Glittering Images: A Journey Through Art from Egypt to Star Wars. Our debates suffered from her absence.
Related at PJ Lifestyle:
Why did we all root for Luke Skywalker and the Rebellion, cheering as one when the Death Star burst into a ball of flame? Why do we unanimously detest Panem’s Capitol, sharing a surge of joy when District 11 erupts after Rue’s senseless murder in The Hunger Games? What accounts for our universal loathing of Lady Catherine de Bourgh, Jane Austen’s most refined dictator who, insisting Mr. Darcy marry her insipid daughter, rivals the Emperor and President Snow in her own Georgian way?
Would it really have been so awful had the Empire ruled the Galaxy? Nobody appeared to be starving. It’s true the citizens of Panem were hungry, but at least they were safe from “war, terrible war.” The demise of Darcy and Elizabeth Bennet’s proud and prejudiced love would have cost them their status as the most beloved couple ever to live and breathe papyrus and yet, Darcy and Anne de Bourgh would have been rich — lacking neither the company of polite society nor polished silver.
A deep anguish probably stirred within your heart at these proposals. This malaise would turn to raw anger if we replaced these light-hearted examples of tyranny with darker ones, the true shadows of history whose malice brought real and lasting ruin and misery. Unanimous indignation meets the suggestion that since basic necessities of life were often provided by Stalin, Hitler, or Mao, totalitarianism is a viable living condition. Why?
We instinctively know, as human beings, we need more than food, shelter, and the absence of violence to be happy. This consuming hunger for joy is so important that Aristotle, the Definer himself, designates happiness as the final end for which we are created. To insist people, whether flesh and blood or birthed by quill, content themselves with crusts of bread or caviar instead of true human happiness violates our deepest sense of what it means to be human.
So what necessary ingredient of bliss was missing in the Emperor’s Galaxy, in Hunger Games‘ haunted Panem, and in Austen’s corset string-strangled English countryside? The essential right to self-determination. Nothing is more human than this internal principle of self-direction; the ability to freely select for ourselves from among the near-infinity of goals and the means to attain personally defined success. Without this, we are not human, but animals. This freedom is the condition for our joy and this is why, confronted with all forms of invasive denial of freedom, we rebel.
In Cults: The Mind Enslaved Parts I and II, we considered the normal and cultic human intellectual processes. It seemed that nothing could be worse than surrendering a mind to the shared Gnostic Brain of a cult. Understanding now the primary importance of human freedom for happiness, we consider how cults damage this even more fundamental faculty, the free will.
Psy of “Gangnam Style” fame returned to South Korea earlier this week, but his impact is still being felt on American TV and pop culture. A funny office bit aired on “Chelsea Lately” and “Today” ran a segment on September 27 about Psy’s triumphant return to his homeland. On Friday, Psy tweeted that “Gangnam Style” has topped 300M views on YouTube. It’s all great news for the unlikely rapper who continues to take America and the world by storm.
Psy returned to Korea as a conquering hero. Back on his home turf, he immediately hit the ground running and performed “Gangnam Style” to a packed college campus. In a press conference, Psy said that America was very good to him. “Even if they didn’t know the lyrics, they just danced,” he explained on Thursday. Matt Lauer stated in a “Today” segment, “We’re looking forward to having Psy back on the Plaza at his earliest convenience.”
Gangnam is the most coveted address in Korea, but less than two generations ago it was little more than some forlorn homes surrounded by flat farmland and drainage ditches.
The district of Gangnam, which literally means “south of the river,” is about half the size of Manhattan. About 1 percent of Seoul’s population lives there, but many of its residents are very rich. The average Gangnam apartment costs about $716,000, a sum that would take an average South Korean household 18 years to earn.
As the price of high-rise apartments skyrocketed during a real estate investment frenzy in the early 2000s, landowners and speculators became wealthy practically overnight.
The notion that Gangnam residents have risen not by following the traditional South Korean virtues of hard work and sacrifice, but simply by living on a coveted piece of geography, irks many. The neighborhood’s residents are seen by some as monopolizing the country’s best education opportunities, the best cultural offerings and the best infrastructure, while spending big on foreign luxury goods to highlight their wealth.
A girl who looks quiet but plays when she plays
A girl who puts her hair down when the right time comes
A girl who covers herself but is more sexy than a girl who bares it all
A sensible girl like that
I’m a guy
A guy who seems calm but plays when he plays
A guy who goes completely crazy when the right time comes
A guy who has bulging ideas rather than muscles
That kind of guy
More music at PJ Lifestyle:
Hat tip: The Mary Jane
More on pets and Star Wars at PJ Lifestyle: