- The shock news last week of Andrew Breitbart’s passing away.
- The tone of the left when the news broke.
- A follow-up on Drew’s PJM post back in November, in which he asked, “Why don’t we make more attempts to seize the mainstream back from the dishonest left?”
- How the conservative novelist can construct a moral universe.
- Shows about nothing, and Hollywood’s love of the nihilistic universe.
- “The Ten Hardest Movies To Turn Off Once You Start Watching Them,” and what they say about the future of the movie industry.
I’ve admired Andrew’s work ever since seeing True Crime with Clint Eastwood in a San Jose theater back in 1999, and have featured his PJTV material numerous times on PJM’s Sirius-XM show during its run. But I had never spoken with Andrew before the National Review Cruise this past November. So it was great to ask him some thoughts on new media, Hollywood, conservatism, and the future of the movie industry.
Click below to listen to our interview:
If your browser/Internet connection balks at the Flash player above and/or downloading the audio, click on the player below, or click here to be taken to YouTube, for an audio-only YouTube clip. Between one of those versions, you should find a format that plays on your system.
For the rest of podcasts at the PJM Lifestyle blog, start here and keep scrolling.
I still find it hard to believe that Davy Jones, the teen idol star of TV’s The Monkees died today at age 66 of a heart attack. Given that Mick Jagger is still going strong, and while Keith Richards appears to have morphed into Treebeard at some point over the last decade, he recently concluded a tour to promote his best-selling autobiography, 66 isn’t that old for today’s geriatric rock stars – particularly if Jones had stuck with the milk diet implied by the Monkees’ first sponsor, Kellogg’s Cereal.
I wouldn’t go as far as Kathy Shaidle’s claim that they were “better than The Beatles,” but certainly the latter group’s prefab imitators had their moments. In their early days, with Don Kirshner leading their sessions, they had the pick of New York’s Brill Building songwriters, such as Neil Diamond, Carole King, and Gerry Goffin. In their second season, after they fired Kirshner, the hits slowed down, but their quirky attempts at psychedelia were some of their most fascinating songs, along with Mike Nesmith’s proto-country rock experiments, which anticipated ‘70s groups like The Eagles by a good five to ten years. (Nesmith’s experiments in music video in the following decade would be dubbed by some as a direct precursor to ‘80s phenomenon MTV.)
You could make a case that 1966 was a seminal year in boomer pop culture. A young person could turn on the TV and flip through the dial to find:
- Star Trek
- Mission: Impossible
- The Green Hornet
- I Spy
- The Avengers
- The Wild, Wild West
- And of course, The Monkees
Those shows would be the backbone of syndicated rerun packages for the next quarter century, and most would also be developed into at least one motion picture, and for the first three, entire franchises that continue to this day.
The Monkees’ own picture would arrive first, their infamous 1968 movie Head, featuring a co-writing credit to Jack Nicholson, of all people. As Peter Biskind explored in Easy Riders, Raging Bulls, his seminal history of the “New Hollywood” of the late ’60s and seventies, Nicholson was on the staff of RayBert, the company formed by Monkees producers Bert Schneider and Bob Rafelson. Schneider and Rafelson would use their profits from the TV show they co-created to produce a number of seminal early-‘70s films such as Easy Rider and Five Easy Pieces. Schneider would also later be at the center at arguably the nadir of the Academy Awards, when he read aloud at the 1975 Awards (along with co-producer Peter Davis) to a standing ovation, a congratulatory telegram from North Vietnam on Davis and Schneider’s anti-Vietnam War documentary, Hearts and Minds, just three weeks before South Vietnam’s final surrender.
If that’s a long way from “Hey, Hey, We’re the Monkees,” I doubt very much that Jones knew he signing on for a show that would be the spearhead in a cultural revolution in Hollywood, but hey, hey, that’s how it all worked out. For a group dismissed as trite bubblegum, for better or worse, that’s some legacy.
While building a home theater stocked with a variety of electronic components is lots of fun, unfortunately, going the do-it-yourself route often ends with, well, not quite the proverbial Tower of Babel but perhaps worse from your significant other’s point of view – the dreaded Coffee Table of Babel. Those remote controls for the TV, A/V receiver, DVD or Blu-Ray player, cable or satellite set-top box, and other electronic equipment all begin to pile up, making for an ugly mess, and making the home theater appear more complex to operate than it otherwise is.
Back in 2004, Logitech acquired Easy Zapper, a Canadian startup specializing in universal remote controls, giving a firm best known for computer accessories such as replacement keyboards and mice a foothold in the home theater industry.
Under their Harmony division’s moniker, Logitech now produces a full range of remotes in a variety of retail price-points from $29 to $349. While their most advanced remote is arguably the tablet-shaped Harmony 1100, after reading a variety of reviews, I decided to avoid the tablet shape and go with the model directly below it, Logitech’s Harmony 900, which as of the time of this review, sells for $240.99 at Amazon.com.
This is a remote geared towards someone who knows his way around both his home theater and to some extent his PC as well, and who’s prepared to tinker a bit to set up the remote. In other words, expect a bit of set-up time, but once complete, it does make for a rather powerful remote.
Programming the Remote
After installing the supplied software on your PC, the first step is to gather all of your existing remotes, and to write down the brand and model numbers of all of your home theater components. Logitech maintains a database of approximately 5,000 brands and 225,000 devices, which the Harmony 900’s PC interface will search in order to set-up your remote. If you have a component that’s not on there, don’t fret – as long as you have its remote, you should be able to manually program its codes into the Harmony 900 while it’s plugged into your computer via its supplied USB cable.
It’s also possible to tweak the remote to add functions not included in the database. For example, since I do just about all of my TV watching with my A/V receiver on for surround sound, I ended up programming the A/V receiver’s volume and mute controls into the various devices controlled by the remote. Depending upon the amount of equipment you own, and the level of control you’re aiming for, early on you may have to do a fair amount of tweaking to customize the remote to your preferences.
While the Harmony 900 allows control over individual components, its first emphasis is on what it calls (on the remote’s GUI) “Activities.” These typically include watching TV, watching a movie, playing a CD, etc. The Harmony 900 will group together tasks so that pressing one button on the remote will automatically do things such as:
- Turn on your A/V receiver.
- Switch it to the TV input.
- Turn on your TV.
- Make sure it’s switched whichever input the satellite TV is on.
- Turn on the satellite TV digital set-top box.
And so on. A similar activity can be programmed watching a movie, which switch everything on to watch a DVD. For those with a few pieces of home theater gear that need to work together in harmony (if you’ll pardon the pun), this is a pretty convenient way to begin a few hours of television watching.
The Harmony 900 also supports individual components of course, which it calls “Devices.” The remote’s GUI can be toggled back and forth between devices and activities.
While the PC has quickly become the de facto home entertainment center for many, there are still moments – such as the Super Bowl or when it’s time to view Lawrence of Arabia or Star Wars on the big, big (home) screen – when sitting down, leaning back, and spacing out in front of a big-screen TV is a welcome change of pace.
LG’s model number 55LK520 55-Inch LCD HDTV produces a knockout 1080p picture. With three HDMI inputs, it’s possible to connect a satellite or digital set-top box, a Blu-Ray player, and an Internet device such as a Roku box. For the home theater industry’s equivalent of “legacy devices,” there are also component and composite inputs. (There’s no S-video connection, curiously. This may be the first video product I’ve purchased in 25 years without one.)
The LG 55LK520 lacks 3D, but I can’t say I’m enamored with that concept, particularly since it involves wearing ’50s-style 3D glasses over my own. And it lacks an Internet hook-up, but that’s OK as well. I’d rather plug-in a device of my own to connect to the Web. (Besides, my DirecTV receiver, Blu-Ray player, and Roku box all have various Web capabilities.)
The unit shares the same IR codes as the LG BD670 Blu-Ray player we reviewed last month; that unit’s remote is capable of performing the basic functions of this TV, though not vice-versa. It’s sort of academic though, as likely most will use some sort of universal remote, such as Logitech’s Harmony 900 or a similar device.
Initially, I was surprised by how “processed” some DirecTV HD programming looked on the 55LK520. Movies that were clearly shot on 35mm had an almost “live TV” sort of look, with little or no film grain visible. But you quickly become used to it. When I mentioned in my review of the Blu-Ray player last month that you can read the Winston logo printed on the band of Martin Sheen’s cigarettes in Apocalypse Now, or praised the details of a vintage Pimm’s Cup bottle label in the Blu-Ray edition of Boardwalk Empire, this was the TV I was viewing them on.
I had purchased the LG 55LK520 to replace an eight year old JVC rear-projection HD set, and immediately found that there was one feature on the older unit that I missed — the ability to zoom an 4X3 image to fill the screen. In contrast, unless I missed an option, the 55LK520 was only capable of black bars around a 4X3 image. If you watch a lot of older movies, or non-HD programming on cable or satellite, this might be something to keep in mind.
Also, for those who wish to place the LG 55LK520 on a tabletop (as I did, placing the unit on the stand in the middle of my home theater cabinets where my older — and much heavier rear projection once sat) my find that the base that the 55LK520 rests on feels a little on the flimsy side. It can do the job, but I wish had built with a more robust feel. Also, for those who placed their older rear projection sets with the screen flush with the edge of their supporting cabinet, the base causes the LG TV to be recessed about five inches in, which may require some adjustments if you’re planning to place the unit inside of a home theater cabinet. For those who wish to mount the LG 55LK520 on their wall, the rear of the set contains the usual VESA mount.
One of the handiest features on the back is a Toslink digital audio output. For those with limited digital audio inputs on their home theater receivers, the LG 55LK520 will output the audio of whatever device is currently displaying on the screen, thus simplifying use of the set with an A/V receiver, and reducing the number of digital audio inputs the A/V receiver needs for your various components. This also makes it easier to use the LG 55LK520 as a switcher for HDMI inputs, which is particularly useful if your A/V receiver has a few years on it, and lacks these connections.
Incidentally, this is as good a place as any for a friendly reminder, which may be old hat for some, but if not: if you’re doing your own installation, invest in a Brother P-Touch labeler or similar device and label your cables, putting the product the cable terminates in on the opposite end of the cable. Once you start building up a home theater with say, an A/V receiver, Blu-Ray player, Roku box, legacy equipment like a VCR, tape deck, CD player, etc, you risk finding yourself in a bewildering labyrinth of cables when you go to update your gear, or pull a device to send it to the repair shop. Having used masking tape, index file labels, and Crutchfield’s pre-printed cable labels, the tough vinyl P-Touch so far are the only labels that I’ve seen that don’t become brittle and risk falling off over time, but any label is better than none.
Space: 1999 “is poised for a comeback,” according to the Hollywood Reporter:
ITV Studios America and HDFILMS announced plans for a reimagining of Gerry and Sylvia Anderson’s famed franchise of the 1970s, then called Space: 1999. The news comes months after Fox and producer Seth MacFarlane announced they would be reviving Cosmos: A Space Time Odyssey, a 1980s miniseries from Carl Sagan.
“Science fiction is a powerful format capable of visualizing the human condition in thought-provoking ways,” said HDFilms president Jace Hall, who will spearhead the effort and serve as an executive producer. The project is in the development phase and has yet to be shopped to networks.
Be afraid. Be very afraid:
The original Space: 1999 was first conceived as a sequel to Gerry and Sylvia Anderson’s wildly uneven, but occasionally pretty cool UFO series from 1970. Apparently, the scenes set on UFO’s moonbase had the best test results from focus groups, and as a result, the Andersons decided to set their next TV series entirely on the moon — but floating freely in space so that it could visit other planets, a la Star Trek’s USS Enterprise. Never mind the physics of a moon that moved faster than light so that it could arrive at a new planet each week, yet slow enough so that it could launch its exploratory “Eagle” spacecraft, and never get permanently caught in the gravity field of the planet of the week. The result was a show with a not-bad theme song, nice 2001-inspired production design, and pretty good special effects for the pre-Star Wars-era that was completely undermined by a ridiculously overloaded premise. According to Wikipedia, “Gerry and Sylvia Anderson were surprised and disappointed that the public (and critics) never granted them the suspension of disbelief given to other science-fiction programmes.”
There aren’t ropes strong enough to suspend that amount of disbelief. No wonder the show sank.
Star Trek: The Next Generation and its spin-offs proved that it was possible to successfully update an older sci-fi TV series, and the re-imagined Battlestar Galactica certainly had its fans. But a reworked Space: 1999 might be going to the gravity well once too often.
Were you a fan of the old series? And would you tune in for a remake?
What connects seemingly disparate works such as The Silence of the Lambs, Cape Fear, Mad Men, and Seinfeld? It is the philosophy of nihilism, first popularized by Friedrich Nietzsche in the late 19th century. But in the last few decades, how did it become the dominant worldview of Hollywood? In 1999, Dr. Thomas S. Hibbs, currently the Distinguished Professor of Ethics & Culture and Dean of the Honors College at Baylor University, wrote the original version of Shows About Nothing: Nihilism in Popular Culture. Last month, Baylor University released an updated version of the book, which explores shows and films that have debuted since Hibbs’ original work was published. In this half-hour interview, Hibbs discusses:
- How post-WWII Hollywood originally explicitly rejected Nietzsche and nihilism, before ultimately embracing him with open arms.
- Why horror movies eventually eradicated God for charming nihilists who fashion their morality as “beyond good and evil,” such as Dr. Hannibal Lecter.
- Seinfeld: the sunny side of nihilism.
- How man successfully threw off the encumbrances of authority and tradition only to find himself subject to new, more devious, and more intractable forms of tyranny.
- How aesthetics came to usurp morality.
- Mad Men’s Don Draper: the man in the gray nihilistic suit.
- Can Hollywood move beyond nihilism?
Click below to listen to our interview:
If your browser/Internet connection balks at the Flash player above and/or downloading the audio, click on the player below, or click here to be taken to YouTube, for an audio-only YouTube clip. Between one of those versions, you should find a format that plays on your system.
For the rest of podcasts at the PJM Lifestyle blog, start here and keep scrolling.
To give you a sense of how far video technology has advanced, and how far prices have plummeted, let’s first go back to the mid-1990s. Back then, Pioneer Elite’s CLD-97 laser video disc player was one of the finest video playback systems a consumer could buy. Selling at about $2500, it weighed 37 pounds and its exterior case featured a sleek, rich piano black finish with rosewood side panels. With the right source material, it was capable – for its time – of a stunning picture, and can be seen as one of the last steps in the 12-inch laser disc’s evolution before the 4.7-inch DVD came along in the US back in 1997.
But that’s all Jurassic-era history. Currently selling for $124.77 on Amazon, the LG BD670 3D Wireless Network Blu-ray Disc Player with Smart TV leaves the $2500 CLD-97’s picture quality in the dust. And unlike the home theater technology of the 1990s, it’ll talk to your home’s local area network, too.
Amongst the formats it supports, the LG BD670 is capable of playing high-definition Blu-Ray discs, which output up to a 1920×1080 picture, plus 3d Blu-Ray discs, conventional DVDs, compact audio discs (CDs), WMA, and MP3s . We’ll get to those last two in just a minute.
The LG BD670 does a very good job of upconverting most DVDs before outputting them to an HD television. I wrote my recent review of Boardwalk Empire based on standard definition DVDs played through the LG BD670 on a 55-inch LCD TV and thought, man, this picture looks great. Of course, when the Blu-Ray review copy finally arrived from HBO, I was blown away by how sharp it was; you could discern the weave in Nucky’s proto-zoot suit. Or read the text on the bottles of Pimm’s No. 1 he procures for a politician he’s bribing. Watching Apocalypse Now in Blu-Ray, it was possible to read the “Winston” script on the band of Martin Sheen’s cigarette while he was taking a drag. On some films, this can lend dramatic differences in perception. The pace of 1968’s 2001: A Space Odyssey, a film I’ve seen dozens and dozens of times over the past decades, on pan & scan VHS, a couple of different letterboxed laser discs, DVD, and on a few rare occasions in revival theaters, seemed noticeably faster. The difference was that I could make out the myriad fine details embedded into every shot as eye candy. And I could watch Keir Dullea – almost always photographed in long and medium shots to frame him in his environment – act. It was a potent reminder of how much is lost, even on high-quality playback systems such as anamorphic standard definition DVD.
Speaking of which, the results can vary in quality when watching a standard definition DVD on the LG BD670. I already mentioned the anamorphic standard-definition DVD version of Boardwalk Empire. But plenty of DVDs have been released in TV’s traditional 4X3 format. My DVDs of the legendary early-1970s Thames TV series The World at War probably looked their very best on the LG BD670, but there’s only so much its electronics can do for a series consisting of alternating WWII newsreel footage and 16mm interviews. The worst offender I’ve seen so far was my first generation DVD of the 1989 Michael Douglas, Ridley Scott potboiler Black Rain, which Paramount issued in letterboxed non-anamorphic format shortly after the DVD format debuted. All of the smoke and diffusion in the cinematography made for a muddy, pixilated image after so many lines of resolution were lost in the letterboxing format. (Fortunately, it’s now out on Blu-Ray.)
(Disclosure: my LCD TV doesn’t have 3D, and I’m not a fan any format that requires me to wear extra glasses over my own glasses, so I did not test any 3D discs.)
When did America start having emotional meltdowns over sports? A pair of recent events during the run-up to the Super Bowl highlight a disturbing trend among sports fans.
Most recently, as Peter King writes at Sports Illustrated, fans of the San Francisco 49ers aren’t handling Sunday’s defeat in the NFC Championship game very well:
Nice crowd the 49ers have on Twitter. One of their “fans” tweeted to Williams (@KyleWilliams_10): “Jim Harbaugh, please give @KyleWilliams_10 the game ball. And make sure it explodes when he gets in his car.”
It’s only sports, people. Only sports. Around here, the fog will come up tomorrow.
I know Jim Harbaugh has tried to transform his formerly finesse-oriented team into tough blue collar-style bruisers, but who knew he’d also turn San Francisco’s formerly wine and sushi-enjoying crowd into snarling Oakland Raiders-style fans?
Similarly, assuming it’s not play-acting to deliberately create a viral video (and it wouldn’t be the first time, if that turns out to be the case), this clip is a fascinating look at the mindset of a crazed sports fan, crestfallen that the Green Bay Packers lost in the playoffs:
Vince Lombardi built the Packers of the 1960s into a tough, Spartan football team, and the Packers fans of that era were similarly flinty and cool. (Pardon the frozen tundra-inspired pun.) Looking down from NFL Valhalla, what would Lombardi think of the above video?
I love the magical thinking implicit in blaming her sparkly nail polish (!) for the Packers’ loss. The solipsistic belief that she alone displeased the Football Gods so badly they caused the Pack to lose to the Giants on January 15th.
Then there’s the polypropylene cheesehead and Packers jersey she’s wearing. Hulu, the streaming video site, has a section devoted to the NFL, where you can watch NFL Film’s Lost Treasures series, which looks back at the founding of the league’s film division in the early to mid-1960s. Watching those episodes, you’ll quickly notice that prior to the 1970s, there was little in the way of NFL merchandise for adults to wear. If you watch newsreel footage of the 1957 NFL championship, when the Baltimore Colts beat the New York Giants at the legendary Polo Grounds, the majority of men in the stands wore sober business suits, top coats, and fedoras. This past Christmas, I watched an NFL Channel presentation on “The Longest Game Ever Played,” the double-overtime playoff battle between the Miami Dolphins and the Kansas City Chiefs, the last game played in Municipal Stadium, the predecessor to Arrowhead Stadium, the Chiefs’ current home. As late as Christmas Day, 1971 there were still several men wearing suits, ties, and fedoras to games.
Back in the 1990s, when the World Wide Web was still new and shiny, and all things seemed possible, television ads promised us a future where every movie ever made would one day be available for streaming on the Internet. (At least if I’m remembering the ads I saw around ’97 or ’98 or so correctly.) The Roku set-top box is a big down payment on that promise. And if I were the cable or DBS companies, I’d be a little scared.
While lots of people will keep watching good ol’ network TV, the ability to cut the cable is now within sight. After seeing numerous links at Instapundit.com, typically with comments from readers about how much they enjoyed their Roku set-top boxes, I decided to give one a try.
Once out of the box, while a few people have complained in comments at Amazon about interconnectivity issues, for me, hooking up the Roko XS couldn’t have been simpler. Plug in a LAN cable, plug the Roku’s A/C adaptor into an outlet, pop a pair of AA batters into the remote, and then follow the instructions on its GUI, and let it do its thing. Within a few moments, it was happily talking to the server back at Roku HQ, and was good to go.
The whole design philosophy of the Roku seems to be “strip everything down to its basics, and keep the interface as clean and minimal as possible.” The remote control bundled with the Roku XS only contains 10 buttons, and an up, down, left, right controller. The onscreen GUI is similarly minimal. But then, this is a unit designed primarily to do one thing: get streaming content off the Web and onto your TV screen.
One element of the Roku is too minimal, in my opinion. I was surprised that the only hook-up options are an HDMI cable to connect to most of today’s HDTVs, and an all-in-one analog output, with a mini-plug-sized jack on one end for the Roku box, and RCA connections for video and analog on the other. I would have liked to have seen a separate digital audio output, whether it was RCA or Toslink, to plug the audio into an A/V receiver for surround sound. Fortunately, my LG HDTV has its own Toslink audio output, and I was able to snake a cable back to my A/V receiver as a workaround. Currently, only the Roku XS model has an outlet for hardwired 10/100 mbps Ethernet, and a slot for a microSD card.
Where No Set-Top Box Has Gone Before
So how is the picture? Pretty damn good, I must say. All of the Roku units output a minimum of 720p HD; the Roku XD and XS up the picture quality to 1080p. Of course, picture quality is dependent upon the source material the unit outputs, which can vary widely. But I watched the remastered version of “Where No Man Has Gone Before,” the second pilot for the original Star Trek on Netflix, and this was the sharpest I had ever seen the original show. (Which sometimes didn’t work in its favor: the picture was so sharp, you could see where Leonard Nimoy’s makeup was applied. And the crude appearance of Gary Lockwood’s reflective silver contact lenses.)
In the first place, I would like to observe that the older generation had certainly pretty well ruined this world before passing it on to us. They give us this Thing, knocked to pieces, leaky, red-hot, threatening to blow up; and then they are surprised that we don’t accept it with the same attitude of pretty, decorous enthusiasm with which they received it…
– John F. Carter, “’These Wild Young People’ by One of Them,” in the Atlantic Monthly, 1920.
Stop me if you’ve heard this one before. In the first years of a new decade filled with technological wonders, American troops are returning home from an overseas war that was promoted as saving democracy – democracy as it was currently understood – abroad. Concurrently, self-styled progressives, hoping to transform the world into a utopian vision of Heaven on Earth, wake up each day thinking, “What can we ban today?” The wealthiest one percent create enclaves in which the laws that they force upon everyone else don’t apply to them. And a corrupt if charismatic politician seeks to find ways, via his cronies, to exploit this enormous rift in what is thought by the masses to be a free market.
America today? No, America in 1920, as prohibition begins to sink its ugly claws into the decade.
It’s easy to see how Boardwalk Empire was green-lighted at HBO. The Sopranos, focusing on a ruthless, albeit relatively minor wannabe-Godfather, was a huge hit a decade ago. AMC’s Mad Men, which was created by one of the Sopranos’ producers, is a cult favorite and hit with the critics. Why not hire another Sopranos producer and create another crime show set in New Jersey, but with the same sort of boomer-tinged historical triumphalism that fuels Mad Men?
In the opening titles of Boardwalk Empire, set to menacing, vaguely surf-rock sounding electric guitars, an infinite number of Canadian whisky bottles wash ashore while Steve Buscemi as Enoch “Nucky” Thompson, the show’s answer to Tony Soprano, scans the horizon. If there are any messages in these bottles, it’s a reminder that the past is a foreign country, its people increasingly incapable, in the eyes of the Boomers, of having gotten anything right.
To be fair though, Atlantic City hit the skids long before the 21st century. When I was a kid living in South Jersey, the prospect of an hour and a half car ride to Atlantic City always left me with a feeling of melancholy. A long car ride down route 295 and then 45 minutes on the Atlantic City Expressway, terminated in passing by numerous clapped out seaside homes, on the way to the Boardwalk itself, just before casino gambling was legalized by New Jersey and slightly revitalized the area. Slightly.
But Atlantic City in the 1920s, at least as imagined by HBO, is a sight to behold, with rich swells intermingling with down and out immigrants, and an endlessly variety of storefronts, fortune tellers, and carnival barkers. I’m happy to see the first season of Boardwalk released on DVD and Blu-Ray from HBO. I got hooked on the show in November, when it seemed to be on a continuous loop on HBO while I was back in a very different New Jersey than the one depicted in Boardwalk.
Or maybe not so different; there’s a reason why John Fund wrote an essay for the Wall Street Journal in 2004 titled “Louisiana North” calling New Jersey a “pit of corruption.” Whatever reforms current governor and GOP superstar Chris Christie is capable of, he’s got his work cut out for him.
But then, that’s long been true. In TV’s Boardwalk Empire, the mayor is a figurehead. The man who makes the resort town go is its treasurer, the character played by Buscemi, and based on a real life-life figure, Enoch L. Johnson, who lived from 1883 to 1968 – and who looked nothing like the actor playing him.
I’m pretty sure there will be grave repercussions from the Emperor over this hateful rebel incident:
Michael Cole, 28, of Orlando, was arrested on felony charges of resisting arrest and battery on an officer.
According to the FHP, a construction worker informed the trooper around 2:45 a.m. of an intoxicated man wearing a Darth Vader mask who was walking in the middle of a road near Summerlin Avenue and Anderson Street.
The trooper approached the masked man, later identified as Cole, and repeatedly asked him to get out of the road, the FHP said. Cole instead cursed at the trooper and laid in the roadway, authorities said.
The trooper then told Cole to get up, but he attemtped to punch and kick the trooper, who deployed his Taser, according to the FHP. Officials said Cole’s thick jacket prevented the Taser from working, so the trooper used pepper spray to subdue him.
Local 6 News captured video showing the man screaming while being placed onto a stretcher.
Click over for the video. Reports that a local supermarket manager has been reported missing could not be confirmed.
Hear me now and believe me later, when I tell you that what you’re about to witness is the simply the greatest commercial advertisement in the history of television. Around the world:
No, seriously. Imagine it’s the early 1970s, and this ad appears on Japanese television. If you don’t speak much English, and see Charles Bronson at a swank nightclub, then racing through town in his huge Cadillac Coupe de Ville, and finally, stopping off at his swinging pad to light a pipe and douse himself in…Mandom!…you’d probably think that…
Wait, who am I kidding? This had to look as camp to early ‘70s viewers in Japan as it does to our 21st century American eyes. I can only imagine that the director did several different takes of Bronson with varying expressions and gestures for each shot, and cut together nothing but the wildest shots of Bronson he had, sort of a miniature version of what Stanley Kubrick did to George C. Scott in Dr. Strangelove. Fortunately for all of us, at no point did General Turgidson fling his shirt into the air and pirouette around the War Room.
If you’ve ever seen Sofia Coppola’s brilliant 2003 film Lost in Translation, you can work out the whole backstory for why this ad was made. Bronson was a huge star in Europe and presumably Japan as well, but hadn’t quite yet made it to superstardom in Hollywood, a plateau that would only reach at age 50 when he starred in 1974’s epochal film Death Wish. (Ironically, he was, in one sense, badly miscast – Brian Garfield, the writer of the novel wanted Jack Lemmon in the lead role. Lemmon would have been infinitely more believable as a commercial architect at the start of the movie. And audiences would have been much more shocked at Hollywood’s quintessential milquetoast liberal blasting away at New York muggers at the low point of the city’s existence than they were when the uber-macho Bronson laid waste to the city’s scum.) For Bronson, like Bill Murray’s character in Lost in Translation, shooting an ad while in Japan was an easy paycheck, along with plenty of ego-boosting glad-handing from the local ad agency, eager to work with an American star. And the comfort of knowing that no one back home was going to see the finished ad, no matter how embarrassing it was.
Until YouTube came along.
What’s also fun about this ad in retrospect is that it combines multiple elements that political correctness have all but bulldozed over: pipe smoking, the heroic cowboy figure (whom Bronson briefly transforms into with just a splash of Mandom) and Bronson himself. While Death Wish made him a superstar, it also typecast him into that role almost entirely. As Bronson was quoted at the Internet Movie Database, “Someday I’d like a part where I can lean my elbow against a mantelpiece and have a cocktail.” This ad was probably as close as he got during the last 30 years or so of his career. (And even then, gunplay was involved.)
Oh, and pay no attention to that the fact that no woman seems to be sharing Bronson’s swinging pad, even after he bathes in the World’s Most Ultimate Aftershave. Hopefully she arrived just after the commercial fades out, or whoever Japan’s equivalent of Fredric Wertham was at the time would have pondered endlessly the subtext of the ad. Perhaps even more so than future generations will. Which is a reminder that getting lost in translation isn’t always just a cross-cultural communication breakdown, but can be temporal as well.
(H/T: Jim Treacher.)
It’s dueling Top Ten Lists today. First up, USA Today film critic Claudia Puig lists her Top Ten Favorite movies of 2011. Congratulations to our own Roger L. Simon for A Better Life (now out on DVD, Blu-Ray and Amazon Instant Video) making the cut:
- The Artist
- A Better Life
- The Decendants
- Like Crazy
- A Seperation
- The Tree of Life
Unfortunately though, quality product was a rarity for Hollywood in 2011, which helps to explain this recent breathless headline in The Hollywood Reporter: “Box Office Shocker: Movie Attendance Falls to Lowest Level in 16 Years.”
Considering similar articles were written throughout the “naughts” (QED), that news shouldn’t come as a “shocker” to anyone, least of all, The Hollywood Reporter. To help right the ship before the iceberg completely subsumes it, Big Hollywood’s John Nolte lists the “Top 10 Ways Hollywood Can Win Its Audience Back:”
Here’s a sample:
You can trace most of Hollywood’s problems back to the death of the movie star. At first, the industry was thrilled with this development. No movie star meant no big payday, no ego, and none of the baggage too many stahs carry with them. The industry also found that, at least for a while, they could get away with this. Audiences were still packing theatres to see pre-packaged brands developed from high concepts, comic books, novels, and television shows. Sequels, remakes, and prequels were still sure-fire. Who needs to pay Tom Cruise $30 million to run around with CGI’d dinosaurs when just as many people will pay to see Jeff Goldblum do the same?
This was all well and good until the “brands” ran out. Now Hollywood is down to “The Green Lantern” and board games like “Battleship.”
Movie stars, on the other hand, are the most reliable brands out there. People come to see them and if you have enough of them and if you keep developing them, the inventory is limitless. From the 1920s straight through to right around 1990, if you built it with movie stars, audiences would come. Hollywood didn’t need to rely on “brands” because they built pictures around their stars.
Today we’re down to Sandra Bullock, Will Smith, and Denzel Washington — the only three people I know who can still draw a crowd based solely on their name.
Given the Christmastime success of the latest Mission: Impossible sequel, I’d add Tom Cruise to that list — his recent implosions seem not to have fatally harmed his brand. But beyond that, it’s a shorter and shorter list, as Arnold checked out of Hollywood when he became governor, Mel self-destructed, and Bruce Willis’ star power seems to have diminished. And there aren’t a whole lot of younger actors coming up who have the sort of name and good will where moviegoers will say, “Let’s go see the latest [INSERT STAR NAME HERE] movie,” the way that Mel, Arnold, Sly, Harrison and Tom had in the 1980s and ’90s.
The items that immediately follow on John’s list helps to explain why.
(Thumbnail on Lifestyle blog’s homepage by Shutterstock.com)
Dandy Don and Howard Cosell have each retired to the big broadcasting booth in the sky, and Hank Williams, Jr. has been fired from his Monday Night Football gig. So for some NFL pizazz, there’s only one place left to turn.
Yes, we’re left with Taiwan’s crack team of digital animators, who bring us a preview of Sunday night’s Cowboys/Giants game as only they can:
What’s your take on the final week of the NFL? Who will go deepest in the playoffs? Feel free to reply via digital 3d animation, or simply in the comments below.
(Via the PJ Tatler.)
I think you could make a case for Airplane! — without or without the exclamation mark on the end of the title — as being the funniest movie comedy ever made. If it’s not number one (and feel free to hash it out in the comments below), then certainly in the top ten. For better or worse, it’s also the comedy that ushered in the modern ironic age. (See also headline above.)
Many of Airplane’s (Airplane!’s?) fans know that it was based on a mid-1950s film called Zero Hour!, based on a novel by Arthur Hailey, who would go on to write Airport in 1968, adopted into the mother of all ’70s disaster movies by Universal two years later. Zero Hour! starred Dana Andrews as a washed up fighter pilot named Ted Stryker, Sterling Hayden in the control tower, and then-L.A. Rams star running back Elroy “Crazylegs” Hirsch as Kareem Abdul-Jabbar, and played deadly, earnestly, straight. So straight that when the Zucker Brothers, coming off their debut Kentucky Fried Movie taped this film off the late-night movie show in the late 1970s, they knew there was the basis for a comedy. But to get a sense of how much the Zuckers stuck to the script of the original film (which they bought the rights to, so as to avoid getting sued) check out this YouTube clip, which cross-cuts between the original and in this case, it’s far superior imitation — the one substitute you should definitely accept.
And stop calling me Shirley. (Sorry, it was inevitable.)
To be honest, I’m not sure if Cincinnati Bengals founder Paul Brown or John Facenda, NFL Films’ original Voice of God, would approve of the circus-like acrobatics. But Jerome Simpson of the Bengals obtains NFL immortality, as this clip will be shown endlessly over the coming years on ESPN and the NFL Channel, let alone YouTube:
So how has your team done this year? Are you ready for — dare I say the word — the playoffs?
First the good news — the colorization process looks better — and tighter — than the horrible blotchy early efforts of Ted Turner in the mid-1980s:
A reader sent this, a clip from the new HD colorization. He writes, “Every single frame looks like a Rockwell painting.”
It might, but that’s not the way the film was meant to be seen. Technicolor was invented in 1916 and came of age in the late twenties and thirties. If filmmakers wanted to make their films in color, they could have. Sure, sometimes the cost was prohibitive, but then a film was produced for black and white the lighting, shadows, clothes and make-up were crafted and created deliberately around that reality. Nothing about any black and white film is appropriate for color. Nothing.
Jimmy Stewart himself was so incensed by colorization (his look at what was done to “It’s a Wonderful Life” was likely the last straw) he personally testified before Congress against it in 1988.
For a time, when Ted Turner was really going to town, you couldn’t even buy black and white VHS copies of some of these classics. You had to turn the color off on your television.
For the life of me, I can’t imagine why such a thing would enhance anyone’s enjoyment of a film.
It’s a Wonderful Life is in quasi-public domain, so all sorts of versions of it are available. That’s the film’s blessing and curse, making it both easy for it to be colorization fodder, and easy — at least for now — to find the original version.
Of course, for better or worse — likely worse — it’s only a matter of time before Jimmy Stewart’s career begins again, reborn as a digital thespian; starring in a sorts of new productions.
So how was your Christmas? And what did you watch this weekend?
When I first caught the film bug in college, I got more than a little obsessive rifling through the shelves of the school library for books and magazine articles on Stanley Kubrick and his films. (If you’re a student with tendencies towards OCD, discovering Stanley was like discovering a kindred spirit made good. I shudder to think what would have happened had Taschen’s massive Stanley Kubrick Archives, published several years after Kubrick’s death had been published at the time, but I think Stanley would have loved the book himself.) I was determined to crack the mysteries of 2001: A Space Odyssey, and explore his other films as well. For 2001, Kubrick removed narration and an original score by veteran film composer Alex North to create a visceral nonlinear experience. Given the MoMA-approved film that emerged, and the hundreds of thousands of words that it generated, in a way, it illustrates — so to speak — Tom Wolfe’s dictum from The Painted Word that “Modern Art has become completely literary: the paintings and other works exist only to illustrate the text.”
If you can find a used copy, Carolyn Geduld’s Film Guide to 2001 : A Space Odyssey from 1973 does a thorough job building a roadmap to take you through “The Ultimate Trip” and back. And McLuhan acolyte Jerome Agel’s The Making of Kubrick’s 2001 from 1970 has extensive behind the scenes photos of the film, as well as being a witty (and very McLuhan-esque) non-linear time capsule of the late 1960s in its own right.
One of the best books on Kubrick, which was updated in 2003 to include chapters on Full Metal Jacket and Eyes Wide Shut, his last films, was Michel Ciment’s Kubrick: The Definitive Edition. It featured exclusive interviews with Kubrick and several of his closest collaborators, including his brilliant cinematographer in the 1970s, John Alcott.
Part of Kubrick’s cult of personality was that, in an industry dominated by publicity hounds, after 2001′s release in 1968, and particularly after the controversies surrounding A Clockwork Orange, Kubrick became the Garbo of producers — which of course, only added to his mystique. Because Kubrick rarely did print interviews, and never television, I had never heard his voice before the early days of the World Wide Web, when the clip of his acceptance speech for the D.W. Griffith Award in 1997 went online, two years before Kubrick passed away at age 70. Someone has uploaded the audio of 11 and a half minutes of Ciment’s interviews with Kubrick over the years. There are plenty of “ums and you knows,” which were invariably cut out of Kubrick’s print interviews – not surprising, since many were published under quid pro quo orders that Kubrick be allowed to proof the interview before it ran and make changes and revisions to his quotes. But you can also hear Kubrick’s sharp mind and Bronx dialect (the inspiration for the voice of President Muffley, as portrayed by Peter Sellers in Dr. Strangelove) at work. And a twinge sadness knowing that there will never be another director like him — or a media as vibrant as its heyday when he was at his peak.
This is one of those reviews that will appeal to a very limited audience — those who practice what Tom Wolfe once referred to as “the Secret Vice.” And I have to confess: I consider myself a (junior) member of that club. I like getting dressed up. I like suits, braces, cufflinks, ties, patterned socks, captoed shoes, and dinner jackets. And I like learning about their history.
Mind you, I don’t get especially dressed up every day: I usually wear jeans and a buttondown shirt when blogging, as opposed to PJM’s original namesake garb. But when I go out for dinner, particularly on the weekend or during holidays, I like to look good.
There, I said it. Still with me?
If you’re not, I can understand. Ever since the 1970s, after the era depicted in Mad Men concluded, being well dressed has often been seen as a slightly strange affectation for a man. And yet, to get through life (including job interviews, office work, family gatherings, weddings, upscale restaurants, and other events), there are certain sartorial skills that a man must have.
Fortunately, they’re easily acquired.
At the height of the Silicon Valley boom in the late 1990s, several friends of mine, all in their 40s or 50s, who hadn’t gone on job interviews in ages, each asked me what to wear to them. And in each case, I simply handed them my copy of Alan Flusser’s 1985 book, Clothes and the Man and said, “read this.”
The Long Polyester Hibernation
Confession number two: I wasn’t always much interested in clothes. I became aware of Clothes and the Man in the mid-1980s, when I was in college, having graduated from a 13-year K through 12 hitch at St. Mary’s Hall (now known as Doane Academy) in New Jersey, a private college prep school where I wore a blue blazer, blue buttondown shirt, striped tie and gray trousers every weekday.
Not surprisingly, I left St. Mary’s more than a little confused about what to wear next, especially since simultaneously, menswear was coming out of its long polyester hibernation and into a brief moment of style (Wall Street “power suits,” Miami Vice pastels, suits worn by rock stars in MTV videos, etc.). Of course, with the possible exception of those who were very careful buying their power suits, most ’80s fashion dated very badly, leaving lots of men — including myself — with more than a few momentarily stylish skeletons in their closets. Clothes and the Man helped me avoid many further mistakes: the suits and sports jackets I bought prior to buying Flusser’s book around 1987 have long since been given to Goodwill. (Though I still have the psychedelic Bill Cosby sweater I bought from Boyds in Philadelphia in 1986, just to remind myself of the era.) Some of the clothes I’ve bought post-Flusser, I still wear from time to time, even after a quarter century of ownership.
Appropriate Styles That Will Last
That’s the whole point of Flusser’s most recent book, Dressing the Man: Mastering the Art of Permanent Fashion, which was first published in 2002: finding appropriate styles that flatter a man, and will last. Flusser’s book is copiously illustrated, with a combination of vintage photographs of the usual suspects (Cary Grant, Fred Astaire, the Duke of Windsor, Adolphe Menjou, Lucius Beebe, etc.), newly photographed men in a plethora of styles, and classic illustrations from the golden era of such publications such as Apparel Arts, the beautiful 1930s-through the 1950s forerunner of both GQ and Esquire, which I talked to Michael Anton about, back in October.
I don’t want to give the impression that Flusser’s book is merely a photo and illustration-heavy coffee table book without substance. Like his previous books (and frankly, if you own Clothes and the Man, you might want to thumb through Dressing the Man before buying it, unless you get obsessive over this stuff like I do), Flusser has lots of practical advice on his subject.
If you’ve got a film buff or a friend with an interest in graphic design on your Christmas list, Saul Bass: A Life in Film & Design, is a giant, heavily illustrated 428-page coffee table book with an enormous “wow” factor – and not coincidentally, a fair amount of heft at seven pounds, with dimensions of 11.7 x 10.6 x 1.7 inches. It was designed by Saul Bass’s daughter Jennifer, and written by design historian Pat Kirkham, who knew Bass personally, with an introduction from longtime Bass admirer Martin Scorsese. It’s published by Laurence King Publishers.
Saul Bass (1920 to 1996) began his career designing the film poster for 1954’s Carmen Jones, and the title sequence the following year for The Man with the Golden Arm, both produced by Otto Preminger. He would go on to design groundbreaking title sequences for Hitchcock’s Vertigo, North By Northwest and Psycho, Stanley Kubrick’s Spartacus, John Frankheimer’s Seconds and Grand Prix. Along with all of his film work, Bass eventually became a respected corporate graphic designer for such businesses as AT&T, The Bell System, United Airlines, Dixie Cups, Minolta, Lawry’s Foods, Warner Brothers, and Quaker Oats. For many years, his film career and corporate design work overlapped, until his career as a title designer appeared to slow in pace in the 1980s, only to see it revive with such high profile Martin Scorsese films as Goodfellas (which marked the beginning of a career resurgence for Scorsese as well), Cape Fear, The Age of Innocence, and the last title sequence designed by Bass, Casino.
There are two audiences for this book (with plenty of overlap of course). The first are film lovers and film historians who have thoroughly enjoyed Bass’s title sequences and his contributions to films such as Psycho, including storyboarding shot for shot its legendary shower sequence, which this new book discusses at length. The second are students of graphic design. Much of the work that Bass created would be rendered infinitely with today’s technology such as Adobe Photoshop, Illustrator, and Adobe After Effects. And yet, Bass created his iconic still images and what we now refer to as “motion graphics” decades before such computer technology existed. As with the soundscapes that George Martin created for the Beatles 20 years before digital synthesizers and samplers, these pioneering analog efforts led the way and helped to shape the digital technology we enjoy today.
Bass is perhaps best remembered for elevating the movie title sequence into art, but fortunately, Saul Bass: A Life in Film & Design doesn’t overlook his work as a corporate designer. While Bass was an extremely talented and endlessly creative corporate designer, because of the simple modernist elements he typically worked with, what began as art with Bass was quickly boiled down into formula by other, lesser designers. The result was a corporate sameness by the early 1970s, which was brilliantly – if entirely unintentionally – summed up in the best-known moment of the design and typography-related documentary, Helvetica:
In that sense, as a corporate designer, Bass’s influence was similar to that of Mies van der Rohe. While Mies an extremely talented and inventive architect, too many lesser architects (cough — Philip Johnson — cough — Gordon Bunshaft) who following his lead saw only the plate glass and black I-beams and could never imitate Mies’ sense of proportion and willingness not to be bound to the rules of Miesianism.
Which is a useful lesson for anyone considering a similar career in corporate design work. But then, despite going off to the great artists’ garret in the sky 15 years ago, there are all sorts of lessons still to be learned from Saul Bass.
You gotta start somewhere, and here’s what moviegoers were told in the very early 1930s about a technological breakthrough soon to appear in their homes, with a steep, steep learning curve.
The rotary phone:
Twenty years and a World War later, television went national, as the first transcontinental coaxial cable was run, as Terry Teachout of the Wall Street Journal writes:
In present-tense culture, golden anniversaries tend to get swept away by the whirlwind of current events. Here’s an example: Network television as we know it came into being on Sept. 4, 1951, when AT&T threw the switch on the first transcontinental coaxial cable. Up to that time, TV had been an essentially regional phenomenon. The most important network shows were all performed live in New York, and the only way for West Coast viewers to see them was for fuzzy-looking film copies called “kinescopes” to be shipped to Los Angeles and broadcast a week later. The coaxial cable changed that by making it possible to transmit live video signals from coast to coast–in both directions. Within a matter of months, Hollywood had become a major center of TV production.
Don’t be embarrassed if you didn’t know any of this. So far as I know, no one has taken note of the golden anniversary of the coaxial cable, or celebrated the fiftieth birthdays of three influential series that the cable made possible. But if you owned a TV set in 1951, you might well remember these Truman-era debuts:
* * * * *
Nov. 18, 1951: “See It Now,” the first TV newsmagazine, whose first episode opened with a shot of two control-room monitors. One showed a live picture of the Statue of Liberty, the other a live picture of the Golden Gate Bridge. Edward R. Murrow, the host, was visibly impressed: “For the first time, man has been able to sit at home and look at two oceans at the same time.” It may sound quaint now, but 60 years ago that image took people’s breaths away.
Today, we take smart phones, video conferencing, and — to coin a phrase — a World Wide Web of information for granted. But it took plenty of experimentation with analog technology to build the knowledge base for today’s technology. Assuming our betters in Washington and academia allow us to keep it.
The death on Monday of Bert Schneider, the man who, along with his business partner Bob Rafelson, brought you both the Monkees and Easy Rider, brings to a close one chapter in the life and death of New Hollywood. As Mark Steyn wrote on Wednesday:
Bert Schneider was an obscure figure by the time of his death, but back in “New Hollywood” – that interlude between the end of the studio system and the dawn of the Jaws/Star Wars era – he was briefly a significant figure. He started in TV in the mid-Sixties, helped create “The Monkees” and then took them to the big screen in the feature film Head. That flopped, but the next film he produced, Easy Rider, cost less than 400 grand and within three years had made $60 million. There followed Five Easy Pieces and The Last Picture Show.
But, as much as I like the latter, I prefer to remember the late Mr Schneider for his contribution to the gaiety of 1970s Oscar nights. Truly, that was the golden age of Academy Awards ceremonies. On April 8th 1975, Bert Schneider’s film Hearts And Minds won the Oscar for Best Documentary. Instead of an acceptance speech, he read out a telegram conveying fraternal greetings to the American people from Dinh Ba Thi of the Vietnamese Provisional Revolutionary Government. Offstage, Bob Hope was mad, and scribbled some lines for his co-host Frank Sinatra. So Frank came out and said that the Academy wished to disassociate itself from the preceding. Then a furious Shirley MacLaine yelled at Frank that she was a member of the Academy and no one had asked her if she wanted to disassociate herself from the Vietnamese Provisional Revolutionary Government. Then John Wayne said aw, the Schneider guy was a pain in the ass.
The rise of New Hollywood is a story that’s been told countless times, but one of the very best tellings is Peter Biskind’s Easy Riders, Raging Bulls, originally published in 1998, but finally released in a Kindle version this week — entirely coincidentally, the day after Bert Schneider died. Biskind managed to interview many of the original players, and wrote a compelling narrative of the collapse of postwar Hollywood and the retirement of the last of the great moguls who built the industry, and the rise of the young turks who would be, for a time, their successors. And then their own usurpation, both through drug and alcohol-induced dissipation, and because Hollywood executives, with a little help from Steven Spielberg and George Lucas, rediscovered how to connect with mass audiences.
By the late 1960s, the Hollywood studio system was in ruins. There were multiple reasons — Michael Medved has blamed the demise of Hollywood’s self-enforced production code and its replacement with the G/PG/R/X rating system as alienating a big chunk of traditional moviegoers in the late 1960s. Concurrently, the urban “youth” market of the 1960s felt alienated by an industry still churning out formula clones of the last big film by “Old Hollywood,” The Sound of Music. The failure of so many of those films that came in its wake, including Dr. Doolittle, Hello Dolly, Star and other expensive, out of control musicals and family-oriented movies, nearly drove 20th Century Fox to financial ruin, and ultimately caused the once-mighty MGM to effectively close up shop as a functioning studio.
During the late 1960s, age had caught up with the industry as well. In an era whose slogan amongst the left was “Don’t trust anyone over 30,” most Hollywood crews were manned by people double that age, who had broken in around the time of World War II or immediately afterwards, and weren’t planning to leave anytime soon. As Steven Spielberg told Biskind:
“It was not like the older generation volunteered the baton,” says Spielberg. “The younger generation had to wrest it away from them. There was a great deal of prejudice if you were a kid and ambitious. When I made my first professional TV show, Night Gallery, I had everybody on the set against me. The average age of the crew was sixty years old. When they saw me walk on the stage, looking younger than I really was, like a baby, everybody turned their backs on me, just walked away. I got the sense that I represented this threat to everyone’s job.”
Ultimately he was — including many of the young turks in Biskind’s book, ironically enough. But prior to Spielberg’s rise as an industry unto himself, as Biskind tells it in Easy Riders, there were two milestones in the birth of New Hollywood in the late 1960s. The first was Bonnie & Clyde, the second was Easy Rider. As leftwing author Rick Perstein told Reason magazine in 2008 while promoting his then-recent book Nixonland:
My theory is that Bonnie and Clyde was the most important text of the New Left, much more important than anything written by Paul Goodman or C. Wright Mills or Regis Debray. It made an argument about vitality and virtue vs. staidness and morality that was completely new, that resonated with young people in a way that made no sense to old people. Just the idea that the outlaws were the good guys and the bourgeois householders were the bad guys—you cannot underestimate how strange and fresh that was.
But along with Bonnie & Clyde’s subversive script (written by Robert Benton and David Newman, who got their start at Esquire magazine, then at the peak of its journalistic style and influence), at least the film had a known-star in Warren Beatty, a ravishing looking Faye Dunaway, whose career was still in its ascendency, and a veteran director in Arthur Penn.
That’s not my headline; it was in the subject line of the email sent to me by Tablet magazine’s PR contact — and minus the question mark, to boot. Liel Leibovitz writes at the Jewish-themed magazine that “Steven Spielberg’s Schindler’s List is both a moral and an aesthetic disaster, an embodiment of much that is wrong with American-Jewish life” — and seems ‘surprised’ (likely not) that such fighting words have stirred up plenty of controversy:
Last week, Tablet Magazine published our list of the 100 greatest Jewish films of all time. At the very bottom was Schindler’s List. In a brief blurb, I called it an “astoundingly stupid” movie, which, in turn, inspired some of our readers to call me a “piece of shit” and a “neo-Nazi”—all for casting an aspersion on what, if they are to be believed, is everyone’s favorite Holocaust movie.
Which makes perfect sense: More than just a regrettable film, Schindler’s List neatly reflects the Manichean mindset of many American Jews, for whom mythology trumps memory and nothing lies beyond good and evil. Those who howled at me weren’t expressing a mere aesthetic judgment; they were defending a worldview.
To understand this worldview, we need only look at Schindler’s List. The film’s two main characters are Liam Neeson’s Oskar Schindler and Ralph Fiennes’ Nazi officer, Amon Goeth. The first is a philandering and greedy German who sees a little girl in a red coat and has a nearly instantaneous epiphany, realizing that life is precious and that Jews should be saved. The other is a monster; it’s no coincidence that the American Film Institute ranked Goeth at number 15 in its list of the 100 greatest villains of all time, just one spot below the slimy creature who terrorized Sigourney Weaver in Ridley Scott’s Alien. Goeth, too, is an otherworldly sort. He is not, like the real-life murderer on whom he is based, merely a hateful, opportunistic, and cruel young man who relished the chance to play god. He is impenetrable, predatory, inhuman. We have little reason to fear him more than we fear, say, the Nazis in Spielberg’s Raiders of the Lost Ark or the shark from Jaws; all are terrifying, but all are the sort of baddies we’ll only ever see on-screen, not the kind of ordinary and crooked and all-too-human scum living quietly next door and waiting for a stab at power.
There’s no doubt Spielberg’s sense of World War II history can be off-putting once you get beyond his powerful sense of composition, fluid camera motion, and John Williams’ score. As Mark Steyn noted 15 years ago, there’s plenty of nihilism and moral equivalence at work in Saving Private Ryan, the next WWII-themed movie Spielberg directed after Schindler:
Purporting to be a recreation of the US landings on Omaha Beach, Private Ryan is actually an elite commando raid by Hollywood and the Hamptons to seize the past. After the spectacular D-Day prologue, the film settles down, Tom Hanks and his men are dispatched to rescue Matt Damon (the elusive Private Ryan) and Spielberg finds himself in need of the odd line of dialogue. Endeavouring to justify their mission to his unit, Hanks’s sergeant muses that, in years to come when they look back on the war, they’ll figure that `maybe saving Private Ryan was the one decent thing we managed to pull out of this whole godawful mess’. Once upon a time, defeating Hitler and his Axis hordes bent on world domination would have been considered `one decent thing’. Even soppy liberals figured that keeping a few million more Jews from going to the gas chambers was `one decent thing’. When fashions in victim groups changed, ending the Nazi persecution of pink-triangled gays was still `one decent thing’. But, for Spielberg, the one decent thing is getting one GI joe back to his picturesque farmhouse in Iowa.
And Ryan would be far from the only — or the worst — example of a nihilistic WWII film from a Hollywood that during that period definitely took Leibovitz’s advice and moved far beyond good and evil. But is Schindler a “moral and aesthetic disaster,” as Leibovitz claims above? That seems more like an attempt to deliberately gin-up controversy for its own sake. As always, please discuss in the comments below.
(Oh, and for what it’s worth, here’s my choice for the very worst Holocaust-related movie from Hollywood. So far.)