Get PJ Media on your Apple

PJ Lifestyle

Ed Driscoll

Blogging since 2002, affiliated with PJM since 2005, where he is currently a columnist, San Jose Editor, and founder of PJM's Lifestyle blog. Over the past 15 years, Ed has contributed articles to National Review Online, the Weekly Standard.com, Right Wing News, the New Individualist, Blogcritics, Modernism, Videomaker, Servo, Audio/Video Interiors, Electronic House, PC World, Computer Music, Vintage Guitar, and Guitar World.
Follow Ed:

‘This Never Happened. It Will Shock You How Much It Never Happened’

Wednesday, October 2nd, 2013 - by Ed Driscoll

draperpart2

One of the recurring themes of Mad Men’s early seasons is the postmodern belief that the past is fungible. George Orwell’s 1984 explored the concept on a mass scale, with Winston Smith toiling away in the bowels of the Ministry of Truth to manipulate the past, Soviet-style, to suit the current whims of his political masters. Mad Men looked at the concept from the individual point of view.

As every fan of the show already knows, Don Draper, Mad Men’s hero (or rather anti-hero) is of course, secretly Dick Whitman. Whitman is a supremely ambitious social climber, who heavily airbrushed his past growing up dirt poor in a Depression-era whorehouse, and deserting his Army service in Korea. He accomplished the latter, by switching dog tags with the commanding officer he accidentally killed — and thus assuming his name, and as he later discovered, his wife — to become Don Draper. A decade later, at the apex of the show’s first season, after a rival threatens to out Don’s past to his boss, their employer’s classic response is contained within the scene that defined Mad Men’s solipsistic philosophy:

The climax of the first season of Mad Men, set at the dawn of the 1960s at a Madison Avenue advertising agency, is actually a brilliant anticlimax—a revelation swiftly followed by a re-veiling. Pete Campbell (Vincent Kartheiser), a clumsy striver at Sterling Cooper, attempts to topple the resident alpha dog, Don Draper (Jon Hamm), with what looks to be a career-ending disclosure: Draper, the firm’s dazzling creative director, is living under an assumed name; he’s a fraud, likely a Korean War deserter, and possibly worse. Campbell blurts it all out to the avuncular overlord, Bertram Cooper [Robert Morse], while Draper stands by silently, poker-faced, hands steady enough to light yet another cigarette. The elder statesman Cooper considers, waits an agonizing long beat, and makes a purely utilitarian reply.

“Mr. Campbell, who cares?” Cooper asks calmly, his voice burring with pity and disdain for the youngster’s naive theatrics. “This country was built and run by men with worse stories than whatever you’ve imagined here.”

“The Japanese have a saying,” Cooper continues. “‘A man is whatever room he is in’ — and right now, Donald Draper is in this room.”

Perhaps the ultimate example of this philosophy occurred in the following season. Peggy, Don’s young protégé as a secretary turned advertising copywriter abandons her baby in lieu of her career. Don shows up in the hospital shortly after she’s given birth, and given up the child for adoption, and tells her, “Peggy, listen to me, get out of here and move forward. This never happened. It will shock you how much it never happened.”

Read bullet | 5 Comments »

Second Chance for John Frankenheimer’s Seconds on Blu-Ray

Sunday, September 1st, 2013 - by Ed Driscoll

frankenheimer_seconds_movie_poster_8-25-13-1

Last week, the mailman delivered an Amazon box containing the Criterion Blu-Ray edition of the 1966 John Frankenheimer movie Seconds, starring Rock Hudson. Its arrival meant I could finally retire my 1997 laser disc edition of the film, one of the last 12-inch silver discs I purchased before switching to DVDs. But first, it meant a late night viewing of one of the strangest and most unsettling movies produced by mid-‘60s Hollywood.

Forget Dr. Strangelove’s obsession with fluoridation — something strange had gotten in the water in the mid-1960s. Maybe it was a collective premonition that the overreach of the Johnson Administration’s Great Society would very likely cause it to fail, as it attempted to fight the war on poverty, the war on racism, the space race, the Cold War, and the hot war in Vietnam, all simultaneously.

Perhaps it was the cognitive dissonance of the left, unable to process the fact that Johnson was only in office because President Kennedy was shot by “some silly little Communist,” as newly-widowed Jackie Kennedy muttered upon hearing the news about the motivations of the man who shot her husband. Instead of understanding that the Cold War had claimed her husband, Jackie, like most of the American left couldn’t make the connection. The ideology of Kennedy’s assassin “robs his death of any meaning,” she added.

But giving meaning to life didn’t really interest the American left at the height of the Cold War. In the early days of the 20th century, pioneering, self-described “Progressives” championed better working conditions for the common man. Now that America’s postwar economic boom meant that many men had them, and were moving to the suburbs as a result, after World War II, the left decided this was a bad thing. Hence, the 1956 film, The Man in the Gray Flannel Suit, (which both Seconds and today’s Mad Men each owes much to), and by time of the Kennedy era, Malvina Reynolds and Pete Seeger’s “Little Boxes,” with its references to suburban houses “all made out of ticky-tacky,” dubbed “the most sanctimonious song ever written” by fellow leftwing songwriter Tom Leher.

But this trend went into overdrive by the mid-‘60s, a hatred of all things suburbia that burns to this day, one of many poisoned leftwing wells from which our current president has drunk from deeply. In 1966, director Frank Perry shot the film version of the 1964 short story written by the New Yorker’s John Cheever, The Swimmer, which starred a buff-looking Burt Lancaster, trapped in an 95-minute-long metaphor of a movie. As the title implies, Lancaster swam from pool to pool, chatting wistfully with his neighbors in their wealthy Connecticut suburb about missed opportunities, middle age, and social conformity.

The Swimmer, which is available in high def streaming video from Amazon, wasn’t released by Columbia Pictures until 1968, perhaps because another, much darker film with a somewhat similar theme had bombed badly at the box office. In the early to mid-1960s, director John Frankenheimer had a made a career of Cold War paranoid thrillers, releasing first The Manchurian Candidate in 1962, starring Frank Sinatra and Laurence Harvey. In 1964, Frankenheimer next helmed Seven Days in May, with Kirk Douglas and Burt Lancaster starring in a film about American generals attempting a coup against a dovish liberal president. In 1966, Frankenheimer directed Seconds.

SPOILER ALERT: DON’T READ ANY FURTHER IF YOU HAVEN’T SEEN THE FILM YET, BUT MIGHT WANT TO. I’M ABOUT TO GIVE AWAY WIDE SWATCHES OF THE PLOT. YOU WERE WARNED.

Read bullet | Comments »

Paddy Chayefsky’s 1976 Film Network: Big Media’s How-To Guide for the Obama Era

Thursday, August 22nd, 2013 - by Ed Driscoll
network_poster_8-15-13-1

Network’s theatrical release poster.

 

The recent corporate transfer of Alec Baldwin from permanent NBC Saturday Night Live guest host and star of the recently cancelled low-rated NBC series 30 Rock to his upcoming gig as a raving anchor at MSNBC sounds like something out of Paddy Chayefsky’s 1976 film Network. Not the least of which because Baldwin’s in-house transfer was preceded by an outrageous homophobic slur, which old media — and not just NBC — worked very hard to bury. But then, there’s very little about television news that Network didn’t anticipate.

Because of the length of time needed for a movie to be both green-lighted, and then produced, few cinematic satires arrive at the apogee of their subject’s power. When Kubrick released Dr. Strangelove in 1964, the Air Force had begun to move away from nuclear-equipped B-52 bombers to a missile-based attack system. By the time Robert Altman had shot M*A*S*H in 1970, President Nixon was beginning to wind down American involvement in the Vietnam War. In the 1980s, films such as Red Dawn and 2010: The Year We Make Contact depicted America involved in future military conflicts with the Soviet Union, even as the latter was imploding. (Thank you, President Reagan.)

But when Network hit movie theaters in 1976, the original big three television networks were at the apex of their uncontested power; newspapers were losing readership, the World Wide Web was nearly 20 years off, and even CNN wouldn’t begin broadcasting until 1980. More importantly, talk radio, Fox News and the Blogosphere wouldn’t come into to play for another 15 to 25 years respectively. There was nothing to stop television’s untrammeled power, and seemingly no way for the individual to fight back.

“It’s Not Satire — It’s Sheer Reportage”

In his 2005 interview with Robert Osborne on Turner Classic Movies, Sidney Lumet, Network’s director, told Osborne that when he and Chayefsky were making their initial rounds on the interview circuit to promote the movie, “Paddy and I, whenever we’d be asked something about ‘this brilliant satire,’ we’d keep saying, ‘It’s not a satire — it’s sheer reportage!’ The only thing that hasn’t happened yet is nobody’s been shot on the air!”

Read bullet | Comments »

Death Wish: Mr. Bronson’s Planet

Wednesday, August 7th, 2013 - by Ed Driscoll

death_wish_poster_7-25-13-1

Is it possible for a veteran actor to star in a motion picture that makes him a legend, assures his cinematic immortality, and ensures that while he’s still alive, he’ll always find work, and yet be completely miscast? Actually, it’s happened at least twice. In the late 1970s, Stanley Kubrick cast Jack Nicholson as Jack Torrance in The Shining. The film made Nicholson a legend, but in a way, he’s very badly miscast — Nicholson’s character seems pretty darn bonkers right from the start of the film, long before his encounters with the demons lurking within the bowels of the Overlook Hotel.

But arguably, a far worse case of miscasting is Charles Bronson in Michael Winner’s 1974 film Death Wish. When novelist Brian Garfield wrote the 1972 book that inspired the movie, he was hoping that if Hollywood ever adapted his novel to the big screen, a milquetoast actor such as Jack Lemmon would star. And Lemmon would actually have been perfect, since his character’s transformation from bleeding heart liberal white collar professional to crazed vigilante would have been all the more shocking. Instead, we all know it’s only a matter of time before Charles Bronson reveals his legendary tough guy persona on the screen. Back around 2000, I remember reading Garfield’s notes on his book’s Amazon page, which was something along the lines of, “Would you want to mess with Charles Bronson?”

Currently the cinematic adaptation of Death Wish is available for home viewing in standard definition on DVD, and in high definition, via Amazon’s Instant Video format. And while the latter version is in sharp 1080p HD, the film could use a restoration from Paramount before it’s issued onto a Blu-Ray disc. The Amazon version has its share of scratches and dust on its print, though it’s certainly cleaner than the Manhattan it depicts on screen. I watched the Amazon HD version the other night, and I was reminded that Bronson’s casting dispenses with the film’s credibility almost as explosively as Bronson himself dispatches assailants onscreen. There are eight million stories in the naked city, and apparently, in 1974, almost as many muggers stupid enough to go up against Charles Bronson.

But otherwise, the timing of the film was absolutely perfect. As Power Line’s Steve Hayward noted in The Age of Reagan: The Fall of the Old Liberal Order: 1964-1980, film critic Richard Grenier dubbed Clint Eastwood’s 1971 film Dirty Harry, “the first popular film to talk back to liberalism,” a movie made during the period that then-Governor Ronald Reagan “liked to joke that a liberal’s idea of being tough on crime was to give longer suspended sentences,” Hayward added.

Which helped set the stage not just for Death Wish, but for the era of moral collapse in which it was filmed, and in which it too became a hit by talking back to liberalism.

Peter Biskind’s 1998 book Easy Riders, Raging Bulls documented Hollywood’s near-complete takeover by the left beginning in the late 1960s, but there were a few holdouts during that era: John Wayne was still making movies, Eastwood’s long career was beginning its ascendency, and British director Michael Winner was also a conservative himself.

But on the East Coast, in the early 1970s, New York had essentially collapsed. Saul Bellow was one of the first novelists to document the moral and increasingly physical carnage. As Myron Magnet of City Journal wrote in the spring of 2008, “Fear was a New Yorker’s constant companion in the 1970s and ’80s. … So to read Saul Bellow’s Mr. Sammler’s Planet when it came out in 1970 was like a jolt of electricity”:

The book was true, prophetically so. And now that we live in New York’s second golden age — the age of reborn neighborhoods in every borough, of safe streets bustling with tourists, of $40 million apartments, of filled-to-overflowing private schools and colleges, of urban glamour; the age when the New York Times runs stories that explain how once upon a time there was the age of the mugger and that ask, is New York losing its street smarts? — it’s important to recall that today’s peace and prosperity mustn’t be taken for granted. Hip young residents of the revived Lower East Side or Williamsburg need to know that it’s possible to kill a city, that the streets they walk daily were once no-go zones, that within living memory residents and companies were fleeing Gotham, that newsweeklies heralded the rotting of the Big Apple and movies like Taxi Driver and Midnight Cowboy plausibly depicted New York as a nightmare peopled by freaks. That’s why it’s worth looking back at Mr. Sammler to understand why that decline occurred: we need to make sure it doesn’t happen again.

That was the milieu in which Bronson’s Paul Kersey character resided at the start of Death Wish. Flying back to New York after a relaxing Hawaiian vacation with his wife (played by veteran actress Hope Lange), Kersey’s wife is murdered and his daughter raped by home invaders led by a young Jeff Goldblum at the start of his acting career. (Near the end of the film, a pre-Spinal Tap Christopher Guest plays a nervous rookie NYPD cop). On a business trip out to Tucson, both to take his mind off the horrors that had befallen his family, and to get a real estate development project back on track, Bronson’s Kersey discovers that it’s possible to defend yourself against crime.

The businessman that Kersey meets during the film’s Tucson scenes, played by character actor Stuart Margolin, is a staunch Second Amendment supporter who invites Kersey to a gun range, and asks him,“Paul, which war was yours?” That was a common question among middle-aged men during the latter half of the 20th century. Kersey admits he was a “C.O. in a M*A*S*H unit” in Korea.

“Oh, Commanding Officer, eh?” Margolin’s Good Ol’ Businessman approvingly asks.

“Conscientious Objector,” Bronson’s Kersey drolly replies as Margolin rolls his eyes in disgust.

Kersey explains that he became one as a teenager, after his father was shot and killed in a hunting accident, quickly fleshing out his character’s backstory. Evidently, Kersey’s own skills as a hunter haven’t degraded much over the years, since he then aims and fires the pistol that Margolin’s character had handed him, splitting the paper target at the gun range dead center.

And away we go.

Read bullet | Comments »

Pioneer Elite SC-75 Home Theater Receiver Review

Wednesday, August 7th, 2013 - by Ed Driscoll

sc-75_front_pjm_8-5-13-1

I distinctly remember two mile markers on the road to my obsession with home theater. The first was a 1987 article in Billboard magazine exploring the continuing popularity, against all odds, of the 12-inch laser disc format with movie collectors. The article mentioned an obscure California firm called “The Criterion Collection” that was selling Blade Runner and 2001: A Space Odyssey in something called a “letterboxed” format, which would allow seeing the entire frame of those magnificently photographed widescreen movies on a home television set, instead of the “panned and scanned” version, which cut off the sides of the frame. It also mentioned the superiority of the laser disc format compared to fuzzy VHS tapes, and that laser disc allowed for such ancillary items as directors’ commentaries on auxiliary audio channels, behind the scenes still photos, trailers, deleted scenes, and other fun pieces of memorabilia.

This sounded pretty darn awesome. Shortly thereafter, I purchased my first laser disc player, the predecessor to the DVD, which wouldn’t arrive in stores for another decade. At its best, the picture and sound quality of laser discs blew VHS away, and I was quickly hooked. Particularly when I stumbled over a nearby video store that rented laser discs.

The second mile maker arrived two years later. That’s when I walked out of the B. Dalton Bookstore in New Jersey’s Burlington Mall holding a copy of the second issue of Audio/Video Interiors, the magazine that put the words “home theater” on the map. It was essentially Architectural Digest meets Stereo Review, with photo layout after photo layout filled with sophisticated audio and video components beautifully photographed in stunning home settings, including some of the first dedicated home theaters that were designed to look like the classic movie palaces of the 1930s, such as Theo Kalomirakis’ Roxy Theater, a knockout design built in the basement of his Brooklyn Brownstone. (Kalomirakis would go on to make a living as a home theater designer, producing works for extremely well-heeled clients that make his initial Roxy look positively modest by comparison.)

I gravitated more to the media rooms the magazine featured than the dedicated home theaters. Media rooms were rooms designed for a variety of media consumption — music, TV, movies, concert videos, with the electronics tastefully combined into some of sort attractive cabinetry or hidden into the wall. Maybe because my father had a custom built-in unit installed in his basement in 1969 to house his stereo equipment and a small portion of his enormous (3000+) collection of jazz and big band records. Adding video and surround sound to that concept seemed like a natural to me.

I’ve kept most of the issues from Audio/Video Interiors’ initial run; I was immensely proud to have written a few articles for the magazine in the late 1990s. In retrospect, it’s fun to look back at the first issues of AVI, and realize how much technology has progressed since then. HD video replaced “Never Twice the Same Color” low-def analog TV. VHS is all but extinct. Dolby Pro-Logic, the first consumer surround sound format has been upgraded to first Dolby 5.1 and now Dolby 7.1 and beyond. CDs have been rendered increasingly anathema, particularly for casual listening, by MP3s. The laser disc was replaced 15 years ago by the DVD, which has now been supplanted by Blu-Ray, and increasingly, by streaming high-definition movies, such as those offered by Netflix and Amazon Instant Video.

Welcome to 2013

Apologies for burying the lede, but this brings us to one of Pioneer’s newest A/V receivers, the Pioneer Elite SC-75. The first Pioneer receiver I owned was the classic Pioneer VSX-D1S of 1990, one of the first receivers designed with what we now call home theater in mind, with as much emphasis on controlling video components as the CD player, the tape deck and the record player. Since then, Pioneer has been upgrading the electronics of their units to keep pace with changing world of home theater technology. I purchased the SC-75 to replace my Pioneer Elite VSX-72TXV, which was built in 2006. With six years passed, and the proliferation of streaming video set-top boxes such as the Roku (which we reviewed last year), the addition of LAN and wireless networking technology to many Blu-Ray players, the popularization of Androids and iPods/iPads as music and video devices, and the standardization of the HDMI format to connect video components, the SC-75 is a very different beast compared to the previous generation of Pioneer Elite receivers.

The differences aren’t immediately apparent at first glance; the only thing that initially sets the unit apart from its predecessors is its brushed metal finish, instead of the smooth piano black styling of older Pioneer Elite models (Pioneer has apparently also permanently retired the beautiful rosewood-veneered side panels of the first generation of Elite models, which is a pity; on the other hand, perhaps they simply don’t want to be raided by the lumber fascists, ala Gibson.)

Read bullet | Comments »

Peter Robinson Interviews Tom Wolfe, Documentarian of the New Antiquity

Wednesday, July 24th, 2013 - by Ed Driscoll

Ricochet’s Peter Robinson begins his new interview with Wolfe by reading from the central leitmotif of Wolfe’s latest book, Back to Blood:

A phrase pops into his head from out of nowhere. “Everybody… all of them… it’s back to blood! Religion is dying… but everybody still has to believe in something. It would be intolerable— you couldn’t stand it— to finally have to say to yourself, ‘Why keep pretending? I’m nothing but a random atom inside a supercollider known as the universe.’ But believing in by definition means blindly, irrationally, doesn’t it. So, my people, that leaves only our blood, the bloodlines that course through our very bodies, to unite us. ‘La Raza!’ as the Puerto Ricans cry out. ‘The Race!’ cries the whole world. All people, all people everywhere, have but one last thing on their minds— Back to blood!” All people, everywhere, you have no choice but— Back to blood!

Beyond Wolfe’s own introduction, I thought John O’Sullivan had one of the best takes on Wolfe’s new book, (hidden behind the National Review subscriber paywall, alas) describing it as a battlefront report on the warring “Tribes of Post-America:”

Wolfe’s vision of the America emerging from the chaos of modernity is eerily similar to the Rome of Antiquity before Constantine. Where that antiquity was pre-Christian, this New Antiquity is post-Christian. Its original brand of Protestant Christianity no longer influences the politics, institutions, and laws of the nation it once shaped. The WASP elites, for whom Protestantism was long a mark of respectability and soundness, no longer even pretend to believe. It is a genuine religious faith for only a tiny number of people. Its secular expressions, “American exceptionalism” and “the American Creed,” are in only slightly better shape. The former provoked President Obama into an embarrassed meandering as he sought to reconcile his cosmopolitan disdain for it with its popularity among the rubes; the latter has been redefined into its opposite, an umbrella term covering a multitude of tribes and their different customs, namely multiculturalism.

This transformation from the Great Republic to the New Antiquity has happened in large measure in order to accommodate the growing number of immigrant groups forcing their way into the metropolis. It is a colder and crueler world: Inside the cultural ghettos, the new tribes of post-America retain much of their old affections and loyalties; outside them, they treat others with wariness and distrust. And they are slow to develop a common attachment to their new “home.”

“As mainline WASP Christianity shrivels,”O’Sullivan writes, “other cults flourish in its place: the ethnicity cult, of course; the arts cult for the very rich; the sex cult for the young; the celebrity cult for professionals; the psychology cult for billionaire clients; a religion cult (non-traditional religion, of course) for the perplexed; and the cult of wealth for everyone. Only the Gods of the Copybook Headings are missing from this teeming agora through which Wolfe’s characters pursue their fantasies and flee from their anxieties.”

Robinson asks Wolfe about this passage. Also in the interview he asks Wolfe about his process as a writer (which he cranks out on a manual typewriter or even a pencil, considering how difficult it is for him to keep a typewriter in proper repair), and the importance of taking notes and jotting down ideas as quickly as possible while in the field reporting, before the brain’s memory decay erases the tiny details of a scene that adds verisimilitude to his writing.

Needless to say, watch the whole thing.

******

Cross-posted from EdDriscoll.Com

Read bullet | Comments »

Church Without God?

Sunday, June 30th, 2013 - by Ed Driscoll

CNN stumbles into “a church without one big player: God:”

Sunday’s congregation in Cambridge is a meeting of the Humanist Community at Harvard University and the brainchild of Greg Epstein, the school’s Humanist chaplain.

A longtime advocate for community building, Epstein and his group of atheists have begun to build their Cambridge community and solemnize its Sunday meetings to resemble a traditional religious service.

To Epstein, religion is not all bad, and there is no reason to reject its helpful aspects.

“My point to my fellow atheists is, why do we need to paint things with such a broad brush? We can learn from the positive while learning how to get rid of the negative,” he said.

Godless congregations

For Epstein, who started community-building at Harvard nearly 10 years ago, the idea of a godless congregation is not an oxymoron.

“We decided recently that we want to use the word congregation more and more often because that is a word that strongly evokes a certain kind of community – a really close knit, strong community that can make strong change happen in the world,” he said.

“It doesn’t require and it doesn’t even imply a specific set of beliefs about anything.”

Epstein is not alone in his endeavor. Jerry DeWitt, who became an atheist and left his job as an evangelical minister, is using his pastoral experience to building an atheist church in Baton Rouge, Louisiana.

CNN seems to think this news, when the idea of a Godless church has been woven deep into the firmware of “Progressivism,” ever since Friedrich Nietzsche and the men whom historian Martin E. Marty dubbed “The Bearded God-killers” (Nietzsche, Marx, Darwin and Freud) had their heyday in the late 19th century. However, man is hardwired to believe in something, and the following century could be viewed as one long attempt at finding an alternative God via totalitarian regimes (Stalin, Hitler, Mao, Pol Pot, etc.), and the nature-worship of radical environmentalism. Not to mention drugs — recall Tom Wolfe exploring the religious motivations of ’60s drug users in his seminal mid-1970s essay “The ‘Me’ Decade and the Third Great Awakening,” which as the second half of its title implies, is a lengthy treatise on all sorts of ways to build alternative religions that replace God with the self.

Oh — and then there’s Obama cult of 2008 which had no small amount of building BHO into an alternative God — including by a lot of people who should have known better. But then, as Umberto Eco wrote in 2005:

G K Chesterton is often credited with observing: “When a man ceases to believe in God, he doesn’t believe in nothing. He believes in anything.” Whoever said it — he was right. We are supposed to live in a sceptical age. In fact, we live in an age of outrageous credulity.

Found via Maggie’s Farm, which quips, “Atheist Churches…In my view, there is nothing wrong with social clubs.”

Related: Aaron Clarey on “When Atheists Aren’t Really Atheists.”

*****

Cross-posted from EdDriscoll.Com

Read bullet | 7 Comments »

Paula Deen Vs Quentin Tarantino

Tuesday, June 25th, 2013 - by Ed Driscoll

paula_deen_tarantino_6-24-13

I’m not sure if he created it or is simply relinking to it, but in any case, that’s a trenchant juxtaposition spotted on actor Kevin Sorbo’s Facebook page by Twitchy, along with Sorbo noting “And this is why I love the hypocrisy of America…..Oh, and thank God none of us have EVER been racist. Only this woman. Bad woman. Bad.”

Like Tarantino, as the Daily Caller notes, “‘N-word’ user Paula Deen is an Obama supporter:”

Television chef Paula Deen, who lost her gig Friday with the Food Network after admitting and apologizing for having used racial epithets, including the “N-word,” is a supporter of President Barack Obama and campaigned on his behalf in 2008.

The Southern-bred television personality also invited Michelle Obama onto one of her television programs and praised the first lady’s healthy eating agenda.

Deen, who has previously faced criticism for using excessive quantities of fat and salt in her dishes, is under fire after it was revealed in a lawsuit that she admitted using the N-word and other racial slurs in front of her employees.

Deen is also accused of sexual harassment, and was found in contempt of court for refusing to turn over a video in which she allegedly simulates a sex act on a chocolate eclair.

As Andrew Klavan noted today, watching Bill Maher’s Time-Warner-CNN-HBO cable show so you didn’t have to:

Maher incited the ire of the self-righteous left-wing panel on his HBO show Real Time with Bill Maher by asking, quite reasonably, “If you’re 66 years old, and you were raised in Georgia, and you were a child before the civil rights movement, do you get a bit of a pass?” He further commented: “I… think people shouldn’t have to lose their shows and go away when they do something bad. It’s just a word, it’s a wrong word, she’s wrong to use it, but do we always have to make people go away?”

“It tells you something about the state of debate in our country that Bill Maher has become a voice of reason,” Drew adds.

Her last step before being completely airbrushed down the cable TV memory hole is likely a tearful public auto-da-fé* via Oprah, another wildly enthusiastic 2008 Obama supporter who career has also peaked dramatically in the years since. Or as Canadian columnist Rick Murphy dubbed her last year, “Oprah Winfrey, the Obama supporter fame left behind.”

That does seem to be a rapidly growing class of celebrity, doesn’t it? Or as Glenn Reynolds once quipped, “Everything Obama touches . . . .”

YouTube Preview Image

* It’s what you oughtn’t to do, but you do anyway.

*****

Cross-posted from EdDriscoll.Com

Read bullet | Comments »

RIP I Am Legend Author Richard Matheson

Tuesday, June 25th, 2013 - by Ed Driscoll

k4wMkDO

Variety reports that Matheson was 87:

Richard Matheson, the sci-fi and fantasy novelist and screenwriter who influenced modern genre writers and directors and wrote numerous stories and books that were adapted as films including “I Am Legend,” died on Sunday at his home in Calabasas, Calif,  according to his publisher. He was 87 and had been ill for some time.

As well as creating source material for films including “What Dreams May Come,” “A Stir of Echoes” and “The Shrinking Man,” Matheson was a prolific film and TV scribe and responsible for some of the most popular “Twilight Zone” episodes as well as writing for nearly every other anthology series of the 1960s and 70s with credits including “Lawman,” “The Alfred Hitchcock Hour,” “Rod Serling’s Night Gallery,” “The Martian Chronicles,” “Amazing Stories” and “Star Trek” episode “The Enemy Within.”

For “Twilight Zone,” Matheson wrote the classic William Shatner episode “Nightmare at 20,000 Feet.” The Hugh Jackman film “Real Steel” was adapted from his “Twilight Zone” episode “Steel.”

Between novels and screenplays, the IMDB lists Matheson as contributing to an astonishing 80 film and TV titles — a number which will keep growing, as spin-offs and reboots of his earlier work, such the 1971 Charlton Heston vehicle The Omega Man being reworked into Will Smith’s I Am Legend, will likely continue for years to come.

******

Cross-posted from EdDriscoll.Com

Read bullet | Comments »

Uh-oh: Kim Kardashian and Kanye West’s Daughter’s Name…

Friday, June 21st, 2013 - by Ed Driscoll

kim-kardashian-2-600

“It’s North West! Kim Kardashian and Kanye West’s daughter’s name is confirmed as birth certificate is leaked,” according to the London Daily Mail:

TMZ witnessed a signed birth certificate with the name North West signed by Kim, 32, and Kanye, 36, at Cedars-Sinai hospital.

People magazine confirmed the news with a source close to the Kardashian family.

And a source told UsWeekly that the couple have already given their daughter a nickname — she will be ‘Nori for short’.

They added that the child has no middle name.

The name North had previously been suggested as a possibility for the child, although many Kardashian fans had dismissed the idea as a joke.

Cary Grant and Alfred Hitchcock could not be reached for comment.

Incidentally, since I’ve been meaning to post a link to this all week, this is as good a place as any to post a reminder that when it comes to pop “music,” Time magazine sure can pick ‘em: In August of 2005, a Time magazine cover story dubbed West (Kanye, not Adam or Leslie) “Hip-Hop’s Class Act…Why he’s the smartest man in pop music.” The following month, West would prove he’s neither of those traits, as he dynamited an NBC fundraiser for Hurricane Katrina victims with his racist rant against then-President George W. Bush, despite the fact, as DNC chairwoman Donna Brazile recently noted in CNN, “Bush came through on Katrina.”

Three years later, Time would run a story titled “Lil Wayne: The Best Rapper Alive.” This week, as Breitbart News noted, “Lil’ Wayne Stomps on U.S. Flag, Calls Country ‘Godless America.’”

We should have seen it coming; though perhaps we can predict future pop culture self-immolations. Who else has Time dubbed the best/smartest/classiest in their pop culture genres? Their implosion is likely just a matter of time.

*****

Cross-posted from EdDriscoll.com

Read bullet | Comments »

RIP James Gandolfini, TV’s Tony Soprano

Thursday, June 20th, 2013 - by Ed Driscoll
james_gandolfini_and_wife_6-19-13-1

James Gandolfini and his wife Deborah Lin at the LA premiere of Zero Dark Thirty on December 10, 2012. Gandolfini portrayed Leon Panetta. Photo by Featureflash / Shutterstock.com.

Multiple sources, including the New York Times and the New York Daily News are reporting the death of James Gandolfini, who achieved iconic status as America’s favorite television mobster in HBO’s popular series, The Sopranos, which aired from 1999 through 2007. The Daily News reports that the 51 year old New Jersey-born actor “died following a massive heart attack in Italy,” according to a source close to the actor.

Deadline Hollywood adds:

Overweight, balding, rough around the edges with a thick New Jersey accent, Gandolfini was the opposite of a marquee leading man, destined to be a character actor, and yet he proved through his masterful acting that he could make Tony Soprano sexy and smart, towering and powerful. his portrayal was one of TV’s largest-looming TV anti-heroes — the schlub we loved, the cruel monster we hated, the anxiety-ridden husband and father we wanted to hug when he bemoaned, “I’m afraid I’m going to lose my family. Like I lost the ducks.” In the most maddening series finale in recent history – an episode chock full of references to mortality (life, death, a William Butler Yeats reference to the apocalypse, a bathroom reference to a “Godfather” bloodbath) — his was the show’s last image, seen just as the words “Don’t stop” were being sung on the jukebox. It generated such extreme reaction that the series’ fans crashed HBO’s website for a time that night trying to register their outrage that it ended with a black screen, leaving them not knowing whether Tony Soprano had been whacked. In large part to Gandolfini’s charisma (“Jimmy was the spiritual core of our Sopranos family,” Chris Albright noted today), that Season 5 of The Sopranos in 2004 remains the most watched series in HBO history with 14.4 million viewers on average.

As the New York Times reports, Gandolfini “had an impressive list of character-acting credits but he was largely unknown to the general public when David Chase cast him in ‘The Sopranos’ in 1999:”

“I thought it was a wonderful script,” Mr. Gandolfini told Newsweek in 2001, recalling his audition. “I thought, ‘I can do this.’ But I thought they would hire someone a little more debonair, shall we say. A little more appealing to the eye.”

Fortunately for both HBO and us, Gandolfini got the part, and will live on in Hollywood mobster immortality, an apt successor to the gangster roles that made legends of Edward G. Robinson, Brando, Pacino and the ensemble cast of Goodfellas, which inspired The Sopranos.

******

Cross-posted from EdDriscoll.Com

Read bullet | Comments »

Hollywood ‘Completely Broke.’ But That’s Good News, Right?

Monday, June 17th, 2013 - by Ed Driscoll

shutterstock_107939225

Big Hollywood links to an article by Lynda Obst, the producer of Contact, Sleepless in Seattle, and TV’s Hot in Cleveland (among many other projects) in Salon, setting up her quotes by first noting that “For consumers, the decline of the DVD market has meant switching over to both Blu-ray and, more recently, streaming options for their viewing pleasure.  The end of the DVD format’s dominance meant something much more, and far worse, for Hollywood.”

In Salon, Obst writes:

“The DVD business represented fifty percent of their profits,” [20th Century Fox executive Peter Chernin] went on. “Fifty percent. The decline of that business means their entire profit could come down between forty and fifty percent for new movies.”

For those of you like me who are not good at math, let me make Peter’s statement even simpler. If a studio’s margin of profit was only 10 percent in the Old Abnormal, now with the collapsing DVD market that profit margin was hovering around 6 percent. The loss of profit on those little silver discs had nearly halved our profit margin.

This was, literally, a Great Contraction. Something drastic had happened to our industry, and this was it. Surely there were other factors: Young males were disappearing into video games; there were hundreds of home entertainment choices available for nesting families; the Net. But slicing a huge chunk of reliable profits right out of the bottom line forever?

This was mind-boggling to me, and I’ve been in the business for thirty years. Peter continued as I absorbed the depths and roots of what I was starting to think of as the Great Contraction. “Which means if nothing else changed, they would all be losing money. That’s how serious the DVD downturn is. At best, it could cut their profit in half for new movies.”

* * * * *

“When did the collapse begin?”

“The bad news started in 2008,” he said. “Bad 2009. Bad 2010. Bad 2011.”

It was as if he were scolding those years. They were bad, very bad. I wouldn’t want to be those years.

“The international market will still grow,” he said, “but the DVD sell-through business is not coming back again. Consumers will buy their movies on Netflix, iTunes, Amazon et al. before they will purchase a DVD.” What had been our profit margin has gone the way of the old media.

But it was in 2010 that James Cameron told the Washington Post that DVDs were bad for the Gaia and other living things, and needed to be eliminated (while simultaneously having multiple versions of Avatar coming out that same year on DVD):

It’s a consumer product like any consumer product. I think ultimately we’re going to bypass a physical medium and go directly to a download model and then it’s just bits moving in the system. And then the only impact to the environment is the power it takes to run the computers, run the devices. I think that we’re not there yet, but we’re moving that direction. Twentieth Century Fox has made a commitment to be carbon neutral by the end of 2010. Because of some of these practices that can’t be changed, the only way to do that is to buy carbon offsets. You know, which again, these are interim solutions. But at least it shows that there’s a consciousness that we have to be dealing with carbon pollution and sustainability. …

And the following year, many in Hollywood went all-in with Occupy Wall Street, which was obsessed with the “obscene” profits made by gigantic multinational corporations. You know, like movie studios.

Presumably, losing the cushion of DVD sales is part of the reason why Steven Spielberg recently told a USC audience that, as the Hollywood Reporter paraphrased, “an ‘implosion’ in the film industry is inevitable, whereby a half dozen or so $250 million movies flop at the box office and alter the industry forever.”

But it’s not like Hollywood has much respect for the audience who pays the tickets to see those $250 million products during their initial run in theaters. Obst’s article on the collapse of her industry appears in Salon, which isn’t exactly sympathetic to Hollywood’s core audience in flyover country, when its editor at large has a new book titled, What’s the Matter with White People?: Finding Our Way in the Next America.

Similarly, in 2008, the late Nora Ephron, who in the previous decade had written and directed the Obst-produced Sleepless in Seattle, wrote in the Huffington Post, “This is an election about whether the people of Pennsylvania hate blacks more than they hate women. And when I say people, I don’t mean people, I mean white men.” Incidentally those people in Pennsylvania that Ephron was writing off as troglodytic racists were her fellow Democrats, who were about to decide between Obama and Hillary in the PA Democrat primary — the same primary voters that Obama wrote off at the time as bitter, gun and God-obsessed clingers.

Read bullet | Comments »

More Orgasms = Longer Life Expectancy?

Tuesday, June 4th, 2013 - by Ed Driscoll

Sex can extend a man’s life, Men’s Journal claims

“For men, the more the better,” he says. “The typical man who has 350 orgasms a year, versus the national average of around a quarter of that, lives about four years longer.” And more than those extra four years, Roizen says, the men will feel eight years younger than their contemporaries. Is there an optimal number of orgasms for the average man? Roizen suggests, with a straight face, that 700 a year could add up to eight years to your life. This is an ambitious prescription: The average American adult male has sex just 81 times a year.

Roizen’s formula may be new, but the benefits of sex and orgasms have been tracked for years, and there’s some compelling hard evidence to back Roizen’s claims. A Swedish study done in the ’80s found that 70-year-olds who made it to 75 were the ones still having sex, and a Duke University study that followed 252 people over 25 years concluded that “frequency of intercourse was a significant predictor of longevity.”

But the big kahuna of longevity studies was completed just 10 years ago in Wales. British scientists interviewed nearly 1,000 men in six small villages about their sexual frequency, then arranged for all death records to be forwarded so the scientists could record their life spans. Ten years later they determined that men who had two or more orgasms a week had died at a rate half that of the men who had orgasms less than once a month. “Sexual activity seems to have a protective effect on men’s health,” the researchers concluded.

…Unless you’re Michael Douglas, apparently:

Michael Douglas – the star of Basic Instinct and Fatal Attraction – has revealed that his throat cancer was apparently caused by performing oral sex.

In a surprisingly frank interview with the Guardian, the actor, now winning plaudits in the Liberace biopic Behind the Candelabra, explained the background to a condition that was thought to be nearly fatal when diagnosed three years ago. Asked whether he now regretted his years of smoking and drinking, usually thought to be the cause of the disease, Douglas replied: “No. Because without wanting to get too specific, this particular cancer is caused by HPV [human papillomavirus], which actually comes about from cunnilingus.”

Umm, ooooooooooooooohhhhhkaaaaaaaay. But if it’s actually true, Joe Jackson didn’t know the half of it when he wrote in 1982 that everything gives you cancer:

****

Cross-posted from Ed Driscoll’s Blog, where originally titled “Sexual Healing — Or the Lack Thereof”.

Read bullet | 6 Comments »

The Return of the Primitive

Friday, May 24th, 2013 - by Ed Driscoll

In his introduction to The Return of the Primitive: The Anti-Industrial Revolution, the 1999 update of Ayn Rand’s early 1970s anthology originally entitled The New Left, Peter Schwartz, the editor of the new edition wrote:

Primitive, according to the Oxford English Dictionary, means: “Of or belonging to the first age, period or stage; pertaining to early times …” With respect to human development, primitivism is a pre-rational stage. It is a stage in which man lives in fearful awe of a universe he cannot understand. The primitive man does not grasp the law of causality. He does not comprehend the fact that the world is governed by natural laws and that nature can be ruled by any man who discovers those laws. To a primitive, there is only a mysterious supernatural. Sunshine, darkness, rainfall, drought, the clap of thunder, the hooting of a spotted owl— all are inexplicable, portentous, and sacrosanct to him. To this non-conceptual mentality, man is metaphysically subordinate to nature, which is never to be commanded, only meekly obeyed.

This is the state of mind to which the environmentalists want us to revert.

If primitive man regards the world as unknowable, how does he decide what to believe and how to act? Since such knowledge is not innate, where does primitive man turn for guidance? To his tribe. It is membership in a collective that infuses such a person with his sole sense of identity. The tribe’s edicts thus become his unquestioned absolutes, and the tribe’s welfare becomes his fundamental value.

This is the state of mind to which the multiculturalists want us to revert. They hold that the basic unit of existence is the tribe, which they define by the crudest, most primitive, most anti-conceptual criteria (such as skin color). They consequently reject the view that the achievements of Western— i.e., individualistic— civilization represent a way of life superior to that of savage tribalism.

Both environmentalism and multiculturalism wish to destroy the values of a rational, industrial age. Both are scions of the New Left, zealously carrying on its campaign of sacrificing progress to primitivism.

In addition to the shocking Islamic terrorist attack yesterday in London, a troika of pop culture-related stories making the rounds today remind us that reprimitivization is well on its way.

First up,  “Movement to Normalize Pedophilia Finds Its Poster Girl,” Stacy McCain writes in the American Spectator:

In January, Rush Limbaugh warned that there was “an effort under way to normalize pedophilia,” and was ridiculed by liberals (including CNN’s Soledad O’Brien) for saying so. But now liberals have joined a crusade that, if successful, would effectively legalize sex with 14-year-olds in Florida.

The case involves Kaitlyn Ashley Hunt, an 18-year-old in Sebastian, Florida, who was arrested in February after admitting that she had a lesbian affair with a 14-year high-school freshman. (Click here to read the affidavit in Hunt’s arrest.) It is a felony in Florida to have sex with 14-year-olds. Hunt was expelled from Sebastian High School — where she and the younger girl had sex in a restroom stall — and charged with two counts of “felony lewd and lascivious battery on a child.” The charges could put Hunt in prison for up to 15 years. Prosecutors have offered Hunt a plea bargain that would spare her jail time, but her supporters have organized an online crusade to have her let off scot-free — in effect, nullifying Florida’s law, which sets the age of consent at 16.

Using the slogan “Stop the Hate, Free Kate” (the Twitter hashtag is #FreeKate) this social-media campaign has attracted the support of liberals including Chris Hayes of MSNBC, Daily Kos, Think Progress and the gay-rights group Equality Florida. Undoubtedly, part of the appeal of the case is that Hunt is a petite attractive green-eyed blonde. One critic wondered on Twitter how long activists have “been waiting for a properly photogenic poster child of the correct gender to come along?”

Portraying Hunt as the victim of prejudice, her supporters claim she was only prosecuted because she is homosexual and because the parents of the unnamed 14-year-old are “bigoted religious zealots,” as Hunt’s mother said in a poorly written Facebook post. The apparent public-relations strategy was described by Matthew Philbin of Newsbusters: “If you can play the gay card, you immediately trigger knee-jerk support from the liberal media and homosexual activists anxious to topple any and all rules regarding sex.”

Meanwhile, giant cable television conglomerate Viacom must be especially proud of MTV today: “Trashy Former Pop Star Drinks Her Own Urine on MTV in Ratings Stunt,” Ace writes:

If you had questions about whether Ke$ha was a classy lady– questions that really ought not to persist, given that she really spells her name that way, “Ke$ha” — consider them now resolved.

Some are using this provocation as a justification for renewing the calls for a-la-carte cable subscriptions. “Some” are, in this case, correct.

Anyone who now has cable pays for MTV. Cable companies negotiate a flat payment to a station for carrying it. MTV also collects revenues from advertising, but a major source of its revenue is the automatic “tax” MTV imposes on your cable bill every month. You have no way to avoid paying for MTV– except for cancelling the service altogether.

Monopolies are generally not permitted to “bundle” services together. And local cable companies are usually monopolies, or, at best, have but one competitor– and as all of them have instituted this bundling practice and will not stop the practice no matter how much the public clamors for it, the monopolies (or duopolies) at least appear to be in collusion on this point.

And finally, while Robert Redford’s boyish shock of tousled hair and studio system hauteur hides a multitude of sins, his own primitivist mindset is lurking just under the surface, easily found:

Robert Redford today accused the US of losing its way in the years since the second world war. Speaking at the press conference for his new film All Is Lost at the Cannes film festival.

“Certain things have got lost,” said Redford. “Our belief system had holes punched in it by scandals that occurred, whether it was Watergate, the quiz show scandal, or Iran-Contra; it’s still going on…Beneath all the propaganda is a big grey area, another America that doesn’t get any attention; I decided to make that the subject of my films.”

Redford, now 76, also had critical words for the US’s never-ending drive for economic and technological development, which he considers has been a damaging force.

“We are in a dire situation; the planet is speaking with a very loud voice. In the US we call it Manifest Destiny, where we keep pushing and developing, never mind what you destroy in your wake, whether its Native American culture or the natural environment.

“I’ve also seen the relentless pace of technological increase. It’s getting faster and faster; and it fascinates me to ask: how long will it go on before it burns out.”

Gee Bob, who gets to decide that technological progress will now officially be concluded? As Virginia Postrel told C-Span’s Brian Lamb in 1999 when promoting The Future and its Enemies:

The Khmer Rouge sought to start over at year zero, and to sort of create the kind of society that very civilized, humane greens write about as though it were an ideal. I mean, people who would never consider genocide*. But I argue that if you want to know what that would take, look at Cambodia: to empty the cities and turn everyone into peasants again. Even in a less developed country, let alone in someplace like the United States, that these sort of static utopian fantasies are just that.

Incidentally, that fawning profile of Redford appeared (but of course!) in the UK Guardian under the headline, “Robert Redford on America: ‘Certain things have got lost.’” Well, that can happen when elderly Hollywood multimillionaires make films condoning terrorism, which are in turn approved by a former presidential aide, on the morning show that’s aired nationwide on a TV network owned by the Disney Corporation.

In his 2oo6 book Our Culture, What’s Left Of It, Theodore Dalrymple wrote:

Having spent a considerable proportion of my professional career in Third World countries in which the implementation of abstract ideas and ideals has made bad situations incomparably worse, and the rest of my career among the very extensive British underclass, whose disastrous notions about how to live derive ultimately from the unrealistic, self-indulgent, and often fatuous ideas of social critics, I have come to regard intellectual and artistic life as being of incalculable practical importance and effect. John Maynard Keynes wrote, in a famous passage in The Economic Consequences of the Peace, that practical men might not have much time for theoretical considerations, but in fact the world is governed by little else than the outdated or defunct ideas of economists and social philosophers. I agree: except that I would now add novelists, playwrights, film directors, journalists, artists, and even pop singers. They are the unacknowledged legislators of the world, and we ought to pay close attention to what they say and how they say it.

Especially when the first thought is turn away from the daily horrors our pop culture seems to bring forth in ever-greater numbers.
****

Cross-posted from Ed Driscoll’s Blog

Read bullet | Comments »

RIP, Roger Ebert

Thursday, April 4th, 2013 - by Ed Driscoll

“Roger Ebert dies at 70 after battle with cancer,” reports the Chicago Sun-Times, the paper where he made his home for three decades:

For a film with a daring director, a talented cast, a captivating plot or, ideally, all three, there could be no better advocate than Roger Ebert, who passionately celebrated and promoted excellence in film while deflating the awful, the derivative, or the merely mediocre with an observant eye, a sharp wit and a depth of knowledge that delighted his millions of readers and viewers.

“No good film is too long,” he once wrote, a sentiment he felt strongly enough about to have engraved on pens. “No bad movie is short enough.”

Ebert, 70, who reviewed movies for the Chicago Sun-Times for 46 years and on TV for 31 years, and who was without question the nation’s most prominent and influential film critic, died Thursday in Chicago. He had been in poor health over the past decade, battling cancers of the thyroid and salivary gland.

He lost part of his lower jaw in 2006, and with it the ability to speak or eat, a calamity that would have driven other men from the public eye. But Ebert refused to hide, instead forging what became a new chapter in his career, an extraordinary chronicle of his devastating illness that won him a new generation of admirers. “No point in denying it,” he wrote, analyzing his medical struggles with characteristic courage, candor and wit, a view that was never tinged with bitterness or self-pity.

Always technically savvy — he was an early investor in Google — Ebert let the Internet be his voice. His rogerebert.com had millions of fans, and he received a special achievement award as the 2010 “Person of the Year” from the Webby Awards, which noted that “his online journal has raised the bar for the level of poignancy, thoughtfulness and critique one can achieve on the Web.” His Twitter feeds had 827,000 followers.

Unfortunately, Twitter revealed the intense far left biases and raging misanthropy inside Ebert, which did much to tarnish the family-friendly middlebrow tone of his previous movie criticism. Ebert’s embrace of the unfiltered medium erased much of the good will he developed through his years of co-hosting his weekly TV series At the Movies with Gene Siskel, his fellow Chicago-based critic, who himself had passed away in 1999.

Ironically, both men warned of the dangers of political correctness in the early 1990s:

GENE SISKEL: You have to summon up the courage to say what you honestly feel. And it’s not easy. There’s a whole new world called political correctness that’s going on, and that is death to a critic to participate in that.

EBERT: Political correctness is the fascism of the ‘90s. It’s kind of this rigid feeling that you have to keep your ideas and your ways of looking at things within very narrow boundaries, or you’ll offend someone. Certainly one of the purposes of journalism is to challenge just that kind of thinking. And certainly one of the purposes of criticism is to break boundaries; it’s also one of the purposes of art. So that if a young journalist, 18, 19, 20, 21, an undergraduate tries to write politically correctly, what they’re really doing is ventriloquism.

I suspect that will be the Ebert that will be remembered by posterity, ironically, before he allowed his opinions to be consumed by what he correctly dubbed “the fascism of the 1990s” — and beyond.

(Clicking on the Drudge Report, where I first saw news of Ebert’s death, I also hope the horrific photo of Ebert after his cancer, with much of his jaw removed will somehow be removed from circulation. But alas, our less-than-middlebrow culture won’t allow that to happen unfortunately.)

Update: At the Breitbart.com Conversation, John Sexton quotes this beautiful passage from Ebert, recorded for the commentary on the DVD of Dark City (the thinking man’s Matrix) before PC consumed Ebert’s journalism:

More: Before Ebert’s middlebrow movie critic phase, and final days as an archliberal polemicist, he was a screenwriter for Russ Meyers’ late ’60s and early ’70s sexploitation movies, including Beyond the Valley of the Dolls. Ebert wrote the camp classic line, “This is my happening and it freaks me out!”, which would be spoofed by Mike Myers in the first Austin Powers movie — which Ebert himself mentioned in his review.

Kathy Shaidle has that phase of Ebert’s career covered, in a post with quotes and videos. Plus a great catch, finding a remarkably unthoughtful gaffe by the Chicago Sun-Times in Ebert’s obit.

Read bullet | Comments »

Far from Complete: Great Books Missing in the Kindle Format

Saturday, January 26th, 2013 - by Ed Driscoll

I was a slow convert to the idea of ebooks. My wife bought one of the first Kindles, and I couldn’t get past the off-putting appearance of the text on the screen in the Kindle’s first iteration. But then I tried the Kindle app for Windows. And the Kindle app for my Android Tablet. And slowly began to fall in love. I could read anywhere. I could free up space on my overflowing and limited physical bookshelves. I could easily quote what I had just read in a blog post. The idea of being able to carry my entire library with me and having it accessible in locations as diverse as the treadmill at the gym or a seat on an airplane became increasingly irresistible.

But not my entire library, alas. There are numerous examples of books that I’d repurchase in a second to read on my Kindle that simply aren’t there yet. Nor are they available on Barnes & Noble’s Nook e-reader; I’ve searched.

Off the top of my head, in an ideal world here’s what I’d like to see in the Kindle format. Amazon links are included, if you’d like to get started reading any of these titles now in good ol’ dead tree format — which might be a good idea, as I suspect the wait for some of these might be glacial.

Alvin Toffler’s Back Catalog: Toffler’s Future Shock was a huge bestseller when it was first published in 1970. A decade later, The Third Wave, the sequel to Future Shock, would be  name-checked by Newt Gingrich during the heady days of the “Republican Revolution” in 1995, shortly after he became speaker of the House, which gives a sense of how the book’s predictions held up in the interim 15 years. Toffler’s War and Anti-War applied the principles of the Third Wave to warfare; Powershift applied them to business. Given that The Third Wave was a pretty accurate prediction of how the Internet reshaped society in the 1990s, if any book deserves to be available in electronic format, it’s this one. Where is it? (For my interviews with Toffler, click here and here.)

Profiles of the Future, by Arthur C. Clarke: A quarter century before Star Trek: The Next Generation displayed its first replicator onscreen, Clarke was writing about them in Profiles, along with plenty of other futuristic technology; some we now take for granted (such as the Internet and the Kindle) and others that are still on the drawing board. Again, why isn’t such a forward-thinking book not an ebook as well?

Filmguide to 2001: A Space Odyssey, by Carolyn Geduld. Speaking of when Stanley Kubrick’s enigmatic 2001: A Space Odyssey left so many audiences baffled in the late 1960s, co-screenwriter Arthur C. Clarke was fond of saying, “Read the book, see the movie, repeat the dosage.” Right idea, and while Clarke’s novelization of 2001 is available on Kindle, it’s not necessarily the best book for cracking the film’s mysteries. If I had to hand one baffled 2001 viewer the Cliff’s Notes to the movie, it would be Geduld’s book from 1973, which thoroughly charts out the film’s plot and leitmotifs.

The flat-panel news and information devices the astronauts read while eating dinner in 2001 directly inspired the iPad and Kindle. Now that technology has finally caught up Kubrick’s 1968 vision, shouldn’t the book that places them into context be accessible on those devices as well?

The Death of the Grown-Up, by Diana West. The subhead of West’s book is “How America’s Arrested Development Is Bringing Down Western Civilization.” As Michelle Malkin noted in 2007 when she interviewed West on her book, others have written about the increasing child-like naiveté of society, but West was perhaps the first to explain how it has hamstrung our fight in what was once called the Global War on Terror. That we had (have?) a war named after tactics rather than the enemy we’re fighting is due to the GWOT receiving its name largely through a process of elimination, as West noted in her book and the articles that preceded it, as political correctness allows few other choices.

Read bullet | Comments »

Interview: The History of Epiphone Guitars

Sunday, January 13th, 2013 - by Ed Driscoll

For years, Walter Carter was the in-house historian at Gibson Guitars, before serving a similar function for well-known vintage guitar dealer George Gruhn. He has a new book out this month published by Backbeat Books, called The Epiphone Guitar Book: A Complete History of Epiphone Guitars. Its slick, glossy, 160-pages are heavily illustrated, with many photos in color.

With a legacy dating back to the 1870s and Greek luthier Anastasios Stathopoulos, the Epiphone brand name takes its name from two components — the nickname of Anastasios’ son, Epaminondas, and the word “phone,” which, in the 1920s when the brand Epiphone was launched, competed with the word “radio” to symbolize high-tech and modernity. (See also: Gramophone, the Radio Flyer, etc.)

Epiphone has had several twists and turns in its history. Until the mid-1950s, it competed neck and neck (pardon the pun) with Gibson for sales of arch-top jazz guitars. Ted McCarty, who built up Gibson as a music instrument powerhouse in the mid-2oth century, said that “when I came to Gibson, the biggest competition we had was Epiphone.” But the death of Epi in 1943, followed by squabbles among the surviving Stathopoulos family during the following decade, caused the value of their business to plummet. McCarty acquired Epiphone for Gibson’s parent company at a bargain rate, and production of Epiphone guitars switched in-house to Gibson’s Kalamazoo, MI plant, during the 1960s. The new brand name gave Gibson certain advantages: they could protect the exclusive arrangements their dealers had with Gibson, but sell Epiphone to nearby music dealers, positioning it as a slightly lower brand — the Buick or Oldsmobile to Gibson’s Cadillac.

In the mid-1960s, Epiphone models were played by a little-known cult act called the Beatles — “Everybody but Ringo,” as Carter told me. McCartney played an Epiphone Texan acoustic on “Yesterday,” George Harrison played his Epiphone Casino on Sgt. Pepper, and John Lennon played his own Casino on the rooftop of Apple Records during their legendary last concert at the conclusion of Let It Be.

In the early 1970s, Gibson sent production of Epiphone guitars overseas. Today, it exists, in part, as an entry-level brand for new guitarists (and as such, there are likely more Epiphones in circulation than Gibsons) and there’s some controversy between those who own traditional made-in-America Gibson guitars such as the Les Paul, and those who own Les Pauls and other models also sold under the Epiphone name.

Carter discusses all that and much more in our 21-minute interview. Click here to listen:

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

(21:23 minutes long; 19.5 MB file size. Want to download instead of streaming? Right click here to download this show to your hard drive. Or right click here to download the 6MB lo-fi edition.)

If the above Flash audio player is not compatible with your browser, click below on the YouTube player below, or click here to be taken directly to YouTube, for an audio-only YouTube clip. Between one of those versions, you should find a format that plays on your system.

Read bullet | Comments »

Finally: The Amazon Music Cloud Arrives on Roku

Thursday, December 13th, 2012 - by Ed Driscoll

It magically appeared there sometime early this morning, which was a pleasant surprise after months of waiting; C/NET adds:

Amazon’s cloud music service is now available on Roku and Samsung Smart TVs, offering the ability to stream your own digital music tracks without needing to keep a separate computer running. For Roku, it’s a solid response to Apple’s iTunes Match service, which offers cloud storage and streaming for $25 per year.

While Amazon Cloud Player started off as a largely free service, it now requires a similar fee as iTunes Match: $25 per year for up to 250,000 uploaded songs. That’s a ton of digital music, although the competing Google Play Music allows you to store up to 20,000 tracks for free and is available on Google TV devices.

The release comes on the same day Amazon added an Amazon Instant Video app to the iPhone and iPod Touch as well.

For our original review of the Roku box from January, click here.

Read bullet | Comments »

Mies van der Rohe: Creating the Architectural Language of 20th Century America

Wednesday, November 28th, 2012 - by Ed Driscoll

Television’s Mad Men would have you believe that America was a monolithic bastion of Puritanism, untrammeled by European or socialist influences (despite the rise of Woodrow Wilson and FDR!) until the Beatles touched down at JFK Airport in 1964. The reality though, as Allen Bloom memorably wrote in The Closing of the American Mind, was that almost immediately upon the US winning World War II, America began to slowly — often unwittingly — become an unofficial enclave of Germany’s Weimar Republic.

Take architecture. As Tom Wolfe noted in From Bauhaus to Our House, his classic debunking of modernism’s excesses, because America’s intellectuals tend to think of themselves as an artistic colony in thrall to Europe, when the leaders of the Weimar-era German Bauhaus of the 1920s were evicted by the Nazis, they were welcomed by Depression-era American universities as “The White Gods! Come from the skies at last!”

[Walter Gropius, the founder of the Bahaus] was made head of the school of architecture at Harvard, and Breuer joined him there. Moholy-Nagy opened the New Bauhaus, which evolved into the Chicago Institute of Design. Albers opened a rural Bauhaus in the hills of North Carolina, at Black Mountain College. [Ludwig Mies van der Rohe, its last director, when the Nazis shuttered its doors in 1933] was installed as dean of architecture at the Armour Institute in Chicago. And not just dean; master builder also. He was given a campus to create, twenty-one buildings in all, as the Armour Institute merged with the Lewis Institute to form the Illinois Institute of Technology. Twenty-one large buildings, in the middle of the Depression, at a time when building had come almost to a halt in the United States— for an architect who had completed only seventeen buildings in his career—

O white gods.

Mies van der Rohe (1886-1969) is the titular subject of the newly published biography by architectural historian Franz Schulze and architect Edward Windhorst (who studied his craft under a protégé of Mies). They’ve collaborated on an extensively — very extensively — revised version of the biography of Mies that Schulze first published in 1986, the centennial of Mies’s birth.

Mies van der Rohe’s 1929 Barcelona Pavilion, May 2000. Photo © Ed Driscoll.

While he was America’s most influential postwar modern architect and teacher, Mies never quite become a household name on the same order as Frank Lloyd Wright. (Despite a prominent Life magazine feature in 1957.) But he’s been the subject of numerous biographies and book-length profiles, beginning with his prominent role in The International Style, the pioneering Museum of Modern Art exhibition by Philip Johnson and Henry Russell Hitchcock, which first put modern architecture on the map in America, back in 1932.

Even as Mies was associated with several prominent buildings deserving of respect after World War II, perhaps his greatest accomplishment was to singlehandedly invent the language of postwar American architecture. We take tall steel and glass office buildings and apartments for granted, but it was Mies who created their look, beginning with 1951′s Farnsworth House (which would also provide the inspiration for Philip Johnson’s own Glass House) and from that same year, the 860-880 Lake Shore Drive apartment complex.

Read bullet | Comments »

RIP, Larry Hagman, 81

Friday, November 23rd, 2012 - by Ed Driscoll

The Dallas Morning News reports that J.R. Ewing has retired to the Texas-sized ranch in the sky:

Larry Hagman, who played the conniving and mischievous J.R. Ewing on the TV show Dallas, died Friday at Medical City in Dallas, of complications from his recent battle with cancer, his family said.

He was 81.

“Larry was back in his beloved Dallas re-enacting the iconic role he loved most,” his family said in a written statement. “Larry’s family and close friends had joined him in Dallas for the Thanksgiving holiday. When he passed, he was surrounded by loved ones. It was a peaceful passing, just as he had wished for. The family requests privacy at this time.”

The role of J.R. transformed Mr. Hagman’s life. He rocketed from being a merely well-known TV actor on I Dream of Jeannie and the son of Broadway legend Mary Martin, to the kind of international fame known only by the likes the Beatles and Muhammad Ali.

Mr. Hagman made his home in California with his wife of 59 years, the former Maj Axelsson. Despite obvious physical frailty, he gamely returned to Dallas to film season one and part of season two of TNT’s Dallas reboot.

Reuters’ obit adds that Hagman “had suffered from cancer and cirrhosis of the liver in the 1990s after decades of drinking.” According to Wikipedia, “In August 1995, Hagman underwent a life-saving liver transplant after admitting he had been a heavy drinker. Numerous reports state he was drinking four bottles of champagne a day while on the set of Dallas. He was also a heavy smoker as a young man, but the cancer scare was the catalyst for him to quit.”

RIP.

 

Read bullet | Comments »

Mastering the Music Domain

Saturday, October 6th, 2012 - by Ed Driscoll

For those who enjoy recording their own music or podcasts at home, mastering is one of the more little known aspects of the process. Most people are aware of overdubbing, editing and mixing, but comparatively few understand how critical mastering can be to add the final sparkle to a mix, how it can transform a pretty good mix into something amazing, or (sometimes, with a little luck) a poor mix into something tolerable.

In the professional world, mastering is usually done using lots of very expensive outboard gear, as the final step before a master copy of a CD is sent to be duplicated into millions of consumer discs, or an album of MP3s is uploaded to iTunes and Amazon.

In the not necessarily professional world of home recording, mastering can be done with a plug-in effect.

For over ten years, Boston-area iZotope Inc., located near Boston has been producing a high-end plug-in for recording programs called Ozone. Now in its fifth iteration, iZotope produces versions of it for most PC and Mac-based recording programs, as well for Pro Tools, the most popular professional recording system.

When I interviewed him for a Blogcritics article on an earlier iteration of Ozone back in 2004, Jeremy Todd, the company’s chief technology officer (and a musician himself — he was trained as a classical pianist) told me:

Mastering in general is tough to put your finger on; I guess it depends on who you’re talking to. But for the purposes of Ozone, we talk about everything that you do once you’ve got a stereo mixdown, to when you when you actually have a master and you say, “OK, this is the audio, this is it, we’re not touching it anymore.”

With Ozone, we try to include everything that someone would need, so that, while it’s not always the case, but in theory they could not use another plug-in; they could do it all in one.

How was mastering done before the days of computers and hard disk recording? Todd says:

There were trends established way back when, that are still present today. We’re still seeing examples of these standalone hardware devices. Things were much more isolated, you wouldn’t see as much all-in-one gear, and you’d have these big, honking pieces of equipment that were just an equalizer — and a two or three band equalizer at that, usually just a finalizer, a loudness maximizer.

Obviously, if you go back far enough, mastering was dominated by analog equipment. So with Ozone, we’re trying to capture some of the flavor that people liked, which was a big challenge when it came to designing the DSP. It’s very difficult for people to explain why they like their two-band analog equipment. So it boiled down to a lot of listening tests, and asking people a lot of questions.

We tried to keep a little of the analog flavor in the sound, in our previous versions of Ozone. [Beginning] in Ozone 3, the analog modeling was firmly established, but people have been saying that in some cases, they want something cleaner; they don’t want any flavor, they want to be more surgical with the tool. So we added a digital component to the equalizer and the multi-band crossover.

With Home Recording, Mastering More Important Than Ever

Let’s take a moment to discuss how the mixing and mastering process has changed over the past 30 years for the average home recordist.

Back in the 1980s, when I first began to record demos of songs for my local rock group on a four track, mixing was relatively easy…because there were only four tracks (that’s actually a bit of a simplification — I used a fair amount of virtual tracks and outboard gear). But I did all the mixes in real time and hoped for the best. For their time, they weren’t terrible demos — but certainly nobody would confuse them for properly mixed and mastered track on a CD.

By the late 1990s, it was possible to replicate the process on a personal computer — and with infinitely more control over the individual tracks and the overall sound.

Read bullet | Comments »

The Home Recording Handbook

Sunday, September 23rd, 2012 - by Ed Driscoll

A decade ago, in one of my earliest reviews of a software-based recording program, I dubbed it “Abbey Road in a Box.” That may seem slightly hyperbolic at first, but today’s digital audio workstations (or DAWs for short) are incredibly sophisticated programs, combining the ability to record music digitally, then add built-in and aftermarket effects, and layer in a variety of software synthesizers and prerecorded loops as well. In short, they leave the stone knives and bearskins-level technology the Beatles had available to them in the 1960s in the dust.

But a DAW can seem as overwhelming at first as walking into a physical recording studio. As producer Brian Eno said of an actual mixing board 35 years ago:

Most people see a large mixer, and they’re completely bewildered because there are something like 800 or 900 knobs on it. Actually it’s not so complex as it looks – it’s the same thing repeated many times. Since you’re dealing with 24 tracks, everything has to be multiplied by 24; it’s not a very complex system. Each track from the tape recorder plays back on one channel of the mixer. Each individual channel has a whole set of controls that duplicate the other channels; that’s all.

But what do those knobs do — and more importantly — what can you do with them?

In other words, Abbey Road is just a series of acoustically-treated rooms and electronic gear without the skill of the engineers and producers who know how to make it work. Paul White’s The Producer’s Manual, published by British electronic music house Sample Magic and written by the editor of Britain’s long-running Sound on Sound magazine won’t turn you into the second coming of George Martin alone. But at over 350 full-color, heavily illustrated pages, with a glossary defining of all of its jargon, it’s an excellent guide to unlocking the power of the recording software and equipment you may already own. And what to look for when shopping for your next piece of kit.

If you already own a DAW, it may well have many of the digital tools that White describes in The Producer’s Manual. But how to make the most of them? What physical equipment do you need? What sort of sound card do you need? How do you choose which microphone for which application? Which speakers to ensure your mixes still sound the same beyond your basement? Are the acoustics in your recording room up to snuff?

And then there actual recording techniques — which is what you’ve assembled all this gear for, in the first place. What if you need to record an acoustic guitar? A chorus of background singers? How do you mic up a drum kit? Or heck, what if Christina Hendricks drops by and wants you to record her accordion playing?

OK, White doesn’t specifically mention Christina Hendricks — but he does go into how to record an accordion, along with all sorts of other instruments. And then how to edit, assemble, and master their parts — and how to salvage things afterwards if a session goes haywire. These are but a few of the topics that White explores. Beginners will learn much — I sure wish this book had been around a decade ago when I first made the leap to digital music recording after a decade toiling with cassette four-tracks. But those with plenty of experience in the brave new world of DAWs will find much to learn in this highly recommended book as well.

Read bullet | Comments »

‘A Mole, a Tree, and a Weasel:’ RIP Steve Sabol of NFL Films

Wednesday, September 19th, 2012 - by Ed Driscoll
YouTube Preview Image

Steve Sabol, the scion of the founder of NFL Films, passed away yesterday at 69 of a brain tumor, an age that’s far too young to die these days. I grew up about 20 minutes from the NFL Films offices in Mt. Laurel, NJ* and in 2003, took a tour of their ultra high-tech facilities — which make the Bridge of the Starship Enterprise seem laughably antediluvian in comparison — as part of the research that wound-up doing double-duty at the start of the following year for articles in Videomaker magazine and Tech Central Station. The other half of my prep work for those two articles involved interviewing Sabol on the phone. As he told me at the start of our conversation:

Steve Sabol: There’s an old Indian proverb that I’ve always believed in, and that’s ‘tell me a fact, and I’ll learn. Tell me a truth, and I’ll believe. Tell me a story, and it will live in my heart forever’.

And that’s been one of our mottos, is telling a story. And the story telling is basically done through the editing. It’s the cameraman’s job to come back with as much material—story telling shots, action shots—as he possibly can. Then it’s up to the editor to tame and to shape the raw vision of the cameraman.

I started out as an editor, and then became a cameraman. But that’s really job of the editor. It’s so critical, and it’s one of the most overlooked artforms or disciplines in filmmaking. Most people don’t understand about editing; they understand writing, they understand music, they understand cinematography. But when it comes to editing and the selection and order of the shots, that’s the key to storytelling.

Driscoll: Did being an editor first influence you when you became a cameraman?

Sabol: When I started out as an editor, and tried to tell stories, I realized that there were certain gaps; that you couldn’t tell a story with just action shots. You needed shots that showed the passage of time, the sun shining through the portals of the stadium. You needed close-ups to show the reaction of the players to the game. You needed shots of the audience and the fans. You needed locator shots as well call them, that set the scene. What’s the stadium look like? Is it a full stadium? Is it an empty stadium? And you need shots that can move the story along. It might be a pair of bloody hands. It could be cleat marks in the mud. It could be a crushed water bottle on the sidelines. It could be a flag whipping in the wind. These were all things that were in important.

I was an art major in college, and Paul Cézanne, the famous French impressionistic painter, once said that “all art is selected detail.” And I felt that that was one thing that was missing in sports films were the details. And when I began as a cameraman, that was all I shot, was the details. I filmed the first 15 Super Bowls, and never saw a play. But I could tell you what kind of hat Tom Landry was wearing, how Vince Lombardi was standing in the fourth quarter, if Bob Lilly had a cut on the bridge of his nose. Those were the things that I remember in the Super Bowl. I don’t remember any of the plays. I was just what we call a weasel.

Driscoll: What is a weasel?

Sabol: Well, we have three types of cameramen: we have a tree, a mole, and a weasel. A tree is the top camera. He’s on a tripod rooted into a position on the 50 yard line, and he doesn’t move. A mole is a handheld, mobile, ground cameraman, with a 12 to 240 lens, and he moves all around the field, and he gives you the eyeball-to-eyeball perspective. A weasel is the cameraman who pops up in unexpected places, to get you the telling storytelling shot—the bench, the crowd, all the details.

So those are the three elements. When you blend them together you get the NFL Films visual signature—when you blend together a mole, a tree and a weasel.

You have infinitely more than that of course – NFL Films revolutionized how sports are covered by film and television, and transformed the National Football League in America’s leading sport. And as Sabol told AP when his father was inducted into the NFL Hall of Fame, “We see the game as art as much as sport. That helped us nurture not only the game’s traditions but to develop its mythology: America’s Team, The Catch, The Frozen Tundra:”

When Ed Sabol founded NFL Films, his son was there working beside him as a cinematographer right from the start in 1964. They introduced a series of innovations taken for granted today, from super slow-motion replays to blooper reels to sticking microphones on coaches and players. And they hired the ”Voice of God,” John Facenda, to read lyrical descriptions in solemn tones.

Until he landed the rights to chronicle the 1962 NFL championship game, Ed Sabol’s only experience filming sports was recording the action at Steve’s high school football games in Philadelphia.

* * * * *

He was the perfect fit for the job: an all-Rocky Mountain Conference running back at Colorado College majoring in art history. It was Sabol who later wrote of the Raiders, ”The autumn wind is a pirate, blustering in from sea,” words immortalized by Facenda.

The Sabols’ advances included everything from reverse angle replays to filming pregame locker room speeches to setting highlights to pop music.

”Today of course those techniques are so common it’s hard to imagine just how radical they once were,” Steve told the AP last year. ”Believe me, it wasn’t always easy getting people to accept them, but I think it was worth the effort.”

Indeed it was. RIP, Steve Sabol.

* But then, all of South Jersey is 20 minutes away from the rest of South Jersey.

(Cross-posted at Ed Driscoll.com.)

 

Read bullet | Comments »

The Dark Knight of the Sowell

Thursday, August 23rd, 2012 - by Ed Driscoll

On Sunday, Nina and I finally caught The Dark Knight Rises. We both enjoyed it*, but with a nearly three-hour running time, I felt sort of numb afterwards, finding newfound respect for the terse minimalist Jack Webb police procedural-like feel of the half-hour Adam West Batman series from the 1960s.

OK, just kidding. But still, two hours and 44 minutes is way too long for anything that wasn’t directed by David Lean.

Speaking of which, at the Corner, Michael Walsh, linking to Andrew Klavan’s review in the Wall Street Journal, sees a Dr. Zhivago-esque subtext to the movie, which is obsessed with the dangers of revolution:

[I]f insanity is defined as doing the same thing over and over again and expecting a different result, what are we to make of every murderous Regressive movement from the French Revolution to the October Revolution to Mao and Pol Pot? All of them began in resentment and ended in oceans of blood. In fact, one of the worst things about being a Regressive is having to ride the tiger that eventually eats all of them. In Dr. Zhivago, the idealistic Pasha becomes the feared zealot Strelnikov who in turn becomes another of Stalin’s statistics. In this Batman installment, Bane’s raging Id and his secret controller’s lust for revenge are both defeated by heroes who understand where the truth lies.

In a spoiler-filled round-up at Big Hollywood, Ben Shapiro dubs The Dark Knight Rises, “Magnificent … And Most Conservative Film Ever.”

Most conservative film ever? Well…

Read bullet | Comments »