MSNBC hired Baldwin knowing he was already guilty of hurling anti-gay hate speech at a reporter. Back in June, Baldwin not only used his Twitter account to hurl anti-gay slurs, he threatened the reporter and called on his Twitter followers to “straighten out this f**king little bitch.”
Almost all of the mainstream media gave Baldwin a total pass for his reprehensible behavior last June. In fact, afterwards he won a weekly show on MSNBC (which is currently collapsing in the ratings).
Maybe the media was exhausted after destroying cooking guru Paula Deen over something she said thirty years ago.
Nolte refers to Baldwin as an “actor,” but as I mentioned back in August, when the idea for Baldwin’s show was initially floated by NBC, if the former actor is going to host a talk show for NBC, then he’s now in the same profession as the photographers he loves to threaten and hurl abuse at — they’re both working journalists attempting to gather news and information and present them to the public. And why NBC would sanction any of their employees to behave in such a fashion is a curious double-standard — almost as big a double-standard as contractually allowing Al Sharpton to remain an activist while also playing the role of NBC anchorman.
Between’s Al Sharpton’s history of rabble-rousing, fabulism, and anti-Semitism, Baldwin’s hair-trigger homophobia, and the number of microphones Ed Schultz must ruin each week with his spittle-flecked invective, keeping a safe distance from all of NBC’s toxic anchors seems like rather sound advice these days — perhaps for members of both parties.
To my shame, and against my principles, I have occasionally agreed to appear on television, though even less frequently than I have been asked. I have found those who work for TV broadcasting companies to be the most disagreeable people that I have ever encountered. I far preferred the criminals whom I encountered in my work as a prison doctor, who were more honest and upright than TV people.
— Theodore Dalrymple, ‘Television is an Evil,” November 3rd, 2013.
According to Oxford University research psychologist Kevin Dutton, TV/media ranks among the top three professions with the most psychopathic personalities employed.
When you hear the word “psychopath,” you likely think of Norman Bates, Patrick Bateman or horror films, but it has a real definition: “Psychopathy is a personality disorder that has been variously described as characterized by shallow emotions (in particular reduced fear), stress tolerance, lacking empathy, coldheartedness, lacking guilt, egocentricity, superficial character, manipulativeness, irresponsibility, impulsivity, and antisocial behaviors such as parasitic lifestyle and criminality.”
With that in mind, TV/media apparently ranks #3 among Dutton’s top 10 professions containing the most psychopathic personalities:
— “TV/Media Ranks Among Top 3 Professions with Most Psychopathic Personalities,” headline, Mediaite, November 8th.
If you’re thinking about buying Season Six of Mad Men on DVD or Blu-Ray, and haven’t made your purchase, let my mistake weigh into your decision. I ordered my copy from Amazon without checking the bonus features, simply assuming that there would be commentary from producer Matthew Weiner, and others from the show, from both behind and in front of the cameras, because that had been the case on the DVDs for all five of the show’s previous seasons. And considering how chaotic this season has been, separate and apart from this past year being set in the chaos and horror of America in 1968, I was definitely looking forward to hearing what Weiner and others on the show were thinking with some of their creative choices.
As you may have surmised by now, no such luck. This is the first season of Mad Men to be issued on disc with no commentary whatsoever, with the exception of a couple of meager bonus features. A disc review site called We Got This Covered adds:
AMC and Lionsgate, in this week’s Blu-Ray release of the show’s sixth season, seem to have inexplicably given in to the diminishing levels of hype. Past Mad Men Blu-Ray and DVD releases were events, arriving with creative packaging (remember the cigarette-lighter case from Season One?), packed to the gills with extras (including one or more audio commentaries per episode), and delivering some of the very best audio and visual presentations available on the Blu-Ray format, for television or for film. Season Six, on the other hand, lands on Blu-Ray with an abject whimper, maintaining the same level of A/V perfection, but without any of the fantastic extra content that previously made these home video packages such an enticing proposition. There are no audio commentaries whatsoever. Nowhere on the set can viewers hear from a single member of the cast, nor from creator Matthew Weiner, nor from any of the show’s other writers and directors. At most, there is a conversation with the Production Designers and Art Directors, but that piece happens to be shockingly poorly produced.
In short, where Mad Men itself keeps upping its creative game year in and year out, AMC and Lionsgate have dumped it on to Blu-Ray with only a modicum of effort. As a result, this is the first time a Mad Men season has not been an immediate must-buy on home video, not because of the quality of the show itself, but because AMC and Lionsgate have chosen to do wrong by the fans.
Perhaps AMC and Lionsgate can make amends by issuing free downloadable commentaries, Rifftrax-style, to accompany the discs. As the above review notes, the picture quality is excellent, arguably sharper than the shows looked when aired on AMC HD on DirecTV. But given that they’ll likely be issued in streaming format on Netflix in a couple of months, I definitely feel cheated buying these discs without commentary tracks. If you were looking forward to hearing these as well, buyer beware.
Now is the time when we juxtapose, Small Dead Animals-style:
Upon learning in 1928 of T. S. Eliot’s conversion to Christianity, Virginia Woolf wrote to her sister:
I have had a most shameful and distressing interview with poor dear Tom Eliot, who may be called dead to us all from this day forward. He has become an Anglo-Catholic, believes in God and immortality, and goes to church. I was really shocked. A corpse would seem to me more credible than he is. I mean, there’s something obscene in a living person sitting by the fire and believing in God.
Flash-forward to the present day:
Orson Scott Card is monstrously homophobic; he’s racist; he advocates violence and lobbies against fundamental human rights and equates criticism of those stances with his own hate speech.
I would never, ever suggest that a student seek out his advice. I will not pay to see Ender’s Game; I will never buy another copy.
…Card is a monster who helped me learn to write, an author of hateful screed whose novels taught lonely, angry kids compassion and gave them their first sense of home. None of those things makes the others go away.
—Rachel Edidin, “Orson Scott Card: Mentor, Friend, Bigot,” Wired magazine.
“Lou Reed, Velvet Underground Leader and Rock Pioneer, Dead at 71,” Rolling Stone reports:
Lou Reed, a massively influential songwriter and guitarist who helped shape nearly fifty years of rock music, died today. The cause of his death has not yet been released, but Reed underwent a liver transplant in May.
With the Velvet Underground in the late Sixties, Reed fused street-level urgency with elements of European avant-garde music, marrying beauty and noise, while bringing a whole new lyrical honesty to rock & roll poetry. As a restlessly inventive solo artist, from the Seventies into the 2010s, he was chameleonic, thorny and unpredictable, challenging his fans at every turn. Glam, punk and alternative rock are all unthinkable without his revelatory example. “One chord is fine,” he once said, alluding to his bare-bones guitar style. “Two chords are pushing it. Three chords and you’re into jazz.”
On the other hand, Lou was smart enough to hire Dick Wagner and Steve Hunter to play on the 1973 tour that led to his brilliant Rock and Roll and Animal live album, and Wagner and Hunter could play. Their opening to Lou’s Velvet-era “Sweet Jane” is a brilliant piece of musicianship:
“Barack Obama is a Fabian socialist. I should know; I was raised by one,” Jerry Bowyer wrote at Forbes in a piece that was published on the day before the presidential election in 2008:
My Grandfather worked as a union machinist for Ingersoll Rand during the day. In the evenings he tended bar and read books. After his funeral, I went back home and started working my way through his library, starting with T.W. Arnold’s The Folklore of Capitalism. This was my introduction to the Fabian socialists.
Fabians believed in gradual nationalization of the economy through manipulation of the democratic process. Breaking away from the violent revolutionary socialists of their day, they thought that the only real way to effect “fundamental change” and “social justice” was through a mass movement of the working classes presided over by intellectual and cultural elites. Before TV it was stage plays, written by George Bernard Shaw and thousands of inferior “realist” playwrights dedicated to social change. John Cusack’s character in Woody Allen’s “Bullets Over Broadway” captures the movement rather well.
Arnold taught me to question everyone–my president, my priest and my parents. Well, almost everyone. I wasn’t supposed to question the Fabian intellectuals themselves. That’s the Fabian MO, relentless cultural and journalistic attacks on everything that is, and then a hard pitch for the hope of what might be.
That’s Obama’s world.
He’s telling the truth when he says that he doesn’t agree with Bill Ayers’ violent bombing tactics, but it’s a tactical disagreement. Why use dynamite when mass media and community organizing work so much better? Who needs Molotov when you’ve got Saul Alinski?
So here is the playbook: The left will identify, freeze, personalize and polarize an industry, probably health care. It will attempt to nationalize one-fifth of the U.S. economy through legislative action. They will focus, as Lenin did, on the “commanding heights” of the economy, not the little guy.
Read the whole thing; it was astonishingly prescient stuff. “You’ve heard of the bully pulpit, right? Well, then get ready, because you’re about to see the bully part,” Bowyer concluded.
But then, as Bowyer noted at the beginning of his article, we had seen it all before.
Past performance is no guarantee of future results:
“I think that no matter what you would propose they would go against it because their determination was to destroy this person,” Redford said of the “minority faction” in Washington versus President Obama.
“Well, I think whatever idea I would have had to make things work just wouldn’t have been accepted by this minority faction,” Redford responded when asked by CNN’s Nischelle Turner for his “advice” for Democrats and Republicans to work together. “They wanted, if it meant destroying the government, anything to keep him [Obama] from succeeding.”
— Robert Redford today on CNN.
George Stephanopoulos was so enthusiastic towards Robert Redford and his sympathetic new film about an ex-1960s radical that the actor enthused, “You ought to get on the marketing team!” The aging actor/director appeared on Tuesday’s Good Morning America and endorsed the violent actions of protest groups. Reminiscing on his own past, the liberal Hollywood star recounted, “When I was younger, I was very much aware of the movement. I was more than sympathetic, I was probably empathetic because I believed it was time for a change.”R
After Stephanopoulos wondered, “Even when you read about bombings,” Redford responded, “All of it. I knew that it was extreme and I guess movements have to be extreme to some degree.
— Robert Redford in April, promoting his recent pro-terrorism film The Company You Keep, with ex-Bill Clinton aide George Stephanopoulos on Good Morning America.
Robert Redford was in Havana last month, not to score cigars but to screen his The Motorcycle Diaries for Cuban President Fidel Castro. The Motorcycle Diaries, which Redford produced, is based on the diaries Guevara wrote on a nine-month motorcycle trip through South America in 1952. Directed by Brazilian Walter Salles, it stars Gael Garcia Bernal (who moviegoers will remember from Y Tu Mama Tambien).
Guevara’s widow, Aleida March, attended the screening along with Guevara’s son and two daughters. The movie had its premiere at the Sundance Film Festival in January, where it received a standing ovation.
— The Baltimore Sun, March 7, 2004.
We think of William Randolph Hearst and the fictional Charles Foster Kane as media tycoons encasing themselves in living mausoleums as old men, but Johnny Carson was basically entombed the minute he was hired by NBC to replace Jack Parr as the host of the Tonight Show, except that we were invited to tune in and watch every night. As an audience, particularly during the blow-dried bell-bottom polyester lacuna of the 1970s, we were lucky Johnny was as cool as he was, a byproduct of the early 1960s Sinatra, JFK, Miles, Steve McQueen definition of cool, not the Brando/Fonzie primitive angry young greaser definition of the word. When Marshall McLuhan defined television as a cool medium in the mid-1960s, Johnny personified it – both cool and television. Especially the latter half of the equation.
Or as Kenneth Tynan wrote in his epic 22,000-word(!) 1978 New Yorker profile of Johnny Carson, “I once asked a bright young Manhattan journalist whether he could define in a single word what made television different from theatre or cinema. ‘For good or ill,” he said, ‘Carson.’”
But all transactions involve tradeoffs. While Johnny’s net worth soared as the most popular man on the most popular medium of the mid-20th century, Johnny paid a terrible personal price himself.
In her post yesterday on the new biography of Carson by Henry “Bombastic” Bushkin, his former business advisor and close friend, Kathy Shaidle mentions “Carson’s cool-warmth — that charming-yet-menacing mien — was always obvious to me, and I say that as an admirer of his abilities.”
Kathy mentions Carson and Bob Crane as defining the “cool-warm” personality, but wasn’t the grandfather of ”cool-warmth” Bing Crosby? Crosby displayed amiable warmth on the big screen, adopted a style of singing that let the microphone do the work, a much cooler style — though the word hadn’t been invented yet — than any other singer during the 1920s or ’30s, and in the process, became an international superstar who would go on to master live performing, records, radio, movies, and later, television, both as an actor and producer. (Bob Crane became famous on Hogan’s Heroes, a Bing Crosby production.) While not a macho figure, or a suave sophisticate like Cary Grant, Crosby lived out the cliche that “women wanted him and men wanted to be like him” — heaven knows my dad did — and yet, offscreen, Crosby was, according to his sons, the male equivalent of Joan Crawford in Mommy Dearest.
“Society will develop a new kind of servitude which covers the surface of society with a network of complicated rules, through which the most original minds and the most energetic characters cannot penetrate. It does not tyrannise but it compresses, enervates, extinguishes, and stupefies a people, till each nation is reduced to nothing better than a flock of timid and industrious animals, of which the government is the shepherd.”
― Alexis de Tocqueville (1805-1959).
“And I fear we never shall,” writes Neo-Neocon.
I received several nice compliments via email regarding my recent piece on why 1958′s A Night to Remember continues to resonate, including a link from Neo-Neocon to her post late last year on World War I:
When I was in school, World War I was hardly touched on in my history classes, so eager were the teachers to get to World War II before the year was over. It was only though reading a review of the Paul Fussell book The Great War and Modern Memory when it first came out in 1975, and then being intrigued enough to read the book, that I first learned what a cataclysmic event the First World War was, both in terms of death rates and in its psychological and even spiritual, as well as cultural, effects.
The first hint was this quote by Henry James, from a letter he wrote to a friend the day after Britain entered the war:
The plunge of civilization into this abyss of blood and darkness… is a thing that so gives away the whole long age during which we have supposed the world to be, with whatever abatement, gradually bettering, that to have to take it all now for what the treacherous years were all the while really making for and meaning is too tragic for any words.
If you hack through James’ typically convoluted syntax, you’ll see a perfect encapsulation of the effect of the war: blood and darkness, giving the lie to what people of that age thought “civilization” had meant. The war caused people to look back at all the years of seeming progress and regard them as a cruel, tantalizing, misleading illusion, a sort of trick played on naive people who now looked back at the history they themselves had lived through, tearing off their previous rose-colored glasses and now seeing a stark and terrible vision.
We have been stuck with that vision ever since.
World War I gave birth to all the horrors of the twentieth century. A host of banshees were let loose upon the western world, shattering old dogmas of religion, democracy, capitalism, monarchy, and mankind’s rule in the world. The war fueled widespread hatred, suspicion and paranoia toward elites and established institutions. For belligerents on both sides, economic planning lent political and intellectual credibility to state-directed war socialism. And of course, it led to the enthronement of revolutionaries throughout Europe: Lenin in Russia, Mussolini in Italy, and Hitler in Germany.
It took a while for the modern vision of World War I as hopeless, futile meat grinder to take hold though. (And apologies for largely repeating content from an earlier post here.) In his 2011 book, The New Vichy Syndrome: Why European Intellectuals Surrender to Barbarism, Theodore Dalrymple explored how the meaning of World War One morphed among European intellectuals from the late 1910s to the 1920s:
At least to the victors, the war did not seem self-evidently senseless, and disillusionment was not immediate. The war memorials to be found everywhere in France are tributes to loss, but not to meaninglessness. The soldiers really did die for France, or so almost everyone supposed; in Britain, my next-door neighbor, who collects coins and medals, showed me some First World War service medals for those who survived the war, with an athletic (and naked) young man upon a horse, wielding a sword as if he were a latter-day St. George about to slay a dragon. One of the medals bore the inscription “The War to Save Civilization.” I doubt that these medals were greeted solely by hollow laughter; for one thing, they would hardly have been preserved so carefully if they had been. And browsing in a bookshop recently, I found a book published in 1918 with the title The Romance of War Inventions. It was an attempt to interest boys in science by explaining how shells, mortars, tanks, and so forth had been developed and how they worked. By the time of its publication, millions had already been killed, and surely no one in Britain could by that time not have known someone who had been killed or at least someone whose child or brother or parent had been killed. It seems to me unlikely that such a publication would have seen the light of day in an atmosphere of generalized cynicism about the war.
“The version of the First World War that is now almost universally accepted as ‘true’ is that of the disillusioned writers, male and female, of the late 1920s and 1930s. The war, according this version, was about nothing at all and was caused by blundering politicians, prolonged by stupid generals and lauded by patriotic fools,” Dalrymple adds.
Many people have wondered where I do most of my blogging. Wonder no more:
And when I’m not at the computer, I’m relaxing in my sweet home theater:
(Both clips uploaded to YouTube by Matt Novak of the Paleo-Future blog, from a March 1967 episode of the CBS show The 21st Century, hosted by Walter Cronkite. Between speeches calling for “one-world government,” and believing that Karl Rove had Osama bin Laden on ice in Area 51 during the 2004 election, Cronkite’s actual decade spent in the 2st century before passing away in 2009 was much more chaotic.)
Nobody can say that the Journal News of Westchester New York doesn’t cover the Big Stories of Our Times:
Willie’s got his armadillo back.
The mounted armadillo that was stolen from Willie Nelson’s road crew after a Sept. 19 concert at the Capitol Theatre was returned unharmed Friday morning, the theater happily reported.
“The artists who play at the Capitol Theatre, we try to treat them like family, and no one messes with our family,” Tom Bailey, the Capitol’s manager, said.
Shortly before noon, someone (Bailey did not describe the person, but said it was not the suspected thief) walked up to the theater and handed a sealed box to the attendant working at the box office. Inside — his signature blue-glitter hat still fixed to his leathery head — was Ol’ Dillo.
“Everyone is relieved and delighted to be able to get the thing back to Mr. Nelson,” Bailey said.
No word yet when Ol’ Dillo will be appearing in pro-ObamaCare PSAs.
One of the recurring themes of Mad Men’s early seasons is the postmodern belief that the past is fungible. George Orwell’s 1984 explored the concept on a mass scale, with Winston Smith toiling away in the bowels of the Ministry of Truth to manipulate the past, Soviet-style, to suit the current whims of his political masters. Mad Men looked at the concept from the individual point of view.
As every fan of the show already knows, Don Draper, Mad Men’s hero (or rather anti-hero) is of course, secretly Dick Whitman. Whitman is a supremely ambitious social climber, who heavily airbrushed his past growing up dirt poor in a Depression-era whorehouse, and deserting his Army service in Korea. He accomplished the latter, by switching dog tags with the commanding officer he accidentally killed — and thus assuming his name, and as he later discovered, his wife — to become Don Draper. A decade later, at the apex of the show’s first season, after a rival threatens to out Don’s past to his boss, their employer’s classic response is contained within the scene that defined Mad Men’s solipsistic philosophy:
The climax of the first season of Mad Men, set at the dawn of the 1960s at a Madison Avenue advertising agency, is actually a brilliant anticlimax—a revelation swiftly followed by a re-veiling. Pete Campbell (Vincent Kartheiser), a clumsy striver at Sterling Cooper, attempts to topple the resident alpha dog, Don Draper (Jon Hamm), with what looks to be a career-ending disclosure: Draper, the firm’s dazzling creative director, is living under an assumed name; he’s a fraud, likely a Korean War deserter, and possibly worse. Campbell blurts it all out to the avuncular overlord, Bertram Cooper [Robert Morse], while Draper stands by silently, poker-faced, hands steady enough to light yet another cigarette. The elder statesman Cooper considers, waits an agonizing long beat, and makes a purely utilitarian reply.
“Mr. Campbell, who cares?” Cooper asks calmly, his voice burring with pity and disdain for the youngster’s naive theatrics. “This country was built and run by men with worse stories than whatever you’ve imagined here.”
“The Japanese have a saying,” Cooper continues. “‘A man is whatever room he is in’ — and right now, Donald Draper is in this room.”
Perhaps the ultimate example of this philosophy occurred in the following season. Peggy, Don’s young protégé as a secretary turned advertising copywriter abandons her baby in lieu of her career. Don shows up in the hospital shortly after she’s given birth, and given up the child for adoption, and tells her, “Peggy, listen to me, get out of here and move forward. This never happened. It will shock you how much it never happened.”
Last week, the mailman delivered an Amazon box containing the Criterion Blu-Ray edition of the 1966 John Frankenheimer movie Seconds, starring Rock Hudson. Its arrival meant I could finally retire my 1997 laser disc edition of the film, one of the last 12-inch silver discs I purchased before switching to DVDs. But first, it meant a late night viewing of one of the strangest and most unsettling movies produced by mid-‘60s Hollywood.
Forget Dr. Strangelove’s obsession with fluoridation — something strange had gotten in the water in the mid-1960s. Maybe it was a collective premonition that the overreach of the Johnson Administration’s Great Society would very likely cause it to fail, as it attempted to fight the war on poverty, the war on racism, the space race, the Cold War, and the hot war in Vietnam, all simultaneously.
Perhaps it was the cognitive dissonance of the left, unable to process the fact that Johnson was only in office because President Kennedy was shot by “some silly little Communist,” as newly-widowed Jackie Kennedy muttered upon hearing the news about the motivations of the man who shot her husband. Instead of understanding that the Cold War had claimed her husband, Jackie, like most of the American left couldn’t make the connection. The ideology of Kennedy’s assassin “robs his death of any meaning,” she added.
But giving meaning to life didn’t really interest the American left at the height of the Cold War. In the early days of the 20th century, pioneering, self-described “Progressives” championed better working conditions for the common man. Now that America’s postwar economic boom meant that many men had them, and were moving to the suburbs as a result, after World War II, the left decided this was a bad thing. Hence, the 1956 film, The Man in the Gray Flannel Suit, (which both Seconds and today’s Mad Men each owes much to), and by time of the Kennedy era, Malvina Reynolds and Pete Seeger’s “Little Boxes,” with its references to suburban houses “all made out of ticky-tacky,” dubbed “the most sanctimonious song ever written” by fellow leftwing songwriter Tom Leher.
But this trend went into overdrive by the mid-‘60s, a hatred of all things suburbia that burns to this day, one of many poisoned leftwing wells from which our current president has drunk from deeply. In 1966, director Frank Perry shot the film version of the 1964 short story written by the New Yorker’s John Cheever, The Swimmer, which starred a buff-looking Burt Lancaster, trapped in an 95-minute-long metaphor of a movie. As the title implies, Lancaster swam from pool to pool, chatting wistfully with his neighbors in their wealthy Connecticut suburb about missed opportunities, middle age, and social conformity.
The Swimmer, which is available in high def streaming video from Amazon, wasn’t released by Columbia Pictures until 1968, perhaps because another, much darker film with a somewhat similar theme had bombed badly at the box office. In the early to mid-1960s, director John Frankenheimer had a made a career of Cold War paranoid thrillers, releasing first The Manchurian Candidate in 1962, starring Frank Sinatra and Laurence Harvey. In 1964, Frankenheimer next helmed Seven Days in May, with Kirk Douglas and Burt Lancaster starring in a film about American generals attempting a coup against a dovish liberal president. In 1966, Frankenheimer directed Seconds.
SPOILER ALERT: DON’T READ ANY FURTHER IF YOU HAVEN’T SEEN THE FILM YET, BUT MIGHT WANT TO. I’M ABOUT TO GIVE AWAY WIDE SWATCHES OF THE PLOT. YOU WERE WARNED.
The recent corporate transfer of Alec Baldwin from permanent NBC Saturday Night Live guest host and star of the recently cancelled low-rated NBC series 30 Rock to his upcoming gig as a raving anchor at MSNBC sounds like something out of Paddy Chayefsky’s 1976 film Network. Not the least of which because Baldwin’s in-house transfer was preceded by an outrageous homophobic slur, which old media — and not just NBC — worked very hard to bury. But then, there’s very little about television news that Network didn’t anticipate.
Because of the length of time needed for a movie to be both green-lighted, and then produced, few cinematic satires arrive at the apogee of their subject’s power. When Kubrick released Dr. Strangelove in 1964, the Air Force had begun to move away from nuclear-equipped B-52 bombers to a missile-based attack system. By the time Robert Altman had shot M*A*S*H in 1970, President Nixon was beginning to wind down American involvement in the Vietnam War. In the 1980s, films such as Red Dawn and 2010: The Year We Make Contact depicted America involved in future military conflicts with the Soviet Union, even as the latter was imploding. (Thank you, President Reagan.)
But when Network hit movie theaters in 1976, the original big three television networks were at the apex of their uncontested power; newspapers were losing readership, the World Wide Web was nearly 20 years off, and even CNN wouldn’t begin broadcasting until 1980. More importantly, talk radio, Fox News and the Blogosphere wouldn’t come into to play for another 15 to 25 years respectively. There was nothing to stop television’s untrammeled power, and seemingly no way for the individual to fight back.
“It’s Not Satire — It’s Sheer Reportage”
In his 2005 interview with Robert Osborne on Turner Classic Movies, Sidney Lumet, Network’s director, told Osborne that when he and Chayefsky were making their initial rounds on the interview circuit to promote the movie, “Paddy and I, whenever we’d be asked something about ‘this brilliant satire,’ we’d keep saying, ‘It’s not a satire — it’s sheer reportage!’ The only thing that hasn’t happened yet is nobody’s been shot on the air!”
Is it possible for a veteran actor to star in a motion picture that makes him a legend, assures his cinematic immortality, and ensures that while he’s still alive, he’ll always find work, and yet be completely miscast? Actually, it’s happened at least twice. In the late 1970s, Stanley Kubrick cast Jack Nicholson as Jack Torrance in The Shining. The film made Nicholson a legend, but in a way, he’s very badly miscast — Nicholson’s character seems pretty darn bonkers right from the start of the film, long before his encounters with the demons lurking within the bowels of the Overlook Hotel.
But arguably, a far worse case of miscasting is Charles Bronson in Michael Winner’s 1974 film Death Wish. When novelist Brian Garfield wrote the 1972 book that inspired the movie, he was hoping that if Hollywood ever adapted his novel to the big screen, a milquetoast actor such as Jack Lemmon would star. And Lemmon would actually have been perfect, since his character’s transformation from bleeding heart liberal white collar professional to crazed vigilante would have been all the more shocking. Instead, we all know it’s only a matter of time before Charles Bronson reveals his legendary tough guy persona on the screen. Back around 2000, I remember reading Garfield’s notes on his book’s Amazon page, which was something along the lines of, “Would you want to mess with Charles Bronson?”
Currently the cinematic adaptation of Death Wish is available for home viewing in standard definition on DVD, and in high definition, via Amazon’s Instant Video format. And while the latter version is in sharp 1080p HD, the film could use a restoration from Paramount before it’s issued onto a Blu-Ray disc. The Amazon version has its share of scratches and dust on its print, though it’s certainly cleaner than the Manhattan it depicts on screen. I watched the Amazon HD version the other night, and I was reminded that Bronson’s casting dispenses with the film’s credibility almost as explosively as Bronson himself dispatches assailants onscreen. There are eight million stories in the naked city, and apparently, in 1974, almost as many muggers stupid enough to go up against Charles Bronson.
But otherwise, the timing of the film was absolutely perfect. As Power Line’s Steve Hayward noted in The Age of Reagan: The Fall of the Old Liberal Order: 1964-1980, film critic Richard Grenier dubbed Clint Eastwood’s 1971 film Dirty Harry, “the first popular film to talk back to liberalism,” a movie made during the period that then-Governor Ronald Reagan “liked to joke that a liberal’s idea of being tough on crime was to give longer suspended sentences,” Hayward added.
Which helped set the stage not just for Death Wish, but for the era of moral collapse in which it was filmed, and in which it too became a hit by talking back to liberalism.
Peter Biskind’s 1998 book Easy Riders, Raging Bulls documented Hollywood’s near-complete takeover by the left beginning in the late 1960s, but there were a few holdouts during that era: John Wayne was still making movies, Eastwood’s long career was beginning its ascendency, and British director Michael Winner was also a conservative himself.
But on the East Coast, in the early 1970s, New York had essentially collapsed. Saul Bellow was one of the first novelists to document the moral and increasingly physical carnage. As Myron Magnet of City Journal wrote in the spring of 2008, “Fear was a New Yorker’s constant companion in the 1970s and ’80s. … So to read Saul Bellow’s Mr. Sammler’s Planet when it came out in 1970 was like a jolt of electricity”:
The book was true, prophetically so. And now that we live in New York’s second golden age — the age of reborn neighborhoods in every borough, of safe streets bustling with tourists, of $40 million apartments, of filled-to-overflowing private schools and colleges, of urban glamour; the age when the New York Times runs stories that explain how once upon a time there was the age of the mugger and that ask, is New York losing its street smarts? — it’s important to recall that today’s peace and prosperity mustn’t be taken for granted. Hip young residents of the revived Lower East Side or Williamsburg need to know that it’s possible to kill a city, that the streets they walk daily were once no-go zones, that within living memory residents and companies were fleeing Gotham, that newsweeklies heralded the rotting of the Big Apple and movies like Taxi Driver and Midnight Cowboy plausibly depicted New York as a nightmare peopled by freaks. That’s why it’s worth looking back at Mr. Sammler to understand why that decline occurred: we need to make sure it doesn’t happen again.
That was the milieu in which Bronson’s Paul Kersey character resided at the start of Death Wish. Flying back to New York after a relaxing Hawaiian vacation with his wife (played by veteran actress Hope Lange), Kersey’s wife is murdered and his daughter raped by home invaders led by a young Jeff Goldblum at the start of his acting career. (Near the end of the film, a pre-Spinal Tap Christopher Guest plays a nervous rookie NYPD cop). On a business trip out to Tucson, both to take his mind off the horrors that had befallen his family, and to get a real estate development project back on track, Bronson’s Kersey discovers that it’s possible to defend yourself against crime.
The businessman that Kersey meets during the film’s Tucson scenes, played by character actor Stuart Margolin, is a staunch Second Amendment supporter who invites Kersey to a gun range, and asks him,“Paul, which war was yours?” That was a common question among middle-aged men during the latter half of the 20th century. Kersey admits he was a “C.O. in a M*A*S*H unit” in Korea.
“Oh, Commanding Officer, eh?” Margolin’s Good Ol’ Businessman approvingly asks.
“Conscientious Objector,” Bronson’s Kersey drolly replies as Margolin rolls his eyes in disgust.
Kersey explains that he became one as a teenager, after his father was shot and killed in a hunting accident, quickly fleshing out his character’s backstory. Evidently, Kersey’s own skills as a hunter haven’t degraded much over the years, since he then aims and fires the pistol that Margolin’s character had handed him, splitting the paper target at the gun range dead center.
And away we go.
I distinctly remember two mile markers on the road to my obsession with home theater. The first was a 1987 article in Billboard magazine exploring the continuing popularity, against all odds, of the 12-inch laser disc format with movie collectors. The article mentioned an obscure California firm called “The Criterion Collection” that was selling Blade Runner and 2001: A Space Odyssey in something called a “letterboxed” format, which would allow seeing the entire frame of those magnificently photographed widescreen movies on a home television set, instead of the “panned and scanned” version, which cut off the sides of the frame. It also mentioned the superiority of the laser disc format compared to fuzzy VHS tapes, and that laser disc allowed for such ancillary items as directors’ commentaries on auxiliary audio channels, behind the scenes still photos, trailers, deleted scenes, and other fun pieces of memorabilia.
This sounded pretty darn awesome. Shortly thereafter, I purchased my first laser disc player, the predecessor to the DVD, which wouldn’t arrive in stores for another decade. At its best, the picture and sound quality of laser discs blew VHS away, and I was quickly hooked. Particularly when I stumbled over a nearby video store that rented laser discs.
The second mile maker arrived two years later. That’s when I walked out of the B. Dalton Bookstore in New Jersey’s Burlington Mall holding a copy of the second issue of Audio/Video Interiors, the magazine that put the words “home theater” on the map. It was essentially Architectural Digest meets Stereo Review, with photo layout after photo layout filled with sophisticated audio and video components beautifully photographed in stunning home settings, including some of the first dedicated home theaters that were designed to look like the classic movie palaces of the 1930s, such as Theo Kalomirakis’ Roxy Theater, a knockout design built in the basement of his Brooklyn Brownstone. (Kalomirakis would go on to make a living as a home theater designer, producing works for extremely well-heeled clients that make his initial Roxy look positively modest by comparison.)
I gravitated more to the media rooms the magazine featured than the dedicated home theaters. Media rooms were rooms designed for a variety of media consumption — music, TV, movies, concert videos, with the electronics tastefully combined into some of sort attractive cabinetry or hidden into the wall. Maybe because my father had a custom built-in unit installed in his basement in 1969 to house his stereo equipment and a small portion of his enormous (3000+) collection of jazz and big band records. Adding video and surround sound to that concept seemed like a natural to me.
I’ve kept most of the issues from Audio/Video Interiors’ initial run; I was immensely proud to have written a few articles for the magazine in the late 1990s. In retrospect, it’s fun to look back at the first issues of AVI, and realize how much technology has progressed since then. HD video replaced “Never Twice the Same Color” low-def analog TV. VHS is all but extinct. Dolby Pro-Logic, the first consumer surround sound format has been upgraded to first Dolby 5.1 and now Dolby 7.1 and beyond. CDs have been rendered increasingly anathema, particularly for casual listening, by MP3s. The laser disc was replaced 15 years ago by the DVD, which has now been supplanted by Blu-Ray, and increasingly, by streaming high-definition movies, such as those offered by Netflix and Amazon Instant Video.
Welcome to 2013
Apologies for burying the lede, but this brings us to one of Pioneer’s newest A/V receivers, the Pioneer Elite SC-75. The first Pioneer receiver I owned was the classic Pioneer VSX-D1S of 1990, one of the first receivers designed with what we now call home theater in mind, with as much emphasis on controlling video components as the CD player, the tape deck and the record player. Since then, Pioneer has been upgrading the electronics of their units to keep pace with changing world of home theater technology. I purchased the SC-75 to replace my Pioneer Elite VSX-72TXV, which was built in 2006. With six years passed, and the proliferation of streaming video set-top boxes such as the Roku (which we reviewed last year), the addition of LAN and wireless networking technology to many Blu-Ray players, the popularization of Androids and iPods/iPads as music and video devices, and the standardization of the HDMI format to connect video components, the SC-75 is a very different beast compared to the previous generation of Pioneer Elite receivers.
The differences aren’t immediately apparent at first glance; the only thing that initially sets the unit apart from its predecessors is its brushed metal finish, instead of the smooth piano black styling of older Pioneer Elite models (Pioneer has apparently also permanently retired the beautiful rosewood-veneered side panels of the first generation of Elite models, which is a pity; on the other hand, perhaps they simply don’t want to be raided by the lumber fascists, ala Gibson.)
A phrase pops into his head from out of nowhere. “Everybody… all of them… it’s back to blood! Religion is dying… but everybody still has to believe in something. It would be intolerable— you couldn’t stand it— to finally have to say to yourself, ‘Why keep pretending? I’m nothing but a random atom inside a supercollider known as the universe.’ But believing in by definition means blindly, irrationally, doesn’t it. So, my people, that leaves only our blood, the bloodlines that course through our very bodies, to unite us. ‘La Raza!’ as the Puerto Ricans cry out. ‘The Race!’ cries the whole world. All people, all people everywhere, have but one last thing on their minds— Back to blood!” All people, everywhere, you have no choice but— Back to blood!
Beyond Wolfe’s own introduction, I thought John O’Sullivan had one of the best takes on Wolfe’s new book, (hidden behind the National Review subscriber paywall, alas) describing it as a battlefront report on the warring “Tribes of Post-America:”
Wolfe’s vision of the America emerging from the chaos of modernity is eerily similar to the Rome of Antiquity before Constantine. Where that antiquity was pre-Christian, this New Antiquity is post-Christian. Its original brand of Protestant Christianity no longer influences the politics, institutions, and laws of the nation it once shaped. The WASP elites, for whom Protestantism was long a mark of respectability and soundness, no longer even pretend to believe. It is a genuine religious faith for only a tiny number of people. Its secular expressions, “American exceptionalism” and “the American Creed,” are in only slightly better shape. The former provoked President Obama into an embarrassed meandering as he sought to reconcile his cosmopolitan disdain for it with its popularity among the rubes; the latter has been redefined into its opposite, an umbrella term covering a multitude of tribes and their different customs, namely multiculturalism.
This transformation from the Great Republic to the New Antiquity has happened in large measure in order to accommodate the growing number of immigrant groups forcing their way into the metropolis. It is a colder and crueler world: Inside the cultural ghettos, the new tribes of post-America retain much of their old affections and loyalties; outside them, they treat others with wariness and distrust. And they are slow to develop a common attachment to their new “home.”
“As mainline WASP Christianity shrivels,”O’Sullivan writes, “other cults flourish in its place: the ethnicity cult, of course; the arts cult for the very rich; the sex cult for the young; the celebrity cult for professionals; the psychology cult for billionaire clients; a religion cult (non-traditional religion, of course) for the perplexed; and the cult of wealth for everyone. Only the Gods of the Copybook Headings are missing from this teeming agora through which Wolfe’s characters pursue their fantasies and flee from their anxieties.”
Robinson asks Wolfe about this passage. Also in the interview he asks Wolfe about his process as a writer (which he cranks out on a manual typewriter or even a pencil, considering how difficult it is for him to keep a typewriter in proper repair), and the importance of taking notes and jotting down ideas as quickly as possible while in the field reporting, before the brain’s memory decay erases the tiny details of a scene that adds verisimilitude to his writing.
Needless to say, watch the whole thing.
CNN stumbles into “a church without one big player: God:”
Sunday’s congregation in Cambridge is a meeting of the Humanist Community at Harvard University and the brainchild of Greg Epstein, the school’s Humanist chaplain.
A longtime advocate for community building, Epstein and his group of atheists have begun to build their Cambridge community and solemnize its Sunday meetings to resemble a traditional religious service.
To Epstein, religion is not all bad, and there is no reason to reject its helpful aspects.
“My point to my fellow atheists is, why do we need to paint things with such a broad brush? We can learn from the positive while learning how to get rid of the negative,” he said.
For Epstein, who started community-building at Harvard nearly 10 years ago, the idea of a godless congregation is not an oxymoron.
“We decided recently that we want to use the word congregation more and more often because that is a word that strongly evokes a certain kind of community – a really close knit, strong community that can make strong change happen in the world,” he said.
“It doesn’t require and it doesn’t even imply a specific set of beliefs about anything.”
Epstein is not alone in his endeavor. Jerry DeWitt, who became an atheist and left his job as an evangelical minister, is using his pastoral experience to building an atheist church in Baton Rouge, Louisiana.
CNN seems to think this news, when the idea of a Godless church has been woven deep into the firmware of “Progressivism,” ever since Friedrich Nietzsche and the men whom historian Martin E. Marty dubbed “The Bearded God-killers” (Nietzsche, Marx, Darwin and Freud) had their heyday in the late 19th century. However, man is hardwired to believe in something, and the following century could be viewed as one long attempt at finding an alternative God via totalitarian regimes (Stalin, Hitler, Mao, Pol Pot, etc.), and the nature-worship of radical environmentalism. Not to mention drugs — recall Tom Wolfe exploring the religious motivations of ’60s drug users in his seminal mid-1970s essay “The ‘Me’ Decade and the Third Great Awakening,” which as the second half of its title implies, is a lengthy treatise on all sorts of ways to build alternative religions that replace God with the self.
Oh — and then there’s Obama cult of 2008 which had no small amount of building BHO into an alternative God — including by a lot of people who should have known better. But then, as Umberto Eco wrote in 2005:
G K Chesterton is often credited with observing: “When a man ceases to believe in God, he doesn’t believe in nothing. He believes in anything.” Whoever said it — he was right. We are supposed to live in a sceptical age. In fact, we live in an age of outrageous credulity.
Found via Maggie’s Farm, which quips, “Atheist Churches…In my view, there is nothing wrong with social clubs.”
Related: Aaron Clarey on “When Atheists Aren’t Really Atheists.”
I’m not sure if he created it or is simply relinking to it, but in any case, that’s a trenchant juxtaposition spotted on actor Kevin Sorbo’s Facebook page by Twitchy, along with Sorbo noting “And this is why I love the hypocrisy of America…..Oh, and thank God none of us have EVER been racist. Only this woman. Bad woman. Bad.”
Like Tarantino, as the Daily Caller notes, “‘N-word’ user Paula Deen is an Obama supporter:”
Television chef Paula Deen, who lost her gig Friday with the Food Network after admitting and apologizing for having used racial epithets, including the “N-word,” is a supporter of President Barack Obama and campaigned on his behalf in 2008.
The Southern-bred television personality also invited Michelle Obama onto one of her television programs and praised the first lady’s healthy eating agenda.
Deen, who has previously faced criticism for using excessive quantities of fat and salt in her dishes, is under fire after it was revealed in a lawsuit that she admitted using the N-word and other racial slurs in front of her employees.
Deen is also accused of sexual harassment, and was found in contempt of court for refusing to turn over a video in which she allegedly simulates a sex act on a chocolate eclair.
As Andrew Klavan noted today, watching Bill Maher’s Time-Warner-CNN-HBO cable show so you didn’t have to:
Maher incited the ire of the self-righteous left-wing panel on his HBO show Real Time with Bill Maher by asking, quite reasonably, “If you’re 66 years old, and you were raised in Georgia, and you were a child before the civil rights movement, do you get a bit of a pass?” He further commented: “I… think people shouldn’t have to lose their shows and go away when they do something bad. It’s just a word, it’s a wrong word, she’s wrong to use it, but do we always have to make people go away?”
“It tells you something about the state of debate in our country that Bill Maher has become a voice of reason,” Drew adds.
Her last step before being completely airbrushed down the cable TV memory hole is likely a tearful public auto-da-fé* via Oprah, another wildly enthusiastic 2008 Obama supporter who career has also peaked dramatically in the years since. Or as Canadian columnist Rick Murphy dubbed her last year, “Oprah Winfrey, the Obama supporter fame left behind.”
That does seem to be a rapidly growing class of celebrity, doesn’t it? Or as Glenn Reynolds once quipped, “Everything Obama touches . . . .”
* It’s what you oughtn’t to do, but you do anyway.
Variety reports that Matheson was 87:
Richard Matheson, the sci-fi and fantasy novelist and screenwriter who influenced modern genre writers and directors and wrote numerous stories and books that were adapted as films including “I Am Legend,” died on Sunday at his home in Calabasas, Calif, according to his publisher. He was 87 and had been ill for some time.
As well as creating source material for films including “What Dreams May Come,” “A Stir of Echoes” and “The Shrinking Man,” Matheson was a prolific film and TV scribe and responsible for some of the most popular “Twilight Zone” episodes as well as writing for nearly every other anthology series of the 1960s and 70s with credits including “Lawman,” “The Alfred Hitchcock Hour,” “Rod Serling’s Night Gallery,” “The Martian Chronicles,” “Amazing Stories” and “Star Trek” episode “The Enemy Within.”
For “Twilight Zone,” Matheson wrote the classic William Shatner episode “Nightmare at 20,000 Feet.” The Hugh Jackman film “Real Steel” was adapted from his “Twilight Zone” episode “Steel.”
Between novels and screenplays, the IMDB lists Matheson as contributing to an astonishing 80 film and TV titles — a number which will keep growing, as spin-offs and reboots of his earlier work, such the 1971 Charlton Heston vehicle The Omega Man being reworked into Will Smith’s I Am Legend, will likely continue for years to come.
TMZ witnessed a signed birth certificate with the name North West signed by Kim, 32, and Kanye, 36, at Cedars-Sinai hospital.
People magazine confirmed the news with a source close to the Kardashian family.
And a source told UsWeekly that the couple have already given their daughter a nickname — she will be ‘Nori for short’.
They added that the child has no middle name.
The name North had previously been suggested as a possibility for the child, although many Kardashian fans had dismissed the idea as a joke.
Cary Grant and Alfred Hitchcock could not be reached for comment.
Incidentally, since I’ve been meaning to post a link to this all week, this is as good a place as any to post a reminder that when it comes to pop “music,” Time magazine sure can pick ‘em: In August of 2005, a Time magazine cover story dubbed West (Kanye, not Adam or Leslie) “Hip-Hop’s Class Act…Why he’s the smartest man in pop music.” The following month, West would prove he’s neither of those traits, as he dynamited an NBC fundraiser for Hurricane Katrina victims with his racist rant against then-President George W. Bush, despite the fact, as DNC chairwoman Donna Brazile recently noted in CNN, “Bush came through on Katrina.”
We should have seen it coming; though perhaps we can predict future pop culture self-immolations. Who else has Time dubbed the best/smartest/classiest in their pop culture genres? Their implosion is likely just a matter of time.
Multiple sources, including the New York Times and the New York Daily News are reporting the death of James Gandolfini, who achieved iconic status as America’s favorite television mobster in HBO’s popular series, The Sopranos, which aired from 1999 through 2007. The Daily News reports that the 51 year old New Jersey-born actor “died following a massive heart attack in Italy,” according to a source close to the actor.
Deadline Hollywood adds:
Overweight, balding, rough around the edges with a thick New Jersey accent, Gandolfini was the opposite of a marquee leading man, destined to be a character actor, and yet he proved through his masterful acting that he could make Tony Soprano sexy and smart, towering and powerful. his portrayal was one of TV’s largest-looming TV anti-heroes — the schlub we loved, the cruel monster we hated, the anxiety-ridden husband and father we wanted to hug when he bemoaned, “I’m afraid I’m going to lose my family. Like I lost the ducks.” In the most maddening series finale in recent history – an episode chock full of references to mortality (life, death, a William Butler Yeats reference to the apocalypse, a bathroom reference to a “Godfather” bloodbath) — his was the show’s last image, seen just as the words “Don’t stop” were being sung on the jukebox. It generated such extreme reaction that the series’ fans crashed HBO’s website for a time that night trying to register their outrage that it ended with a black screen, leaving them not knowing whether Tony Soprano had been whacked. In large part to Gandolfini’s charisma (“Jimmy was the spiritual core of our Sopranos family,” Chris Albright noted today), that Season 5 of The Sopranos in 2004 remains the most watched series in HBO history with 14.4 million viewers on average.
“I thought it was a wonderful script,” Mr. Gandolfini told Newsweek in 2001, recalling his audition. “I thought, ‘I can do this.’ But I thought they would hire someone a little more debonair, shall we say. A little more appealing to the eye.”
Fortunately for both HBO and us, Gandolfini got the part, and will live on in Hollywood mobster immortality, an apt successor to the gangster roles that made legends of Edward G. Robinson, Brando, Pacino and the ensemble cast of Goodfellas, which inspired The Sopranos.
Big Hollywood links to an article by Lynda Obst, the producer of Contact, Sleepless in Seattle, and TV’s Hot in Cleveland (among many other projects) in Salon, setting up her quotes by first noting that “For consumers, the decline of the DVD market has meant switching over to both Blu-ray and, more recently, streaming options for their viewing pleasure. The end of the DVD format’s dominance meant something much more, and far worse, for Hollywood.”
In Salon, Obst writes:
“The DVD business represented fifty percent of their profits,” [20th Century Fox executive Peter Chernin] went on. “Fifty percent. The decline of that business means their entire profit could come down between forty and fifty percent for new movies.”
For those of you like me who are not good at math, let me make Peter’s statement even simpler. If a studio’s margin of profit was only 10 percent in the Old Abnormal, now with the collapsing DVD market that profit margin was hovering around 6 percent. The loss of profit on those little silver discs had nearly halved our profit margin.
This was, literally, a Great Contraction. Something drastic had happened to our industry, and this was it. Surely there were other factors: Young males were disappearing into video games; there were hundreds of home entertainment choices available for nesting families; the Net. But slicing a huge chunk of reliable profits right out of the bottom line forever?
This was mind-boggling to me, and I’ve been in the business for thirty years. Peter continued as I absorbed the depths and roots of what I was starting to think of as the Great Contraction. “Which means if nothing else changed, they would all be losing money. That’s how serious the DVD downturn is. At best, it could cut their profit in half for new movies.”
* * * * *
“When did the collapse begin?”
“The bad news started in 2008,” he said. “Bad 2009. Bad 2010. Bad 2011.”
It was as if he were scolding those years. They were bad, very bad. I wouldn’t want to be those years.
“The international market will still grow,” he said, “but the DVD sell-through business is not coming back again. Consumers will buy their movies on Netflix, iTunes, Amazon et al. before they will purchase a DVD.” What had been our profit margin has gone the way of the old media.
But it was in 2010 that James Cameron told the Washington Post that DVDs were bad for the Gaia and other living things, and needed to be eliminated (while simultaneously having multiple versions of Avatar coming out that same year on DVD):
It’s a consumer product like any consumer product. I think ultimately we’re going to bypass a physical medium and go directly to a download model and then it’s just bits moving in the system. And then the only impact to the environment is the power it takes to run the computers, run the devices. I think that we’re not there yet, but we’re moving that direction. Twentieth Century Fox has made a commitment to be carbon neutral by the end of 2010. Because of some of these practices that can’t be changed, the only way to do that is to buy carbon offsets. You know, which again, these are interim solutions. But at least it shows that there’s a consciousness that we have to be dealing with carbon pollution and sustainability. …
And the following year, many in Hollywood went all-in with Occupy Wall Street, which was obsessed with the “obscene” profits made by gigantic multinational corporations. You know, like movie studios.
Presumably, losing the cushion of DVD sales is part of the reason why Steven Spielberg recently told a USC audience that, as the Hollywood Reporter paraphrased, “an ‘implosion’ in the film industry is inevitable, whereby a half dozen or so $250 million movies flop at the box office and alter the industry forever.”
But it’s not like Hollywood has much respect for the audience who pays the tickets to see those $250 million products during their initial run in theaters. Obst’s article on the collapse of her industry appears in Salon, which isn’t exactly sympathetic to Hollywood’s core audience in flyover country, when its editor at large has a new book titled, What’s the Matter with White People?: Finding Our Way in the Next America.
Similarly, in 2008, the late Nora Ephron, who in the previous decade had written and directed the Obst-produced Sleepless in Seattle, wrote in the Huffington Post, “This is an election about whether the people of Pennsylvania hate blacks more than they hate women. And when I say people, I don’t mean people, I mean white men.” Incidentally those people in Pennsylvania that Ephron was writing off as troglodytic racists were her fellow Democrats, who were about to decide between Obama and Hillary in the PA Democrat primary — the same primary voters that Obama wrote off at the time as bitter, gun and God-obsessed clingers.