» Capitalism, the Unknown Ideal
  
Get PJ Media on your Apple

Ed Driscoll

Capitalism, the Unknown Ideal

Past performance is no guarantee of future results. In the wake of the January 2011 shooting of Gabrielle Giffords and over a dozen others, which the MSM immediately and erroneously blamed on Sarah Palin’s clip art, the MSM rushed in lockstep to condemn violent rhetoric, and demanded that both politicians and the media censor themselves. One contributor to the left-leaning publication National Journal insisted that violent rhetoric should be treated in the same fashion “that we’ve stopped using certain epithets like the ‘n’-word public forums:”

National Journal’s Michael Hirsh wants to raise the bar on decorum to an entirely new level. On Thursday’s MSNBC airing of “Hardball,” Hirsh told host Chris Matthews certain “gun” terms should be stricken from political discourse…His proposal? Make such language inappropriate in the same racial slurs are inappropriate.

“That’s the kind of language I think we got to have a hard think about now,” Hirsh said. “Do we really want to continue to use that kind of language at these levels? Or, should there be kind of a social sanction, not a legal one, but a moral sanction in the way that we’ve stopped using certain epithets like the ‘n’-word public forums. Stop using that kind of language, those kinds of metaphors.”

Certainly, many would view comparing someone to a Holocaust denier a slur that’s in the same league with violent, eliminationist rhetoric. Which makes this passage in a new National Journal article written by a young socialist justice warrior posing as a journalist highly problematic, in a piece titled “Scientists Tell Smithsonian to Ditch Koch Money.” (Link safe, goes to Twitchy):

The push arrives amid revelations that Smithsonian scientist and climate-denier Wei-Hock Soon raked in roughly $1.2 million dollars from the fossil-fuel industry while failing to disclose a conflict of interest. One of the founders of Soon’s research was the Charles G. Koch Charitable Foundation.

Does Mr. Soon deny that the climate exists? Now that would be news! In the interim, we await the layers and layers of editors and fact checkers at National Journal to condemn the use of a metaphor freighted with such a violent subtext. But we won’t hold our breath:

Related: “Reporters Explain Why Balance Isn’t Needed On Global Warming.”

Since the MSM long ago exited the profession of journalism in order to be Democrat operatives with bylines, are there any topics still left in which the MSM wishes to be fair and balanced (to coin a slogan) when discussing?

Starbucks Dials Back Racialist Campaign

March 22nd, 2015 - 2:46 pm

“Starbucks CEO Howard Schultz Imposes His Racial Hang-ups On America,” John Nolte wrote yesterday at Big Government, and as Tom Blumer added at NewsBusters, USA Today was eager to use their newspaper as a vehicle to promote Schultz’s racialism, bundling the above multipage folder into their papers yesterday. Fortunately, as Reuters (no stranger to leftwing evangelicalism themselves) reports today, Schultz or someone wiser at his company realized that having 20-something clerks lecture their customers on race was a staggeringly stupid idea for all concerned:

Starbucks Corp head Howard Schultz told employees on Sunday they will no longer be encouraged to write “Race Together” on drinks cups, but the company’s effort to promote discussion of racial issues “is far from over”.

The world’s biggest coffee chain kicked off a U.S. race relations campaign last week when it published full-page ads in major U.S. newspapers with the words “Shall We Overcome?” at centre page and “RaceTogether” and the Starbucks logo near the bottom. Employees behind the counter were also given the option of writing “Race Together” on customers’ cups.

The campaign was met with skepticism on social media, with many complaining the company was overstepping it boundaries with a campaign on sensitive cultural topics that had no place in the coffee shop’s lines.

Well, yes:

Mollie Hemingway of the Federalist had a slightly different take on Schultz’s mad scheme; in her opinion, he doesn’t view himself as a far left college headmaster, but as a fundamentalist preacher eager to use his underlings as missionaries to spread the socialist gospel, ala Father Coughlin or Rev. Wright. As Hemingway wrote, “With Race Together, Starbucks Is Using Worst Of Evangelical Practices:”

The whole campaign reminded me so much of this story from 2004, when an American Airlines pilot got on the loudspeaker and asked passengers who were Christian to raise their hands. Then he suggested to the ones who raised their hands that they spend the remainder of the flight trying to convert those who hadn’t. The passengers were so confused by the request that they wondered if the pilot was a terrorist.

Listen, I love few things more than sharing the good news that Jesus has triumphed over sin, death and Satan with others and I hate racism. But there’s a reason why the American Airlines pilot and the Starbucks approaches freak people out! Yes, part of it is that there’s a time and place to share the Gospel of Jesus Christ and discuss difficult social problems. But also, these things are highly ineffective when done outside of a personal relationship.

Both of these approaches also exhibit extreme vocational confusion.

* * * * * * *

Simply flying a plane to the best of your ability and bringing hundreds of passengers safely from one point to another is a great way to serve your neighbor. You don’t need to hand out cross pins or get on the loudspeaker and introduce people to Jesus to make it a good work.

It’s curious; for leftwing consumers, simply knowing that a CEO disagrees with this week’s stance on gay marriage or Obamacare is enough to get him fired or have his chain boycotted. But a leftwing CEO feels perfectly entitled to proselytize the Gleichschaltung to his customers.

Exit question: How long would Starbucks permit a “barista” to enthusiastically preach the real gospel to his customers as an aide to racial healing?

By the way, speaking of the Gleichschaltung, I stopped going to Starbucks on a regular basis two or three years ago when they bowed to Obamacare-related laws and began printing calorie counts on their menu boards. It wasn’t so much paying $5.00 for a cup of coffee that’s essentially a warm milkshake that was problematic, but being hit in the face that I now had an additional 500 calories or more to burn off at the gym that night. Now with their CEO having dropped the mask and gone the full Bullworth-meets-Eric Holder on his customers, I realize I was simply ahead of the curve in avoiding their product.

But hey, as they say at MSNBC’s parent network:

“Starbucks’s new campaign is yet another sign of the relentless politicization of American culture,” Jonah Goldberg writes today:

It’s ironic. The Obama years were supposed to usher in an era of racial harmony. That didn’t happen — which presumably is why Schultz feels the need to help mend our racial wounds. What has happened, however, is that hordes of college graduates, unable to find jobs suitable to their degrees, have ended up toiling away at places like Starbucks.It’s kind of ingenious. Since sociology majors can’t find relevant jobs, Schultz is making the jobs they have relevant to their majors. If this becomes a trend, maybe my dog walkers will start reciting Proust in French on their perambulations.

As a business decision, I find the whole thing bizarre. If I don’t have my coffee in the morning, I get a headache that feels like a Hell’s Angel is trying to press his meaty thumb through my forehead. This is not the most propitious moment to engage me in a conversation about my “race journey.” Worse, Starbucks lines are already long. How much longer will they get when the barista takes 20 minutes out of his or her job to debate the Moynihan Report with a customer?

On Red Eye last night, Rob Long made a great observation — there’s no Starbucks in Ferguson. If Howard Schultz, Starbucks’ CEO wants to really start a dialogue about race, perhaps he needs to put his company’s money where is mouth is, and start there.

Meanwhile, David P. Goldman, writing in his “Spengler” column at PJM wants to start a dialogue about the truly important issues that vex us all: It’s “Time for a National Conversation About Why Starbucks Coffee Is Disgusting.”

By the way, as with all “Progressive” concepts, Schultz’s thinking is stuck in America’s distant past — it’s always Alabama in 1963 for the left, unless it’s 1933 and they’re searching for their next Roosevelt. But in the real 21st century America, my local town in Northern California has a widely divergent racial culture filled Asians from Japan, China and Korea; people of Spanish and Mexican descent, and people of pallor like myself. And by and large, they seem to get on pretty darn well. Oh to be a fly on a wall when a white barista lectures an Asian or someone of Latino origin to be more harmonious in their race relations.

Lest you wind up in one of Hillary’s “Fun Camps” for political reeducation.

Exit Question: “Why Is There No Starbucks Coffee House in Selma?”

Unexpectedly!

March 16th, 2015 - 1:40 pm

We’ve seen this movie before, haven’t we?

 

As far as the automated dining service, Jazz Shaw writes, “It’s a terribly impersonal service as compared to a bartender or waitress who stops to chat with you, but it gets the job done:”

I ran into one of these setups at the Philadelphia airport this winter and they work surprisingly well. If you plan to pay by credit or debit card (which is the only option in some cases) you barely interact with a human at all. You browse the drinks and food on the touch screen, place your order, swipe your card, and a short while later somebody strolls up with your food and beverage, says hello and drops them off. It’s a terribly impersonal service as compared to a bartender or waitress who stops to chat with you, but it gets the job done.

Of course, that last phrase is the big issue here, isn’t it? It gets the job done. That job used to be done by a person. Now it’s essentially a robot. So those workers are no longer on the payroll, but hopefully they’ll catch on someplace else. Unfortunately, as Seattle is finding out, employers who run single outlets and don’t have the backing and buffer range of a major chain often won’t be able to make the shift in technological infrastructure required to cut back on staffing while staying open. Those folks will shut down, and it’s apparently already beginning in Washington state.

You know… if only somebody had tried to warn them.

Unexpectedly.

Update: Jonah Goldberg explored the racialist origins of the minimum wage in his 2008 book, Liberal Fascism:

[Early "Progressive" stalwart Edward Alsworth Ross] was a showman, but his ideas fit squarely within the worldview of progressive economics, on both sides of the Atlantic. Consider the debate over the minimum wage. The controversy centered on what to do about what Sidney Webb called the “unemployable class.”It was Webb’s belief, shared by many of the progressive economists affiliated with the American Economic Association, that establishing a minimum wage above the value of the unemployables’worth would lock them out of the market, accelerating their elimination as a class. This is essentially the modern conservative argument against the minimum wage, and even today, when conservatives make it, they are accused of—you guessed it—social Darwinism. But for the progressives at the dawn of the fascist moment, this was an argument for it. “Of all ways of dealing with these unfortunate parasites,”Webb observed, “the most ruinous to the community is to allow them unrestrainedly to compete as wage earners.”30
Ross put it succinctly: “The Coolie cannot outdo the American, but he can underlive him.”Since the inferior races were content to live closer to a filthy state of nature than the Nordic man, the savages did not require a civilized wage. Hence if you raised minimum wages to a civilized level, employers wouldn’t hire such miscreants in preference to “fitter”specimens, making them less likely to reproduce and, if necessary, easier targets for forced sterilization.
And of course, even beyond its racialist origins, a high minimum wage also makes it that much more difficult for a small business to succeed in general, which “Progressives” then and now consider a feature, and not a bug.

Ferguson Home Values Plummeting

March 16th, 2015 - 12:06 pm

Fusion, a Website that’s an, err fusion between Univision and ABC/Disney is shocked that Ferguson real estate prices are “Down nearly 50 percent since Michael Brown’s death.” There’s more than a hint of bias in that subhead, as the cause wasn’t Brown’s death after he slugged a convenience store clerk and attempted to steal a police officer’s gun, but the riots and looting that followed — which were another kind of media fusion, ginned up by via the minicams of CNN and fueled further by NBC anchorman Al Sharpton’s corrosive presence:

[John] Zisser, 55, has owned and operated Zisser’s Tires in this city since 1987. He says the still-visible damage from the November protests that followed a grand jury’s decision not to indict Ferguson officer Darren Wilson for the shooting death of teenager Michael Brown is hurting property owners. His store’s insurance is in the process of being cancelled after it was twice vandalized during the unrest, he says.

“If I sold this place today, I could probably get $300,000 for it, if anyone is crazy enough to buy. Last year, the county said this lot was worth almost a million,” he says. “The value here is all going down. There’s about nine burnt-out buildings this way,” he says, pointing. “And about four more behind me.”

Zisser is one of many Ferguson residents feeling a financial toll from the months of protests, media attention, and now another high-profile shooting. They’re worried not just about their own situations, but about the city coffers, too. The future of Ferguson, they say, is anyone’s guess.

“How much money are we going to lose?” Zisser asks. “How much money is the city and the county going to lose in taxes because of this? And how much is the school district going to lose here? They’re the biggest losers.”

Not at all “unexpectedly,” of course, as Fred Siegel warned in August of last year at City Journal:

Riots bring but one certainty—enormous economic and social costs. Businesses flee, taking jobs and tax revenues with them. Home values decline for all races, but particularly for blacks. Insurance costs rise and civic morale collapses. The black and white middle classes move out. Despite its busy port and enormous geographic assets, Newark, New Jersey has never fully recovered from its 1967 riot. This year, Newark elected as its mayor Ras Baraka, the son and political heir of Amiri Baraka—the intellectual inspiration for the 1967 unrest.

The story is similar in Detroit, which lost half its residents between 1967 and 2000. Civic authority was never restored after the late 1960s riots, which never really ended; they just continued in slow motion. “It got decided a long time ago in Detroit,” explained Adolph Mongo, advisor to the jailed former “hip-hop mayor,” Kwame Kilpatrick, that “the city belongs to the black man. The white man was a convenient target until there were no white men left in Detroit.” The upshot, explained Sam Riddle, an advisor to current congressman John Conyers, first elected in 1965, is that “the only difference between Detroit and the Third World in terms of corruption is that Detroit don’t have no goats in the streets.”

“No doubt little will be learned from Ferguson. No doubt there will be more Fergusons,” Siegel concluded. We’ve seen Ferguson’ possible future. And it’s not at all pretty:

Moviegoing Heads for the Exit

March 13th, 2015 - 12:26 pm

John Podhoretz, writing in the Weekly Standard, asks, “Will anyone go to the movies 25 years from now?”

Will there even be movie theaters 25 years from now? These are not idle questions. New research from the Motion Picture Association of America shows how the moviegoing audience of those between the ages of 25 and 39 has contracted precipitously—dropping almost 25 percent over the past four years.

Moviegoing is like any habit: Break it, and you’re not likely to go back to it. The habit is being broken. The business relies on those who go to theaters at least once a month. Such people are responsible for more than half the tickets sold in any given year. They now make up a mere 11 percent of the overall audience, and they’re getting older. Ticket sales to Americans over 40 are rising. Ticket sales to Americans between the ages of 21 and 40 are falling.

If this trend is not reversed, and it’s hard to see how it will be, two things will happen. The importance of frequent moviegoers will rise for the cinema’s bottom line because the number of people who go rarely or don’t go at all will rise. But those frequent moviegoers will begin to recede in numbers over time because they will (alas) begin, literally, to die out.

And then there’s the money it costs to build and maintain the infrastructure that delivers Hollywood’s product, Podhoretz adds:

It costs more to advertise them because it’s harder to make people aware that they even exist in the cable/Internet universe. Also, theaters are built on real estate that grows more valuable over time, luring developers because of the size of their footprint. Money has to be spent on theater upkeep or the seats will grow uncomfortable and the bathrooms skeevy. And if they grow less valuable because fewer people use them, and they don’t generate the profits at the concession stands that really support them, those theaters will be sold or will close.

In the 1970s, Nixon Derangement Syndrome drove many of the decisions made by the “New Hollywood” that replaced the studio system that created the industry’s golden era, which ran from the 1930s through the mid-1960s. (John Gregory Dunne’s book The Studio is an excellent profile of 20th Century Fox in 1967, just as the lights were about to go out on old Hollywood.) But at least the young Turks who replaced the grizzled old founders of Hollywood had fresh ideas, worked with smaller budgets, and had much more room to experiment, before Spielberg and Lucas showed the industry how to make money once again.

In the aftermath of 9/11, a near monolithically left Hollywood worked very hard at alienating the American middle class, and by and large succeeded. The preening-anti-war statements, all the way to siding with the terrorists in Guantanamo Bay. The rabid hatred of President Bush. The SJW sucker punches. The two-tiered dumbed-down industry reduced to cranking out two types of movies: zillion dollar CGI blockbusters in the form of mindless formulaic epic quests and cartoon/sci-fi blockbusters.  And equally mindless anti-Iraq movies.

But the biggest trend that greatly changed Hollywood was the loss of its stars. A certain amount of this was out of the industry’s control. Schwarzenegger decided to become governor. Tom Cruise and Mel Gibson had very public freakouts. Clint and Harrison Ford aged, the latter making increasingly bad film choices along the way. (Anybody remember Hollywood Homicide or Firewall from the mid-naughts?)

Stars were what made an industry where “nobody knows anything” about a film’s chances, as screenwriter William Goldman famously said, somewhat predictable. In the old days, you could know nothing about a movie other than above its title was a name like John Wayne, Cary Grant, Fred Astaire, or Humphrey Bogart, and know that you were going to have a pretty good two hours ahead of you. As late as the 1990s, Hollywood in the summertime delivered up a steady stream of Schwarzenegger, Stallone, Clint, Cruise, Gibson and Harrison, and you knew that you could watch two hours of a cool guy blowing stuff up. Casablanca it wasn’t, but it was still dependable.

Today, unless you’re a teenager who wants to see comic book stars and spaceships, Hollywood no longer wants you at the box office. As Podhoretz wrote in an earlier column, American Sniper “turned out an audience of people who haven’t been to a movie theater in years.” But as with the surprise massive success of Gibson’s The Passion, Hollywood has signaled loud and clear that we’re not a crowd that they wish to cater to at the box office.

To borrow from Tony Hendra’s classic doubletalk in Spinal Tap, the industry worked very hard to make its audience more “selective.” They shouldn’t be surprised to watch it thin out even further.

We’ll get to the above 1972 video of Walter Cronkite in just a moment, but first, let’s set the stage. Return with us now to the end of the 1960s and the dawning of the craptacular ’70s. As Power Line’s Steve Hayward wrote in the first volume of The Age of Reagan, environmentalism — then simply called “ecology” — became an obsession of the left shortly after President Nixon took office, eclipsing both anti-Vietnam war and pro-civil rights protests:

Writing in Science magazine, Amitai Etzioni of Columbia University dismissed ecology as a “fad,” and thought that “the newly found environmental dangers are being vastly exaggerated.” Even if not exaggerated, Etzioni thought the environment was the wrong priority: “Fighting hunger, malnutrition, and rats should be given priority over saving wildlife, and improving our schools over constructing waste disposal systems.”

This criticism was mild compared to the blasts that came from black civil rights leaders. The most bitter attack came from Richard Hatcher, the black mayor of Gary, Indiana: “The nation’s concern for the environment has done what George Wallace was unable to do—distract the nation from the human problems of black and brown Americans.” Whitney Young of the National Urban League was equally distressed: “The war on pollution is one that should be waged after the war on poverty is won. Common sense calls for reasonable national priorities and not for inventing new causes whose main appeal seems to be in their potential for copping out and ignoring the most dangerous and pressing of our problems.”

And being a good doctrinaire liberal, CBS’s Walter Cronkite was quick to move with the times and ride the fad. As left-leaning historian Douglas Brinkley noted in his 2012 biography of Cronkite:

A CBS Reports segment in September 1962 had Eric Sevareid famously interviewing the literary biologist Rachel Carson about the perils of the insecticide DDT at her home in Silver Spring, Maryland. Cronkite, at the time, had been focused on the Earth-orbiting flight of the second Mercury launch. But now that Neil Armstrong had walked on the Moon, Cronkite sensed that ecology would soon replace space exploration as the national obsession. CBS News producer Ron Bonn recalled precisely when Cronkite put the network on the front line of the fight. “ It was New Year’s Day, 1970, and Walter walked into the Broadcast Center and said, ‘God damn it, we’ve got to get on this environmental story,’ ” Bonn recalled. “When Walter said ‘God damn it,’ things happened.”

What could go wrong?

Cronkite pulled Bonn from nearly all other CBS duties for eight weeks so he could investigate environmental degradation. He wanted a whole new regular series on the CBS Evening News—inspired by Silent Spring, the philosophy of René Dubos, and those amazing photos of Earth taken by the Apollo 8 astronauts. The CBS Evening News segments were to be called “Can the World Be Saved?” “We wanted to grapple first with air pollution, the unbreathable air,” Bonn recalled. “But then we wanted to deal with the primary underlying problem, which was overpopulation.”

So, eugenics, then. And then a quick detour into global cooling. As Julia Seymour writes today at NewsBusters, “And That’s the Way It Was: In 1972, Cronkite Warned of ‘New Ice Age:’”

The late Cronkite is considered a “legendary journalist” and a pioneer in the field, which is why Marc Morano, publisher of Climate Depot, said this footage was so important. Morano is a former staff member of U.S. Senate Environment & Public Works Committee and producer of the upcoming global warming documentary Climate Hustle, set for release later in 2015.

“Global warming activists have claimed for years that the 1970s global cooling scare never existed. They have tried to erase the inconvenient history which ironically blamed extreme weather like tornadoes, droughts, record cold and blizzards on global cooling,” said Morano.

Morano told MRC Business, “But now — unearthed from bowels of media archives — comes none other than Walter Cronkite reporting on fears of a coming ice age in 1972. Having Cronkite’s image and face discussing global cooling fears reveals the fickleness of the climate change claims.”

“Climate fear promoters switched effortlessly from global cooling fears in the 1970s to global warming fears in the 1980s. In the present day, the phrase ‘global warming’ has lost favor in favor of ‘climate change’ or ‘global climate disruption’ or even ‘global weirding,’ Morano added. “’Settled science’ has never seemed so unsettled.”

By the way, let’s end with this inadvertently telling paragraph from Brinkley (his book, meant to celebrate Cronkite, raised many questions about the man who spent much of his career posing as Mr. Objective):

In January 1970, the promise of a new environmentalism brought about the end of [Cronkite’s future-themed series] The Twenty-First Century (which had succeeded The Twentieth Century in June 1967). No longer would Cronkite tolerate Union Carbide (a major polluter) as a sponsor. The Texas-based Fortune 500 company was the enemy of “Earthrise,” he told Bonn. At Cronkite’s insistence, CBS canceled The Twenty-First Century to coincide with the debut of the “Can the World Be Saved?” segments.

Yes, the crank science of the 1970s brought an end to the heroic phase of Kennedy and Johnson’s space program and its dalliance with embracing the 21st century a few decades early. And along with the collapse of the Great Society, which disillusioned the left when it tried to be all things to all voters, the optimism of the postwar 1950s and the first half of the 1960s would fade away, replaced by a grim nihilistic permanent malaise.

Exit question: Scott Pelley, the current incarnation of Cronkite on CBS has publicly likened those who question the “settled science” of global warming to Holocaust deniers, asking, “If I do an interview with Elie Wiesel, am I required as a journalist to find a Holocaust denier?”

What would he say if he ran into the 1972 iteration of Walter Cronkite?

‘Preparing for China’s Collapse’

March 3rd, 2015 - 12:58 pm

P.J. O’Rourke was interviewed by Peter Robinson of Ricochet and the Hoover Institute last month, and near the end of their wide-ranging conversation, Robinson asked O’Rourke about those of us in California who’ve given up on trying to reform the sclerotic dinosaur that is Sacramento. O’Rourke began his reply with this great anecdote referencing an even bigger and more dangerous ancient socialist government:

I remember going around China with a friend of mine who owned some steel foundries and a pelletized iron ore plant. He’s an American, but he lives in Hong Kong. Anyway, we’re wandering around mainland China, and I remember saying that I hadn’t heard any political discussions. Is it because people are afraid to talk about politics? He said, “no, they’re not afraid to [talk about politics]. You get ‘em started, and they’ll go on. But you’ve got understand the fundamental Chinese attitude toward government is ‘shhhhhhh….don’t wake it up when it’s sleeping.’” And I think our Millennials have a little bit of that same attitude. Fortunately, what they would wake up would not be as terrifying as [China’s cultural revolution.]

But what happens when China’s government does wake from its slumber? Steve Green writes today that the results won’t be pretty:

If a collapse should come, there is something we need to think about very seriously whether or not Washington ever heeds Mattis’s advice: The huge economic disruptions. China does in manufacturing today what America used to do, which is to move fast and scale up even faster. China moves workers and material in amounts and at speeds which are a legal and regulatory impossibility in 21st Century America. Between worker regs and the EPA, it simply isn’t possible for the US to replace China’s manufacturing ability — and there’s no other country besides us big enough and skilled enough to even try.

China’s collapse would cut a whole leg off of the global economy, with no anesthetic and no way to stop the bleeding. The loss of physical capital and manufacturing know-how would make a second Great Depression all but certain.

We need to have a plan in place to lift an awful lot of regulations, immediately, so that American business can go back to doing the kinds of things it used to do — and could do again if Big Fat Washington weren’t sitting on its chest.

In the early days of WWII, FDR asked for the impossible — that American industry build 50,000 warplanes in the first year, and 50,000 more every year after that. Nothing like it had ever been tried. But American business saw the profit potential, and FDR (for once) mostly got Washington out of the way. Sure enough, he got his airplanes.

We could do this, and avoid a global depression. The only thing stopping us is us.

I’m tempted to say “insert the Pogo quote here,” but given its origins during the rise of the American environmentalist left in the early 1970s, it was designed to stop us as well.

Related Exit Quote, via William F. Buckley: “Every ten years I quote the same adage from the late Austrian analyst Willi Schlamm, and I hope that ten years from now someone will remember to quote it in my memory. It goes, ‘The trouble with socialism is socialism. The trouble with capitalism is capitalists.’”

Snowfalls Are Now Just a Thing of the Past

March 2nd, 2015 - 10:37 am


Past performance is no guarantee of future results:

Good Morning America anchors and reporters effusively lauded Al Gore on Friday after he won the Nobel Peace Prize for his work on global warming. Diane Sawyer opened the program by breathlessly declaring, “Former Vice President Al Gore wins the Nobel Peace Prize for helping awaken the world to global warming. Now is it time to run for president again?” In her introduction to a piece on the subject, Sawyer gushed that the ex-VP is receiving the award for “for educating the world.”

“ABC Gushes Over Al Gore Nobel Win; He’s ‘Educating the World,’” the Media Research Center, October 15, 2007.

Good Morning America news reader Amy Robach on Friday mocked Republican James Inhofe as “bizarre” for a global warming speech he gave on the Senate floor. Robach described, “And a bizarre scene in Washington. One senator used the recent snow to bolster his argument about climate change.”

Inhofe held up a snowball to note the unusually cold February that the east cost has suffered through. Tossing the snowball, he joked, “Here, Mr. President. Catch this.” ABC has a history with condescending coverage on this issue. On April 23, 2012, reporter Bill Blakemore derided climate change skeptics as “denialists” and called for more alarmist advocacy.

“ABC Hits Senator Inhofe’s Climate Speech as ‘Bizarre,’” NewsBusters, February 27, 2015.

(Headline via the London Independent in 2000. The New York Times was running similar headlines as recently as last year; anti-vaccine crank Bobby Kennedy Jr. was specifically warning of no more snow in DC in 2008.)

Update: “Continue to Remind the Alarmists that It’s Cold Out. They Deserve It,” Sonny Bunch of the Washington Free Beacon writes, in-between digging his car out from seven degree weather. We’re doing our part!

Another day, another hit piece on Walker, this time from Philip Rucker of the Washington Post. (Link safe; goes to Hot Air; I’m not rewarding attack articles with extra traffic):

Walker responded by ticking through his recent itinerary of face time with foreign policy luminaries: a breakfast with Henry Kissinger, a huddle with George P. Shultz and tutorials at the American Enterprise Institute and the Hoover Institution.

But then Walker suggested that didn’t much matter.

“I think foreign policy is something that’s not just about having a PhD or talking to PhD’s,” he said. “It’s about leadership.”

Walker contended that “the most significant foreign policy decision of my lifetime” was then-President Ronald Reagan’s move to bust a 1981 strike of air traffic controllers, firing some 11,000 of them.

“It sent a message not only across America, it sent a message around the world,” Walker said. America’s allies and foes alike became convinced that Reagan was serious enough to take action and that “we weren’t to be messed with,” he said.

According to Politico, Rucker was the guy who whined, “What about your gaaaaaaaffffffes!!!!!!” to Mitt Romney in 2012; but what about Rucker’s gaffes, specifically, his lack of knowledge of history? Specifically, history that happened likely before the young Democrat operative with a byline was even born. Rucker’s article is headlined “Scott Walker calls Reagan’s bust of air traffic controller strike ‘most significant foreign policy decision,’” but that’s not a bad summation of how those events played out.

Return with us now to the early 1980s. In his 2009 book The Age of Reagan: The Conservative Counterrevolution: 1980-1989, Steve Hayward of Power Line wrote:

Smashing the air traffic controllers union has loomed large in populist lore ever since as a “signal” to private sector management that it was now okay to squeeze unions, but this is too simple. (If Reagan had really wanted to send an anti-union message, he would have proposed privatizing air traffic control.) Generally polls showed that public esteem for organized labor was at an all-time low by the time of PATCO’s ill-considered gambit. Labor was getting the message. A Wall Street Journal headline a month later told the story: “Economic Gloom Cuts Labor Union Demands for Big 1982 Contracts.” Fed chairman Paul Volcker later said that Reagan’s firing of the PATCO strikers was the single most important anti-inflationary step Reagan took.

There was one unanticipated audience that paid close attention to Reagan’s manhandling of the strike: the Soviet Politburo. Since taking office the administration had been looking for an opportunity to demonstrate in some concrete ways its toughness toward the Soviet Union. As is often the case, the most effective opportunity came in an unexpected way and from an unlooked-for place. The White House realized it had gotten Moscow’s attention when the Soviet news agency TASS decried Reagan’s “brutal repression” of the air traffic controllers.

For the American news media, Reagan’s handling of the strike became the opening for a new line of criticism. During the budget fight, the dominant line of criticism was that while Reagan’s policies might be cruel and uncaring, he himself was a kindly man. Having wondered whether Reagan was too “nice,” Haynes Johnson now wrote: “A glimmer of a harsher Reagan emerges…. For the first time as president, he has displayed another, less attractive side. Firmness is fine in a president; indeed, it is desirable. But something else came through last week—a harsh, unyielding, almost vengeful and mean-spirited air of crushing opponents. It makes you wonder how he will respond if faced with a direct, and dangerous, foreign challenge, one requiring the most delicate and skillful combination of strength and diplomacy.”

Gee, ask Secretary Gorbachev how that worked out.

In her 2003 book about Reagan,  Peggy Noonan quoted the Gipper’s Secretary of State George Schultz, who called it:

“One of the most fortuitous foreign relations moves he ever made”. It was in no way a popular move with the American public but it showed European heads of state and diplomatic personnel that he was tough and meant what he said.

Yesterday, Noonan added at the Wall Street Journal:

What Reagan did not speak about was an aspect of the story that had big foreign-policy implications.

Air traffic controllers in effect controlled the skies, and American AWACS planes were patrolling those skies every day. Drew Lewis: “The issue was not only that it was an illegal strike. . . . It was also that a strike had real national-security implications—the AWACS couldn’t have gone up.” It is likely that even though the public and the press didn’t fully know of this aspect of the strike’s effects, the heads of the union did. That’s why they thought Reagan would back down. “This hasn’t come up,” said Lewis, “but the Soviets and others in the world understood the implications of the strike.”

Foreign governments, from friends and allies to adversaries and competitors, saw that the new president could make tough decisions, pay the price, and win the battle. The Soviets watched like everybody else. They observed how the new president handled a national-security challenge. They saw that his rhetorical toughness would be echoed in tough actions. They hadn’t known that until this point. They knew it now.

However, I’m not at all surprised that the newspaper whose then-subsidiary magazine declared “We Are Socialists Now” upon Mr. Obama’s inauguration in 2009 would not be all that familiar with the history of the final years of the Cold War.

And speaking of Reagan:

Exit quote:


The pile continues to grow.

Update: “Arrogant Media Elites Mock Middle America,”  Salena Zito writes today at Real Clear Politics:

As consumers of news, most Americans want an honest look at the potential presidential candidates and where they stand on serious issues.

Reporters mock those news-consumers when they mock candidates who aren’t like the reporters themselves — but who are very much like normal Americans.

It is unforgivably arrogant for anyone in the media to think that the rest of the country thinks like they do.

“A reporter’s job is to report the news, not to drive it or to create it. A reporter’s audience is not just an echo chamber, not just D.C. friends, rivals, partisans and followers on social media. (Remember: Only 8 percent of Americans get their news through Twitter.),” Zito writes.

Don’t think of the DC media as reporters, as Glenn Reynolds recently noted:

The press sees itself first and foremost as political allies of Democrat-dominated institutions, which most emphatically includes universities, a major source of funding, foot-soldiers, and ideological suport for Democrats. When outsiders want information that might hurt Democrat-dominated institutions — see, e.g., ClimateGate — they are always portrayed by the press as partisans, malcontents, and evil. That is because the press today functions largely as a collection of Democratic operatives with bylines.

And the successful pushback against government unions by Walker — like Reagan before him — explains much of the subtext driving Rucker’s ahistoric ruckus.

kfc_edible_cup_2-25-15-1

Good news! “KFC Now Has A Coffee Cup You Can Eat,” reports BuzzFeed (who better to break this story?) Bad news — it’s only available in England right now:

This edible coffee cup was invented in a partnership with food scientists at The Robin Collective to coincide with the launch of KFC’s Seattle’s Best Coffee across its UK branches. The cup itself is made of biscuit, which has been wrapped in sugar paper and then lined with a layer of white chocolate, which melts over time, softening the biscuit enough to melt in your mouth.

On top of that delicious blend, a spokesperson for The Robin Collective told the Telegraph that the cups are also infused with a selection of “mood improving aromas,” like ‘coconut sun cream,’ ‘freshly cut grass’ and ‘wild flowers,’ which “evoke the positive memories we associate with warm weather, sunshine and summer holidays.”

Of course it does. The only charitable explanation given the involvement of companies with the names Kentucky Fried Chicken and Seattle’s Best Coffee is that perhaps they’re merely working out the product’s kinks out of town before it debuts in America. I will be charitable and assume that’s the case.

Because America is waiting, as David Byrne and Brian Eno would say.

‘Rahm-a-Lama-Ding-Dong’

February 25th, 2015 - 11:49 am

Rahm Emanuel, “Ex-Obama Aide Forced into Chicago Runoff,” John Fund reports. Couldn’t happen to a nicer party hack:

Illinois, the nation’s fiscal basket case, has been full of political surprises lately. Yesterday, Chicago, the home of the political machine that nurtured Barack Obama’s career, saw Mayor Rahm Emanuel forced into an April runoff against Councilman Jesus “Chuy” Garcia.

Emanuel, Obama’s first White House chief of staff, had every advantage in the race: $7 million in TV ads, a personal visit from his former boss, and the backing of a business community that’s been able to make special side deals with “da Mayor.” But Garcia showed the muscle of two powerful forces in the city’s politics: its growing Hispanic population and the Chicago Teachers Union, furious at Emanuel’s closing of 50 public schools. Two years ago, Emanuel retreated after a brief teacher’s strike and signed a new, generous contract with the union hoping to buy peace. That never works, and now the union is out to get him.

At Bloomberg, Dave Weigel explains why his fellow ‘Progressives’ “Celebrate Rahm Emanuel’s Surprise Setback in Chicago:”

Why did progressives–why do progressives–want to humble Emanuel? The answer’s been blaring from magazines like In These Times and Rolling Stone and the Nation for months. In the election-month cover story of In These Times, for example, progressive historian Rick Perlstein explained why the deal Emanuel cut with a company to remake the city’s transit cards never stopped hurting him.

The transit cards can double as debit cards, you see, promoted as a boon for Chicago’s un- and under-banked. But dig the customer fees hidden in the 1,000-page contract the city signed with Cubic: $1.50 every time customers withdraw cash from an ATM, $2.95 every time they add money to their online debit account with a personal credit card, $2 for every call with a service representative and an “account research fee” of $10 an hour for further inquiries, $2 for a paper copy of their account information, and, if you decide you’ve had enough, a $6 “balance refund fee.” [Gee, wait'll they discover ObamaCare -- Ed] This all makes mincemeat of the pro-privatization argument that “the marketplace” is more transparent than a government bureaucracy. The city might have been able to anticipate this before inking the deal had they paid attention to the fact that Money Network, the payment processing company partnering with Cubic, had received the lowest possible grade from the Better Business Bureau, and that another partner, MetaBank, was fined $5.2 million by federal regulators for a scheme to issue debit cards funded by tax refund loans at interest rates of up to 650 percent.

Emanuel was elected in the nadir of the first Obama term. While the White House adapted to Democratic politics, and while economic progressives took back a leading intellectual role in the party, Emanuel governed as a neoliberal. He’s still got plenty of advantages over Garcia, but he’s the first Chicago mayor to be forced into a runoff since the runoff system was created. Progressives wanted not just to humble Emanuel but to make a point about what sort of politics could no longer define the Democratic Party. And they’ve done that.

So Rahm has been transformed into the local government equivalent of Joe Lieberman, whom the Kos Kiddies hung out to dry as a loyalty test in 2006? That was also Hillary’s fate in 2007 — and possibly yet again if Elizabeth Warren is serious about running.

It’s a mindset that’s catching on the other side of the aisle: at Red State today, Leon Wolf has some thoughts “On the Value of Shooting Cowards.”

(Headline via NRO’s Twitter account. Note the photo atop it, which will get loads of play should Emanuel lose his runoff against Garcia.)


Yes, you never know when a once-trusted financial advisor can turn a investment or insurance plan you thought was running on autopilot into dust, all the while promising you repeatedly…

Much more at Twitchy; presumably, this is a set-up for further shakedowns from the trial lawyers, one of the semi-retired president’s favorite constituencies, or for additional regulations. Or both.

“De Blasio and Obama’s lack of experience” is explored my Michael Goodwin in the New York Post, beginning with a great quip in the opening:

A liberal friend pained over the rampant failures of liberal government makes an observation: “Barack Obama and Bill de Blasio are both learning to shave on our whiskers.”

The image is priceless, the insight wise. Their lack of experience in how the world works, and their ignorance of such basic knowledge as meeting a payroll, are proving to be fatal flaws for our president and mayor.

* * * * * * * *

Another telling example is Obama’s latest outrage — his effort to draw a moral equivalency between Christianity and Islamic State barbarism.

“Lest we get on our high horse and think this is unique to some other place, remember that during the Crusades and the Inquisition, people committed terrible deeds in the name of Christ,” he said at the National Prayer Breakfast. “In our home country, slavery and Jim Crow all too often was justified in the name of Christ.”

The statement reflects his abiding disapproval of Western history and his fetish that Americans are always this close to mass ­Islamophobia — two topics where he’s far, far outside the mainstream. Instead of building a bridge, he throws a stick of ­dynamite.

Having failed over six years to convince the nation that Islam has nothing to do with terrorism, he drags up ancient Christian history to silence his critics. He’s moving the goal posts, which is another way of saying, “Shut up, I’m right.”

When Obama gave that shocking utterance at the National Prayer Breakfast last week, it was made all the more disgusting by ISIS having released their latest snuff film the day before of burning a man alive in a cage. And as the New York Times reports — and I think we can trust them on this one — Obama’s PC hectoring to three quarters of America wasn’t boilerplate written by his often bumbling speechwriters — the president says he wrote those words himself:

President Obama personally added a reference to the Crusades in his speech this week at the National Prayer Breakfast, aides said, hoping to add context and nuance to his condemnation of Islamic terrorists by noting that people also “committed terrible deeds in the name of Christ.”

But by purposely drawing the fraught historical comparison on Thursday, Mr. Obama ignited a firestorm on television and social media about the validity of his observations and the roots of religious conflicts that raged more than 800 years ago.

And one entirely of his own making. But then hey, this is the guy who said, “I think I’m a better speechwriter than my speechwriters. know more about policies on any particular issue than my policy directors. And I’ll tell you right now that I’m gonna think I’m a better political director than my political director.”

Well, you walked right into this latest disaster all by yourself then, champ.

As Jonah Goldberg noted in his latest G-File, “The Islamic State is crucifying people right now. Romans crucified people over 2,000 years ago. Does this mean that Italians can’t criticize them? How is it that the sins of Christianity are eternal but the sins of Muslim fanatics right now aren’t even Muslim? The Islamic State is enslaving people right now. America had slaves 150 years ago:”

Forget the Inquisition and the Crusades for a moment. Take slavery. It was an evil institution. It will always remain a stain on America’s honor.

But here’s the thing. America put an end to it at an enormous price. Moreover, slavery was a constant on every continent for thousands of years. Looking at America in the context of the great tide of human events, the remarkable thing isn’t that we had slaves, it’s that we ended slavery. We ended slavery because deep in the founding principles of this country were deeply Christian — or, if you prefer, Judeo-Christian — principles that eventually couldn’t be reconciled with slavery.

Obviously, the better example is Britain. The British had slaves, as did countless other societies and civilizations stretching off to the dawn of man. What is remarkable is that, thanks to a Christian renaissance, they decided to not only abolish slavery in their own lands, but to impose their values on others. The British got on a very high horse, thank God, and they had the courage to act on their sense of moral superiority.

As should we. It’s entirely fair to argue that we shouldn’t get on a high horse with regard to how the French or the Canadians do things, no matter how much fun it may be. But the Islamic State? The Mullahs of Iran? Boko Haram? Please, we’re so much better than them by any objective moral or intellectual standard, it’s insulting to be asked to make the case. That doesn’t mean we don’t have faults, but it does mean our faults are entirely irrelevant and one should not bring up such irrelevancies for fear that reasonable people will hear false equivalencies.

Unless, of course, you’re the kind of person who isn’t comfortable with the idea that America or the West can be wholly, completely, unapologetically on the right side of a major question of human affairs, particularly when that conviction gives you license to kill evil people. Such confidence makes some people very uncomfortable, and so they start scanning the horizon for a topic they can drag into their comfort zone. “Enough about how bad they are,” they seem to be saying. “Can’t we get back to how bad we are? Where’s Joe McCarthy when we need him!?”

The Horse Equivocator

One last thing about this high horse. There’s a kind of Escher drawing pas de deux of asininity here because Obama is telling people not to get on a high horse from the saddle of a much higher horse. I mean is there a man in public life who preaches from a higher equine altitude than this guy? This is the guy who explained that Hillary Clinton’s supporters in the Democratic primary in Pennsylvania were backward yokels bitterly clinging to their sky god and boom sticks.

What offends Obama isn’t sanctimony, judgmentalism, or arrogance; it’s competition. What rankles him is when people refuse to genuflect to the trite pieties he unspools as if they were spun from gold.

In his 1944 State of the Union address, FDR, looking towards both how the postwar era would shape up and his own upcoming reelection bid that November, smeared millions of small-government laissez-faire-minded Americans when he thundered, “if history were to repeat itself and we were to return to the so-called ‘normalcy’ of the 1920′s—then it is certain that even though we shall have conquered our enemies on the battlefields abroad, we shall have yielded to the spirit of Fascism here at home.”

Just to review: FDR considered those oppose socialism in America to be National Socialists themselves. George Orwell, call your office.

Similarly, the man whom Time magazine anointed as the next FDR in 2008, knowing that the clock is ticking on his administration, smears three quarters of the country as crypto-terrorist slave-owners. And the man hired a decade ago by the New York Times to be their token conservative tells NBC’s Meet the Press today that he’s “totally pro-Obama on this.” Hey, those pants won’t crease themselves.

As Goldberg writes in his G-File:

What Obama shares with the collective authors of the liberal narrative is a deep and abiding suspicion that the American people are bigots, that they don’t understand their self-interest as well as liberal elites do, that America/Americans has/have no right to judge others given our own sins, and that we should never overreact to anything that makes liberals feel uncomfortable. Oh, you can overreact as much as you want to whatever liberals are overreacting to. In fact, that is encouraged. But if you get excited about something the folks at MSNBC think is weird or scary or could lead to the McCarthy poltergeist will-o’-the-wisping through the Upper West Side of Manhattan or Park Slope, then it’s a scary time here in America.

No wonder Obama is much more concerned with waging war against his domestic opponents than doing anything meaningful to fight ISIS.

On the other hand, a journalist at The Week squares the circle:

“Trickle-down economics is a Leftist lie,” British conservative Daniel Hannan politician writes:

In a 2012 paper for the Hoover Institute, the brilliant American writer Thomas Sowell showed that phrase was first used by FDR’s speech writer, Samuel Rosenman, who attacked “the philosophy that had prevailed in Washington since 1921, that the object of government was to provide prosperity for those who lived and worked at the top of the economic pyramid, in the belief that prosperity would trickle down to the bottom of the heap and benefit all.”

* * * * * * *

What free-marketeers in fact advocate is not trickle-down, but trickle-up. The way to become rich, in a competitive economy, is to offer a service to the broad mass of consumers. I am typing these words using software that I bought from Bill Gates. The transaction enriched him – adding fractionally to his net wealth – but it also enriched me, making my life more convenient. Bill Gates became wealthy, in other words, by persuading a great many poorer people to buy something from him. In doing so, he made us considerably better off, too. Trickle-up, you see.

Trickle-down, by contrast, would represent the precise opposite of an open market system. It would involve handing wads of cash to the undeserving rich in the hope that their affluence would somehow transfer itself to the rest of us. Now such transfers do occasionally happen. The bank bailouts were the most notorious example: they shifted a great deal of money, through coercive taxation, from people on low and medium incomes to wealthy bankers and bondholders. The Common Agricultural Policy is another instance: its cost falls disproportionately on the poor, who spend a relatively high percentage of their income on food, and its benefits go overwhelmingly to big landowners. Likewise the alternative energy boondoggles that force the general population to subsidise those same landowners through higher fuel bills.

In other words, the whole mindset of crony socialism that market the early days of the Obama administration, and can be summed up in a single, damning word: Solyndra (or Tesla. Or Government Motors, but that’s two words.) In the early 1920s, the American left began to use the traditionally conservative word “liberalism” to separate themselves from the failed “Progressive” (read: totalitarian) policies of Woodrow Wilson. Over the next 15 years they would go on (as FDR and Truman both did) to Orwellianly denounce Coolidge’s laissez-faire worldview as “fascist.” Similarly, the next GOP presidential candidate could have lots of fun driving his interviewers absolutely insane calling for a permanent end to the trickle-down economics of Barack Obama and his fellow leftists.

The left steals bases all the time. Why can’t we?

Related: In his weekly USA Today column, Glenn Reynolds offers a troika of additional ways “to ‘do something’ about poverty.”

The Biden 2016 campaign is certainly steaming along nicely, no?

At an event this morning, Vice President Joe Biden told Democrats that, “To state the obvious, the past six years have been really, really hard for this country.”

“And they’ve been really tough for our party. Just ask [former DCCC chair] Steve [Israel]. They’ve been really tough for our party. And together we made some really, really tough decisions — decisions that weren’t at all popular, hard to explain,” said Biden.

At the end of 2000 election, Slate noted, “In the wake of a successful centrist presidency and the best economy in memory, Gore adopted an angry populism as the tone of his campaign. Michael Kinsley aptly characterized this stance as ‘You’ve never had it so good, and I’m mad as hell about it.’”

Biden has reversed this formula: “To state the obvious, the past six years have been really, really hard for this country. And they’ve been really tough for our party.” So vote for me for four more years!

As for where things stand with the suddenly Romney-less GOP field, Tom Blumer has you covered over at the PJM homepage, along with one of my Photoshops for “George Stephanopoulos, Democrat Sniper.”

Related: Tanned, rested, and ready!

FDR had breadlines for as long as the eye could see. Bill de Blasio has…

Entirely related!

And of course, as with FDR and the Depression, de Blasio and Cuomo are doing everything they can to make a bad situation worse, because, power: 

 

 


A few years ago when New York was pounded by many inches of global warming despite the Times predicting in 2000 that snowfalls would be a thing of the past, Victor Davis Hanson warned of “The Bloomberg Syndrome:”

It is a human trait to focus on cheap and lofty rhetoric rather than costly, earthy reality. It is a bureaucratic characteristic to rail against the trifling misdemeanor rather than address the often-dangerous felony. And it is political habit to mask one’s own failures by lecturing others on their supposed shortcomings. Ambitious elected officials often manage to do all three.

The result in these hard times is that our elected sheriffs, mayors, and governors are loudly weighing in on national and global challenges that are quite often out of their own jurisdiction, while ignoring or failing to solve the very problems that they were elected to address.

Quite simply, the next time your elected local or state official holds a press conference about global warming, the Middle East, or the national political climate, expect to experience poor county law enforcement, bad municipal services, or regional insolvency.

The names of the players have changed — and going from Bloomberg to de Blasio, the players themselves have gotten worse. But the political disease lingers on.

Update: One Twitter user squares the circle:

That’s what Glenn Reynolds argues, linking to Megan McArdle’s article at Bloomberg News on Obama’s trolling State of the Union address. “This is a win-win: He gets the blame, or he vetoes it,” Glenn writes.

The problem though is that Obama might not get the blame.

As with Democrats talking George H.W. Bush into raising taxes in 1990, one huge danger to this sort of game is that Democrats will play along in 2015 and then run ads like the above the following year directed towards the individual GOP senators and congressmen who raised taxes:

They would also receive a very painful, albeit well-deserved reminder from one of Senator Blutarsky’s colleagues.

“Anyone reading this knows where he was on September 11, 2001. A diminishing number remember where they were on January 30, 1965—the day we said farewell to Winston Churchill. (He died fifty years ago, January 24, 1965.),” Richard Langworth writes at the Weekly Standard:

For me it was a life-changing experience. Suddenly, unforgettably, on my flickering, black and white TV screen in New York City, the huge void of Westminster Abbey filled with The Battle Hymn of the Republic. He was, we were reminded, half-American, an honorary citizen by Act of Congress.

That day was the start of my 50-year career in search of Churchill—of what his greatest biographer, Sir Martin Gilbert, calls, “labouring in the vineyard.”

After the funeral I picked up The Gathering Storm, the first volume of his World War II memoirs. I was snared by what Robert Pilpel called his “roast beef and pewter phrases.” It’s biased, as he admitted—“This is not history; this is my case.” But it is so ordered as to put you at his side for the “great climacterics” that made us what we are today.

Churchill’s life spanned sixty years of prominence, unmatched in recent history. Of course, he insisted, “nothing surpasses 1940.” That was the year Britain and the Commonwealth—“the old lion with her lion cubs,” as he put it, “stood alone against hunters who are armed with deadly weapons” until “those who hitherto had been half blind were half ready.”

But I soon learned there was more to Churchill than 1940. Martin Gilbert wrote: “As I open file after file of Churchill’s archive, from his entry into Government in 1905 to his retirement in 1955, I am continually surprised by the truth of his assertions, the modernity of his thought, the originality of his mind, the constructiveness of his proposals, his humanity, and, most remarkable of all, his foresight.”

Sadly, England as a whole lacked Churchill’s foresight; at MercatorNet, Alun Wyburn-Powell explains “How Winston Churchill lost the 1945 election:”

Among the excuses the Conservatives offered after their defeat was that the Army Bureau of Current Affairs had indoctrinated service personnel to vote Labour. This excuse was at least plausible in principle, but it was pretty flimsy stuff.

There were some more obvious reasons for Churchill’s humiliation. Ultimately, the Conservatives had simply lost the electoral “ground war”.

In contrast to the other parties, the Conservatives had stuck rigidly to the spirit and the letter of the wartime electoral truce, only holding one party conference during the war and putting little effort into policy development and constituency organisation. The result was that the party machine was in a terrible state, with a greatly depleted band of agents and volunteers.

The party was also still carrying the blame for the appeasement of Hitler in the 1930s, for which it had been excoriated by the 1940 book Guilty Men.

Public memory was also against the Tories for another reason: the travails of David Lloyd George, who died in 1945. While still credited as the man who won World War I, Lloyd George’s record as prime minister after the war was dismal, marked by broken promises, unemployment, industrial unrest and threats to start another war. His dire tenure created a popular consensus was that good war leaders do not necessarily make good peacetime leaders.

Meanwhile, British society had changed during the war. Voters had become less class-bound; the evacuation of urban children to rural areas, service of all classes in the armed forces, and civilians sharing bomb shelters with strangers, had facilitated social mixing on an entirely new scale.

That in turn helped create a whole new political atmosphere. After World War I, many people had wanted a return to life as it had been – but after World War II, most people wanted a complete break with the past. In that climate, Labour’s forward-looking election slogan, “Let us face the future”, was far more appealing the Conservatives’ plea to let Churchill “Finish the job”.

Everyone should watch the 26-part World at War series released in 1973 by Thames Television, available on DVD from Amazon, and pretty easily found in streaming format on the Web. As I’ve written before, it was produced at exactly the right moment — when television was technically sophisticated enough to undertake a project of this scope, and while many of the major players were still alive and many still relatively young, and while Laurence Olivier was alive to narrate the series with the gravitas it deserved.

But perhaps most importantly, before the excoriating impact of political correctness would begin to tarnish how we view World War II, which unless we really have reached what Robert Tracinski of the Federalist calls “Peak Leftism,” will likely only get worse in coming decades. Political correctness is a disease that advanced slowly before fully metastasizing; but its roots were already present among 1930s British leftwing elites, who vowed would “in no circumstances fight for king and country,” and feared Churchill more than they feared Hitler (plus ça change). And as the 15th-episode of the World at War, titled “Home Fires” notes, even as England was on the verge of defeating National Socialism in Germany, it was about to institute an ever-increasing peacetime amount of nationalization and socialism at home:

That’s an excerpt from that episode; watch the whole thing here.

As to how Labour would radically reshape the people who inhabited postwar Britian, Peter Hitchens, the Tory-leaning brother of the late leftist Christopher Hitchens, does a remarkable job of highlighting the transformation of his country in his 2000 book, the The Abolition of Britain: From Winston Churchill to Princess Diana. (Please, somebody release this book in Kindle format). As the book’s title suggests, Hitchens begins by comparing the British people who turned out with stiff upper lips for the 1965 funeral of the Man Who Won World War II, and 30 years later, ululated en masse over the demise of Princess Diana, who was largely famous for being famous and for being a wannabe pop star and fashion icon. In other words, for purely aesthetic reasons.

“Wouldn’t it be simpler,” socialist playwrite Bertolt Brecht famously wrote, if the government dissolved the people and elected another?”

It took a few decades, with a timeout of sorts during the Thatcher years, but mission accomplished in postwar, post-Churchill England.

Speaking of political correctness, the transformation of a people, and Margaret Thatcher, Mr. Obama couldn’t be bothered to attend her funeral in 2013. Presumably, he wouldn’t have made time for Churchill’s either, right?

Update: At Power Line, Steve Hayward is more optimistic about the West’s future than I am, dubbing Churchill “Not the Last Lion:”

Manchester wrote in 1983 (in National Review, surprisingly enough) that “If there is a high office in the United States to which Winston Churchill could be elected today, it is unknown to me.”

The irony is that pre-war Churchill thought very much the same thing: see his remarkable essay from around 1930 entitled “Mass Effects in Modern Life,” which is in the must-have collection, Thoughts and Adventures. “Modern conditions do not lend themselves to the production of the heroic or superdominant type,” he wrote.  This was, Harry Jaffa pointed out in a splendid essay entitled “Can There Be Another Churchill?,” an instance of Churchill being wrong:

In 1939, Winston Churchill did not think so. But, as so often in his life, he was mistaken. Let us take comfort in that.

And in response to my post, Kathy Shaidle proffers excellent advice:

“In-Flight Catalog SkyMall Files for Bankruptcy,” the Wall Street Journal reports:

“With the increased use of electronic devices on planes, fewer people browsed the SkyMall in-flight catalog,” Mr. Wiley said.

The increase in the number of airlines providing Internet access “resulted in additional competition from e-commerce retailers and additional competition for the attention of passengers, all of which further negatively impacted SkyMall’s catalog sales,” he added.

The SkyMall business had revenue of about $33.7 million in 2013, but only $15.8 million for the nine months ended September 28, 2014.

SkyMall filed to preserve their assets by seeking “to achieve a sale of their assets and complete an orderly wind-down of their affairs,” said Mr. Wiley.

Is nothing sacred? 2015 is certainly starting off on a consolidating note as first Radio Shack, and now SkyMall as the mighty buzzsaw of Amazon continues to devour the rest of the retail world. I’m not sure if I can face living in a world without SkyMall, but how will Barney Stinson survive?

And where will the rest of us get our backyard-enhancing products made from “quality designer resin,” eh?