Get PJ Media on your Apple

Ed Driscoll

From Bauhaus To Our House

Lights! Camera! Weimar!

March 7th, 2013 - 7:31 pm

“Hollywood’s German Influence” is explored by A.J. Goldmann in the Wall Street Journal:

They don’t make them like they used to. But while the most recent Berlin Film Festival was filled with largely mediocre competition fare, luckily one could seek refuge in the festival’s sidebar retrospective, “The Weimar Touch.” Co-curated by the Deutsche Kinematek and the Museum of Modern Art—it will be shown in New York with slight modifications from April 3 through May 6—the series used the works of such celebrated directors as Fritz Lang, Ernst Lubitsch and Billy Wilder to make a compelling case for the lasting influence of Weimar-era cinema on Hollywood and beyond.

It is hard to overstate the amount of technical innovation and sheer talent that characterized the German film industry between 1918 and 1933. Today this golden age is best remembered for Expressionist masterpieces such as Robert Wiene’s “The Cabinet of Dr. Caligari” and F.W. Murnau’s “Nosferatu.” But Weimar filmmaking was much more than just sinister lighting and jagged camera angles. The industry also produced comedies, musicals, melodramas and thrillers that were among the most popular and inventive movies of the era.

When the Nazis took power in 1933, many of Germany’s most gifted directors, actors, cinematographers, production designers and composers were forced out of its film industry. All told, the country lost more than 2,000 professionals, many of Jewish descent. And Germany’s loss was Hollywood’s gain. The themes, techniques and sensibilities that a generation of émigré filmmakers carried with them to Hollywood brought a new degree of sophistication and know-how to American cinema.

But what corner of American life wasn’t shaped by the Weimar Republic? Last year, I did a Silicon Graffiti video titled, “Weimar? Because We Reich You,” which focused on the how ubiquitously Weimar-era ideas and concepts seeped into American culture (both high and low), politics, and science over the course of the 20th century.

In 1987’s The Closing of the American Mind, the late Alan Bloom wrote that by the middle of the 20th century, American universities had essentially become enclaves of German philosophy:

This popularization of German philosophy in the United States is of peculiar interest to me because I have watched it occur during my own intellectual lifetime, and I feel a little like someone who knew Napoleon when he was six. I have seen value relativism and its concomitants grow greater in the land than anyone imagined. Who in 1920 would have believed that Max Weber’s technical sociological terminology would someday be the everyday language of the United States, the land of the Philistines, itself in the meantime become the most powerful nation in the world? The self-understanding of hippies, yippies, yuppies, panthers, prelates and presidents has unconsciously been formed by German thought of a half-century earlier; Herbert Marcuse’s accent has been turned into a Middle Western twang; the echt Deutsch label has been replaced by a Made in America label; and the new American life-style has become a Disneyland version of the Weimar Republic for the whole family.

As I added in the conclusion of my script:

Which isn’t to say that German influences on America were all bad, or that this was some sort of sinister plot. Sigmund Freud’s efforts have been left in the dust by modern neuroscience, but research into the hidden caverns of the human brain had to start to somewhere.  Albert Einstein’s theories led to the splitting of the atom, which both won World War II and provided the basis of nuclear power, which has been remarkably safe in America and Europe. After the war, America’s jet aviation – first the Air Force, then commercial airlines – benefitted enormously from brilliant German engineering work even as America was, thankfully, destroying Nazi Germany’s ability to implement these designs. Similarly, America landed a man on the moon thanks to the efforts of Werner Von Braun, and other German émigrés.

On Park Avenue in 1966, a businessman could have lunch at the Four Seasons, a restaurant designed by Philip Johnson (who dug both Weimar and its successor culture…) in a building designed by Mies van der Rohe, he could then walk over to the Pan Am building, designed by Walter Gropius, to catch a helicopter to JFK Airport, and on the way, read about Werner Von Braun’s latest efforts to land a man on the moon. If he was worried that von Braun’s missiles could be used to deliver payloads designed by Albert Einstein and Edward Teller – well, his Freudian analyst would soon set him at ease. At least until he saw the April 8th 1966 cover of Time magazine, which echoed the words of Friedrich Nietzsche nearly a century earlier. And all the while, likely never thinking of where these additions to American life originated. The following decade, our businessman would struggle with Weimar-style risqué sexual mores, hyperinflation, and what the then-president’s administration ultimately termed a malaise and a crisis of confidence amongst his fellow liberal elites.

But to respond to the query by Thomas Friedman last year in the New York Times, ‘Can Greeks Become Germans?’

Well, 50 years ago, we did, didn’t we?

No reason why Hollywood should have been exempted from Weimar’s influence as well.

(Thumbnail on PJM homepage based on a modified Shutterstock.com image.)

‘I’m Going to Miss Shopping’

January 23rd, 2013 - 8:50 pm

I love the Internet and shopping online at Amazon, and while I’d prefer the aesthetics of the Mad Men era, technology-wise I’m very happy to be living in the 21st century.

But.

On Monday, James Lileks visited his local Best Buy to purchase a low-end portable computer. Or as he put it, “My wife said she needed a laptop so she could work at home in the evening without bringing the office laptop The clerk asked if he could help, and I said I was looking for a cheap virus-magnet laptop loaded with crapware.”

Afterwards, Lileks visited a competing chain, Micro Center:

Drove on to MicroCenter, or whatever it’s called — a junky computer store in an old grocery-store building, aisles of geek detritus. Always packed. Good prices. A hard drive or a legacy cable. The internet won’t kill it because there are many of us who need these things now and like to go to a place where these things are.

The Internet couldn’t kill it, but the craptacular California economy could. From about 1998 or so, until last year, there was a Micro Center in Santa Clara, about 10 miles from my house. Its primary competition wasn’t really with Best Buy, but with Fry’s Electronics. Fry’s is a regional chain with Silicon Valley origins that has a huge selection of electronic products, giant big box stores with the overall room tone of a pachinko palace — and is notorious for the worst customer service around. If you’re a geek and really know what you’re doing, Fry’s is probably great – in the old days, you could pick up a new CPU, motherboard, computer case, hard drive, sticks of RAM, a case of Diet Coke, a box of Twinkies, the skin mag of your choice, and you were good to go for the weekend.

But, heaven forfend, as Lileks would say, if you need to return something. Fry’s salesmen seem like a cross between the Soup Nazi and ex-members of Saddam’s Republican Guard. Its customer service department was the likely inspiration for the TSA. You’ll need your driver’s license, social security card, high school diploma, and every scrap of paper that came with the product you purchased. If your router came with a subscription card for PC World, it had better be there, or NO REFUND FOR YOU. NEXT! (My wife’s theory, and I think she’s onto something, is that the worse the customer service, the better the bargain the customer think he’s receiving. Or as a 1997 Forbes article noted in its headline, “The customer is always right? Not at Fry’s.”)

In contrast, Micro Center’s clerks seemed surprisingly friendly and interactive; they knew their products, and their return policies were remarkably liberal, in the old sense of the word. Which encouraged a certain amount of experimentation: If you’re not sure which product you need, this was the place to go. And a result, our LAN hubs and cables, routers, RAM sticks, power filters, and all of the nuts and bolts products we needed to run two businesses out of our house and setup our home theater came from Micro Center. (Not to mention, the store stocked a number of the electronics and gadget magazines I wrote for during my pre-Blogosphere days.)

The local Micro Center was always crowded whenever I went in there, but evidently it wasn’t enough to sustain it. As the (former) store’s Webpage notes,  it is “very difficult to announce that our Santa Clara store closed for normal business on Monday, July 23rd [2012]. We are very disappointed with the unsuccessful attempts to negotiate an economically viable extension of our store lease, and we deeply regret not being able to fully serve you going forward.”

If you can’t sustain a computer store in the heart of Silicon Valley, where can you sustain it? As Lileks adds in his post, regarding Best Buy:

I’m going to miss shopping and looking and touching, I really am. I hope my daughter remembers running up and down the aisles, looking at movies and Nintendo games and marveling at the volume of things to see and hear and play. You never get that sense from internet shopping. On the internet everything is presented individually, a page at a time, like a clerk is bringing out a shoe for Madame to consider.

Whatever the future for America’s retail economy holds, it likely won’t resemble the one envisioned by the architect whose daughter’s documentary Lileks links to later in his post, an unbelievably depressing look at a guy who apparently thought that From Bauhaus to Our House was a how-to guide and not a warning, and missed every lesson about the organic nature of civilization, the culture, and the importance of the street, the sidewalk and the stoop taught by Jane Jacobs, and that it was his mission to hit the CTL-ALT-DLT keys on the way mankind lives. What could go wrong, this time?

Meanwhile, Back in Chicago

January 21st, 2013 - 1:56 pm

“Chicago’s $22.5 Million Payout To A Gang Rape Victim Is Probably A Bargain.” Doug Johnson of the Wizbang blog notes:

Chicago Alderman Edward Burke says the city could have lost $80 million or more had a jury learned of all that Eilman went through and the nine desperate calls her mother made to police in the 27 hours. She had begged officers not to release her daughter because her daughter was bipolar and having a breakdown, but to no avail.

…Despite doing things like babbling incoherently and smearing menstrual blood on the holding cell walls, and after her parents’ frantic calls, Eilman was released to fend for herself in a high-crime area. She ended up in a nearby public housing building, where a man raped her at knifepoint before she fell from a seventh-story window.

Read the whole thing; follow the link to this Chicago Tribune report from 2010 for the horrific backstory.

And speaking of Chicago, is a once-great town with some of America’s best architecture (both Frank Lloyd Wright and Mies van der Rohe, and both architects’ talented acolytes worked there over the years) now on the road to Detroit? In a typically well-crafted essay at the Weekly Standard, Andrew Ferguson tries to make sense of the 21st century Second City: “Whose Kind of Town?”

Notes from Atlantis

January 13th, 2013 - 11:59 pm

Last week, we linked to Sarah Hoyt’s essay on the resilience of cultural memes from the immediate aftermath of WWI that have stayed permanently entrenched in the left’s collective thinking. Since she wrote that she’s currently fascinated by the British interwar-era, I sent her a link to the episode of the early 1970s Thames television series The World At War that focused on the British home front during WWII. As I mentioned in a post from the day after Christmas, if you want to see Britain’s welfare state being formed, and its healthcare nationalized and socialized, even as England was busy pulverizing a giant welfare state with plenty of national socialism of its own, it’s an interesting segment — probably more so to an American viewer than a British one.

As I’ve written before, The World At War was made at the perfect time — television documentary techniques were sufficiently developed by 1969 when production on the series began to tell the story properly, and it was only a quarter century after WWII concluded, and enough survivors were still around, still sharp, and able to appear on camera. But of equal importance is that it was made before political correctness had sapped the cultural confidence of the West. If the BBC or Thames’ successor network were to remake the The World at War today, it would have a very different tone to it, probably far closer to Oliver Stone’s “Springtime for Hitler and Stalin” Showtime series than the BBC would care to admit.

Also, the interviews and the contemporary non-newsreel footage were shot in color. We take that entirely for granted now, but when the show first went into production, color TV was still a new phenomenon to many English viewers; BBC2 had only begun broadcasting in color in 1967, and BBC1 not until 1969. It’s tough to conceive of something like Monty Python‘s Flying Circus as being shot in black and white, but as late as 1967, its immediate predecessor, a show with the classic title of At Last, the 1948 Show, was a monochrome production.

Another influential British documentary series from that era, which may well have influenced the style and quality of The World at War, would also have a very different tone were it made today. In fact, it probably couldn’t be made today. To help promote the BBC’s embrace of color television, in 1968 the network commissioned a 13-part documentary series titled Civilisation: A Personal View by Kenneth Clark — or simply Civilisation, as it’s almost universally called.

Civilisation debuted on February 23, 1969; to further advance the acceptance of color TV, each episode featured luscious cinema-quality photography of globe-hoping historical locations and numerous key pieces of art and sculpture, with all sorts of stately camera moves, all shot on 35mm film, rather than the cheaper-looking 16mm format or videotape.  (Many, perhaps all of the episodes, are currently available in full-length form at YouTube, but the series is available on Blu-Ray, and in terms of cinematography, it’s worth it.)

It’s fascinating, in 2013, witnessing the ongoing collapse of our own culture — and in particular, the complete collapse, decades ago, of what was once called “middlebrow culture” – to watch a show titled Civilisation  – that itself is from a civilization that effectively no longer exists. At the very least, the network that created the series no longer exists in the same form (QED).

Pages: 1 2 | Comments bullet bullet

I’ve written several posts over the years noting that modern art — at least the “shocking the bourgeois” brand of modern art — is a genre permanently trapped in the 1920s. Modern architecture often seems similarly trapped in the same decade, endlessly recycling the forms and styles created by Mies van der Rohe and Le Corbusier. Sarah Hoyt has an interesting post this weekend that describes much of today’s bourgeois intellectual life as permanently trapped in that decade as well, as a byproduct of WWI and its aftermath:

Like a child shocked by WWI and having both externalized the blame – Listen to a six year old, sometime “I didn’t break the vase.  It was the cat.”  Same thing “I didn’t cause death and carnage.  It was capitalism and old white men.” – and misattributed it – states looking for resources and expanding their power through bureaucratic means was more important than competition for raw materials, whatever you heard in school – the idiot child that is Western civ continues rampaging through her room, tearing everything that made it comfortable and useful and a good place, and throwing it out the window.

(more…)

Pages: 1 2 | Comments bullet bullet

It’s a Wonderful Fountainhead

December 24th, 2012 - 12:20 am

Joe Carter of the Catholic Education Resource Center explores “The Fountainhead of Bedford Falls.” As he writes, Ayn Rand’s Howard Roark and Frank Capra’s George Bailey aren’t often discussed in the same breath, but the two fictitious characters, immortalized by Hollywood via Gary Cooper and Jimmy Stewart, two legendary mid-century leading men, have a surprising amount in common.

“To anyone familiar with both works, it would seem the two characters could not be more different, ” Carter notes. “Unexpected similarities emerge, however, when one considers that Roark and Bailey are variations on a common archetype that has captured the American imagination for decades:”

Ayn-Rand-As-Che-10-3-09Howard Roark, the protagonist of Rand’s book, is an idealistic young architect who chooses to struggle in obscurity rather than compromise his artistic and personal vision by conforming to the needs and demands of the community. In contrast, George Bailey, the hero of Capra’s film, is an idealistic young would-be architect who struggles in obscurity because he has chosen to conform to the needs and demands of the community rather than fulfill his artistic and personal vision. Howard Roark is essentially what George Bailey might have become had he left for college rather than stayed in his hometown of Bedford Falls.

Rand portrays Roark as a demigod-like hero who refuses to subordinate his self-centered ego to the demands of the community society. Capra, in stark contrast, portrays Bailey as an amiable but flawed man who becomes a hero precisely because he chooses to subordinate his self-centered ego for the greater good of the community.

Read the whole thing, found via Kathy Shaidle, who has her own thoughts on the comparison.

And for my video interview with Jennifer Burns, the historian and author of Goddess of the Market: Ayn Rand and the American Right, in which we discuss The Fountainhead, along with other aspects of Rand in postwar America, just click here.

Incidentally, say what you will about Rand and Capra, Roark and Bailey, and Cooper and Stewart; the Hollywood of World War II and its immediate aftermath was undoubtedly made of sterner stuff than its current iteration.

Related: Since this is a movie-related post, I might as well hang this here: a movie Easter egg so cool, it goes to 11.

(Originally posted December 9, 2010.)

Update: Welcome Instapundit readers! When you’re done here, check out The View from Alexandria, which has some thoughts on “Politics without Foundations,” and why one Yale history professor believes that “there’s no liberal Ayn Rand.”

‘Blood, Poo, Sacrilege, and Porn’

November 30th, 2012 - 12:51 pm

Now that we have your attention, the above headline (wait, where else do headlines go?) comes from an article on the Washington Post’s Slate Website*, which focuses on eight theories on “Why the Art World Is So Loathsome.” The source of our headline is number two on their list, appropriately enough:

Old-school ’70s punk shock tactics are so widespread in today’s art world that they have lost any resonance. As a result, twee paintings like Gainsborough’s Blue Boy and Constable’s Hay Wain now appear mesmerizing, mysterious, and wildly transgressive. And, as Camille Paglia brilliantly argues in her must-read new book, Glittering Images, this torrent of penises, elephant dung, and smut has not served the broader interests of art. By providing fuel for the Rush Limbaugh-ish prejudice that the art world is full of people who are shoving yams up their bums and doing horrid things to the Virgin Mary, art has, quoting Camille again, “allowed itself to be defined in the public eye as an arrogant, insular fraternity with frivolous tastes and debased standards.” As a result, the funding of school and civic arts programs has screeched to a halt and “American schoolchildren are paying the price for the art world’s delusional sense of entitlement.” Thanks a bunch, Karen Finley, Chris Ofili, Andres Serrano, Damien Hirst, and the rest of you naughty pranksters!

But the first sentence of the above paragraph — “Old-school ’70s punk shock tactics are so widespread in today’s art world that they have lost any resonance” — is actually chronologically off by about five decades, and gets the source of the art world’s postmodern/reprimitivized games playing exactly backwards. Don’t blame the ’70s punks**; blame René Magritte, Marcel Duchamp, and Dada of the 1920s, which pioneered exactly the sorts of stunts you see in a Piss Christ or the Virgin Mary depicted in elephant dung, even down the reminders of bodily waste. Like Ikea’s furniture, today’s examples of modern “art” are just merely cheap knock-offs of much older forms of modernism.

As with modern architecture remaining forever trapped in the 1920s, and much of culture in general trapped in the 1970s, if we’re going to remain in what Mark Steyn calls “present-tense culture,” for the first decades of the 21st century, why must it remain so permanently freeze-dried?

* I had originally planned to insert a crack at the start of this post about how Slate allows the Post to further left than its main namesake publication ordinarily can. But I’m not sure if there’s any territory further to the left remaining for the Post to go, post-JournoList. And speaking of both freeze-dried culture and the Washington Post’s leftward lurch, on PJM’s homepage, Bob Owens reminds us for the remarkably undiversified Post editorial board, it’s 1963 — and it always will be.

** Who could shock by being quite conservative — and occasionally still are.

From Bauhaus to Ed’s House

November 28th, 2012 - 1:25 pm

Looking for a lengthy review of Mies van der Rohe: A Critical Biography, New and Revised Edition by Franz Schulze and Edward Windhorst? Of course you are!

Which is why, in addition to all of the usual political stuff here, I have just such a review over at the PJ Lifestyle blog. And yes, in case you’re wondering (and aren’t you astute for even pondering this topic!) there are plenty of comparisons to the original 1986 edition of the book.

(And yes, I am making up for my late 1980s obsession with modern art. I have to put those pretensions to work somewhere, right?)

Or: From Sea-Land to Our House.

In 1923, when modernists were obsessed with the machine (decades before their current rage against it), Le Corbusier put himself on the architectural map with his aphorism, “The house is a machine for living in.”  And as the years went on, plenty of other modern architects took this phrase to heart (to pacemaker?). In Tom Wolfe’s From Bauhaus to House, Wolfe memorably described the ever-cheapening effect of modern on architecture:

Eventually, everyone gave up and learned, like the haute bourgeoisie above him, to take it like a man.

They even learned to accept the Mieslings’ two great pieces of circular reasoning. To those philistines who were still so gauche as to say that the new architecture lacked the richness of detail of the old Beaux-Arts architecture, the plasterwork, the metalwork, the masonry, and so on, the Mieslings would say with considerable condescension: “Fine. You produce the craftsmen who can do that kind of work, and then we’ll talk to you about it. They don’t exist anymore.” True enough. But why? Henry Hope Reed tells of riding across West Fifty-third Street in New York in the 1940s in a car with some employees of E. F. Caldwell & Co., a firm that specialized in bronze work and electrical fixtures. As the car passed the Museum of Modern Art building, the men began shaking their fists at it and shouting: “That goddamn place is destroying us! Those bastards are killing us!” In the palmy days of Beaux-Arts architecture, Caldwell had employed a thousand bronzeurs, marble workers, model makers, and designers. Now the company was sliding into insolvency, along with many similar firms. It was not that craftsmanship was dying. Rather, the International Style was finishing off the demand for it, particularly in commercial construction. By the same token, to those who complained that International Style buildings were cramped, had flimsy walls inside as well as out, and, in general, looked cheap, the knowing response was: “These days it’s too expensive to build in any other style.” But it was not too expensive, merely more expensive. The critical point was what people would or would not put up with aesthetically. It was possible to build in styles even cheaper than the International Style. For example, England began to experiment with schools and public housing constructed like airplane hangars, out of corrugated metal tethered by guy wires. Their architects also said: “These days it’s too expensive to build in any other style.” Perhaps one day soon everyone (tout le monde) would learn to take this, too, like a man.

And according to ABC News, they’re taking this like a man in the Motor City: “Shipping Containers to Become Condos in Detroit.” And of course, ABC describes it as “Exceptional Green Living on Rosa Parks, Detroit.” Exceptional!

This 20-unit, four-story condo complex consisting of 93 stacked cargo containers – the first U.S. multi-family residence to be built from these discarded vessels – has been in the works for four years. Tabled when the national real estate market shattered, the project is now scheduled to break ground early next year in midtown Detroit. The units will come rigged with ductless heating and air systems, tankless water heaters and other energy-saving systems. “We’re putting money into these energy efficiencies so that the tenant has reduced energy costs,” says Leslie Horn, CEO of Three Squared, the project’s developer. “And we can build in less than half the time.”

Wow, 30 years ago, when Chrissie Hynde sang about driving “Past corrugated tin shacks holed up with kids and man I don’t mean a Hampstead nursery,” she had no idea what was coming.

Incidentally, note that even more “exceptional” cases of “green living” have also been proposed for Detroit.

Whither the Arts?

October 8th, 2012 - 12:17 am

At Ricochet, Dave Carter links to Camille Paglia’s essay in the Wall Street Journal on the decline of the art world with a reminder of the wonders of the 700-year old Cologne Cathedral and writes:

To venture inside and see The Shrine of the Three Holy Kings (purported to hold the crowned skulls of the Three Wise Men), or the Gero Cross which dates back to 976, or the legions of statues, is to become virtually intoxicated with the divine devotion that conceived and constructed such a solemn place.

Where is there anything in modernity to compare?  Camille Paglia poses just such a question, asking (and answering) the question of why so much of our fine arts have devolved into a “wasteland.”  “Painting was the prestige genre in the fine arts from the Renaissance on.  But painting was dethroned by the brash multimedia revolution of the 1960s and ’70s,” writes Paglia, who then zeros in on a central point:  “What do contemporary artists have to say, and to whom are they saying it?  Unfortunately, too many artists have lost touch with the general audience and have retreated to an airless echo chamber.”

It’s a chamber where the avant-garde first yielded to iconoclasm, which in turn has yielded to unimaginative and vulgar conformity.  One need look no further than the artist who submerses a crucifix in urine, and then congratulates himself for bravely giving the finger to orthodoxy, all while carefully avoiding a cartoon of Mohammed so as to avoid getting his head chopped off.  So much for breaking new ground.

If I’m remembering the history of modernism correctly, as early as the 1960s, modernists were looking back nostalgically at Mondrian, Monet, and other pioneering modernists as a Heroic Era long since passed. In architecture, Frank Lloyd Wright, Corbusier, Walter Gropius, and Mies van der Rohe would all be dead by the end of the decade. In pop culture, by the end of the 1960s, the “Easy Riders/Raging Bulls” crowd of Young Turks would slam the door shut on Hollywood’s golden era, cheered on by critics such as Pauline Kael.

About that last development, in 2008, Robert Fulford wrote in Canada’s National Post, in a description that was applicable to much of what was going on the rest of pop culture in the late 1960s, as liberals spontaneously declared the postwar Middlebrow era dead: 

[Kael] announced no less than a revolution in taste that she sensed in the air. Movie audiences, she said, were going beyond “good taste,” moving into a period of greater freedom and openness. Was it a violent film?

Well, Bonnie and Clyde needed violence. “Violence is its meaning.”

She hated earnest liberalism and critical snobbery. She liked the raw energy in the work of adventurous directors such as Robert Altman, Francis Ford Coppola, Steven Spielberg, George Lucas and Martin Scorsese. She trusted her visceral reactions to movies.

When hired as a regular New Yorker movie critic, she took that doctrine to an audience that proved enthusiastic and loyal. She became the great star among New Yorker critics, then the most influential figure among critics in any field. Books of her reviews, bearing titles such as I Lost it at the Movies, Kiss Kiss Bang Bang and When the Lights Go Down, sold in impressive numbers. Critics across the continent became her followers. Through the 1970s and ’80s, no one in films, except the actual moviemakers, was more often discussed.

It was only in the late stages of her New Yorker career (from which she retired in 1991) that some of her admirers began saying she had sold her point of view too effectively. A year after her death (in 2001) one formerly enthusiastic reader, Paul Schrader, a screenwriter of films such as Raging Bull and Taxi Driver, wrote: “Cultural history has not been kind to Pauline.”

Kael assumed she was safe to defend the choices of mass audiences because the old standards of taste would always be there. They were, after all, built into the culture. But those standards were swiftly eroding. Schrader argued that she and her admirers won the battle but lost the war. Acceptable taste became mass-audience taste, box-office receipts the ultimate measure of a film’s worth, sometimes the only measure. Traditional, well-written movies without violence or special effects were pushed to the margins. “It was fun watching the applecart being upset,” Schrader said, “but now where do we go for apples?”

And that’s the question modernists in general need to ask themselves. The other question that often remains unexplored in self-described “modern” art is in reality, how ancient it comparatively all is.

Pages: 1 2 | Comments bullet bullet

“Is There a Limit to How Tall Buildings Can Get?”, the Atlantic asks.

With or with a space elevator on top?, we query in return.

Now is the time at Ed Driscoll.com when we juxtapose, Small Dead Animals style:

– Headline, TV Newser.com, June 21st.

– Ann Althouse, today, who adds that “The clip is hilarious.”

Unexpectedly! — to coin another Internet meme, as the network that brought you The Wright-Free Zone now brings its Saudi Arabian viewers a surfeit of Seurat.

Oh those crazy New Puritans — it’s enough to make you wish that the Impressionists had been dentists.

“The history of philosophy can be divided into two different periods. During the first, philosophers sought the truth; during the second, they fought against it.”

– Jean-François Revel (1924-2006), French journalist and philosopher, as quoted in The Fortunes of Permanence.

Roger Kimball, my fellow PJ Media columnist and publisher of the New Criterion magazine and Encounter Books, stopped by recently to discuss his latest book, The Fortunes of Permanence: Culture and Anarchy in an Age of Amnesia. During our interview, Roger expounds on:

  • Present-Tense Culture: What happens to a culture that has not only submerged its past, but is doing its damndest to bury it permanently?
  • The left-wing etymology of the word “ideology.”
    How postwar-American modern architecture was able to minimize its prewar socialist past.
  • How Rudyard Kipling and James Burnham became historical unpersons.
  • The progressive paradox of colonialism.
  • The future of socialism: Over 60 years ago, George Orwell responded to the horrors of socialism by asking, “where is the omelet?” Since, England, Greece and California are all in dire fiscal straits, and we know that there’s no omelet being cooked up, where does America go from here?

And much more. Click here to listen:

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

(23 minutes long; 22 MB file size. Want to download instead of streaming? Right click here to download to your hard drive. Or right click here to download the 6.9 MB lo-fi edition.)

Since in the past, a few people have complained of difficulties with the Flash player above and/or downloading the audio, use the video player below, or click here to be taken to YouTube, for an audio-only YouTube clip. Between one of those versions, you should find a format that plays on your system.

For the rest of our podcasts, click here and just keep scrolling.

And from more from Roger, don’t miss his video interview with Glenn Reynolds on his new book.

Beyond the Theory of Moral Relativity

July 4th, 2012 - 12:03 am
YouTube Preview Image

On this Fourth of July, to understand how America — and much of the world — began to go off the rails in the 20th century, it’s worth flashing back to the tremendous opening shot of Paul Johnson’s opus Modern Times:

At the beginning of the 1920s the belief began to circulate, for the first time at a popular level, that there were no longer any absolutes: of time and space, of good and evil, of knowledge, above all of value. Mistakenly but perhaps inevitably, relativity became confused with relativism.

No one was more distressed than Einstein by this public misapprehension. He was bewildered by the relentless publicity and error which his work seemed to promote. He wrote to his colleague Max Born on 9 September 1920: ‘Like the man in the fairy-tale who turned everything he touched into gold, so with me everything turns into a fuss in the newspapers.’ Einstein was not a practicing Jew, but he acknowledged a God. He believed passionately in absolute standards of right and wrong.

He lived to see moral relativism, to him a disease, become a social pandemic, just as he lived to see his fatal equation bring into existence nuclear warfare. There were times, he said at the end of his life, when he wished he had been a simple watchmaker.

The public response to relativity was one of the principal formative influences on the course of twentieth-century history. It formed a knife, inadvertently wielded by its author, to help cut society adrift from its traditional moorings in the faith and morals of Judeo-Christian culture.

Last week, while searching for that quote, I came across a 2010 comment on Johnson’s thesis by econo-blogger Bryan Caplan and a post at the long-running British libertarian blog Samizdata which both referenced it with some gentle criticism. As Johnathan Pearce wrote at the latter blog:

Like Caplan, I am not entirely sure that moral relativism captures the full nature of what went wrong in terms of the 20th Century, although I think Johnson does capture quite a lot of the problem with that concept. For me, the ultimate disaster of that century was the idea of the omniscient State and of the associated idea that governments, run by all-knowing officials, could solve many of the real or supposed problems of the age. The 20th Century was not unique in witnessing the growth of government, but it was an age when government had, like never before, the technology at its disposal to be immensely powerful, probably more so than at any time since the Romans (and even the writ of Rome had its limits). We are still, alas, in the grip of that delusion that government can and should fix problems, although there is perhaps, hopefully, a bit more cynicism about it than say, during the late 1940s when the likes of Attlee were in Downing Street.

Johnson is right, however, to point out that in a world where there is no stated respect for the idea of impartial rules and law, no respect for reason and for the idea of objective truth – or at least that it is noble to pursue truth – that terrible consequences follow; every irrationality, might-is-right worldview, will fill the vacumn. However, unlike Johnson, I do not think that morality requires the anchor of belief in a Supreme Being, and he tends to make the mistake, like a lot of devoutly religious folk, of assuming that atheists, for example, cannot arrive at a moral code, which seems to rather overlook the role of people such as Aristotle, who had a huge impact on views about ethics, and from whom other religions have borrowed (think of the Thomist tradition in Catholic thought, for instance).

I think he’s right. Part of the problem is that “moral relativity” and moral relativism sounds at first glance like a swingin’ night on the town in Manhattan during the Beame era – that squalid perigee of the 1970s when the city birthed Death Wish, Taxi Driver, and, heck, Saturday Night Fever, a hopelessly nihilistic period that, ironically, a surprising number of liberal New Yorkers bored with Mayor Bloomberg’s current great clean-up of the human soul would be happy to return to.

But at the risk of going to the well once too often, I’d say the real cause of the woes of the 20th century was this:

YouTube Preview Image

Whenever a revolutionary movement took shape, it effectively banished the past. But it wasn’t just history that vanished – Nietzsche killed God, and millennia of Judeo-Christian religion. Marx paved the way for systems of government where freedom of choice and economic knowledge accumulated over centuries of trial and error could be junked for a top-down centrally-planned command and control economy.

Progressives began to argue that man himself could be reengineered – as Tom Courtenay’s Pasha/Strelnikov character says to Julie Christie’s Lara near the start of David Lean’s version of Dr. Zhivago shortly before Hell descends, “It’s the system, Lara. People will be different after the Revolution.” And if they weren’t, they could be engineered to be different. H.G. Wells and other late 19th and early 20th century “progressives” believed this concept implicitly, Fred Siegel wrote in a 2009 article on Wells in City Journal magazine:

In A Modern Utopia, written in 1905, Wells updated John Stuart Mill’s culturally individualist liberalism in light of the horizons opened by Darwin and Francis Galton, the founder of eugenics. Biologically, argues the book’s narrator, the “species is the accumulation of the experiments of all its successful individuals since the beginning.” That means, he says, that the “people of exceptional quality must be ascendant.” Further, “the better sort of people, so far as they can be distinguished, must have the fullest freedom of public service.”

What provides the possibility for such freedom is eugenics. Wells has no use for the iron laws of Marxism, but he replaces them with the iron laws of Malthus and Darwin. “From the view of human comfort and happiness, the increase of population that occurs at each advance in human security is the greatest evil of life,” he writes. “The extravagant swarm of new births” that created the masses was “the essential disaster of the 19th century.” Man’s propensity to reproduce will always outstrip his productive capacity, even in an age of machinery. Worse, the “base and servile types,” who are little more than the “leaping, glittering confusion of shoaling mackerel on a sunlit afternoon,” are the most fecund.

In Anticipations, Wells had already argued horrifyingly that the “nation that most resolutely picks over, educates, sterilizes, or poisons its People of the Abyss” would be ascendant. For the base and servile types, death would mean merely “the end of the bitterness of failure.” It was “their portion to die out and disappear.” The New Republicans would have “little pity and less benevolence” for the untermenschen, “born of unrestrained lusts . . . and multiplying through sheer incontinence and stupidity.”

In A Modern Utopia, Wells, stung by criticism of Anticipations, backed off, but only partway. “Idiots,” “drunkards,” “criminals,” “lunatics,” “congenital invalids,” and the “diseased” would “spoil the world for others,” Wells again argued. But their depredations required “social surgery,” not total extermination. That meant preventing people below a set income and intelligence from reproducing, as well as isolating the “failures” on an island so that better folk could live unfettered by government intrusion. Remove the unfit, and there will be no need for jails or prisons, which are places “of torture by restraint.” Illiberalism enables liberalism.

In practice, the notion that groups of men deemed “inferior” could be eradicated did not begin with, nor was it exclusive to, the Nazis. Stalin used famine as a weapon to reorder early Soviet society; the German obsession with eugenics preceded the Nazis by decades. It was certainly very much in the intellectual atmosphere of the Weimar Republic in the 1920s while the Nazis gathered strength and plotted their own version of Hell.

Pages: 1 2 | Comments bullet bullet

From Bauhaus to Ed’s House

June 24th, 2012 - 11:21 pm

Mies van der Rohe’s Barcelona Pavilion, May 2000

(We take a break from the usual day to day political and media bias stuff for a long rambling discussion on modern architecture and aesthetics written in the first person voice. As with our earlier explorations of the topic, we’ll understand if you bail on this one. And yes, that’s my last use of the royal we. At least for this post.)

I’m not sure what initially attracted me to the aesthetics of modernism. I do remember studying Art of Western Civilization in college, which, as with Western Civilization itself, largely concluded with the arrival of the 20th century. But modern art fascinated me — unlike traditional aesthetics, cracking modernism, whether it was architecture, or artists such as Mondrian, was a bit like deciphering a puzzle box. Of course, that complexity was considered a feature, not a bug, by the men who founded the movement. Reviewing C.P. Snow’s 1959 book, The Two Cultures and the Scientific Revolution, Orrin Judd of The Brothers Judd book review site and blog wrote:

As Snow notes, as late as say the 1850s, any reasonably well-educated, well-read, inquisitive man could speak knowledgeably about both science and the arts.  Man knew little enough that it was still possible for one to know nearly everything that was known and to have been exposed to all the religion, art, history–culture in general–that mattered.  But then with the pure science revolution of which Snow spoke–in biology and chemistry, but most of all in physics–suddenly a great deal of specialized training and education was necessary before one could be knowledgeable in each field.  Like priests of some ancient cult, scientists were separated out from the mass of men, elevated above them by their access to secret knowledge.  Even more annoying was the fact that even though they had moved beyond what the rest of us could readily understand, they could still listen to Bach or read Shakespeare and discuss it intelligently.  The reaction of their peers in the arts, or those who had been their peers, was to make their own fields of expertise as obscure as possible.  If Picasso couldn’t understand particle physics, he sure as hell wasn’t going to paint anything comprehensible, and if Joyce couldn’t pick up a scientific journal and read it, then no one was going to be able to read his books either.  And so grew the two cultures, the one real, the other manufactured, but both with elaborate and often counterintuitive theories, requiring years of study.

Or at very least, a crash course for an enthusiastic auto-didactic to pick up the basics. I began by taking out books on modern art and New York’s Museum Modern Art from my college library and my local public library. Eventually, I came across Henry Russell Hitchcock and Philip Johnson’s early 1930s book, The International Style, which put modernism on the map in America, and Peter Blake’s mid-‘60s book The Master Builders: Le Corbusier, Mies van der Rohe, and Frank Lloyd Wright, both of which have been perennially in print and still available from the gift shop at NY MoMA. And given that I had loved the Right Stuff, The Purple Decade and The Bonfire of the Vanities, I also read Tom Wolfe’s From Bauhaus to Our House.

Oddly enough, reading From Bauhaus to Our House, I found myself loving the satire, but also finding myself strangely fascinated by the images, in spite of Wolfe’s best efforts to take the mickey out of them. Reading Blake’s Master Builders, and other books on modern architecture, initially, I admired Corbusier’s works, particularly his pre-WWII buildings, but found myself increasingly put off by his post-war efforts, which replaced the white stucco of the homes he designed for his earliest wealthiest patrons with massive forms built largely out of raw concrete. Corbu’s postwar style was dubbed Béton Brut, and the New Brutalism, and brutal it was indeed. (Even Blake, the former editor in chief of Architectural Forum magazine, would have second thoughts.)

Georg Kolbe’s statue, “Dawn,” in the Pavilion.

But Mies van der Rohe had worked out an architectural language that was logical (or at least seemed logical), and at its best a sort of industrial poetry. It was also the vocabulary of post-war American cities. As Wolfe wrote in From Bauhaus to Our House, Mies, the Bauhaus’s last director, and Walter Gropius, its founder, both settled in America after fleeing the Nazis in the 1930s, and both we’re welcomed by academia, as Wolfe famously wrote, as…The White Gods!

Gropius had the healthy self-esteem of any ambitious man, but he was a gentleman above all else, a gentleman of the old school, a man who was always concerned about a sense of proportion, in life as well as in design. As a refugee from a blighted land, he would have been content with a friendly welcome, a place to lay his head, two or three meals a day until he could get on his own feet, a smile every once in a while, and a chance to work, if anybody needed him. And instead—

The reception of Gropius and his confreres was like a certain stock scene from the jungle movies of that Bruce Cabot and Myrna Loy make a crash landing in the jungle and crawl out of the wreckage in their Abercrombie & Fitch white safari blouses and tan gabardine jodhpurs and stagger into a clearing. They are surrounded by savages with bones through their noses—who immediately bow down and prostrate themselves and commence a strange moaning chant.

The White Gods!

Come from the skies at last!

Mies in particular created a sort of systems-based design philosophy, which he taught to his students at the Illinois Institute of Technology, which was essentially his private educational fiefdom in the 1940s and ‘50s. By the 1960s, it became common to say that Mies’s architecture was the easiest architectural language to teach, as Blake himself writes in The Master Builders. But as Chicago-area architectural historian Franz Schulze, Mies’s best biographer, would write in 1985, “Indeed it was not at all, and may have been among the least teachable. The acres of stillborn design in the Miesian manner that transformed the American cityscape in the 1950s and 1960s are a palpable indication of this.”

Pages: 1 2 | Comments bullet bullet

Cultural Question Answered

June 19th, 2012 - 2:27 pm

Now is the time when we juxtapose, Small Dead Animals-style:

– Headline, Houston Culture Map, January 27, 2011.

– Headline, My Fox Houston, yesterday.

What’s the problem? I thought major museums considered graffiti to be “aerosol art” — though they tend to act rather “unexpectedly” indignant once their own property is threatened.

A Century of Anti-Individualism

June 17th, 2012 - 4:02 pm

Do a search on the word “individual” in the Kindle edition of Jonah Goldberg’s 2008 book Liberal Fascism, and you’ll quickly find a slew of quotes from early 20th century “progressives” who thought that the idea of the sovereign individual was just much primitive bunkum:

[Herbert Croly, the founder of the New Republic magazine] was an unabashed nationalist who craved a “national reformer…in the guise of St. Michael, armed with a flaming sword and winged for flight,” to redeem a decadent America. This secular “imitator of Christ” would bring an end to “devil-take-the-hindmost” individualism in precisely the same manner that the real Jesus closed the Old Testament chapter of human history. “An individual,” Croly wrote, sounding very much like Wilson, “has no meaning apart from the society in which his individuality has been formed.” Echoing both Wilson and Theodore Roosevelt, Croly argued that “national life” should be like a “school,” and good schooling frequently demands “severe coercive measures.”

* * * * * *

Croly constructed this worldview out of what he deemed vital necessity. Industrialization, economic upheaval, social “disintegration,” materialistic decadence, and worship of money were tearing America apart, or so he—and the vast majority of progressives—believed. The remedy for the “chaotic individualism of our political and economic organization” was a “regeneration” led by a hero-saint who could overthrow the tired doctrines of liberal democracy in favor of a restored and heroic nation. The similarities with conventional fascist theory should be obvious.

* * * * * *

We should not forget how the demands of war fed the arguments for socialism. [John] Dewey was giddy that the war might force Americans “to give up much of our economic freedom…We shall have to lay by our good-natured individualism and march in step.” If the war went well, it would constrain “the individualistic tradition” and convince Americans of “the supremacy of public need over private possessions.” Another progressive put it more succinctly: “Laissez-faire is dead. Long live social control.”

* * * * * *

[Walter] Lippmann, as he argued later, believed that most citizens were “mentally children or barbarians” and therefore needed to be directed by experts like himself. Individual liberty, while nice, needed to be subordinated to, among other things, “order.”

* * * * * *

For the most part, the progressives looked upon what they had created and said, “This is good.” The “great European war…is striking down individualism and building up collectivism,” rejoiced the Progressive financier and J. P. Morgan partner George Perkins. Grosvenor Clarkson saw things similarly. The [World War I] war effort “is a story of the conversion of a hundred million combatively individualistic people into a vast cooperative effort in which the good of the unit was sacrificed to the good of the whole.” The regimentation of society, the social worker Felix Adler believed, was bringing us closer to creating the “perfect man…a fairer and more beautiful and more righteous type than any…that has yet existed.” The Washington Post was more modest. “In spite of excesses such as lynching,” it editorialized, “it is a healthful and wholesome awakening in the interior of the country.”

And so on. Occasionally, these quotes would rebound rather ironically upon the utterer. In the early years of the 1920s, Mies van der Rohe, the pioneering modernist architect and last director of the Bauhaus, the Weimar-era German school for modern artists would write, “The individual is losing significance; his destiny is no longer what interests us.” This in the midst of earning a living designing houses for wealthy individuals and only a decade before the Nazis came to power, who disliked individualism much more than Mies did, shuttered the Bauhaus and eventually forced Mies to flee Germany to teach and practice his profession in Chicago.

A century that revolved around the horrors of collectivism in all its forms doesn’t stop similar talk today. Scientific American reviews a new book titled The Self Illusion: How the Social Brain Creates Identity, and queries:

Although Hood believes the self may be the greatest trick our brain has ever played on us, he concludes that believing in it makes life more fulfilling. The illusion is difficult–if not impossible–to dispel. Even if we could, why deny an experience that enables empathy, storytelling and love?

“To justify fascism,” Smitty, Stacy McCain’s co-blogger tersely responds:

If you want to factor out any theistic concept of a soul or notion of free will in one fell swoop, this sort of materialistic reduction is the way to go.

And it’s cool, too: once we’ve got life reduced to measurable bits of matter, and have nuked the idea of a ‘self’, we can set about the elimination of the individual and manage society through a series of spreadsheets. The molecules made us do it–how could there be a Devil?

This is not an evangelical pleading, though. My secular answer to this discussion is that the self, and freewill, have got to be taken as an assumption. That is, barring clear genetic-level defects like Downs, free moral agency has got to be the default position for the individual. Otherwise, we remain a societal collection of infants, forced into heroin addiction and inter-species romance because we’re, you know, victims.

Or to use the preferred word of Mayor Bloomberg, Jerry Brown and Barack Obama — constituents.

Our latest Silicon Graffiti video was inspired by one of the key themes in the late Allan Bloom’s 1987 book, The Closing of the American Mind. Bloom wrote that by the middle of the 20th century, American universities  had essentially become enclaves of German philosophy. As a result, “the new American life-style has become a Disneyland version of the Weimar Republic for the whole family,” according to Bloom. Last year in the New York Times, Thomas Friedman famously asked, ‘Can Greeks Become Germans?’

Why not? If we could, any nation can. This video looks at how and why that happened, and the results — or at least scratches the surface of those concepts, inasmuch as any six minute video can.

And when you’re done watching, check out David P. Goldman at his “Spengler” column (and that nom de blog dovetails remarkably well with our theme, doesn’t it?) on “Philistinism and Failure,” and follow David’s link to Fred Siegel from the April issue of Commentary, for his brilliant article on “How Highbrows Killed Culture,” for much more on this theme.

A handy, portable, easily embeddable YouTube format of the video is available here. And click here for three years worth of earlier editions of Silicon Graffiti.  The script of this week’s show, with plenty of hyperlinks to the books and blog posts that inspired it, follows on the next page.

Pages: 1 2 3 4 | Comments bullet bullet

In Andrew Klavan’s latest post, on “The Left’s Con Man Logic,” one of the cons he notes is the left’s selective utterance of the phrase, “You can’t put the toothpaste back in the tube.” As Andrew writes:

The Wall Street Journal this weekend had two writers of opposing opinions address the question: Has the sexual revolution been good for women? The feminist who answered yes began her argument with this masterpiece of disingenuousness: “Here’s the thing about revolutions — you can’t take them back….If you feel that the sexual revolution destroyed the American family by giving women power over their reproductive choices, and that power turned daughters and wives… into a bunch of wanton hussies, well, stew over your feelings all you want, but you might as well give up thinking that it is possible to herd us up and drive us back into the kitchen….”

Do lefties really fall for garbage like that? Why? Everything about that argument is meant to make you stop thinking. I need hardly point out that the relative chastity of the Victorian era in Britain followed the relative promiscuity of the Restoration period and was in turn followed by the roaring twenties which were followed by the fifties — so that, while, yes, there’s no going back, one can always go forward in a new direction. Nor need I point out that some of us who feel the Sexual Revolution hurt women may have our fellow creatures’ good at heart. The only thing you really need to know is that the writer is trying to obscure, not illuminate, the situation. That alone should make you start asking questions.

Like this one: Are you stupid…  or what?

Beyond the example of promiscuity waxing and waning that Andrew mentions, hasn’t the entire mission of the Left been one attempt after another to either “take back” a revolution, or put the toothpaste of civilization back into the tube — or both? In 1882, Friedrich Nietzsche declared God is Dead — and Time magazine would take their own whack at Him 84 years later, just for good measure. During the same period that Nietzsche was upending religion, Karl Marx rifled through the millenia of experimentation and accumulated wisdom that made up commerce and the marketplace, called it “capitalism,” and declared it similarly dead. (It certainly would be, wherever Marx’s ideas were implemented.) In the 1920s, the Bauhaus in Germany and Le Corbusier in France decided that a millenia of  accumulated wisdom in architecture could be swept aside to “Start From Zero” — and Corbusier believed Paris as a whole could be swept aside to Start From Zero — a decade later, Albert Speer and his chief patron entertained similar notions about Berlin. Likewise, in the 1950s, American urban planners, explicitly following Corbusier’s lead, would bulldoze whole neighborhoods in the name of “urban renewal,” which proved ultimately disastrous. In the 1930s, FDR and the New Dealers thought that the American Revolution, which gave birth to the most laissez-faire federal government ever known to man could be yoked under an endless alphabet soup of agencies and stifling regulations. Martin Luther King’s Civil Rights Revolution of the 1960s, which sought to judge a man by the content of his character rather than the color of skin has been upended by the left into tribalization based on skin color, and in academia, a de facto return to Separate But Equal.

What is environmentalism (which Andrew addresses earlier in his post) but staring down the freedom that the Industrial Revolution brought to the American middle class, including comfortable homes in suburbia, electric light, cars and planes to transport them everywhere, and endless information and entertainment at the press of a button, and taking it all back in the form of higher energy prices, even more regulation, less reliable energy generation systems, and the overall ennui that the true believers of global warming want to foist upon all of us?

No wonder the MSM and the left (but I repeat myself) wadded their panties into such a tight bunch when the Tea Party emerged — they know better than anyone that while it’s not easy, how entirely possible it is to reshape society and how fragile their own hold on power could ultimately be.

Related: At the Tatler, Robert Wargas spots a writer at the far left New York Review of Books railing against the failures of the American education system in a similar — if screedier — fashion as Woody Allen shouting about New York City’s downhill slide in the ’60s and ’70s, without stopping to consider that in both cases, it’s his ideological brethren that controls the terrain. As David Solway adds, “The decline of education, which means also the fading out of historical memory and the dimming of literate curiosity, has been the case for some considerable time now. The insistent question is: how does one go about trying to rescue a culture in the throes of custodial dissolution?”

Update: Related thoughts from Kathy Shaidle.

More: From the comments, “As people living in today’s progressive utopias like Cuba or North Korea might ask, ‘What’s toothpaste?’”

(Thumbnail on PJM homepage based on a modified Shutterstock.com image.)

Break Out the Billy Beer, Boys

March 18th, 2012 - 9:11 pm

Have you heard the news? There’s good rockin’ at midnight — of civilization, James Lileks writes:

Levitated Mass makes it to LA. I wrote about this idiocy for the National Review, one of them-there philistine-type arkticles what don’t understand the subtel-tees of modern art. Philistone would be more like it, perhaps. Hah! That’s a joke I said that’s a joke son. The rock in question is a 340-ton boulder dragged from the desert to a museum installation, where it will rest over a deep concrete-lined trench. I am unimpressed by the idea of putting a massive stone over a trench. Logistically, it’s fascinating; getting the rock from its natural habitat to the installation required a huge vee-hicle with 900 tires, or something, and it took forever, since the rock weighed slightly more than the pretense of the entire conception, and the truck only moved five miles an hour.

The WSJ had a piece about its arrival in LA. 200,000 people supposedly showed up to watch it pass. I don’t know how many came to see the immense truck, or how many came to see the Big Rock; if more came for the latter I’d be depressed. It’s just a rock. It’s a large rock, but . . . it’s a large rock. The usual explainers told us that it summed up the rich long history of Monument Moving, and while I suppose that’s true – Easter Island with its attendant ecological despoilation comes to mind – it also reminds us that this “monument” is not only unfinished, it has no intention of being finished. That would ruin the essence of the rock, I guess.

Time was a sculptor looked at a big slab of stone and saw the figure within he would liberate with hammer and chisel; time was, people gathered to see a monolith pass because it was a gift from Egypt, and stood for the power of another culture your culture had managed to subdue. Plus, it was cool; it was exotic. Time was, you valued something for what we could make of it, not the fact that you could just drag it somewhere else and say “now walk under it, and think things about big rocks.” Feh.

Just perfect. Not only has Jimmy Carter returned in the form of Barack Obama, but super-sized Pet Rocks now adorn museums. At least the seventies had Star Wars and Led Zeppelin to salvage the decade.