In his introduction to The Return of the Primitive: The Anti-Industrial Revolution, the 1999 update of Ayn Rand’s early 1970s anthology originally entitled The New Left, Peter Schwartz, the editor of the new edition wrote:
Primitive, according to the Oxford English Dictionary, means: “Of or belonging to the first age, period or stage; pertaining to early times …” With respect to human development, primitivism is a pre-rational stage. It is a stage in which man lives in fearful awe of a universe he cannot understand. The primitive man does not grasp the law of causality. He does not comprehend the fact that the world is governed by natural laws and that nature can be ruled by any man who discovers those laws. To a primitive, there is only a mysterious supernatural. Sunshine, darkness, rainfall, drought, the clap of thunder, the hooting of a spotted owl— all are inexplicable, portentous, and sacrosanct to him. To this non-conceptual mentality, man is metaphysically subordinate to nature, which is never to be commanded, only meekly obeyed.
This is the state of mind to which the environmentalists want us to revert.
If primitive man regards the world as unknowable, how does he decide what to believe and how to act? Since such knowledge is not innate, where does primitive man turn for guidance? To his tribe. It is membership in a collective that infuses such a person with his sole sense of identity. The tribe’s edicts thus become his unquestioned absolutes, and the tribe’s welfare becomes his fundamental value.
This is the state of mind to which the multiculturalists want us to revert. They hold that the basic unit of existence is the tribe, which they define by the crudest, most primitive, most anti-conceptual criteria (such as skin color). They consequently reject the view that the achievements of Western— i.e., individualistic— civilization represent a way of life superior to that of savage tribalism.
Both environmentalism and multiculturalism wish to destroy the values of a rational, industrial age. Both are scions of the New Left, zealously carrying on its campaign of sacrificing progress to primitivism.
In addition to the shocking Islamic terrorist attack yesterday in London, a troika of pop culture-related stories making the rounds today remind us that reprimitivization is well on its way.
First up, “Movement to Normalize Pedophilia Finds Its Poster Girl,” Stacy McCain writes in the American Spectator:
In January, Rush Limbaugh warned that there was “an effort under way to normalize pedophilia,” and was ridiculed by liberals (including CNN’s Soledad O’Brien) for saying so. But now liberals have joined a crusade that, if successful, would effectively legalize sex with 14-year-olds in Florida.
The case involves Kaitlyn Ashley Hunt, an 18-year-old in Sebastian, Florida, who was arrested in February after admitting that she had a lesbian affair with a 14-year high-school freshman. (Click here to read the affidavit in Hunt’s arrest.) It is a felony in Florida to have sex with 14-year-olds. Hunt was expelled from Sebastian High School — where she and the younger girl had sex in a restroom stall — and charged with two counts of “felony lewd and lascivious battery on a child.” The charges could put Hunt in prison for up to 15 years. Prosecutors have offered Hunt a plea bargain that would spare her jail time, but her supporters have organized an online crusade to have her let off scot-free — in effect, nullifying Florida’s law, which sets the age of consent at 16.
Using the slogan “Stop the Hate, Free Kate” (the Twitter hashtag is #FreeKate) this social-media campaign has attracted the support of liberals including Chris Hayes of MSNBC, Daily Kos, Think Progress and the gay-rights group Equality Florida. Undoubtedly, part of the appeal of the case is that Hunt is a petite attractive green-eyed blonde. One critic wondered on Twitter how long activists have “been waiting for a properly photogenic poster child of the correct gender to come along?”
Portraying Hunt as the victim of prejudice, her supporters claim she was only prosecuted because she is homosexual and because the parents of the unnamed 14-year-old are “bigoted religious zealots,” as Hunt’s mother said in a poorly written Facebook post. The apparent public-relations strategy was described by Matthew Philbin of Newsbusters: “If you can play the gay card, you immediately trigger knee-jerk support from the liberal media and homosexual activists anxious to topple any and all rules regarding sex.”
Meanwhile, giant cable television conglomerate Viacom must be especially proud of MTV today: “Trashy Former Pop Star Drinks Her Own Urine on MTV in Ratings Stunt,” Ace writes:
If you had questions about whether Ke$ha was a classy lady– questions that really ought not to persist, given that she really spells her name that way, “Ke$ha” — consider them now resolved.
Some are using this provocation as a justification for renewing the calls for a-la-carte cable subscriptions. “Some” are, in this case, correct.
Anyone who now has cable pays for MTV. Cable companies negotiate a flat payment to a station for carrying it. MTV also collects revenues from advertising, but a major source of its revenue is the automatic “tax” MTV imposes on your cable bill every month. You have no way to avoid paying for MTV– except for cancelling the service altogether.
Monopolies are generally not permitted to “bundle” services together. And local cable companies are usually monopolies, or, at best, have but one competitor– and as all of them have instituted this bundling practice and will not stop the practice no matter how much the public clamors for it, the monopolies (or duopolies) at least appear to be in collusion on this point.
And finally, while Robert Redford’s boyish shock of tousled hair and studio system hauteur hides a multitude of sins, his own primitivist mindset is lurking just under the surface, easily found:
Robert Redford today accused the US of losing its way in the years since the second world war. Speaking at the press conference for his new film All Is Lost at the Cannes film festival.
“Certain things have got lost,” said Redford. “Our belief system had holes punched in it by scandals that occurred, whether it was Watergate, the quiz show scandal, or Iran-Contra; it’s still going on…Beneath all the propaganda is a big grey area, another America that doesn’t get any attention; I decided to make that the subject of my films.”
Redford, now 76, also had critical words for the US’s never-ending drive for economic and technological development, which he considers has been a damaging force.
“We are in a dire situation; the planet is speaking with a very loud voice. In the US we call it Manifest Destiny, where we keep pushing and developing, never mind what you destroy in your wake, whether its Native American culture or the natural environment.
“I’ve also seen the relentless pace of technological increase. It’s getting faster and faster; and it fascinates me to ask: how long will it go on before it burns out.”
The Khmer Rouge sought to start over at year zero, and to sort of create the kind of society that very civilized, humane greens write about as though it were an ideal. I mean, people who would never consider genocide*. But I argue that if you want to know what that would take, look at Cambodia: to empty the cities and turn everyone into peasants again. Even in a less developed country, let alone in someplace like the United States, that these sort of static utopian fantasies are just that.
Incidentally, that fawning profile of Redford appeared (but of course!) in the UK Guardian under the headline, “Robert Redford on America: ‘Certain things have got lost.’” Well, that can happen when elderly Hollywood multimillionaires make films condoning terrorism, which are in turn approved by a former presidential aide, on the morning show that’s aired nationwide on a TV network owned by the Disney Corporation.
In his 2oo6 book Our Culture, What’s Left Of It, Theodore Dalrymple wrote:
Having spent a considerable proportion of my professional career in Third World countries in which the implementation of abstract ideas and ideals has made bad situations incomparably worse, and the rest of my career among the very extensive British underclass, whose disastrous notions about how to live derive ultimately from the unrealistic, self-indulgent, and often fatuous ideas of social critics, I have come to regard intellectual and artistic life as being of incalculable practical importance and effect. John Maynard Keynes wrote, in a famous passage in The Economic Consequences of the Peace, that practical men might not have much time for theoretical considerations, but in fact the world is governed by little else than the outdated or defunct ideas of economists and social philosophers. I agree: except that I would now add novelists, playwrights, film directors, journalists, artists, and even pop singers. They are the unacknowledged legislators of the world, and we ought to pay close attention to what they say and how they say it.
Especially when the first thought is turn away from the daily horrors our pop culture seems to bring forth in ever-greater numbers.
“Roger Ebert dies at 70 after battle with cancer,” reports the Chicago Sun-Times, the paper where he made his home for three decades:
For a film with a daring director, a talented cast, a captivating plot or, ideally, all three, there could be no better advocate than Roger Ebert, who passionately celebrated and promoted excellence in film while deflating the awful, the derivative, or the merely mediocre with an observant eye, a sharp wit and a depth of knowledge that delighted his millions of readers and viewers.
“No good film is too long,” he once wrote, a sentiment he felt strongly enough about to have engraved on pens. “No bad movie is short enough.”
Ebert, 70, who reviewed movies for the Chicago Sun-Times for 46 years and on TV for 31 years, and who was without question the nation’s most prominent and influential film critic, died Thursday in Chicago. He had been in poor health over the past decade, battling cancers of the thyroid and salivary gland.
He lost part of his lower jaw in 2006, and with it the ability to speak or eat, a calamity that would have driven other men from the public eye. But Ebert refused to hide, instead forging what became a new chapter in his career, an extraordinary chronicle of his devastating illness that won him a new generation of admirers. “No point in denying it,” he wrote, analyzing his medical struggles with characteristic courage, candor and wit, a view that was never tinged with bitterness or self-pity.
Always technically savvy — he was an early investor in Google — Ebert let the Internet be his voice. His rogerebert.com had millions of fans, and he received a special achievement award as the 2010 “Person of the Year” from the Webby Awards, which noted that “his online journal has raised the bar for the level of poignancy, thoughtfulness and critique one can achieve on the Web.” His Twitter feeds had 827,000 followers.
Unfortunately, Twitter revealed the intense far left biases and raging misanthropy inside Ebert, which did much to tarnish the family-friendly middlebrow tone of his previous movie criticism. Ebert’s embrace of the unfiltered medium erased much of the good will he developed through his years of co-hosting his weekly TV series At the Movies with Gene Siskel, his fellow Chicago-based critic, who himself had passed away in 1999.
Ironically, both men warned of the dangers of political correctness in the early 1990s:
GENE SISKEL: You have to summon up the courage to say what you honestly feel. And it’s not easy. There’s a whole new world called political correctness that’s going on, and that is death to a critic to participate in that.
EBERT: Political correctness is the fascism of the ‘90s. It’s kind of this rigid feeling that you have to keep your ideas and your ways of looking at things within very narrow boundaries, or you’ll offend someone. Certainly one of the purposes of journalism is to challenge just that kind of thinking. And certainly one of the purposes of criticism is to break boundaries; it’s also one of the purposes of art. So that if a young journalist, 18, 19, 20, 21, an undergraduate tries to write politically correctly, what they’re really doing is ventriloquism.
I suspect that will be the Ebert that will be remembered by posterity, ironically, before he allowed his opinions to be consumed by what he correctly dubbed “the fascism of the 1990s” — and beyond.
(Clicking on the Drudge Report, where I first saw news of Ebert’s death, I also hope the horrific photo of Ebert after his cancer, with much of his jaw removed will somehow be removed from circulation. But alas, our less-than-middlebrow culture won’t allow that to happen unfortunately.)
Update: At the Breitbart.com Conversation, John Sexton quotes this beautiful passage from Ebert, recorded for the commentary on the DVD of Dark City (the thinking man’s Matrix) before PC consumed Ebert’s journalism:
More: Before Ebert’s middlebrow movie critic phase, and final days as an archliberal polemicist, he was a screenwriter for Russ Meyers’ late ’60s and early ’70s sexploitation movies, including Beyond the Valley of the Dolls. Ebert wrote the camp classic line, “This is my happening and it freaks me out!”, which would be spoofed by Mike Myers in the first Austin Powers movie — which Ebert himself mentioned in his review.
Kathy Shaidle has that phase of Ebert’s career covered, in a post with quotes and videos. Plus a great catch, finding a remarkably unthoughtful gaffe by the Chicago Sun-Times in Ebert’s obit.
I was a slow convert to the idea of ebooks. My wife bought one of the first Kindles, and I couldn’t get past the off-putting appearance of the text on the screen in the Kindle’s first iteration. But then I tried the Kindle app for Windows. And the Kindle app for my Android Tablet. And slowly began to fall in love. I could read anywhere. I could free up space on my overflowing and limited physical bookshelves. I could easily quote what I had just read in a blog post. The idea of being able to carry my entire library with me and having it accessible in locations as diverse as the treadmill at the gym or a seat on an airplane became increasingly irresistible.
But not my entire library, alas. There are numerous examples of books that I’d repurchase in a second to read on my Kindle that simply aren’t there yet. Nor are they available on Barnes & Noble’s Nook e-reader; I’ve searched.
Off the top of my head, in an ideal world here’s what I’d like to see in the Kindle format. Amazon links are included, if you’d like to get started reading any of these titles now in good ol’ dead tree format — which might be a good idea, as I suspect the wait for some of these might be glacial.
■ Alvin Toffler’s Back Catalog: Toffler’s Future Shock was a huge bestseller when it was first published in 1970. A decade later, The Third Wave, the sequel to Future Shock, would be name-checked by Newt Gingrich during the heady days of the “Republican Revolution” in 1995, shortly after he became speaker of the House, which gives a sense of how the book’s predictions held up in the interim 15 years. Toffler’s War and Anti-War applied the principles of the Third Wave to warfare; Powershift applied them to business. Given that The Third Wave was a pretty accurate prediction of how the Internet reshaped society in the 1990s, if any book deserves to be available in electronic format, it’s this one. Where is it? (For my interviews with Toffler, click here and here.)
■ Profiles of the Future, by Arthur C. Clarke: A quarter century before Star Trek: The Next Generation displayed its first replicator onscreen, Clarke was writing about them in Profiles, along with plenty of other futuristic technology; some we now take for granted (such as the Internet and the Kindle) and others that are still on the drawing board. Again, why isn’t such a forward-thinking book not an ebook as well?
■ Filmguide to 2001: A Space Odyssey, by Carolyn Geduld. Speaking of when Stanley Kubrick’s enigmatic 2001: A Space Odyssey left so many audiences baffled in the late 1960s, co-screenwriter Arthur C. Clarke was fond of saying, “Read the book, see the movie, repeat the dosage.” Right idea, and while Clarke’s novelization of 2001 is available on Kindle, it’s not necessarily the best book for cracking the film’s mysteries. If I had to hand one baffled 2001 viewer the Cliff’s Notes to the movie, it would be Geduld’s book from 1973, which thoroughly charts out the film’s plot and leitmotifs.
The flat-panel news and information devices the astronauts read while eating dinner in 2001 directly inspired the iPad and Kindle. Now that technology has finally caught up Kubrick’s 1968 vision, shouldn’t the book that places them into context be accessible on those devices as well?
■ The Death of the Grown-Up, by Diana West. The subhead of West’s book is “How America’s Arrested Development Is Bringing Down Western Civilization.” As Michelle Malkin noted in 2007 when she interviewed West on her book, others have written about the increasing child-like naiveté of society, but West was perhaps the first to explain how it has hamstrung our fight in what was once called the Global War on Terror. That we had (have?) a war named after tactics rather than the enemy we’re fighting is due to the GWOT receiving its name largely through a process of elimination, as West noted in her book and the articles that preceded it, as political correctness allows few other choices.
For years, Walter Carter was the in-house historian at Gibson Guitars, before serving a similar function for well-known vintage guitar dealer George Gruhn. He has a new book out this month published by Backbeat Books, called The Epiphone Guitar Book: A Complete History of Epiphone Guitars. Its slick, glossy, 160-pages are heavily illustrated, with many photos in color.
With a legacy dating back to the 1870s and Greek luthier Anastasios Stathopoulos, the Epiphone brand name takes its name from two components — the nickname of Anastasios’ son, Epaminondas, and the word “phone,” which, in the 1920s when the brand Epiphone was launched, competed with the word “radio” to symbolize high-tech and modernity. (See also: Gramophone, the Radio Flyer, etc.)
Epiphone has had several twists and turns in its history. Until the mid-1950s, it competed neck and neck (pardon the pun) with Gibson for sales of arch-top jazz guitars. Ted McCarty, who built up Gibson as a music instrument powerhouse in the mid-2oth century, said that “when I came to Gibson, the biggest competition we had was Epiphone.” But the death of Epi in 1943, followed by squabbles among the surviving Stathopoulos family during the following decade, caused the value of their business to plummet. McCarty acquired Epiphone for Gibson’s parent company at a bargain rate, and production of Epiphone guitars switched in-house to Gibson’s Kalamazoo, MI plant, during the 1960s. The new brand name gave Gibson certain advantages: they could protect the exclusive arrangements their dealers had with Gibson, but sell Epiphone to nearby music dealers, positioning it as a slightly lower brand — the Buick or Oldsmobile to Gibson’s Cadillac.
In the mid-1960s, Epiphone models were played by a little-known cult act called the Beatles — “Everybody but Ringo,” as Carter told me. McCartney played an Epiphone Texan acoustic on “Yesterday,” George Harrison played his Epiphone Casino on Sgt. Pepper, and John Lennon played his own Casino on the rooftop of Apple Records during their legendary last concert at the conclusion of Let It Be.
In the early 1970s, Gibson sent production of Epiphone guitars overseas. Today, it exists, in part, as an entry-level brand for new guitarists (and as such, there are likely more Epiphones in circulation than Gibsons) and there’s some controversy between those who own traditional made-in-America Gibson guitars such as the Les Paul, and those who own Les Pauls and other models also sold under the Epiphone name.
Carter discusses all that and much more in our 21-minute interview. Click here to listen:
If the above Flash audio player is not compatible with your browser, click below on the YouTube player below, or click here to be taken directly to YouTube, for an audio-only YouTube clip. Between one of those versions, you should find a format that plays on your system.
Amazon’s cloud music service is now available on Roku and Samsung Smart TVs, offering the ability to stream your own digital music tracks without needing to keep a separate computer running. For Roku, it’s a solid response to Apple’s iTunes Match service, which offers cloud storage and streaming for $25 per year.
While Amazon Cloud Player started off as a largely free service, it now requires a similar fee as iTunes Match: $25 per year for up to 250,000 uploaded songs. That’s a ton of digital music, although the competing Google Play Music allows you to store up to 20,000 tracks for free and is available on Google TV devices.
The release comes on the same day Amazon added an Amazon Instant Video app to the iPhone and iPod Touch as well.
For our original review of the Roku box from January, click here.
Television’s Mad Men would have you believe that America was a monolithic bastion of Puritanism, untrammeled by European or socialist influences (despite the rise of Woodrow Wilson and FDR!) until the Beatles touched down at JFK Airport in 1964. The reality though, as Allen Bloom memorably wrote in The Closing of the American Mind, was that almost immediately upon the US winning World War II, America began to slowly — often unwittingly — become an unofficial enclave of Germany’s Weimar Republic.
Take architecture. As Tom Wolfe noted in From Bauhaus to Our House, his classic debunking of modernism’s excesses, because America’s intellectuals tend to think of themselves as an artistic colony in thrall to Europe, when the leaders of the Weimar-era German Bauhaus of the 1920s were evicted by the Nazis, they were welcomed by Depression-era American universities as “The White Gods! Come from the skies at last!”
[Walter Gropius, the founder of the Bahaus] was made head of the school of architecture at Harvard, and Breuer joined him there. Moholy-Nagy opened the New Bauhaus, which evolved into the Chicago Institute of Design. Albers opened a rural Bauhaus in the hills of North Carolina, at Black Mountain College. [Ludwig Mies van der Rohe, its last director, when the Nazis shuttered its doors in 1933] was installed as dean of architecture at the Armour Institute in Chicago. And not just dean; master builder also. He was given a campus to create, twenty-one buildings in all, as the Armour Institute merged with the Lewis Institute to form the Illinois Institute of Technology. Twenty-one large buildings, in the middle of the Depression, at a time when building had come almost to a halt in the United States— for an architect who had completed only seventeen buildings in his career—
O white gods.
Mies van der Rohe (1886-1969) is the titular subject of the newly published biography by architectural historian Franz Schulze and architect Edward Windhorst (who studied his craft under a protégé of Mies). They’ve collaborated on an extensively — very extensively — revised version of the biography of Mies that Schulze first published in 1986, the centennial of Mies’s birth.
While he was America’s most influential postwar modern architect and teacher, Mies never quite become a household name on the same order as Frank Lloyd Wright. (Despite a prominent Life magazine feature in 1957.) But he’s been the subject of numerous biographies and book-length profiles, beginning with his prominent role in The International Style, the pioneering Museum of Modern Art exhibition by Philip Johnson and Henry Russell Hitchcock, which first put modern architecture on the map in America, back in 1932.
Even as Mies was associated with several prominent buildings deserving of respect after World War II, perhaps his greatest accomplishment was to singlehandedly invent the language of postwar American architecture. We take tall steel and glass office buildings and apartments for granted, but it was Mies who created their look, beginning with 1951′s Farnsworth House (which would also provide the inspiration for Philip Johnson’s own Glass House) and from that same year, the 860-880 Lake Shore Drive apartment complex.
The Dallas Morning News reports that J.R. Ewing has retired to the Texas-sized ranch in the sky:
Larry Hagman, who played the conniving and mischievous J.R. Ewing on the TV show Dallas, died Friday at Medical City in Dallas, of complications from his recent battle with cancer, his family said.
He was 81.
“Larry was back in his beloved Dallas re-enacting the iconic role he loved most,” his family said in a written statement. “Larry’s family and close friends had joined him in Dallas for the Thanksgiving holiday. When he passed, he was surrounded by loved ones. It was a peaceful passing, just as he had wished for. The family requests privacy at this time.”
The role of J.R. transformed Mr. Hagman’s life. He rocketed from being a merely well-known TV actor on I Dream of Jeannie and the son of Broadway legend Mary Martin, to the kind of international fame known only by the likes the Beatles and Muhammad Ali.
Mr. Hagman made his home in California with his wife of 59 years, the former Maj Axelsson. Despite obvious physical frailty, he gamely returned to Dallas to film season one and part of season two of TNT’s Dallas reboot.
Reuters’ obit adds that Hagman “had suffered from cancer and cirrhosis of the liver in the 1990s after decades of drinking.” According to Wikipedia, “In August 1995, Hagman underwent a life-saving liver transplant after admitting he had been a heavy drinker. Numerous reports state he was drinking four bottles of champagne a day while on the set of Dallas. He was also a heavy smoker as a young man, but the cancer scare was the catalyst for him to quit.”
For those who enjoy recording their own music or podcasts at home, mastering is one of the more little known aspects of the process. Most people are aware of overdubbing, editing and mixing, but comparatively few understand how critical mastering can be to add the final sparkle to a mix, how it can transform a pretty good mix into something amazing, or (sometimes, with a little luck) a poor mix into something tolerable.
In the professional world, mastering is usually done using lots of very expensive outboard gear, as the final step before a master copy of a CD is sent to be duplicated into millions of consumer discs, or an album of MP3s is uploaded to iTunes and Amazon.
In the not necessarily professional world of home recording, mastering can be done with a plug-in effect.
For over ten years, Boston-area iZotope Inc., located near Boston has been producing a high-end plug-in for recording programs called Ozone. Now in its fifth iteration, iZotope produces versions of it for most PC and Mac-based recording programs, as well for Pro Tools, the most popular professional recording system.
When I interviewed him for a Blogcritics article on an earlier iteration of Ozone back in 2004, Jeremy Todd, the company’s chief technology officer (and a musician himself — he was trained as a classical pianist) told me:
Mastering in general is tough to put your finger on; I guess it depends on who you’re talking to. But for the purposes of Ozone, we talk about everything that you do once you’ve got a stereo mixdown, to when you when you actually have a master and you say, “OK, this is the audio, this is it, we’re not touching it anymore.”
With Ozone, we try to include everything that someone would need, so that, while it’s not always the case, but in theory they could not use another plug-in; they could do it all in one.
How was mastering done before the days of computers and hard disk recording? Todd says:
There were trends established way back when, that are still present today. We’re still seeing examples of these standalone hardware devices. Things were much more isolated, you wouldn’t see as much all-in-one gear, and you’d have these big, honking pieces of equipment that were just an equalizer — and a two or three band equalizer at that, usually just a finalizer, a loudness maximizer.
Obviously, if you go back far enough, mastering was dominated by analog equipment. So with Ozone, we’re trying to capture some of the flavor that people liked, which was a big challenge when it came to designing the DSP. It’s very difficult for people to explain why they like their two-band analog equipment. So it boiled down to a lot of listening tests, and asking people a lot of questions.
We tried to keep a little of the analog flavor in the sound, in our previous versions of Ozone. [Beginning] in Ozone 3, the analog modeling was firmly established, but people have been saying that in some cases, they want something cleaner; they don’t want any flavor, they want to be more surgical with the tool. So we added a digital component to the equalizer and the multi-band crossover.
With Home Recording, Mastering More Important Than Ever
Let’s take a moment to discuss how the mixing and mastering process has changed over the past 30 years for the average home recordist.
Back in the 1980s, when I first began to record demos of songs for my local rock group on a four track, mixing was relatively easy…because there were only four tracks (that’s actually a bit of a simplification — I used a fair amount of virtual tracks and outboard gear). But I did all the mixes in real time and hoped for the best. For their time, they weren’t terrible demos — but certainly nobody would confuse them for properly mixed and mastered track on a CD.
By the late 1990s, it was possible to replicate the process on a personal computer — and with infinitely more control over the individual tracks and the overall sound.
A decade ago, in one of my earliest reviews of a software-based recording program, I dubbed it “Abbey Road in a Box.” That may seem slightly hyperbolic at first, but today’s digital audio workstations (or DAWs for short) are incredibly sophisticated programs, combining the ability to record music digitally, then add built-in and aftermarket effects, and layer in a variety of software synthesizers and prerecorded loops as well. In short, they leave the stone knives and bearskins-level technology the Beatles had available to them in the 1960s in the dust.
But a DAW can seem as overwhelming at first as walking into a physical recording studio. As producer Brian Eno said of an actual mixing board 35 years ago:
Most people see a large mixer, and they’re completely bewildered because there are something like 800 or 900 knobs on it. Actually it’s not so complex as it looks – it’s the same thing repeated many times. Since you’re dealing with 24 tracks, everything has to be multiplied by 24; it’s not a very complex system. Each track from the tape recorder plays back on one channel of the mixer. Each individual channel has a whole set of controls that duplicate the other channels; that’s all.
But what do those knobs do — and more importantly — what can you do with them?
In other words, Abbey Road is just a series of acoustically-treated rooms and electronic gear without the skill of the engineers and producers who know how to make it work. Paul White’s The Producer’s Manual, published by British electronic music house Sample Magic and written by the editor of Britain’s long-running Sound on Sound magazine won’t turn you into the second coming of George Martin alone. But at over 350 full-color, heavily illustrated pages, with a glossary defining of all of its jargon, it’s an excellent guide to unlocking the power of the recording software and equipment you may already own. And what to look for when shopping for your next piece of kit.
If you already own a DAW, it may well have many of the digital tools that White describes in The Producer’s Manual. But how to make the most of them? What physical equipment do you need? What sort of sound card do you need? How do you choose which microphone for which application? Which speakers to ensure your mixes still sound the same beyond your basement? Are the acoustics in your recording room up to snuff?
And then there actual recording techniques — which is what you’ve assembled all this gear for, in the first place. What if you need to record an acoustic guitar? A chorus of background singers? How do you mic up a drum kit? Or heck, what if Christina Hendricks drops by and wants you to record her accordion playing?
OK, White doesn’t specifically mention Christina Hendricks — but he does go into how to record an accordion, along with all sorts of other instruments. And then how to edit, assemble, and master their parts — and how to salvage things afterwards if a session goes haywire. These are but a few of the topics that White explores. Beginners will learn much — I sure wish this book had been around a decade ago when I first made the leap to digital music recording after a decade toiling with cassette four-tracks. But those with plenty of experience in the brave new world of DAWs will find much to learn in this highly recommended book as well.
Steve Sabol, the scion of the founder of NFL Films, passed away yesterday at 69 of a brain tumor, an age that’s far too young to die these days. I grew up about 20 minutes from the NFL Films offices in Mt. Laurel, NJ* and in 2003, took a tour of their ultra high-tech facilities — which make the Bridge of the Starship Enterprise seem laughably antediluvian in comparison — as part of the research that wound-up doing double-duty at the start of the following year for articles in Videomaker magazine and Tech Central Station. The other half of my prep work for those two articles involved interviewing Sabol on the phone. As he told me at the start of our conversation:
Steve Sabol: There’s an old Indian proverb that I’ve always believed in, and that’s ‘tell me a fact, and I’ll learn. Tell me a truth, and I’ll believe. Tell me a story, and it will live in my heart forever’.
And that’s been one of our mottos, is telling a story. And the story telling is basically done through the editing. It’s the cameraman’s job to come back with as much material—story telling shots, action shots—as he possibly can. Then it’s up to the editor to tame and to shape the raw vision of the cameraman.
I started out as an editor, and then became a cameraman. But that’s really job of the editor. It’s so critical, and it’s one of the most overlooked artforms or disciplines in filmmaking. Most people don’t understand about editing; they understand writing, they understand music, they understand cinematography. But when it comes to editing and the selection and order of the shots, that’s the key to storytelling.
Driscoll: Did being an editor first influence you when you became a cameraman?
Sabol: When I started out as an editor, and tried to tell stories, I realized that there were certain gaps; that you couldn’t tell a story with just action shots. You needed shots that showed the passage of time, the sun shining through the portals of the stadium. You needed close-ups to show the reaction of the players to the game. You needed shots of the audience and the fans. You needed locator shots as well call them, that set the scene. What’s the stadium look like? Is it a full stadium? Is it an empty stadium? And you need shots that can move the story along. It might be a pair of bloody hands. It could be cleat marks in the mud. It could be a crushed water bottle on the sidelines. It could be a flag whipping in the wind. These were all things that were in important.
I was an art major in college, and Paul Cézanne, the famous French impressionistic painter, once said that “all art is selected detail.” And I felt that that was one thing that was missing in sports films were the details. And when I began as a cameraman, that was all I shot, was the details. I filmed the first 15 Super Bowls, and never saw a play. But I could tell you what kind of hat Tom Landry was wearing, how Vince Lombardi was standing in the fourth quarter, if Bob Lilly had a cut on the bridge of his nose. Those were the things that I remember in the Super Bowl. I don’t remember any of the plays. I was just what we call a weasel.
Driscoll: What is a weasel?
Sabol: Well, we have three types of cameramen: we have a tree, a mole, and a weasel. A tree is the top camera. He’s on a tripod rooted into a position on the 50 yard line, and he doesn’t move. A mole is a handheld, mobile, ground cameraman, with a 12 to 240 lens, and he moves all around the field, and he gives you the eyeball-to-eyeball perspective. A weasel is the cameraman who pops up in unexpected places, to get you the telling storytelling shot—the bench, the crowd, all the details.
So those are the three elements. When you blend them together you get the NFL Films visual signature—when you blend together a mole, a tree and a weasel.
You have infinitely more than that of course – NFL Films revolutionized how sports are covered by film and television, and transformed the National Football League in America’s leading sport. And as Sabol told AP when his father was inducted into the NFL Hall of Fame, “We see the game as art as much as sport. That helped us nurture not only the game’s traditions but to develop its mythology: America’s Team, The Catch, The Frozen Tundra:”
When Ed Sabol founded NFL Films, his son was there working beside him as a cinematographer right from the start in 1964. They introduced a series of innovations taken for granted today, from super slow-motion replays to blooper reels to sticking microphones on coaches and players. And they hired the ”Voice of God,” John Facenda, to read lyrical descriptions in solemn tones.
Until he landed the rights to chronicle the 1962 NFL championship game, Ed Sabol’s only experience filming sports was recording the action at Steve’s high school football games in Philadelphia.
* * * * *
He was the perfect fit for the job: an all-Rocky Mountain Conference running back at Colorado College majoring in art history. It was Sabol who later wrote of the Raiders, ”The autumn wind is a pirate, blustering in from sea,” words immortalized by Facenda.
The Sabols’ advances included everything from reverse angle replays to filming pregame locker room speeches to setting highlights to pop music.
”Today of course those techniques are so common it’s hard to imagine just how radical they once were,” Steve told the AP last year. ”Believe me, it wasn’t always easy getting people to accept them, but I think it was worth the effort.”
Indeed it was. RIP, Steve Sabol.
* But then, all of South Jersey is 20 minutes away from the rest of South Jersey.
(Cross-posted at Ed Driscoll.com.)
On Sunday, Nina and I finally caught The Dark Knight Rises. We both enjoyed it*, but with a nearly three-hour running time, I felt sort of numb afterwards, finding newfound respect for the terse minimalist Jack Webb police procedural-like feel of the half-hour Adam West Batman series from the 1960s.
OK, just kidding. But still, two hours and 44 minutes is way too long for anything that wasn’t directed by David Lean.
Speaking of which, at the Corner, Michael Walsh, linking to Andrew Klavan’s review in the Wall Street Journal, sees a Dr. Zhivago-esque subtext to the movie, which is obsessed with the dangers of revolution:
[I]f insanity is defined as doing the same thing over and over again and expecting a different result, what are we to make of every murderous Regressive movement from the French Revolution to the October Revolution to Mao and Pol Pot? All of them began in resentment and ended in oceans of blood. In fact, one of the worst things about being a Regressive is having to ride the tiger that eventually eats all of them. In Dr. Zhivago, the idealistic Pasha becomes the feared zealot Strelnikov who in turn becomes another of Stalin’s statistics. In this Batman installment, Bane’s raging Id and his secret controller’s lust for revenge are both defeated by heroes who understand where the truth lies.
In a spoiler-filled round-up at Big Hollywood, Ben Shapiro dubs The Dark Knight Rises, “Magnificent … And Most Conservative Film Ever.”
Most conservative film ever? Well…
Scott was the young brother of Ridley Scott; in addition to Top Gun, had directed Beverly Hills Cop II, Enemy of the State, and the remake of the Taking of Pelham 1,2,3, in addition to other projects in both film and TV. The Contra Costa Times has the early details of his apparent suicide:
British film director Tony Scott, known for such Hollywood blockbusters as “Top Gun,” “Days of Thunder,” “Beverly Hills Cop II” and “The Taking of Pelham 123,” jumped to his death Sunday from the Vincent Thomas Bridge spanning San Pedro and Terminal Island, according to Los Angeles County coroner’s officials.
Scott, 68, climbed a fence on the south side of the bridge’s apex and leapt off “without hesitation” around 12:30 p.m., according to the Coroner’s Department and port police.
A suicide note was found inside Scott’s black Toyota Prius, which was parked on one of the eastbound lanes of the bridge, said U.S. Coast Guard Lt. Jennifer Osburn.
More as it comes in.
Haskell Wexler’s Medium Cool, reviewed in the above clip by Larry Karaszewski, the writer of Ed Wood and The People Vs. Larry Flynt, via the Trailers From Hell Website, is invariably trotted out by film buffs for its blending of fiction and non-fiction. Or as Karaszewski enthuses in his narration:
The best thing about Medium Cool is the way it mixes narrative and documentary forms. Fact and fiction. There’s no other movie like it. You have a fictional story, but inside that, there’s documentary footage. But then there’s also faked documentary footage, and on top of that, there are fictional characters in real documentary footage. It’s mind blowing!
Beyond the loose and proto-postmodern “fake but accurate” (but ultimately fake) feel of the film, and Wexler’s cinematography, which is often stunning, Medium Cool is also one of the touchstones of the late 1960s as the beginning of the nadir of America. The film was shot around the violence the New Left inflicted upon the aging remnants of the old New Deal-era Left at the 1968 Democratic National Convention. (About which Mayor Daley was quoted as uttering one of the great and telling malapropisms of all time: “Gentlemen, let’s get this thing straight, once and for all. The policeman is not here to create disorder. The policeman is here to preserve disorder.” Which neatly, if unintentionally sums up Chicago, then and now.)
Only a year before helming Medium Cool, Wexler was the cinematographer on The Thomas Crown Affair, which was an exercise in pure style — Steve McQueen looking ice cool in his three piece suits and skinny Don Draper-esque ties, Faye Dunaway looking ravishing — the last gasp of the Cary Grant/Grace Kelly/Alfred Hitchcock-style suspense film.
But by then, the moral rot was already seeping in — Thomas Crown, who also robs banks for kicks, is the sort of millionaire playboy version of Clyde Barrow, as portrayed the year prior by Warren Beatty, in the hugely influential film also co-starring Dunaway, and the style of the Hitchcock-era would effectively be dead. Hollywood would then enter a thoroughly confused, and often audience-alienating, and hence money-losing phase which didn’t end until two young whiz kids named Spielberg and Lucas saved the industry. Medium Cool remains a fascinating time capsule of the late 1960s — along with the exhaustion of both its film industry and the liberals who helmed it.
Incidentally, after hearing David Gelernter on Hugh Hewitt’s show, I’m finally going through Gelernter’s new book, America-Lite: How Imperial Academia Dismantled Our Culture (and Ushered In the Obamacrats), and I thoroughly recommend it. As its title implies, the primary focus of America-Lite is how America arrived in 2012, with an academy, media, and president all with — although Gelernter doesn’t use the word in his book — a raging case of Oikophobia. Naturally, that story can’t be told without focusing on the rise of the New Left in the 1960s. Gelernter describes a touchstone moment here:
During the summer of 1967, the New York Review of Books published on its cover a diagram showing how to make the flame grenade called a Molotov cocktail—the message being that left-liberals who wanted to remake American society should take to the streets and throw bombs. Make them, throw them, and the hell with it. If people burn, they burn. What’s important anyway, mere human beings or the Big Idea? The Movement? The Revolution? Also sprach the left. Back in the 1930s, the malevolent ravings of left-wing intellectuals had been unimportant to American culture at large. But now, times were different. The colleges were listening (although even the radical left-liberal college students of the late 1960s rarely resorted to Molotov cocktails—despite being grateful, no doubt, for the Review’s helpful advice).
With its focus on the violence surrounding the 1968 Democratic Convention, in a sense, Medium Cool functions as a sort of visual styleguide to Gelernter’s book. America-Lite describes how the cultural breakdown occurred, and how it continues to impact us today. But to get a visual sense of the late 1960s as it was seen by a director who was (and is) very much a man of the far left, this is the film that will do the job. And like the decade itself and the tragic descent of the ideology that propelled it, it’s not a pretty story, whether told in fact or fictional form.
Debuting in the mid-1970s, largely thanks to Japan’s Roland Corporation, guitar synthesizers have long had their share of headaches, until Roland launched their VG-system in the mid-1990s. Instead of concentrating on synthesizing strings and trumpets, suddenly here was a unit armed with loads of great guitar-oriented sounds and effects, which tracked flawlessly. The original VG-8 debuted in 1995. Roland’s VG-88 lasted from 2000 until 2007, and used versions of the VG-88 can be found on eBay these days for $150 to $500. Its successor, still in production, was released in 2007, and dubbed the VG-99. It replaced the black stealth bomber doorstop floor box shape of the VG-88 and original VG-8 with a sleek silver table-top unit, which could also be rack mounted, or placed on an optional music stand, for manipulation during performance.
Roland’s VG-99 (which streets for about $1600 with the required pickup for your electric guitar, $100 less without the additional pickup) builds on their long-running line of VG-8 and VG-88 guitar modeling systems, but now in the form of a tabletop, not floor design and inside, three internal processors for some high-powered computing technology.
I wrote up one of the earliest reviews of the VG-99 for Blogcritics in 2007. I’m cribbing from that text, though with revisions to bring that material up to date.
The VG-99 requires a guitar equipped with an aftermarket Roland hexaphonic pickup (pictured at left mounted on a Gibson Les Paul) and 13-pin cable to connect the pickup to the VG-99, or a guitar equipped with a compatible factory-installed hexaphonic pickup, such as those made by Godin, or Fender’s Roland-Ready Stratocaster, which I used to test the unit. Some sources claim that Roland’s hexaphonic pickup sounds better on many of these patches than the piezo pickups used on the Godin units; check out the archives at the VGuitar Forum to see the pros and cons of this argument.
Like the predecessor VG-88, it’s also possible to plug an electric guitar with a conventional quarter-inch jack into the VG-99. Most of the more extreme modeling elements won’t trigger, but it’s a great way to make use of a trusty old Les Paul, Tele or any other non-hex-equipped instrument and drive basic amp sounds.
Speaking of amps, expect to find all sorts of simulated Marshalls, Fenders, Voxes, Mesas, Hi-Gains and Roland’s own JC-120. There are also a variety of modeled guitars, including Les Pauls, ES-335s, Fender Strats and Teles, steel and nylon-strung guitars, 12-strings, Jazz and P-Basses, and more exotic instruments such as Dobros, mandolins, and even violins. The two control buttons on my Roland-Ready Strat correspond with the treble/center/lead pickup switch on the Les Paul and the five-way switch on a (traditional) Fender Stratocaster; a nice touch.
It’s also possible to model a guitar completely from scratch, even with physical parameters impossible on a real instrument. While the parameters on the screen of the VG-99 are reasonably easy to tweak, A much more intuitive computer GUI allows tweaking the parameters via a PC and USB connection.
And there are all sorts of effects as well, plus the ability to manipulate wah-wah, volume and pitch (from dive bombs to B-Bender-style licks) via either Roland’s long-running EV-5 foot pedal, or more complex (and more expensive) FC-300 pedal board, and the controllers on the top of the VG-99 itself.
Taken from Roland’s keyboard synthesizers, these include a finger-sliding “ribbon” controller, which can be switched to control the pitch and filter settings of most patches. Perhaps more intriguingly, there’s also Roland’s “D-Beam”, which can also control many patches by waving a hand over the VG-99, or even a guitar neck. The D-Beam could provide the opportunity for some flashy stage gestures, vaguely reminiscent of Jimmy Page and his Theremin.
Also, it’s possible to manipulate many of the effects in the patches via the knobs on the unit, and in many of the preset patches, go from open to standard tuning and back again at the press of a button. For those who like to play rhythm guitar in Open-G tuning ala Keith Richards, but drop back to standard tuning for the solo, that’s easily accomplished with the VG-99.
I first began playing guitar around November of 1982; I remember vividly driving back from the Moorestown Mall having purchased (in the now defunct B. Dalton bookstore chain) The Guitar Handbook by Ralph Denyer. Covering everything from the author’s favorite guitar heroes, to what to look for when buying a guitar, to an extensive and well-written main core of the book devoted to music theory, Denyer’s book certainly lives up to its name. I remember instantly thinking as I thumbed through it, “This is it! It’s all here!” Of course, what wasn’t there was much of an insight into rock guitar licks, but still, it was a book I referred to endlessly when I first began playing, to the point where I basically wore my copy out, using black electrical tape to keep its binding together. While Denyer released an updated version of the book in 1992, a few years ago, I bought a used copy of the original 1982 edition, just to remind myself of where things started.
And they really did start from there. Shortly afterward, I bought my first electric guitar, a Hondo (Korean- or Japanese-made) clone of a 1959 Les Paul. In March of this year, after my mom had passed away and we cleaned out her house in preparation of putting it on the market, I found the old Hondo in the basement and picked it up — as was typical of Les Pauls of the early 1980s, both by Gibson and those selling knock-offs, it weighed a ton!
While I counted Jimmy Page, Keith Richards, Eric Clapton and Jimi Hendrix as my early guitar heroes, at the time, my biggest musical inspiration was Pete Townshend. And as journalist J.R Taylor wrote a few years ago, with both The Who’s popularity and his own as a solo artist at their apogee, the early 1980s “was a good time to be a Pete Townshend fan.” Certainly in my case that was true.
In 1983, Townshend released the first of his Scoop series of albums. These were the demo recordings of songs that would be recorded by The Who or professionally re-recorded by Townshend for his solo albums. In the liner notes, Townshend explained that he didn’t write his songs on staff paper; he recorded them on tape recorders, overdubbing a drum track — either real drums or a drum machine — then guitar, then bass, then vocals.
Concurrent with the release of Scoop, the first cassette four-track recorders began to appear in music stores, building on punk rock’s DIY ethos, and I was quickly off and running. A cassette four-track isn’t one of those old eight-track machines that Homer Simpson had in his car as a teenager. They use ordinary cassettes, but instead of having flipping the tape over to play the other side, the four-track recorder only plays in one direction, to allow for overdubbing up to four tracks of music; perfect for cutting a demo, as mentioned above, with a drum machine (which was also a new development in the early 1980s), bass, guitar and vocals; one instrument per track.
While I was not very artistic as a teenager prior to picking up an instrument, once I realized I could write and produce my own music, I thought, what else can I do? Which lead to studying radio production, video production, and eventually, a certificate in filmmaking from NYU.
But it all began with guitar playing. And one of the elements that ties together so many early bloggers is DIY music. As Glenn Reynolds (who was producing his own MP3s before launching Instapundit) told C-Span’s Brian Lamb in 2006, paraphrasing the 2003 Dave Clarke song “Disgraceland” along the way, to him blogging was “like the old punk rock ethos. You know, ‘they were terrible; I wanted to be terrible too!’ But it wasn’t terrible. And that was actually what was really striking about [Mickey Kaus’s Kausfiles in 2001.] There were lots of sort of amateurish, not very good Web sites out there in 1996, or whenever this was, but this looked good and it read well and it was really interesting, and I just thought it was really cool.”
More or less concurrent with my own nascent blogging efforts beginning in early 2002, I returned to my eighties-era hobby of recording my own music. Only this time around, using a personal computer, Cakewalk’s Sonar multitrack recording program, and eventually, a couple of incarnations of the Roland Corporation’s guitar modeling rigs, which allow a guitarist to dial through an enormous variety of preset sounds in much the same way a keyboard synthesizer player is able to. (You can scroll through my articles at Blogcritics over the years; I’ve written all sorts of posts there on the topic of home recording.)
When I started producing PJM’s Sirius-XM radio show, which lasted from September of 2007 through the end of 2010, and my ongoing Silicon Graffiti video series, which began in earnest in January of 2008, my guitar playing went by the wayside a bit. I still picked it up almost every day to noodle, but rarely plugged it into an amplifier. And cranking out a weekly 55-minute MP3 filled with interviews and music — occasionally my own — and uploading it to the Sirius-XM server filled my home recording jones in spades.
But this past weekend, I dusted off my “Roland-Ready Strat,” a Fender Stratocaster electric equipped with a special pickup designed to plug into Roland’s guitar synthesizers and plugged it in my Roland VG-99 guitar modeling box. Just dialing through the presets, and playing electric guitar, acoustic guitar, electric sitar, and guitar synthesizer was a reminder of all of the possibilities inherent in the seemingly simple instrument that is the guitar.
And also a reminder of how comparatively easy it now is to both learn how to play guitar, and to get a decent sound out of it. Once you’ve learned a few basic chord shapes and the bare bones rudiments of musical theory and you’d like to learn to play a hit song, there’s likely tablature available for free on the Internet to learn its riffs and chord changes. With the fundamentals now so easy to learn, we should be hearing hours of fantastic new music on the radio every week, right?
No, of course not. Which brings us to the second part of this essay, starting on the next page.
(We take a break from the usual day to day political and media bias stuff for a long rambling discussion on modern architecture and aesthetics written in the first person voice. As with our earlier explorations of the topic, we’ll understand if you bail on this one. And yes, that’s my use of the royal we. At least for this post.)
I’m not sure what initially attracted me to the aesthetics of modernism. I do remember studying Art of Western Civilization in college, which, as with Western Civilization itself, largely concluded with the arrival of the 20th century. But modern art fascinated me — unlike traditional aesthetics, cracking modernism, whether it was architecture, or artists such as Mondrian, was a bit like deciphering a puzzle box. Of course, that complexity was considered a feature, not a bug, by the men who founded the movement. Reviewing C.P. Snow’s 1959 book, The Two Cultures and the Scientific Revolution, Orrin Judd of The Brothers Judd book review site and blog wrote:
As Snow notes, as late as say the 1850s, any reasonably well-educated, well-read, inquisitive man could speak knowledgeably about both science and the arts. Man knew little enough that it was still possible for one to know nearly everything that was known and to have been exposed to all the religion, art, history–culture in general–that mattered. But then with the pure science revolution of which Snow spoke–in biology and chemistry, but most of all in physics–suddenly a great deal of specialized training and education was necessary before one could be knowledgeable in each field. Like priests of some ancient cult, scientists were separated out from the mass of men, elevated above them by their access to secret knowledge. Even more annoying was the fact that even though they had moved beyond what the rest of us could readily understand, they could still listen to Bach or read Shakespeare and discuss it intelligently. The reaction of their peers in the arts, or those who had been their peers, was to make their own fields of expertise as obscure as possible. If Picasso couldn’t understand particle physics, he sure as hell wasn’t going to paint anything comprehensible, and if Joyce couldn’t pick up a scientific journal and read it, then no one was going to be able to read his books either. And so grew the two cultures, the one real, the other manufactured, but both with elaborate and often counterintuitive theories, requiring years of study.
Or at very least, a crash course for an enthusiastic auto-didactic to pick up the basics. I began by taking out books on modern art and New York’s Museum Modern Art from my college library and my local public library. Eventually, I came across Henry Russell Hitchcock and Philip Johnson’s early 1930s book, The International Style, which put modernism on the map in America, and Peter Blake’s mid-‘60s book The Master Builders: Le Corbusier, Mies van der Rohe, and Frank Lloyd Wright, both of which have been perennially in print and still available from the gift shop at NY MoMA. And given that I had loved the Right Stuff, The Purple Decade and The Bonfire of the Vanities, I also read Tom Wolfe’s From Bauhaus to Our House.
Oddly enough, reading From Bauhaus to Our House, I found myself loving the satire, but also finding myself strangely fascinated by the images, in spite of Wolfe’s best efforts to take the mickey out of them. Reading Blake’s Master Builders, and other books on modern architecture, initially, I admired Corbusier’s works, particularly his pre-WWII buildings, but found myself increasingly put off by his post-war efforts, which replaced the white stucco of the homes he designed for his earliest wealthiest patrons with massive forms built largely out of raw concrete. Corbu’s postwar style was dubbed Béton Brut, and the New Brutalism, and brutal it was indeed. (Even Blake, the former editor in chief of Architectural Forum magazine, would have second thoughts.)
But Mies van der Rohe had worked out an architectural language that was logical (or at least seemed logical), and at its best a sort of industrial poetry. It was also the vocabulary of post-war American cities. As Wolfe wrote in From Bauhaus to Our House, Mies, the Bauhaus’s last director, and Walter Gropius, its founder, both settled in America after fleeing the Nazis in the 1930s, and both we’re welcomed by academia, as Wolfe famously wrote, as…The White Gods!
Gropius had the healthy self-esteem of any ambitious man, but he was a gentleman above all else, a gentleman of the old school, a man who was always concerned about a sense of proportion, in life as well as in design. As a refugee from a blighted land, he would have been content with a friendly welcome, a place to lay his head, two or three meals a day until he could get on his own feet, a smile every once in a while, and a chance to work, if anybody needed him. And instead—
The reception of Gropius and his confreres was like a certain stock scene from the jungle movies of that Bruce Cabot and Myrna Loy make a crash landing in the jungle and crawl out of the wreckage in their Abercrombie & Fitch white safari blouses and tan gabardine jodhpurs and stagger into a clearing. They are surrounded by savages with bones through their noses—who immediately bow down and prostrate themselves and commence a strange moaning chant.
The White Gods!
Come from the skies at last!
Mies in particular created a sort of systems-based design philosophy, which he taught to his students at the Illinois Institute of Technology, which was essentially his private educational fiefdom in the 1940s and ‘50s. By the 1960s, it became common to say that Mies’s architecture was the easiest architectural language to teach, as Blake himself writes in The Master Builders. But as Chicago-area architectural historian Franz Schulze, Mies’s best biographer, would write in 1985, “Indeed it was not at all, and may have been among the least teachable. The acres of stillborn design in the Miesian manner that transformed the American cityscape in the 1950s and 1960s are a palpable indication of this.”
Last week, when I linked to the video from McDonald’s Canadian division that explained why food almost always looks better — and typically bigger — in a photograph than in person, YouTube suggested the above video, titled “The Photoshop Effect” as a recommended choice at the end of the McDonald’s clip. It’s from 2008, but it’s still a relevant topic, especially considering how much more powerful Photoshop has gotten in the years since, including its new CS6 edition.
But arguments as to “is it fair” that supermodels and A-list Hollywood actresses have teams of skilled Photoshoppers making their already well-toned bodies and well-defined facial features look even better seems to be a rather specious argument. Celebrities want to look their best when they’ve got a new film to hawk, Sports Illustrated wants their swimsuit edition to jump off grocery counter checkout lines, etc. Does it promote a false ideal for women, as the young woman in the above video asks? Well no more than the physical fitness of models and actresses, who have hours blocked out of their day to spend at the gym with expensive personal trainers.
Funny though that no one complains that when Bruce Willis jumps off a 100-story skyscraper or fist-fights his way through a thousand heavily-armed terrorists, what we’re really seeing is a stuntman and plenty of CGI. But even if they did, in a way, that complaint, and the ones heard in the above video are somewhat akin to the arguments floated when massive amounts of overdubbing first took off in popular music in the mid-1960s. The early Beatles, at their best, were a tight little rock group, as can be heard on their first album. I believe all of those backing tracks were cut live, and only minimal overdubbing was done to patch up their vocals. But the time of the Sgt. Pepper-era, the Beatles were bringing in session musicians skilled in unusual instruments, whole orchestras, hiring outside arrangers, and their producer George Martin was developing new recording effects and increasingly complex strategies to push the equipment inside EMI’s Abbey Road studios to the very limit of 1966 and ’67-era recording technology. That the Beatles were a cash cow for EMI made it all possible.
20 years later, during the height of the MTV-era, Paul McCartney would release a stripped down, relatively low budget video shot in the London subway tubes to accompany his song “Press” and justify it during interviews by complaining about so many up and coming groups who would simply hiring the trappings of success — expensive cars, flashy clothes, dancing girls, and exotic locales for a day or two worth of video shooting, to make themselves look more successful and wealthier than they really were.
(Photoshopped into the PJ Lifestyle blog from Ed Driscoll.com.)
Last night I installed Fandango onto my Roku. (Now there’s a sentence that would have been meaningless a few years ago.) It’s a an HD movie trailer channel set up by the folks who sell online movie tickets. Opening next month at a theater near you, is a new film starring Dwayne Johnson (“The Rock”), Bruce Willis, and Jonathan Pryce, the star of Terry Gilliam’s Brazil and co-star of Larry Gelbart’s HBO production of Barbarian at the Gates. The three team up for a gritty new high-tech action movie about the president (played by Pryce) being replaced by an evil twin, sort of The Manchurian Candidate meets Kevin Kline’s Dave movie from the early 1990s.
Produced by Hasbro:
I don’t begrudge any man in Hollywood his $20 million paycheck, but…dude. Incidentally, I love the bit where Johnson describes Willis’s character as “The reason why we call ourselves Joes.” He’s the original GI Joe — But isn’t that rather obvious, given that Willis’s character is 12-inches tall in the movie and all the other actors are only 3 and a half inches tall?
According to TMZ, Donna Summer has passed away at age 63:
9:27 AM PST- TMZ has learned … Donna died from lung cancer. Several sources are telling us Donna believed she contracted it by inhaling toxic particles after the 9/11 attack in New York City.
9:35 AM PST- Donna’s family just released a statement, claiming, they “are at peace celebrating her extraordinary life and her continued legacy.”
Donna Summer — the Queen of Disco — died this morning after a battle with cancer … TMZ has learned.
We’re told Summer was in Florida at the time of her death. She was 63 years old.
The reference to 9/11 sounds like her survivors are preparing some sort of lawsuit against the City of New York, but in the meantime, RIP to the late-’70s-era icon.
(Cross-posted at Ed Driscoll.com)
Ever since 1990′s Metropolitan, writer-director Whit Stillman has been documenting the foibles and mores of
elite WASPs the “urban haute bourgeoisie.” His latest film, Damsels in Distress, is set at fictional Seven Oaks College, and explores the efforts of Greta Gerwig as Violet Wister, Analeigh Tipton as Lily and Megalyn Echikunwoke as the posh London accented-Rose, to reform the slovenly boys of the school’s frat house. Along the way, they team up to create the Sambola, the dance craze of 2012.
In this ten minute interview Stillman discusses:
- Why it’s been 14 years since his previous film, The Last Days of Disco.
- How the independent film market has changed since the 1990s.
- How Damsels references both Metropolitan and Last Days of Disco.
- When we can expect to see 1994′s Barcelona on Blu-Ray and/or in the Criterion Collection.
- When we can expect Stillman’s next film.
And much more. Click below to listen to our interview:
If your browser/Internet connection balks at the Flash player above and/or downloading the audio, click on the player below, or click here to be taken to YouTube, for an audio-only YouTube clip. Between one of those versions, you should find a format that plays on your system.
For the rest of podcasts at the PJM Lifestyle blog, start here and keep scrolling.
Judy Gelman and Peter Zheutlin are the co-authors of The Unofficial Mad Men Cookbook: Inside the Kitchens, Bars, and Restaurants of Mad Men. Knowing how much I love the show (at least in its early seasons — get back to me when this season is over…) my wife gave me a copy of the book for Christmas, and I was surprised at how thorough and accurate the authors’ research of Mad Men, and early ’60s drinking and dining in general was. If you’re planning a Mad Men-themed party, or simply want to make the same kind of Old Fashioned that Don drinks, Roger’s favorite Oysters Rockefeller recipe, or heck, Pat Nixon’s Date Nut Bread, this is your book.
Among the topics we discussed are:
- While aesthetics in general may have arguably gone downhill since the swank suit and skinny ties of the early 1960s, food has actually gotten much more varied. How basic were the dining and drinking choices in Don Draper’s days? The answer may surprise you.
- How did the authors compile a list of all of the food and drink shown on the show?
- How cooperative were the restaurants that are still around, and mentioned in the show in working with the authors?
- Have they met any of the cast members since writing the book?
13 minutes long, click here to listen:
If your browser is unhappy with our MP3 player, an audio-only YouTube version is also available:
For our previous podcasts at the Lifestyle blog, click here and keep scrolling.
Just over the wire from MSNBC:
Levon Helm, singer and drummer for the Band, died on April 19 in New York of throat cancer. He was 71.
“He passed away peacefully at 1:30 this afternoon surrounded by his friends and bandmates,” Helm’s longtime guitarist Larry Campbell tells Rolling Stone. “All his friends were there, and it seemed like Levon was waiting for them. Ten minutes after they left we sat there and he just faded away. He did it with dignity. It was even two days ago they thought it would happen within hours, but he held on. It seems like he was Levon up to the end, doing it the way he wanted to do it. He loved us, we loved him.”
In addition to his musicianship with The Band, Helm was also an accomplished actor in supporting roles as Sissy Spacek’s father in A Coal Miner’s Daughter, and as the wingman to Sam Shepard’s Chuck Yeager in The Right Stuff, with a stick of Beemans (the official gum of test pilots) always at the ready.
Incidentally, as someone who wasn’t a fan of The Band and its mythology in the 1970s, what’s the deal with “The Night They Drove Old Dixie Down?” Its pro-Confederate lyrics are the very definition of politically incorrect. Is it granted a pass by the left because of The Band’s association with Dylan? Does it help that it’s describing the end of the Confederacy? Or are fans simply listening to the melody and the dynamics of the song and not paying attention to the lyrics? (When I saw the Funk Brothers, the Motown house band, play at a Northern California winery five or six years ago, I got a chuckle out of couple of thousand Bobos in Paradise, few of which are likely NRA members, shouting every word of Junior Walker’s “Shotgun;” presumably Robbie Robertson’s song gets a pass as well.)
From all accounts, Jim Marshall was a truly nice a man, a gentleman in the British sense of the word, who happened to design the amp that caused all our ears to bleed. (Even before they went to #11.) The first guitar amplifier I ever owned was a small Marshall, and back in 2003, Vintage Guitar magazine asked me to write a two-part profile on the History of the Marshall Amp for that august musical equipment manufacturer’s 40th anniversary; here are a couple of excerpts from those articles:
In July of 1960, Jim Marshall, having developed his reputation as a regularly gigging drummer, and drum teacher, opened a musical equipment store at 76 Uxbridge Road in the Hanwell section of West London, which would come to be frequented by some of England’s top guitarists. Most of them felt at the time, Marshall says, that the Fender Bassman was the amplifier to beat—but it wasn’t perfect. “Players like Pete Townshend, Ritchie Blackmore and ‘Big’ Jim Sullivan (a hugely talented player who was one of the most respected and busiest session guitarists in England during the ‘60s and ‘70s) pointed out to me, that although they used the Fender, it didn’t produce the actual sound they wanted. So, they described the sound they were looking for to me and that’s how the JTM 45 came to be.”
That the sound of the Marshall amp would come out of the Bassman isn’t all that surprising, as it’s not too difficult to compare Jim Marshall to Leo Fender. Neither man was a guitarist, but each made his career as an entrepreneur who was willing to listen very, very carefully to their guitar-playing customers, and give them what they wanted.
Marshall says, “I liked the sound of the Fender, in fact it was my favorite guitar amplifier at that time without any doubt, but it wasn’t the sound the boys described to me…it wasn’t the sound I heard in my head.”
Getting the sound Marshall heard in his head required a considerable amount of experimentation. “My repairman, Ken Bran, had a young assistant named Dudley Craven and he was the chap who managed to put what I was hearing in my head into an amplifier”, Marshall says. “Dudley was a brilliant engineer who used to work as an apprentice for EMI and I more than doubled his wages so he’d help us build our first rock and roll amplifier. Dudley made five amps for me, one after the other, and I turned them all down because they didn’t have the sound I was after. Then he made number six and that was the one that did it—that’s the one that had the sound I had in my mind that the players had put to me. The players must’ve agreed too because when we put “number six” in the store in September 1962, we sold 23 that very first day!”
“Number six” was a 35-watt head whose circuitry closely resembled the Fender Bassman. The difference in sound was “the harmonics of the valves—or the tubes as you call them in America—when they’re driven in a certain, special way…along with certain things we do within our amplifiers that we do not discuss!” For those who wish to compare the differences between the first Marshall amp and the Fender Bassman, Tom Doyle’s invaluable book, The History of Marshall compares the circuitry of each amp design in depth.
* * * * *
Nick Bowcott, who is now Marshall’s product manager for their American distributor, Korg USA, Inc., may be prejudiced, but what he told me recently is something that most players can relate to. “This might sound somewhat strange, but when I was in my teens, and my band first started venturing out of our hometown, I truly didn’t feel like I was someone who could be taken seriously, until I got my first Marshall. It was almost like a status symbol, it was like I was saying to the audience, ‘OK, I’ve arrived, I’m serious.’ In my mind, to this very day, there’s nothing like seeing a band you’ve never seen before, and the first thing that hits you is a wall of Marshalls. That’s always been synonymous with the sort of music I like, and great tone. It spoke volumes without a single note being played because it’s such a powerful visual statement.”
And it speaks with even more volume, once it’s switched on, as the clip below, with Eric Clapton playing a late-1950s-era Les Paul plugged into an early Marshall amp demonstrates. RIP, Jim Marshall:
Otto von Bismarck, the father of the welfare state, is often credited — apparently erroneously — as saying that “Laws are like sausages — it is best not to see them being made.” Often, that’s also the case with books about show business. Very often, the finished product is inversely proportional to what bastards the artists who produced it were.
For as Woody Allen — of all people — once told his biographer about five minutes before he became synonymous with the name Soon Yi:
“Talent is absolutely luck,” he said one day while talking about his early fear of performing. “And no question that the most import thing in the world is courage. People worship talent and it’s so ridiculous. Talent is something you’re born with, like Kareem [Abdul-Jabbar] is born tall. That’s why so many talented people are shitheels.”
And there were plenty of artists who inhabited the original edition of Saturday Night Live who fit both halves of that equation, combining varying overlapping degrees of talent and schmuckiness. Which is why the book Saturday Night: A Backstage History of Saturday Night Live by Doug Hill and Jeff Weingrad, first published in 1986 and recently made available on the Kindle (and selling for under six bucks as of the time of this article), is sometimes reminiscent of Woody’s and Otto’s warnings. In a way, Hill and Weingrad’s book works on a similar level as movies like The Godfather, Scarface, or Goodfellas. In modern-era gangster movies, as long as the cameras keep the audience within the point of viewer of the mobsters, they seem sleek and cool. It’s only when you consider the damage done to the innocent people just off-screen that you begin to appreciate the level of brutality the mob inflicts.
Knowing what we now know of the culture wars that began in the mid-sixties, there’s a sense of that in A Backstage History of Saturday Night Live, though it’s sometimes only tacitly referenced, in this otherwise extremely well-researched book. It’s an excellent read — as the Associated Press noted in a blurb from the book’s original edition, “It reads like a thriller and may be the best book ever written about television” — and based on the quality of writing here and the research and interviews that went into it, that’s not exactly hyperbole.
Don’t Trust Any Boom Operator Over 30
To understand how SNL changed television, it helps to understand the era before its debut. Hill and Weingrad explore that extensively from the point of view of late sixties and early seventies underground comedy. But as far as the TV industry itself, the best source is likely Ben Shapiro’s 2001 book Primetime Propaganda, which has a lengthy section that charts the history of the growing leftward tilt of the television industry in the 1960s and early 1970s.
It’s safe to say that by the mid-1970s, there probably weren’t a whole lot of Republicans left at NBC, and certainly not in the more prominent roles at the network. We know that much of the on-air talent on its various shows, such as Johnny Carson, James Garner, and news readers such as Tom Brokaw and Bryant Gumbel, were liberals to one degree or another. The union crew members who built the sets, manned the cameras and aimed the lighting rigs were likely majority Democrat as well. But they were of the old-school middlebrow left, where a classy and polished product was still the goal. And as Hill and Weingrad demonstrate, there was a hard culture clash between the old school liberals who worked at NBC in the mid-1970s, and the young radicals who made up the production staff and on-air talent at Saturday Night Live.