Technical Hooey from the White House

There have been a couple of real plum stories in the intersection of political reporting and technology in the last week: first, the Washington Post story on the incoming Obama administration’s shock and horror at the computer environment they found at the White House (“Staff Finds White House in the Technological Dark Ages“), then closely following, rather more technical stories on how the new whitehouse.gov website was very much more “open” that the Bush administration’s version.

Advertisement

Sadly, in both cases, the stories demonstrate a lot more knowledge of politics and attention to the “nasty, stupid Bush administration” theme than they show technical knowledge.

Let’s look at them in order. In the first story, Washington Post writer Anne Kornbluth writes about the “technological Dark Age” at the White House. It makes for some great sound bites, like “it is kind of like going from an Xbox to an Atari.” But when we look at their actual complaints, what do we find? That the Obama staff prefers Macintosh computers, but that the White House uses Windows, and worse, they use “six-year-old versions” of Windows.

In other words, the White House staff was using — horrors! — Windows XP.

What’s more, they discovered it was hard to reach their Facebook pages and their external email accounts.

My guess is that it’s inauguration amnesia that causes reporter Kornbluth to forget the scandal around Governor Palin’s use of an external Yahoo email account for personal email, or the extended complaints about the White House using RNC email accounts for political correspondence. Or the fact that the Clinton administration several times found itself in at least technical violation of the Hatch Act for using White House facilities for political purposes. And I’m sure that a professional staff writer at the Washington Post would never consciously write a story intending to slant it against the outgoing administration, but it’s clear that she wasn’t sufficiently sophisticated to understand the reality behind the spin she was being fed.

Advertisement

The second, more technical story this week appears to have started out with a posting at Jason Kottke’s blog, one at TheNextWeb.com, one at Codeulate.com, and one by Cory Doctorow at BoingBoing. The Kottke post was helped along by a Twitter tweet from Tim O’Reilly, which was then re-tweeted many times.

The gist of the story is the notion that the Obama administration had made the whitehouse.gov site much more “open,” because the robots.txt file was dramatically shorter. TheNextWeb.com said it was “hopefully a sign of greater transparency”; Tim O’Reilly, who really should know better, put it as “Transparent gov FTW” (in English, “transparent government ‘for the win‘”); from the Twitter world, “saranovotny” said “amazing geek metric of the openness of the obama [sic] administration.”

For people who don’t eat and breathe websites, the robots.txt file is a suggestion to “spidering” programs, like the ones Google uses to index the web for later searching. Many people don’t realize that the way web search engines are built is based on programs that visit potentially every webpage on the Internet and copy the contents back to the search host. (This is why Google can provide cached copies of webpages that have been deleted.) The robots.txt file is defined, by convention, to tell the spidering programs which files should not be copied or indexed. So, on first glance, it makes sense that a shorter robots.txt means “more openness” — after all, that means fewer pages are being blocked.

Advertisement

The problem is, as Declan McCullough pointed out at CNet, there are a lot of good reasons why a competent webmaster would block pages from search engines. In fact:

If anything, Obama’s robots.txt file is too short. It doesn’t currently block search pages, meaning they’ll show up on search engines — something that most site operators don’t want and which runs afoul of Google’s webmaster guidelines. Those guidelines say: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.”

In other words, the new “openness” of the White House website was actually poor search engine optimization compared to the Bush administration’s site.

Now, I’m the first person to recognize that no one can be an expert at everything, and expecting even a seasoned staff writer at the Washington Post to notice that what’s really been described to her is another example of the Windows versus Macintosh religious war might be a bit much. But, honestly, would it be too much for her to ask some questions before running with a quote like “technological Dark Ages”? Since it’s a geeky subject, sites like TheNextWeb and technical authorities like Cory Doctorow and Tim O’Reilly seem a little more guilty, but still it’s something that was published quickly. (It wouldn’t seem out of line to expect some of the websites to publish a correction.)

Advertisement

What it does tell us, though, is that readers who want to be well informed can’t afford to let down their guards. Clearly, the legacy media and even technical experts are perfectly capable, and more than willing, to be led astray, as long as it fits the “dumb Bush administration” narrative.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement