Get PJ Media on your Apple

Ed Driscoll

The Electronic Cottage

The other day, I linked to John Derbyshire’s post at the Corner on Britain’s state of “anarcho-tyranny:”

The late Sam Francis gave us the invaluable term “anarcho-tyranny” to describe that state of society in which “we refuse to control real criminals (that’s the anarchy) so we control the innocent (that’s the tyranny).”

Britain is far gone in anarcho-tyranny. Among my Christmas mail were a card and letter from a relative we barely communicate with the rest of the year. To make up for her side of the delinquency, she sends us a nice chatty summary of all that’s happened to her large and bustling family in the previous twelve months.

As I noted, Derbyshire concluded, “Anarcho-tyranny: coming soon to a jurisdiction near you” — which we like to call around these parts, “California.” Victor Davis Hanson has extensively discussed anarcho-tyranny in the once-Golden State, but at NR today, in a post titled “I Am Meredith Graves,” Kevin Williamson explores how it works in the Big Blue Redoubt on the other coast:

I fully expect that Meredith Graves will do time for having had the bad sense to attempt to exercise certain God-given and unalienable rights at a place in which they were famously attacked. While I have enjoyed the back-and-forth between Robert VerBruggen and the others on the legal and constitutional questions of interstate concealed-carry protocols, I enjoy them the way I enjoy watching a tennis match: The skill involved in the volleys is impressive, but it is only a game. Our Second Amendment jurisprudence, like our First Amendment jurisprudence, seems to me to be simply an unprincipled political fight. We should heed the wisdom of Roy Cohn: “Don’t tell me what the law is, tell me who the judge is.”

Rather than weigh in on the legal questions, let me offer another view: I am Meredith Graves. At least, I have found myself in a very similar situation, but had a very different outcome.

Read the whole thing.

A 21st Century Recessional

January 6th, 2011 - 10:57 am

Back in late 2005, Mark Steyn’s best-selling America Alone was preceded by his magnum opus article, “It’s the Demography, Stupid” in the New Criterion (and quickly republished in the Wall Street Journal, because, if I’m remembering correctly, the New Criterion’s server was blown offline by the perfect storm of an Insta-Drudge-NRO-Freeper-lanche.)

This month in the New Criterion, Mark explores, “Dependence Day—On the erosion of personal liberty:”

After Big Government, after global retreat, after the loss of liberty, there is only remorseless civic disintegration. The statistics speak for themselves. The number of indictable offences per thousand people was 2.4 in 1900, climbed gradually to 9.7 in 1954, and then rocketed to 109.4 by 1992. And that official increase understates the reality: Many crimes have been decriminalized (shoplifting, for example), and most crime goes unreported, and most reported crime goes uninvestigated, and most investigated crime goes unsolved, and almost all solved crime merits derisory punishment. Yet the law-breaking is merely a symptom of a larger rupture. At a gathering like this one, John O’Sullivan, recalling his own hometown, said that when his grandmother ran a pub in the Liverpool docklands in the years around the First World War, there was only one occasion when someone swore in her presence. And he subsequently apologized.

“The past is a foreign country: they do things differently there.” But viewed from 2010 England the day before yesterday is an alternative universe—or a lost civilization. Last year, the “Secretary of State for Children” (both an Orwellian and Huxleyite office) announced that 20,000 “problem families” would be put under twenty-four-hour cctv supervision in their homes. As the Daily Express reported, “They will be monitored to ensure that children attend school, go to bed on time and eat proper meals.” Orwell’s government “telescreen” in every home is close to being a reality, although even he would have dismissed as too obviously absurd a nanny state that literally polices your bedtime.

For its worshippers, Big Government becomes a kind of religion: the state as church. After the London Tube bombings, Gordon Brown began mulling over the creation of what he called a “British equivalent of the U.S. Fourth of July,” a new national holiday to bolster British identity. The Labour Party think-tank, the Fabian Society, proposed that the new “British Day” should be July 5th, the day the National Health Service was created. Because the essence of contemporary British identity is waiting two years for a hip operation. A national holiday every July 5th: They can call it Dependence Day.

* * *

In our time, to be born a citizen of the United States is to win first prize in the lottery of life, and, as Britons did, too many Americans assume it will always be so. Do you think the laws of God will be suspended in favor of America because you were born in it? Great convulsions lie ahead, and at the end of it we may be in a post-Anglosphere world.

Do I even need to say, read the whole thing?

Roberts invented the Altair 8800, pictured above, which was the PC I cut my teeth on, when a math teacher at St. Mary’s purchased one, hooked it up to a used teletypewriter as its I/O device and started my school’s first computer club in the mid-1970s. Oh, the epic battles of Wumpus, Star Trek, and Lunar Lander we fought, my friends.

AP reports:

Dr. Henry Edward Roberts, a developer of an early personal computer that inspired Bill Gates to found Microsoft, died Thursday in Georgia. He was 68.Roberts, whose build-it-yourself kit concentrated thousands of dollars worth of computer capability in an affordable package, inspired Bill Gates and his childhood friend Paul Allen to come up with Microsoft in 1975 after they saw an article about the MITS Altair 8800 in Popular Electronics.

Roberts, an ex-military man, later went on to careers as a farmer and a physician, but continued to keep up with computer advances: He recently told Gates he hoped to work with new, nanotechnology-enhanced machines, according to son David Roberts.

“He did think it was pretty neat, some of the stuff they’re doing with the processors,” said David Roberts, who confirmed Gates rushed to Georgia Friday to be with his mentor.

Roberts died in a Macon hospital after a long bout with pneumonia, according to his family.

“Ed was willing to take a chance on us — two young guys interested in computers long before they were commonplace — and we have always been grateful to him,” Gates and Allen said in a joint statement released Thursday. “The day our first untested software worked on his Altair was the start of a lot of great things. We will always have many fond memories of working with Ed.”

The man often credited with kickstarting the modern computer era never intended to lead a revolution.

Born in Miami in 1941, Roberts spent time in the U.S. Air Force and earned an electrical engineering degree from Oklahoma State University in 1968, according to information provided by his family.

He later parlayed his interest in technology into a business making calculators; when large firms like Texas Instruments began cornering the business, Roberts soon found himself in debt, David Roberts said.

Meanwhile, he was gaining an interest in computers — at the time, hulking machines available almost exclusively at universities.

“He came up with the idea that you could have one of these computers on your own,” said David Roberts, adding his father expected to sell a few units. “Basically, he did it to try to get out of debt. ”

Roberts himself would later describe the effort as an “almost megalomaniac kind of scheme” that he pursued out of youthful ambition.

“But at that time you know we just lacked the, eh, the benefits of age and experience,” Roberts said on a program called “Triumph of the Nerds” that aired on PBS in 1996. “We didn’t know we couldn’t do it.”

As Al Ries and Jack Trout noted in one of their marketing books, a big reason why Roberts isn’t as well remembered today as he should be is that, while he got there before them, Woz and Jobs did a far better job of branding. Unlike Altair (the story goes it was taken from an episode of Star Trek that Roberts’ daughter was watching one week), the name Apple is simplicity itself. More importantly, the sleek white plastic case of the Apple II also promised ease of use, in sharp contrast to the scary and confusing looking switches and LEDs on the front of the Altair.

Today, the Altair is mostly found in computer museums, including the Bay Area’s Computer Museum History Center, which I described back in 2001 for National Review Online.

In his brilliant and frequently-updated non-fiction book, Profiles of the Future, Arthur C. Clarke famously quotes a remark attributed to William Henry Preece, the chief engineer of the British Post Office, when he was told in 1877 that the Americans had stumbled across a then-bleeding edge communications technology:

“The Americans have need of the Telephone — but we do not. We have plenty of messenger boys.”

In contrast, Clarke had infinitely more foresight. In 1967, he gave a speech in which he said:

Newspapers will, I think, receive their final body blow from these new communications techniques. I take a dim view of staggering home every Sunday with five pounds of wood pulp on my arm, when what I really want is information, not wastepaper. How I look forward to the day when I can press a button and get any type of news, editorials, book and theater reviews, etc., merely by dialing the right channel.Electronic “mail” delivery is another exciting prospect of the very near future. Letters, typed or written on special forms like wartime V-mail, will be automatically read and flashed from continent to continent and reproduced at receiving stations within a few minutes of transmission.

And as the first of Clarke’s Three Laws goes, “When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.”

So with all of that background, how did Newsweek do in 1995 just as the World Wide Web, the still-new graphical interface running atop an Internet that had been around since 1969 was just taking off?

Just came across this article from Newsweek in 1995. It lists all the reasons the internet will fail. My two favorite parts:

The truth in no online database will replace your daily newspaper, no CD-ROM can take the place of a competent teacher and no computer network will change the way government works.

* * *

Yet Nicholas Negroponte, director of the MIT Media Lab, predicts that we’ll soon buy books and newspapers straight over the Intenet. Uh, sure.

If Newsweek is as good at maintaining the journalism industry as they are at fortune telling, they should be around for a long time.

Heh. Although, maybe not at this burn rate.

In the meantime though, Newsweek is determined not to get caught flat-footed once again. Via Tim Blair, a photographed smuggled out of Newsweek HQ just moments ago shows a highly-paid, smartly dressed consultant explaining the latest in technologies to the magazine’s crack staff, who stare wide-eyed at the wonders of the future to come:

The Debt Star Has Cleared The Planet

July 30th, 2009 - 4:57 pm

debt-starBetsy Newmark writes that flexibility and Obama are mutually exclusive terms — he definitely stays on target no matter what the damage his plans may cause; arguably because collateral damage to the economy is a feature, not a bug in his mind (reference his bankrupting the coal industry rhetoric):

Nina Eason makes the point that Obama was masterful during the campaign to stay on message. He didn’t change that message no matter what.

Barack Obama promised universal health care and a mass conversion to green energy when he launched his presidential campaign. On that frigid February day in 2007, the economy was growing at a 2.8% clip. Obama stuck to the same promises a year later when he won Iowa, as the housing market was slumping into recession. And energy and health care were the twin pillars of his acceptance speech in Denver, 18 days before Lehman Brothers collapsed.

But now the question is whether his audacious plans, which might have been less questionable when the economy was doing well, would actually sink any hope of recovery. And the answer seems clear that piling up massive future debts as well as putting new burdens on business are lousy moves to make during a deep economic decline.

As one of the most disciplined, on-message politicians of our time, President Obama hasn’t wavered from his audacious plans to remake entire business sectors. But when wavering is what the U.S. economy seems to do best these days, the President confronts a new question: Does his own agenda threaten to choke off the economic recovery that he also promises — and that will define much of his legacy? Both of his legislative campaigns for the fall, health-care reform and the cap-and-trade plan to curb carbon emissions, could put new burdens on a weak economy.

Well, duh! But what’s a bit of common sense about how to avoid hurting the economy when the President has to stay on message and rack up big legislative scalps to hang on his belt?

In contrast, his subordinates are not quite as cool and calm under the harsh pressure of reality:

“They are the villains in this,” Pelosi said of private insurers. “They have been part of the problem in a major way. They are doing everything in their power to stop a public option from happening. And the public has to know that. They can disguise their arguments any way they want, but the fact is that they don’t want the competition.”…

“It’s almost immoral what they are doing,” added Pelosi, who stood outside her office long after her press conference ended to continue speaking to reporters, even as aides tried in vain to usher her inside. “Of course they’ve been immoral all along in how they have treated the people that they insure with pre-existing conditions, you know, the litany of it all.”

Emphasis above via Allahpundit, who adds:

Take one of the most unpopular politicians in America, have her go off half-cocked in a crude attempt to satisfy Democratic demands for a villain to demagogue in selling ObamaCare, then wait for the backlash. Bonus points for using the same Orwellian rhetorical device Paul Ryan called Katrina Vanden Heuvel on last night, namely, exploiting the language of competition to push one of the most anti-competitive domestic measures in American history.

Not to mention the notion that this is coming from someone who no doubt sees the idea of “villains” and “morals” as dated rhetoric in the postmodern world, where one man’s terrorist is another man’s freedom fighter, and the like.

Update: Michelle Malkin adds, “opposition to Hope and Change™ is starting to take its toll. Nancy Pelosi is starting to act like somebody spiked her Botox with traces of arsenic, and the descent into paranoid madness is somewhat reminiscent of Captain Queeg on the witness stand in The Caine Mutiny.”

Then: “The Strawberries.” Now: “Palomino!”

Getting Webbie With It

June 23rd, 2009 - 3:14 pm



The weekend prior to my recent trip to Alaska, I had planned to purchase a Flip Mino HD or Creative Lab Vado HD  after reading the review of the two tiny video cameras by Skye at her Midnight Blue Weblog. However, since Best Buy was out of both cameras, but had a Sony MHS-PM1 “Webbie” in stock, I figured what the heck.

About the size of a pack of cigarettes (to borrow a common measuring term now apparently verboten) for the most part, the Webbie is certainly intuitive enough; rotating its tiny lens up from its protective cover turns the camera on, and the buttons below the monitor screen marked PHOTO AND MOVIE are certainly intuitive enough.

But there are several aspects of the camera that are less than intuitive. Clicking the movie button once lights up a small recreation of a typical video camera’s tally light, to let you know the camera’s recording. But then clicking it again generates a note that says “RECORDING”. It’s the camera’s way of letting you know it’s recording the just captured to the unit’s Memory Stick card, but it takes a couple of tries to figure out just when the unit is actually, you know, recording. (Also, you’ll need to purchase the Memory Stick card separately, which bumps the total price of the unit up slightly, as the Webbie’s onboard 12MB is pretty useless except for recording a handful of still shots.)

Right out of the box, the Webbie’s default mode is 720P, which is perfect for uploading videos to YouTube’s recently adopted widescreen format. The above video, documenting my train ride from Anchorage (where my plane got in) to Seward (where we picked up our cruise ship) was shot in the Webbie’s 720P format; mainly because I wasn’t sure how to switch the Webbie into 1080p without first flipping through the Webbie’s instruction manual. The button on the right hand side of the camera marked MENU brings up some commands, but the button to its right, which also doubles as the button to delete unwanted shots is what changes video modes. (VGA is also available as an option, for those who prefer standard def.)

As you can see by the above video, the picture quality is pretty darn good for such a tiny camera. But perhaps the most frustrating feature on the Webbie is the lack of a smooth zoom control. Obviously, because of its tiny lens, the unit uses electronics to generate its zoomed images, rather than adjusting the actual lens itself, a money and space-saving feature common on lots of low-end consumer camcorders.

But most camcorders have a fluid zoom effect. In contrast, the Webbie ratchets between positions in its zoom; you’ll want to compose your shots first, then hit record, or be prepared to discard the material shot while the camera zooms, and fluidly focusing on an object then zooming back (see Kubrick’s Barry Lyndon for this technique repeatedly used as a leitmotif) is impossible.

While the unit does have a tripod mount, it lacks a seperate microphone input, so you’ll be relying on the Webbie’s built-in mic, which may be fine for some occassions, and very frustrating for others.

But then, that’s the Webbie in a nutshell, isn’t it? Filling the gap between a cellphone camera and a decent consumer camcorder, the Webbie is great for quick and dirty video blogging, as a handy second camera for shooting B-roll footage, and certainly for home movies. But with a few additions and modifications, it could have been a much more useful little tool.

More on the Webbie from C/Net, which also includes a video of the unit in action:




Update: The Blogfather links to more reviews of tiny camcorders.

Happy Birthday, YouTube

April 24th, 2009 - 12:42 pm

Found via Greg Pollowitz, the Penn Station of video hosts turns four years old today. Here’s my early “Army of Davids”-style look at the benefits of the site at Tech Central Station from February of 2006. As for its downside, here’s my firsthand report at the drawback of relying upon the site as your sole video host.

And from Videomaker magazine in late 2007, here’s my look at some of the other video hosts out there, complete with quotes from my interview with Liz Stephans and Scott Baker of Breitbart.tv.

(For some uber-wonky video talk, my latest Videomaker article compares and contrasts CMOS and CCD sensors in video cameras, with a cameo appearance by Hahn Choi, one of the many hard working behind the scene people at PJTV.)

All The News That Fits In Your Pocket

April 1st, 2009 - 11:39 pm

Sensing the future is just around the corner, the New York Times experiments dilligently with a smaller, more portable, pocket-sized edition of its newspaper. But I’m not sure if they have the technology quite worked out just yet….

(And yes, it’s still April 1st on the West Coast for another 20 minutes as I write this.)

Only eight? Slackers.

Update: “The Dangers of Living Online.” Meanwhile, “OMG, OnStar May Soon Let You Twitter From Your Car!”

(That last item found here, naturally enough. Though some in the comments are declaring it an early April’s Fools prank. If so, sorry for the unintended mellow enharshening.)

Hey, it’s survived me being on there, so I’m guessing it’s somewhat bulletproof. But still, Nova Spivack has some interesting questions about Twitter’s future. In my “All You Need Is Tweet” video, I compared Twitter to a combination police scanner and Internet chatroom; Spivack has an equally viable analogy:

Twitter reminds me of CB radio — and that is a double-edged blessing. In Twitter the “radio frequencies” are people and hashtags. If you post to your Twitter account, or do an @reply to someone else, you are broadcasting to all the followers of that account. Similarly, if you tweet something and add hashtags to it, you are broadcasting that to everyone who follows those hashtags.

This reminds me of something I found out about in New York City a few years back. If you have ever been in a taxi in NYC you may have noticed that your driver was chatting on the radio with other drivers — not the taxi dispatch radio, but a second radio that many of them have in their cabs. It turns out the taxi drivers were tuned into a short range radio frequency for chatting with each other — essentially a pirate CB radio channel.

This channel was full of taxi driver banter in various languages and seemed to be quite active. But there was a problem. Every five minutes or so, the normal taxi chatter would be punctuated by someone shouting insults at all the taxi drivers.

When I asked my driver about this he said, “Yes, that is very annoying. Some guy has a high powered radio somewhere in Manhattan and he sits there all day on this channel and just shouts insults at us.” This is the problem that Twitter may soon face. Open channels are great because they are open. They also can become aweful, because they are open.

Which also dovetails nicely with Compuserve’s groundbreaking CB chat application, which, in the early 1980s, along with regional BBSs, was one of my very first online experiences. (Yes, I had to walk a mile home from school barefoot in the snow to get to my computer, and we needed tin cans, strings and stone knives and bearskins–or at least TRS-80′s–to connect. In those primitive days, 1,000,000 years B.C. (Before Cable-modem), life online was a constant struggle to survive–and pay the connection fees.

I don’t think Twitter is as susceptible as Compuserve’s CB was to a high signal to noise ratio, simply because it’s possible to filter much of the spam and noise out of a conversation. But Spivack has some additional suggestions that might help to ameliorate the impact of a sudden rush of new tweeters.



After all of the recent political and media bias Silicon Graffiti videos, I wanted to do something in a lighter vein, so here’s (hopefully) a fun overview of Twitter. No doubt, hard care power Tweeters (yes, it’s supposed to sound silly) will chide me for leaving out whatever this week’s killer app of the century is, but I’ve tried to make something enjoyable for both newcomers and veteran users of Twitter.

From a Twittering Barack Obama to Hugh Hewitt and all points in between, we go deep inside your computer and try to make sense of Twitter.

Featuring:

Click here to follow us on Twitter, and here to check out our previous 26 or so prior editions of Silicon Graffiti.

Your Cat Wants Steak

January 21st, 2009 - 1:39 pm

I’m sure somebody on Fark has already said it about this Japanese device which PC World notes, “Aims to Translate Cat Talk .”

Via the Professor, more pet gadgetry here.

Dialing For Sushi

October 15th, 2007 - 4:53 pm

Two quick technology updates:

Found via Steve Green, I hadn’t planned to buy an Apple iPhone, but I’m starting to change my mind

And while I often have sushi while sitting in front of my PC’s twin LCD monitors, apparently the in-thing amongst the really hip members of the digerati is preparing the sushi right on them. That sounds good to me, but aren’t they worried that the wasabi will melt the plastic?

The Future Of Videogames

September 26th, 2007 - 9:20 pm

Allahpundit explores the boffo box office–which a different kind of PC industry, politically correct Hollywood, would kill for–of Microsoft’s Halo 3, which ties in with an apt comment Glenn Reynolds made a while back:

It occurs to me that the media sectors that are doing badly — movies, music, newspapers, TV women’s shows — seem to be the most highly politicized, while the sectors that are doing well, like games, aren’t. I’d be interested to see more analysis on that subject.

Meanwhile, James Lileks has online video of the haves and have-nots of the videogame world as Halo 3′s launch approached.

Ahh, but what sort of space would be worthy to qualify as the perfect rec room in which to play such an awesomely awesome game? There can be only choice:

This.

News From 1980

August 27th, 2007 - 7:17 pm

ABC reports, “The Future of the Workplace: No Office, Headquarters in Cyberspace–Some Companies Don’t Care Where Workers Are as Long as They Get the Job Done”.

Geez, Toffler wrote about telecommuting in The Third Wave in 1980. Numerous businesses (not the least of which is Pajamas) rely heavily on it. Wall Street firms used telecommuting to stay afloat immediately after 9/11. Why such a breathless headline from ABC?

Strange Tribal Rituals Observed

June 29th, 2007 - 10:16 pm

10,000 geeks will look at this video clip and think: “Man, I’m glad we Windows / Star Wars / Star Trek / furgasm fans aren’t as crazed as these guys“:

Online Videos by Veoh.com

(Triumph could have had a field day in this line, incidentally.)

Huh. Off the top of my head, I can’t think of anyone in the Blogosphere who would enjoy this.

The Laptop From 2015

June 15th, 2007 - 10:44 pm

SciFi.com gives us a sneak preview of what the laptop of the future will look like. As to what it will have inside, see my recent CE Pro article on 64-bit computing.

Of course, this is all contingent on the UN’s forecast of the world coming to an end in 2015 not coming true, but somehow, I think we’ll muddle through…

Fill My Eyes With That Double Vision

May 27th, 2007 - 4:36 pm

From what I’ve heard, once you go dual, you never go back. I’ll let you know–I’m experimenting with dual 19-inch LCD monitors. Surprisingly, it was a PITA to install, because apparently my PC’s ATI videocard, which is designed to simultaneously pump out both VGA and DVI video–and hence allowing two monitors–apparently had a defective DVI output. But now that I’ve replaced the card, and have both monitors working, it seems like it should improve workflow with recording programs such as Cakewalk Sonar, and video programs like Adobe Premiere Pro. Not to mention experimenting with rotating the monitor 90 degrees for Word documents.

Besides, it looks bitchin’ cool to boot. Maybe I’ll add a third!

During the late-1990s, as the new millennium was approaching and pre-Blogosphere, I was largely toiling away for various home automation magazines (something I still do quite often, actually), where I wrote my share of “Welcome To The Home Of The Future!” articles. Here’s one that featured quotes from my interview of Star Trek veteran David Gerrold, and is a representative (though heavily edited, as I recall) sample of the genre.

But my sci-fi forecasting had nothing on the Minneapolis Strib’s apocalyptic vision of the future domus. Roger L. Simon writes that many of us are having the same reaction from Al Gore’s low budget PowerPoint presentation agitpropumentary Academy Award-winnning blockbuster film:

After viewing the movie I was less troubled with the global warming issue and more troubled by Gore’s narcissism – not exactly the result intended. In fact, the reverse. And evidently, from the poll results, I am not alone.

Oh yeah? Well, heed the Goracle now maaaan, or pay up in the future!

(more…)