Just for variety, today’s science column is about something I actually have some professional qualifications to write about.
No, that really hasn’t ever slowed me down, I just wanted to note it.
In fact, my master’s thesis, “A Software Performance Engineering Environment,” was about tools to allow software engineers to develop performance models of software alongside the software. I spent some years in IBM and Sun’s consulting practices, usually dealing in one way or another with web-based businesses. I had a very popular talk, “Capacity Planning on a Cocktail Napkin,” which I later wrote as an article for SmartBear Software.
There really is only one explanation for the meltdown of the Obamacare exchanges since 1 October, and that’s utter incompetence.
Now, lemme ‘splain.
Back in the old days, at the Very Beginning Of The Web, all a web server could do was deliver a static piece of text. It was a brilliant hack by Tim Berners-Lee, who realized that he could build a little editor for a simple markup language and add one little change — a special tag that could address another page of text in the same markup language. To make it work, he needed a program that could return those files and a simple way the editor could ask for the files it wanted. The markup language was a subset of a commonly available commercial standard, SGML, called the Hyper Text Markup Language, HTML, the server program using a very simple text-based protocol called the Hyper Text Transfer Protocol, HTTP, and the rest, as they say was ….
Yes, class, that’s right, “history.”
From this simple hack the whole World Wide Web was made.
Now, with the same foresight that led me to buy Borland stock over Microsoft when they both went public, I thought at the time that it was a mildly amusing notion, but I didn’t see much future in it. I was head-down in category theory working on my dissertation.
In the meantime, though, people realized that if you could serve a file, you could also write a program that would generate HTML output and send it. Pretty quickly, people had started building basic web-commerce sorts of sites, where every time you wanted to, say, log in, the web server would run a special program to generate the pages. This worked, but it had some limits: basically, on any computer, it takes a fairly long time (in computer terms) to start a program, and there is a pretty strict limit to how many programs can be running at the same time — say, just to be definite, that it’s 100. And let’s say that each person’s interaction with the web site lasts for 5 minutes, 1/12th of an hour. That means, roughly, that your web site will be able to handle around 1200 customers an hour maximum. So, that would be something like 288,000 customer visits a day.
Remember that number: 288,000 a day. Just by way of comparison, an obscure blog by a Tennessee law professor was getting about 19,000 visits a day in 2002.
This was the status of web commerce in about 1996. By 1998 it was clear this was not going to stand up, and a lot of people — including me, by this point I’d concluded that maybe this World Wide Web thing was going to catch on — were working with customers to build new web sites that could handle a much bigger load. Customers, in turn, had had their minds focused wonderfully by a succession of what we called “CNN Moments” — websites that were announced with great fanfare, only to fall over in the first few minutes because of the load, leading to CNN covering the failure in prime time news.
Companies hated CNN moments, and not a few companies went bust because of them.
We devoted a lot of time and effort figuring out what to do, and to cut this fascinating history short — contain your disappointment — now really big web sites handle millions or even tens or hundreds of millions of visitors a day. A relatively small site like PJMedia.com (sorry, guys, we’re a big news site but nothing compared to sites like eBay) still serves a million pages a day or more on big days.
So now, let’s compare that to the Obamacare exchanges. California, after reporting some fairly ridiculous numbers, had a total of 645,000 unique visitors the first day. Connecticut had 26,000 visitors.
So here’s a comparison for you. California had two-thirds the traffic of PJMedia, but at least managed to mostly stay up. Connecticut had 26,000 visitors of whom 167 managed to actually complete a purchase.
Now, I’m writing a technical article about building web service sites using Python. I just tried my little internal test site, making it handle 26,000 “visits.” I’m doing this on my desktop Mac Mini. There is nothing here to do anything special. In fact. here’s the actual webserver code:
python -m SimpleHTTPServer
That web server handled 26,000 visits in 3:40.1. In other words, my desktop machine using the dumbest possible one-line program was able to handle nearly 400 times the load at which Connecticut’s system was falling over.
Now, admittedly, an O-care web site has other issues that make life much more complex. But you’re going to hear a lot of people in the next few days talking about how big the load was.
Don’t buy it. It’s not the load that was the problem.
Join the conversation as a VIP Member