Cold Hard Facts and the 'Big-Boned Climate' Theory

Global warming alarmists love to tell us that the debate is over, that "climate change" -- their new term for a catchall theory that allows them to stick their noses into literally everything -- is established fact. "Do not believe your lying senses!" we are commanded, because the unusually cold winter we are enjoying here in North America isn't really cold; it's, uh, just a lull! This is much like the family of the overweight child telling everyone that their son isn't fat but big-boned; everyone knows the kid is obese, but the parents refuse to admit it. Well, anthropogenic global warming is just big-boned.

Global warming stopped in 1997 and we have had no statistically significant warming since 1995, according to Richard Lindzen; meanwhile the Gang Green, the alarmists with such big-boned ideas, twist themselves into pretzels to argue otherwise. (Of course, it doesn't help to have Al Gore jetting around the world, bringing temperature drops that make Dante's Cocytus feel sultry.) Sea levels refuse to increase their rate of rise; some glaciers and sea ice have melted but more ice has been added elsewhere to counterbalance this. Of course, keeping track of all that ice is tricky; the National Snow and Ice Data Center inconveniently lost 193,000 miles of such ice the other day and was quite fortunate to find it -- doubtlessly hidden by the girth of our big-boned theory. For that matter, Roy Spencer has spun out some calculations suggesting that a good portion of the CO2 increase we've seen in the Earth's atmosphere may be natural. We've had the predictions of doom founder on the iceberg of reality for the global warming crowd. Mostly, it's been just plain cold!

But our friends haven't given up; at least, not yet.

The ace in the hole for global warming alarmists of every shape and size is that the Earth has warmed (by under one degree Fahrenheit) during the 20th century and solar irradiance does not seem to adequately account for the temperature increase.

Solar irradiance is measured in watts/meter-squared (W/m2). The solar constant is estimated at 1,366 W/m2, which is an average based on measurements taken at the top of the atmosphere and is a measure of the entire electromagnetic spectrum -- not just visible light. It is an average because this varies depending on the time of year, based on the Earth's position and angle.

During the first half of the 20th century, scientists, most notably Charles Greeley Abbot, labored to obtain accurate measurements of total solar irradiance (TSI). He corrected the work of Samuel Pierpoint Langley, who had estimated the solar constant at 2,903 W/m2, making measurements based on high-altitude studies (Abbot had to content himself with surface observations, since his work came before airflight) and determined values between 1,318 and 1,548 W/m2. Since 1978 satellite data has given us a good picture of solar irradiance, although we have much to learn about the direct impact. For example, it seems we haven't understood how solar flares contribute to the overall energy balance.

Our knowledge of solar irradiance prior to 1978 is based on radiosonde (balloon) and other high-altitude measurements, and our knowledge of solar irradiance prior to high-altitude research is based entirely on proxy data (inferred). We cannot state confidently that solar irradiance was "thus" because it is mere guesswork.

But the GW crowd will boldly claim that prior to the 20th century it was indeed "thus" and will then show charts that suggest we have somehow changed, with the Earth warming despite a lack of solar increase. Yes, they will admit that the sun got a bit more energetic in parts of the 20th century, but it does not account for the increase in temperature, or so they claim.