The familiar phrase was spoken by Marcellus in Shakespeare’s Hamlet — first performed around 1600, at the start of the Little Ice Age. “Something is rotten in the state of Denmark” is the exact quote. It recognizes that fish rots from the head down, and it means that all is not well at the top of the political hierarchy. Shakespeare proved to be Nostradamus. Four centuries later — at the start of what could be a new Little Ice Age — the rotting fish is Copenhagen.
The smell in the air may be from the leftover caviar at the banquet tables, or perhaps from the exhaust of 140 private jets and 1200 limousines commissioned by the attendees when they discovered there was to be no global warming evident in Copenhagen. (In fact, the cold will deepen and give way to snow before they leave, an extension of the Gore Effect.)
But the metaphorical stench comes from the well-financed bad science and bad policy, promulgated by the UN, and the complicity of the so-called world leaders, thinking of themselves as modern-day King Canutes (the Viking king of Denmark, England, and Norway — who ironically ruled during the Medieval Warm Period this very group has tried to deny). His flatterers thought his powers “so great, he could command the tides of the sea to go back.”
Unlike the warmists and the compliant media, Canute knew otherwise, and indeed the tide kept rising. Nature will do what nature always did — change.
It’s the data, stupid
If we torture the data long enough, it will confess. (Ronald Coase, Nobel Prize for Economic Sciences, 1991)
The Climategate whistleblower proved what those of us dealing with data for decades know to be the case — namely, data was being manipulated. The IPCC and their supported scientists have worked to remove the pesky Medieval Warm Period, the Little Ice Age, and the period emailer Tom Wigley referred to as the “warm 1940s blip,” and to pump up the recent warm cycle.
Attention has focused on the emails dealing with Michael Mann’s hockey stick and other proxy attempts, most notably those of Keith Briffa. Briffa was conflicted in this whole process, noting he “[tried] hard to balance the needs of the IPCC with science, which were not always the same,” and that he knew “ … there is pressure to present a nice tidy story as regards ‘apparent unprecedented warming in a thousand years or more in the proxy data.’”
As Steve McIntyre has blogged:
Much recent attention has been paid to the email about the “trick” and the effort to “hide the decline.” Climate scientists have complained that this email has been taken “out of context.” In this case, I’m not sure that it’s in their interests that this email be placed in context because the context leads right back to … the role of IPCC itself in “hiding the decline” in the Briffa reconstruction.
In the area of data, I am more concerned about the coordinated effort to manipulate instrumental data (that was appended onto the proxy data truncated in 1960 when the trees showed a decline — the so called “divergence problem”) to produce an exaggerated warming that would point to man’s influence. I will be the first to admit that man does have some climate effect — but the effect is localized. Up to half the warming since 1900 is due to land use changes and urbanization, confirmed most recently by Georgia Tech’s Brian Stone (2009), Anthony Watts (2009), Roger Pielke Sr., and many others. The rest of the warming is also man-made — but the men are at the CRU, at NOAA’s NCDC, and NASA’s GISS, the grant-fed universities and computer labs.
Programmer Ian “Harry” Harris, in the Harry_Read_Me.txt file, commented about:
[The] hopeless state of their (CRU) data base. … No uniform data integrity, it’s just a catalogue of issues that continues to grow as they’re found. … I am very sorry to report that the rest of the databases seem to be in nearly as poor a state as Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO and one with, usually overlapping and with the same station name and very similar coordinates. I know it could be old and new stations, but why such large overlaps if that’s the case?Aarrggghhh! There truly is no end in sight. …
This whole project is SUCH A MESS. No wonder I needed therapy!!
I am seriously close to giving up, again. The history of this is so complex that I can’t get far enough into it before by head hurts and I have to stop. Each parameter has a tortuous history of manual and semi-automated interventions that I simply cannot just go back to early versions and run the updateprog. I could be throwing away all kinds of corrections – to lat/lons, to WMOs (yes!), and more. So what the hell can I do about all these duplicate stations?
Climategate has sparked a flurry of examinations of the global data sets — not only at CRU, but in nations worldwide and at the global data centers at NOAA and NASA. Though the Hadley Centre implied their data was in agreement with other data sets and thus trustworthy, the truth is other data centers are complicit in the data manipulation fraud.
The New Zealand Climate Coalition had long solicited data from New Zealand’s National Institute of Water & Atmospheric Research (NIWA), which is responsible for New Zealand’s National Climate Database. For years the data was not released, despite many requests to NIWA’s Dr. Jim Salinger — who came from CRU. With Dr. Salingers’ departure from NIWA, the data was released and showed quite a different story than the manipulated data. The raw data showed a warming of just 0.06C per century since records started in 1850. This compared to a warming of 0.92C per century in NIWA’s (CRU’s) adjusted data.
Willis Eschenbach, in a guest post on Anthony Watts’ blog, found a smoking gun at Darwin station in Australia. Raw data from NOAA (from their GHCN, Global Historical Climate Network, that compiled data that NASA and Hadley work with) showed a cooling of 0.7C. After NOAA “homogenized” the data for Darwin, that changed dramatically. In Willis’ words:
YIKES! Before getting homogenized, temperatures in Darwin were falling at 0.7 Celsius per century … but after the homogenization, they were warming at 1.2 Celsius per century. And the adjustment that they made was over two degrees per century … when those guys “adjust,” they don’t mess around.
He found similar discrepancies in the Nordic countries. And that same kind of unexplainable NOAA GHCN adjustment was made to U.S. stations.
In this story, see how Central Park data was manipulated in inconsistent ways. The original U.S. Historical Climate Network (USHCN) data showed a cooling to adjust for urban heat island effect — but the global version of Central Park (NOAA GHCN again) inexplicably warmed Central Park by 4F. The difference between the two U.S. adjusted and global adjusted databases, both produced by NOAA NCDC, reached an unbelievable 11F for Julys and 7F annually! Gradually and without notice, NOAA began slowly backing off the urban heat island adjustment in the USHCN data in 1999 and eliminated it entirely in 2007.
Anthony Watts, in his surfacestations.org volunteer project “Is the U.S. Surface Temperature Record Reliable?”, found that of the 1000-plus temperature recording stations he had surveyed (a 1221-station network), 89% rated poor to very poor — according to the government’s own criteria for siting the stations.
Perhaps one of the biggest issues with the global data is station dropout after 1990. Over 6000 stations were active in the mid-1990s. Just over 1000 are in use today. The stations that dropped out were mainly rural and at higher latitudes and altitudes — all cooler stations. This alone should account for part of the assessed warming. China had 100 stations in 1950, over 400 in 1960, then only 25 by 1990. This changing distribution makes any assessment of accurate change impossible.
No urbanization adjustment is made for either NOAA or CRU’s global data, based on flawed papers by Wang (1990), Jones (1990), and Peterson (2003). The Jones and Wang papers were shown to be based on fabricated China data. Ironically, in 2008 Jones found that contamination by urbanization in China was a very non-trivial 1C per century — but that did not cause the data centers to begin adjusting, as that would have eliminated global warming.
Continent after continent, researchers are seeing no warming in the unprocessed data (see one thorough analysis here).
Just as the Medieval Warm Period made it difficult to explain why the last century of warming could not be natural (which the hockey stick team attempted to make go away), so did the warm spike in the 1930s and 1940s. In each of the databases, the land data from that period was adjusted down. And Wigley suggested that sea surface temperatures could likewise be “corrected” down by 0.15C, making the results look both warmer but still plausible.
Wigley also noted:
Land warming since 1980 has been twice the ocean warming — and skeptics might claim that this proves that urban warming is real and important.
NOAA complied in July 2009 — removing the satellite input from the global sea surface temperature assessment (the most complete data in terms of coverage), which resulted in an instant jump of 0.24C in ocean temperatures.
Is NASA in the clear? No. They work with the same base GHCN data, plus data from Antarctica (SCAR). To their credit, they attempt to consider urbanization — though Steve McIntyre showed they have poor population data and adjust cities warmer as often as they do colder. They also constantly fiddle with the data. John Goetz showed that 20% of the historical record was modified 16 times in the 2 1/2 years ending in 2007.
When you hear press releases from NOAA, NASA, or Hadley claiming a month, year, or decade ranks among the warmest ever recorded, keep in mind: they have tortured the data, and it has confessed.