Climategate: NOAA and NASA Complicit in Data Manipulation

Recent revelations from the Climategate emails, originating from the Climatic Research Unit at the University of East Anglia, showed how all the data centers — most notably NOAA and NASA — conspired in the manipulation of global temperature records to suggest that temperatures in the 20th century rose faster than they actually did.

Advertisement

This has inspired climate researchers worldwide to take a hard look at the data proffered, by comparing it to the original data and to other data sources. An in-depth report, co-authored by myself and Anthony Watts for the Science and Public Policy Institute (SPPI), compiles some of the initial alarming findings with case studies included from scientists around the world.

We don’t dispute the fact that there has been some cyclical warming in recent decades — most notably from 1979 to 1998 — but cooling took place from the 1940s to the late 1970s, again after 1998, and especially after 2001, all while CO2 rose. This fact alone questions the primary role in climate change attributed to CO2 by the IPCC, environmental groups, and others.

However, the global surface station data is seriously compromised.

There was a major station dropout — and an increase in missing data from remaining stations — which occurred suddenly around 1990. Just about the time the global warming issue was being elevated to importance in political and environmental circles.

A clear bias was found towards removing higher elevation, higher latitude, and rural stations — the cooler stations — during this culling process, though that data was not also removed from the base periods from which “averages,” and then anomalies, were computed.

The data also suffers contamination by urbanization and other local factors, such as land-use/land-cover changes and improper siting.

Advertisement

There are also uncertainties in ocean temperatures. This is no small issue, as oceans cover 71% of Earth’s surface.

These factors all lead to significant uncertainty and a tendency for overestimation of century-scale temperature trends. A conclusion from all findings suggests that global databases are seriously flawed and can no longer be trusted to assess climate trends, or rankings, or to validate model forecasts. Consequently, such surface data should be ignored for political decision-making.

Prior to the release of this paper, KUSI’s John Coleman — founder of The Weather Channel — aired a one hour prime-time special: Global Warming: The Other Side. The special was so successfully received that KUSI will be doing another special in February.

NOAA has already responded to the preliminary paper supporting John Coleman’s special through the Yale Climate Forum:

The accuracy of the surface temperature record can be independently validated against satellite records. Over the period from 1979 to present where satellite lower-tropospheric temperature data is available, satellite and surface temperatures track quite well.

Actually Klotzbach et al. (2009) found that when the satellites were first launched, their temperature readings were in relatively good agreement with the surface station data. There has been increasing divergence over time (exceeding 0.4C now), but the divergence does not arise from satellite errors. Further, they found that the divergence between surface and lower-tropospheric measurements, which has probably continued, is consistent with evidence of a warm bias in the surface temperature record.

Land temperatures trends were 0.31C for NOAA, versus 0.16C for the satellite measurements from the University of Alabama-Huntsville (UAH).

Advertisement

NOAA announced that June 2009 was the second-warmest June. In sharp contrast to this, GISS and the UAH satellite assessments had June virtually at the long-term average (+0.001 C°, or the 15th coldest in 31 years). Remote Sensing Systems (RSS — the other satellite measurement database) had June at +0.075 C°, the 14th coldest in 31 years.
NOAA proclaimed June 2008 to be the eighth-warmest for the globe in 129 years. Meanwhile, NASA showed it was the 9th-coldest June in the 30 years of its record in 130 years, falling just short of 2005.

Satellites were positioned by NOAA to be the future of temperature monitoring — but they amazingly are never mentioned in the NOAA or NASA monthly report.

With regards to the station dropout, NOAA noted:

Thomas Peterson and Russell Vose, the researchers who assembled much of GHCN, have explained:

The reasons why the number of stations in GHCN drop off in recent years are because some of GHCN’s source datasets are retroactive data compilations (e.g., World Weather Records) and other data sources were created or exchanged years ago. Only three data sources are available in near-real time.

It’s common to think of temperature stations as modern Internet-linked operations that instantly report temperature readings to readily accessible databases, but that is not particularly accurate for stations outside of the United States and Western Europe. For many of the world’s stations, observations are still taken and recorded by hand, and assembling and digitizing records from thousands of stations worldwide is burdensome.

During that spike in station counts in the 1970s, those stations were not actively reporting to some central repository. Rather, those records were collected years and decades later through painstaking work by researchers. It is quite likely that, a decade or two from now, the number of stations available for the 1990s and 2000s will exceed the 6,000-station peak reached in the 1970s.

Advertisement

So rest assured — we can make trillion dollar decisions today based on imperfect, tainted data … and correct it in two decades when we painstakingly gather the missing data for the 1990s through 2010s and fix it.

Our government at work.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement