Land temperatures trends were 0.31C for NOAA, versus 0.16C for the satellite measurements from the University of Alabama-Huntsville (UAH).
NOAA announced that June 2009 was the second-warmest June. In sharp contrast to this, GISS and the UAH satellite assessments had June virtually at the long-term average (+0.001 C°, or the 15th coldest in 31 years). Remote Sensing Systems (RSS — the other satellite measurement database) had June at +0.075 C°, the 14th coldest in 31 years.
NOAA proclaimed June 2008 to be the eighth-warmest for the globe in 129 years. Meanwhile, NASA showed it was the 9th-coldest June in the 30 years of its record in 130 years, falling just short of 2005.
Satellites were positioned by NOAA to be the future of temperature monitoring — but they amazingly are never mentioned in the NOAA or NASA monthly report.
With regards to the station dropout, NOAA noted:
Thomas Peterson and Russell Vose, the researchers who assembled much of GHCN, have explained:
The reasons why the number of stations in GHCN drop off in recent years are because some of GHCN’s source datasets are retroactive data compilations (e.g., World Weather Records) and other data sources were created or exchanged years ago. Only three data sources are available in near-real time.
It’s common to think of temperature stations as modern Internet-linked operations that instantly report temperature readings to readily accessible databases, but that is not particularly accurate for stations outside of the United States and Western Europe. For many of the world’s stations, observations are still taken and recorded by hand, and assembling and digitizing records from thousands of stations worldwide is burdensome.
During that spike in station counts in the 1970s, those stations were not actively reporting to some central repository. Rather, those records were collected years and decades later through painstaking work by researchers. It is quite likely that, a decade or two from now, the number of stations available for the 1990s and 2000s will exceed the 6,000-station peak reached in the 1970s.
So rest assured — we can make trillion dollar decisions today based on imperfect, tainted data … and correct it in two decades when we painstakingly gather the missing data for the 1990s through 2010s and fix it.
Our government at work.