“It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.”
This famous quote from the fictional Sherlock Holmes well-summarizes the search for truth in criminology, but it also applies perfectly to today’s global warming debate.
The inconvenient truth about climate change is that we lack the data to properly understand what weather was like over most of the planet, even in the recent past. Without a good understanding of past weather conditions, we have no way of knowing the history of the planet’s average condition — the climate. Despite the confident pronouncements of politicians and climate activists, we cannot compare today’s climate with the past. Meaningful forecasts of future climate change are therefore impossible.
For example, consider the touchstone of the global warming movement: the “HadCRUT4” global average temperature history for the past 167 years. This record was produced by the Climatic Research Unit at the University of East Anglia, and the Hadley Centre (UK Met Office), both based in the United Kingdom:
Figure 1: The “touchstone” of the global warming movement.
The graph above displays the monthly instrumental temperature records formed by combining the sea surface temperature records and the land surface air temperature records. The baseline is an average from 1961 – 1990.
Until the 1960s, temperature data was collected using mercury thermometers located at weather stations situated mostly in the United States, Japan, the UK, and eastern Australia. Most of the rest of the planet had very few temperature sensing stations. And none of the Earth’s oceans, which constitute 70% of the planet’s surface area, had more than the occasional station separated from its neighbor by thousands of kilometers. The data collected at the weather stations in this sparse grid had, at best, an accuracy of +/-0.5 degrees Celsius. In most cases, the real-world accuracy was no better than +/-1 deg C.
Averaging such poor data in an attempt to determine global conditions cannot yield anything meaningful.
Displaying average temperature to tenths of a degree, as is done in the “HadCRUT4” global average temperature history above, clearly defies common sense.
Modern weather station surface temperature data is now collected using precision thermocouples. However, starting in the 1970s, less and less ground surface temperature data was used for plots such as “HadCRUT4.” This was done initially because governments believed that satellite monitoring could take over from most of the ground surface data collection. But the satellites did not show the warming forecast by computer models. So bureaucrats simply closed most of the colder rural surface temperature sensing stations, thereby yielding the warming desired for political purposes.
Sherlock Holmes would have condemned this twisting of the facts to suit the theory.
Today, there is virtually no data for approximately 85% of the Earth’s surface. Indeed, there are fewer weather stations in operation now than there were in 1960. The surface temperature computations displayed on the “HadCRUT4” graph after about 1980 are meaningless.
Combining this with the problems with the early data, the conclusion is unavoidable: it is not possible to know how the Earth’s so-called average temperature has varied over the past century and a half. This useless data is the input for the computer models which form the basis of policy recommendations produced by the United Nations Intergovernmental Panel on Climate Change (IPCC).
The lack of adequate surface data is only the start of the problem.
The computer models on which the climate scare is based are mathematical constructions that require the input of data above the surface, as well. As shown in Figure 2, the models divide the atmosphere into cubes piled on top of each other, ideally with wind, humidity, cloud cover, and temperature conditions known for different altitudes.
But we currently have even less data above the surface than at it, and there is essentially no historical data at altitude:
Figure 2: Models require data above the surface as well. We hardly have any.
Many people think the planet is now adequately covered by satellites gathering 24/7 global data far more accurate than anything determined at surface weather stations. But the satellites are unable to collect data from the North and South Poles, regions that are touted as critical to understanding global warming.
Also, space-based temperature data collection did not start until 1979, and 30 years of weather data is required to generate a single data point on a climate graph. The satellite record is far too short for reaching useful conclusions about climate change. In fact, there are insufficient data of any kind — temperature, land and sea ice, glaciers, sea level, extreme weather, ocean pH, etc. — to be able to determine how today’s climate differs from the past.
Lacking such fundamental data, the IPCC’s climate forecasts have no connection with the real world.
Professor Hubert Lamb, often identified as the founder of modern climatology, identified the problem when trying to improve weather forecasting. He theorized that better forecasting required better understanding of past weather and climate — and that such understanding could only come from possessing accurate weather records from times long before we properly understood much at all about the weather. Lamb examined the archives of the United Kingdom Meteorological Office, and demonstrated that the long-term records showed far different weather conditions to today, even in the recent past.
After World War II, lamb set up the Climatic Research Unit at the University of East Anglia to collect data and to reconstruct as much climate history as possible. His comprehensive treatise, Climate, Past, Present and Future, is one of the most remarkable and informative sources of the extent and type of data available. Lamb clearly shows that we simply do not have the vast amount of accurate weather data that would make it possible to understand climate change.
Lamb also complained that funding for improving the weather database was being dwarfed by the money spent on computer models and theorizing. Because of this wrongheaded approach, Lamb warned, we would start to see wild and unsubstantiated theories appearing. Further, predictions would fail to improve over time.
That is precisely what has happened. All prediction made by the computer models cited by the IPCC have turned out to be incorrect.
The first predictions the IPCC made, which appeared in its 1990 Assessment Report, were so wrong that the IPCC later started to refer to them as “projections” rather than “predictions.” The IPCC later offered low-, medium-, and high-range “projections” for the future, seemingly concluding that a broad enough range of forecasts was bound to include the correct one.
Even that approach proved too optimistic. All three ranges predicted by the IPCC have since turned out to be wrong as well.
A simple definition of science is the ability to predict. If the prediction is wrong, the science is wrong. American theoretical physicist and Nobel Prize winner Richard P. Feynman summed up the situation well:
It doesn’t matter how beautiful your theory is … If it doesn’t agree with experiment, it’s wrong.
This year’s UN Climate Change Conference (COP23) kicked off on Monday in Bonn, Germany, and runs through November 17. The conference is being led by Fiji’s Prime Minister Frank Bainimaram, a strong supporter of the Paris agreement. In his address before the UN General Assembly on September 20, Bainimaram called for “an absolute dedication to meet the 1.5-degree target.”
In support of Fiji’s position, the COP23/Fiji site repeatedly cites the IPCC’s scary — though groundless – forecasts. It states, for example:
The IPCC recently reported that temperatures will significantly increase in the Sahel and Southern African regions, rainfall will significantly decrease, and tropical storms will become more frequent and intense, with a projected 20 per cent increase in cyclone activity.
The IPCC is not practicing science when they issue these forecasts; this is pseudo-science being used to back a political agenda. But the press, politicians, and activists love it, because scaremongering drives media sales, changes votes, and bolsters the bottom line for environmental groups.
Batten down the hatches. A global warming propaganda tsunami was just triggered this week.