What Is a Normal Temperature?
Last week, I wrote about how the record cold temperatures of the polar vortex and impenetrable ice in the Antarctic are being cited as evidence in favor of global warming. That makes global warming an "unfalsifiable" theory in the sense that any evidence against it will be reinterpreted ad hoc as evidence for it.
After I published that, my friend Jack Wakeland sent me a link to an even more brazen reinterpretation of reality: the claim that the 15-year pause in global warming doesn't exist—so long as we use a new and totally different measure of global temperatures.
This is an obvious shifting of the goalposts. The measure of warming that they all thought was fine and dandy when temperatures seemed to be rising now doesn't show rising temperatures. So they have to reinterpret the data to get the result that fits their theory.
When someone posted my article on Facebook, commenter Jordan Phillips named the basic pattern of global warming arguments: "In a real scientific theory, you have to make some kind of blind prediction that makes you vulnerable and accountable, so that if later observations contradict the prediction then your theory has no squirm room to avoid its fate. But catastrophic manmade global warming theory is the opposite: it's based on waiting for a weather event to happen, then rationalizing how that event was caused by global warming."
So global warming science is not just ad hoc but post hoc.
This raises a wider issue that I think is fundamental to the science on this issue. Do we even have an accurate measure of global temperature? Do we know what global temperature is normal, and what variation of temperature is normal?
The problem is that we have less than 150 years of thermometer measurements, and much less than that from other sources like satellites or ocean temperatures. Given that the early years of this record show a rapid increase in temperatures before industrialization could have made a difference, this indicates that it's very hard to know, given such a short record, what is natural variation versus what is outside the normal range.
In fact, last year I linked to an article arguing that a proper statistical analysis of thermometer measurements is more consistent with random variation than with a warming trend. (See a simpler explanation here.)
The warmthers try to overcome this deficiency by developing longer-term measurements of temperature, over periods of 2,000 or 10,000 years. But since they don't have thermometer measurements going back that far, they use "proxies," such as tree-rings in ancient, slow-growing trees, which tend to vary with temperature. But these measurements also vary with other factors, such as rainfall, so the data has to be "smoothed" in an attempt to factor out short-term fluctuations. But this means that proxy measurements are only meaningful over very long periods of time—and that causes problems when you try to combine them with the shorter-term temperature measurements.
But this is precisely what the warmthers do. In the Climategate e-mails, for example, this is what all the fuss was about using "Mike's Nature trick...to hide the decline." This was the technique of switching suddenly from proxy measurements to thermometer measurements for more recent dates, in some cases covering up a decline in temperatures shown by the proxies.
This is the sort of thing that came crashing down last year when a scientist of the warmther school created a new "hockey stick" graph showing low temperatures over 11,000 years that suddenly shoot up in the last century. It turned out that the "smoothing" techniques used in this study made its recent results statistically meaningless. This is the basic tradeoff the warmthers can't get around: the methods they use to derive temperature records over thousands of years don't give you any idea what is happening on a time scale of mere decades.
So the warmthers don't ask whether we can actually figure out what a "normal" temperature is because they are afraid that the answer is, "We can't."
Or perhaps they are concerned that we can, and that the answer isn't what they want. There is evidence, for example, that we are headed into a low point in solar activity that may cause a replay of the "Maunder Minimum," a global cold snap that hit in the 17th century.
Or if we step back and look at global temperatures over a much longer time scale—a truly relevant geological time scale—we get a much more sobering answer. Look at this graph of global temperatures over the past 350,000 years. It shows long steady declines in global temperatures over 50,000 to 100,000 years as the Earth slides into an ice age, followed by sudden, rapid warming as we pop back up into a relatively warm "interglacial" period that lasts 10,000 to 20,000 years.
We have the good fortune of being in the middle of a nice balmy interglacial right now, and that's no accident. Human civilization begins its rise within a few thousand years after the end of the last ice age. It seems that we flourish in warmer weather.
So what does a "normal" global temperature look like on this time scale? It's five to ten degrees colder and half of North America is covered in glaciers.
That's the real big picture on "climate change." It's not the fear that the weather might get a few tenths of a degree warmer. It's the certainty that massive global cooling and a new ice age are somewhere in humanity's future.
[fbshare type="button"] [twitter style="horizontal" float="left"]