My first two posts on Global Warming, I and II, dealt with the First Law of Thermodynamics, otherwise known as the Conservation of Energy, and carbon dioxide, an important greenhouse gas. By carefully accounting for energy flows and balances, I showed how researchers establish that increases in carbon dioxide must produce global warming. While the fundamentals, which is all I’m dealing with, are simple, the details and the magnitudes are complicated and turn on details. This is why there are professionals in the business. The professionals are of the view that, when all the details are considered, humans are warming the climate and the warming will be significant.
In this post, I’m going to show some of the temperature change data, as I showed to my students. Of course, the purpose is not to help them to become climate scientists. The title of the book I use is Physics and Technology for Future Presidents, by UC Berkeley physics Professor Richard Muller. Thus what I have in mind for these climate change blogs is to help future presidents and citizens think about this issue, and about scientific claims in general. Also, to tell some stories.
In the matter of temperatures, you will be gratified to see that the scientists always show 0 on their vertical axes. Alas, this is not 0 K, or 0 C, or 0 F. Indeed, if they were to plot their data with 0 K, the real physicists’ zero temperature, they’d never be able to see a bit of what they are interested in, which are small changes in average temperatures that occur over many years. The scientists plot the temperature anomaly, which is the temperature minus the temperature averaged over some base period of years. Thus, they might take the temperature averaged over the years 1950 to 1960, and subtract that value from their data before they plot it. Thus, they plot, in 1980, the difference between the measured (or calculated) temperature averaged, say from 1970 to 1980 minus the average temperature from 1950 to 1960. They usually plot this anomaly in degrees C, which are the same as degrees K, and sometimes in degrees F, which are 5/9th of a degree C or K.
Here’s the famous hockey stick graph.
This is one value plotted going back a thousand years, to show the behavior of the temperature. But the temperature varies from night to day, from one place to another, with altitude, and it is just one part of climate. Thus the scientists have averaged over a lot. That average from 1961 to 1990 that they subtracted is about 60 F, which is about 15 C. That fuzz represents the scientists’ estimate of the possible error of their measurements. You can also see the year by year data, which jiggles around and an average over, probably, ten years, which is a smoother curve.
Thermometers came into use in the 18th century, more or less, and it wasn’t until later that people began recording daily air temperatures, and it was still later that people began recording temperatures at widely spaced locations on land or sea. Thus the data from thermometers is available for these purposes on in the last 150 or 100 years. They still involve a lot of extrapolation and interpolation.
All the data from earlier times, that is, most of this data, comes from so-called proxies. You can see that these include tree rings, corals, ice cores, and records. My opinion is that this is remarkable and painstaking analytical work. Not least in trying to get the values from the different proxies to match each other, and to the thermometer data. The struggle to get all of this data consistent shows up in the data errors.
Students of historical climate and temperature, and historians, identify a Medieval Warm period in the North Atlantic region, from the 10th to the 12th or 13th century and a Little Ice Age (not really an Ice Age, but a cooler time) from the 16th century to the mid-19th century. Can you see it in this data? Maybe. But those phenomena, which may have had major effects on history, everything from the settlement of Greenland, to wine grapes in England, to the destruction of Napoleon’s Army in Russia, were likely localized to the North Atlantic and don’t show up much on what is supposed to represent surface temperatures over the northern hemisphere.
Of course, the entire fuss has to do with the rapid jump upward at the end that shows no sign of easing up. You can see it jumps completely above the entire range of variation for the past 1000 years.
The climate, of course, consists of much more than just this single, greatly abstract thoroughly averaged time series. As I read about this, I’m impressed by the breadth of evidence and phenomena gathered by researchers in many fields consistent with global warming. Arctic ice cover is vanishing. Flowers are blossoming earlier. And so on. Much more than I wish to blog about.
But I have an interesting story I told my students about the author of their text book, Prof. Muller. He wrote the book in 2008, and it was published in 2009. Chapter 10 deals with climate change. Muller says that the climate is warming and that humans are the major cause, but he carefully attends to the statements of the IPCC in which they try to assign “error bars” to their conclusions. As I read the chapter, I had the impression that Muller was actually some kind of skeptic about climate change. He expresses a lot of concern about the supposed harms caused by those, such as Al Gore, who exaggerate the threats of warming. He seems unconcerned about the harms, if any, caused by the nut case science deniers.
It turns out that Muller really was skeptical of the climate scientists’ analysis of the thermometric temperature record. After the book appeared, maybe in 2012 or so, he formed the Berkeley Earth Surface Temperature project and raised funds from, it turns out, climate change deniers such as Koch brother foundations. His plan was to carry out his own, “independent”, analysis of the raw temperature data. This data is often criticized by deniers who claim that the scientists are fooled by jet engine blast at airports and urban heat islands, and so on. As the professionals make their raw data available, anyone can do their own analysis. And analysis is necessary. Thermometers are moved. They are replaced with new models. Ocean sampling from ships at sea change. In addition to thermometers, there is data from satellites and so on, all of which has to be calibrated, adjusted, and aligned. It’s very complicated, detail-oriented, and massive.
From the climate change blogs I read, climate scientists were bemused, or irritated, that this guy thought he could do correctly what they had been working on for decades, but were in error, according to him.
Here’s his result:
Looks like the climate scientists had done OK, as Prof. Muller said in scientific papers and in the New York Times. NASA and NOAA refer to our space agency and our weather bureau, and Hadley is British data sets and analysis.
The climate scientists were confident that Muller would confirm their results, if he did his work correctly, because they had been working for decades on the issue, tracking down errors and inconsistencies. I had another story about this that I told my students about because it illuminates an important aspect of the scientific method. I’m referring to a discrepancy between direct surface temperature measurements and atmospheric measurements from satellites. This will be in Global Warming Part IV.