San José State University
Thayer Watkins
Silicon Valley
& Tornado Alley

Statistical Characteristics of
the Record of Global Average Temperatures

The accepted record of average global temperatures is given below:

The temperature plotted on the vertical axis is the deviation of each year's temperature in degrees Celsius from the average temperature for the period. This is called the temperature anomaly. The average temperature for the period is 14°C.

A perplexing aspect of the global temperature data is that there is no measure of accuracy associated with each datum. Surely the earlier years with their fewer weather stations and less accurate instruments have less accurate values than the later years. However systematic but constant bias in the measurements is not really an issue. The concern is not with the level of the temperature but with the change in the level of the temperature. Systematic bias as long as it does not change will not affect the changes in temperature. Thus improper placement of the measuring stations result in a bias but as long as it does not change it is unimportant. But any changes in the number and location of measuring stations could create the appearance of a spurious trend. Thus the shutting down of hundreds of high latitude weather stations by Russia in the 1990's for budgetary reasons would be cause for concern about any appearance of trends in the temperature data.

There do appear to be trends. From 1855 to about 1870 there is an upward trend, then from 1870 to 1910 a downward trend. Without any obvious explanation from 1910 there is an upward trend that continues until about 1945. After 1945 the trend is downward until about 1975. Since 1975 the trend has been upward. As the climatologist Patrick J. Michaels has pointed out the slopes of the trends from 1910 to 1945 and from 1975 onward are about the same. Moreover the slopes of the downward trends from 1870 to 1910 and from 1945 to 1975 are also about the same. The initial upward trend from 1855 to 1870 could be perceived as having the about the same slope as the two later upward trends.

Since variables which are the cumulative sums of random disturbances appear to have trends even when the random disturbances have an expected value of zero it is unwise and unsound to extrapolate any apparent trends for such variables. The temperature of the Earth's surface is thermodynamically the cumulative sum of the net heat inflow to it. The question is whether or not the net heat inflow is a random (stochastic) variable. This can be judged by looking at the changes in temperature from year to year. These changes are shown below.

The data viewed in this form do not show any obvious trends. A regression line for a trend in the changes is barely perceptible because it is so close to the horizontal axis. The t-ratio (regression coefficient divided by its standard deviation) for the regression slope is a miniscule 0.01, definitely not significantly different from zero at the 95 percent level of confidence.

Another way of examining the temperature change data is to construct a frequency distribution (histogram). Here the temperature changes are grouped into temperature change intervals of 0.05°C width.


The average temperature change is 0.0055°C per year and that is equivalent to 0.55°C per century. The t-ratio for that change is 0.53 and not significantly different from zero at the 95 percent level of confidence. It is notable that the distribution looks more or less like a normal distribution. This is as would be expected from the Central Limit Theorem which says that some quantity which is the sum of a large number of independent random influences will have a frequency distribution which is closer to a normal distribution the larger the number of independent influences. This lends credence to the notion that the year-to-year temperature changes are stochastic (random).

Since there is the concern that matters have changed over time it is of interest to look the frequency distribution over different intervals. First the interval from 1856 to 1997 is split into two.



There is seemingly a sharp difference between the distributions. The average change for 1856 to 1926 is only 0.22°C per century where as for 1927 to 1997 the average change is 0.93°C per century. However in both cases the t-ratios indicate that the averages are not significantly different from zero at the 95 percent level of confidence. In both cases the distributions could be considered to be based upon normal distributions although the case for 1927 to 1997 looks as though it might represent a bimodal distribution; i.e., one that has to two peaks. It is notable that although the standard deviation of the temperature changes increased from 0.116°C/century to 0.129°C/century between the first seventy years and the second seventy years this is only an 11 percent change and not significantly different from no change.

To keep things in perspective a frequency distribution for a middle interval from 1880 to 1910 was created, as shown below, and it shows an average temperature change of −0.84°C per century.


Again the distribution is roughly that of a normal distribution. If this trend had been extrapolated the Earth's temperature would have been expected to decline by about 8°C by 2007 compared to 1910. This would be equivalent to another ice age. This points up the danger of making projections from limited ranges of variables. This is particularly true for variables which display autocorrelation. Such variables appear to follow trends even if there are no real trends in the system. In other words such variables display spurious trends. The autocorrelation structure of the global temperature changes is shown below.

HOME PAGE OF applet-magic
HOME PAGE OF Thayer Watkins