San José State University |
---|
applet-magic.com Thayer Watkins Silicon Valley & Tornado Alley USA |
---|
|
Meteorological weather forecasting models are tested and refined on the basis of the stringent criterion of the usefulness of their forecasts. As a result weatherforecasting models are of high quality and meteorology is a hard science, as least as hard of a science as the subject matter allows. Weather systems are subject to infinite sensitivity to initial conditions so it is mathematically and physically impossible to make useful weather forecasts beyond a week or so. No matter how much public funds are poured into such an endeavor useful weather forecasts cannot be extended beyond a short period of time.
Climate models are not generally subjected to as stringent criteria of validity as weather models. By an act of faith the model builders run them to obtain projections decades, even centuries, into the future. It is felt that the climate models cannot be tested because it would take too long to wait for a verification of their projections. Their projections are not even in principle forecasts because the modelers have no way of knowing what exogenous events such as major volcanic eruptions might occur.
The Intergovernmental Panel on Climate Change (IPCC) considers validation of its 15 climate models to consist of reproducing the current climate characteristics. The models do alright on some climate characteristics such as the latitudinal profile of zonal temperature means but fail miserably on the climate characteristic of cloudiness. Actually the models do not do all that well even for zonal temperatures until exogenous fudge factors, called flux adjustments are included. For more on these validity tests see IPCC model validation.
The IPCC does not consider the more appropriate validation criterion of projection the recent rates of increase of climate characteristics. For an analysis of how well the climate models do at computing the rate of change of temperature by latitude for the period from 1970 to 2001 see Polar versus Equatorial Warming. The results show that the climate models have some success in explaining the qualitative characterics of the data but they also have failures in the qualitative characteristics. On a quantitative level the errors are so large as to make the projections over a century period useless. The errors in the projection models are cumulative so that if there is 200 percent error over a thirty year period the error over a hundred year period will on the order of 600 percent.
However it is not true that the accuracy of the climate models cannot be tested without waiting 50 years or so. The climate models can be run backwards just as well as forward. Instead of a forecast they would give a backcast of the climate characteristics of the past. Another term coined for this process is retrodiction, in analogy wit prediction. Patrick J. Michaels in his book, Meltdown, gives the backcasting of two climate models from about 1993 back to 1905. One is the first Coupled Global Climate Model(CGCM1) from the Canadian Centre for Climate Modeling and Analysis and the second is British, from the Hadley Centre for Climate Prediction and Research. The data were scaled from the Michaels' graph.
Here is the comparison of the backcasts of global temperature from the Canadian model in comparison with the observations.
Although the model gets the shape generally right the timing is off and that shape had to have come from inputting the sulfate aerosol estimates which may not have been independent estimates but values chosen on the basis of the known observations of global temperature. Nevertheless the backcast change in temperature was 96 percent higher than the observational change of the eighty eight year period. That is nearly a 100 percent error per century.
There is an anomaly in the graph. The CGCM1 should start at the same level at the end of the period as the observations. If the CGCM1 figures are shifted down to coincide with the 1993 observations here is what the graph looks like.
The performance of the CGCM1 climate model looks even less impressive in this corrected version. This shift however does not alter the performance of the model in terms of the errors in computing the temperature changes. Generally the CGCM1 drastically overestimates the temperature change. On the basis of this performance it is clear that the CGCM1 can be tossed out. One of the exasperating aspects of the IPCC, and there are many, is that it pretends that climatology is a hard science but it, the IPCC, cannot differentiate among 15 different climate models as to which ones are best.
Now for the British model.
The general shape of the model value line is "sort of" the same as the observation line, but it misses the global warming which occurred from about 1915 to about 1938 and the global cooling from 1940 to 1960. In terms of the temperature change over the entire period the model is pretty good, being only 12 percent too small. However looking at the graph the two curves cross at six points and it appears to be just a coincidence that one of the crossings was close to the end of the backcasting interval. At about 1955 the error would have been about 200 percent. This indicates that the measure of performance should be some average error over the backcasting period. However clearly the Hadley Centre model is superior the model of the Canadian Centre for Climate Modeling and Analysis. However it is not certain whether even the Hadley Centre model is sufficiently accurate to have any relevance in economic policy decisions. In particular, it does not seem accurate enough to be the basis for the developed countries imposing a trillion dollar a year cost to their economies. And that trillion dollar a year cost may end up being a trillion dollar a year transfer to the governments of developing countries for the purchase of their carbon emissions quotas.
The obvious thing to do is to use regression analysis to calibrate the climate models forecasts. This would not only improve the accuracy of the climate model forecasts but would provide a basis for computing the uncertainty which applies to each forecast. See Climate Model Uncertainty. Regressing actual temperature on all 15 climate model computed temperature will give a weight for each climate model. The models whose computed temperatures are more closely correlated with actual temperatures will get a higher weight.
The global temperature change over the past century has been about 0.7°C, some of which has been due to increases in solar intensity. No climatic disasters have occurred as a result of that temperature change. The projected increase in global temperatures over the next century is on the order of 1.2°C. There is no basis for climatic difficulties stemming from that increase other than from non-validated and invalidated computer climate models. That is not to say there will not be serious weather. There will be serious weather problems just as there will be earthquakes and tsunamis, but no amount of government regulation will change that. It used to be that tribal leaders; i.e., governments; would throw a few virgins into the volcanoes to placate the gods and prevent eruptions. Now governments in developed countries want to throw their economies into the cauldron in repentence for their prosperity. It all stems from some cultural notion that if one makes a sacrifice then surely one will get something in return. For millenia religions have been organizing sacrifices to placate the gods and everyone of those sacrifices has been a complete and total waste. The names have changed by the game is the same.
HOME PAGE OF Thayer Watkins |