The majority of climate scientists think that global temperatures have risen over the past century mostly because of human activity. However, there are some climate scientists who think that the small changes we have seen in global temperature are mostly the result of natural variations that exist independently of people. Others simply say we don’t have enough information to know how much human activity has played a role in the process. Add to that the unreliability of much of the early data regarding global temperatures, and you end up with a picture that is far more murky than what most media outlets and politicians want you to see.
A recently-published study might help to eventually shed some light on how much human activity affects global temperatures. It comes from four climate scientists in China who are affiliated with The Climate Center of the Zhejiang Meteorologic Bureau, the Earth Science School of Zhejiang University, and the Shanghai Climate Center. They are convinced that the vast majority of the changes we have seen in global temperatures are due to natural variations, and those variations are buffered by the oceans. As a result, they have tried to analyze global temperatures from that perspective.
Since global temperature data sets don’t really agree with one another, they first had to choose which global temperatures they would actually use. They chose the Global Land Surface Temperature Anomaly Index (GLST) as compiled by the NOAA. They then tried to find correlations between those data and the Sea Surface Temperatures (SST) as compiled by the Hadley Climate Center. The correlations they found led them to develop a mathematical equation that would reproduce the GLST data. While the idea of finding a single equation that would fit all the GLST data might seem like an impossible task, it is not. One phrase I often hear from my nuclear chemistry colleagues is, “It only takes four parameters to fit an elephant.” In other words, if you have enough parameters in your equation, you can fit just about anything.
Of course, for something as complex as global temperatures, it takes more than four parameters. In fact, their paper indicates that it took 20. However, with their 20-parameter equation, they were able to reasonably reproduce the global temperature data that they were analyzing. The results can be seen in the image at the top of the post. The jagged, grey line indicates the data, and the smoother, black line indicates the results of their equation. As you can see, it does a pretty good job of fitting the known data.
Does that mean their equation is a good explanation of global temperatures? Not at all. It is simply an equation that has been forced to fit the data. What I find interesting, however, are the temperatures it predicts for the future. According to the equation, the earth has hit its maximum temperature for a while, and over the next 100+ years, the average temperature of the planet will cool. Do I think that prediction is correct? There is no way I can adequately judge that. There are simply too many unknowns in climate science for anyone to make a reliable prediction about what is going to happen in the future. Perhaps we will eventually learn enough about climate science to change that, but right now, the uncertainties simply preclude reasonable predictions.
However, here’s what I will say about this very interesting study: The authors assume that that the vast majority of the temperature variations we have seen are the result of natural processes. If, over the next 30 years, the data continue to fall in line with the predictions of their equation, that will lend more credence to their assumption. If not, that will indicate that either their assumption is wrong, or that some of the natural variations which cause global temperature changes are too long-term to show up in a century’s worth of unreliable temperature data.
Regardless of the outcome, I do think that this paper, while simple in its approach, is a valuable addition to climate science.