Climate Science's Dirtiest Secret

With the climate science party-line case for global warming rapidly unwinding there is growing interest by researchers from outside the climate change community in applying advanced statistical techniques to climate data. It has long been recognized that statistical acumen is lacking among mainstream climate scientists. This dirty little secret was first publicly disclosed during Congressional hearings regarding the 2006 Wegman Report. Even newer analyses have revealed that many of the predictions made by the IPCC reports and other global warming boosters are wrong, often because inappropriate statistical techniques were applied.

The Wegman Report was the result of an ad hoc committee of independent statisticians who were asked by a congressional committee to assess the statistical information presented in the Michael Mann “Hockey Stick” papers. Dr. Edward Wegman, a prominent statistics professor at George Mason University and chairman of the National Academy of Sciences’ (NAS) Committee on Applied and Theoretical Statistics, headed the panel of experts who examined the use of statistics in climate science. They found the climate science community was far too insular and did not consult with experts in statistics outside of their own field.

This self imposed isolation led, in the opinion of the committee, to misuse of statistics and a peer review process that only included the close-knit circle of researchers at the center of the global warming controversy. Wegman stated in testimony before the energy and commerce committee: “I am baffled by the claim that the incorrect method doesn't matter because the answer is correct anyway. Method Wrong + Answer Correct = Bad Science.” More on the Wegman Report can be found in Chapter 13 of The Resilient Earth.

More recently, Tom Siegfried, editor in chief of Science News, wrote an essay entitled “Odds Are, It's Wrong.” In it he addressed the general problem of the misuse of statistics in science and medical research. While not talking speciffically about climate science, here is Siegfried's take on the problem:

It’s science’s dirtiest secret: The “scientific method” of testing hypotheses by statistical analysis stands on a flimsy foundation. Statistical tests are supposed to guide scientists in judging whether an experimental result reflects some real effect or is merely a random fluke, but the standard methods mix mutually inconsistent philosophies and offer no meaningful basis for making such decisions. Even when performed correctly, statistical tests are widely misunderstood and frequently misinterpreted. As a result, countless conclusions in the scientific literature are erroneous...

Nowhere is this “dirty secret” more prevalent than in climate science. I have already reviewed two recent statistical analyses of climate change that find much different answers and draw radically different conclusions than mainstream climate science. This post is the third in the series of articles on statistical analysis and global warming. In this article I will discuss another data analysis technique that is used to deal with non-stationary data of the type often associated with Earth's climate system—empirical mode decomposition.

Empirical Mode Decomposition

I first came across the use of non-stationary data analysis techniques in things related to climate science while researching our book, The Resilient Earth. The technique was called Empirical Mode Decomposition (EMD) and it was applied to the variation in yearly flooding of the Nile river and its relationship to variation in the output of the Sun. According to Alexander Ruzmaikin, Joan Feynman and Yuk Yung, water levels in the Nile are linked to solar variability on both an 88 year and 200 year cycle. For more information see Chapter 10 of TRE or the paper “Does the Nile reflect solar variability?

EMD has been applied to problems in a number of disciplines. It can be used to extract underlying trends present in a set of time series data, which can then be used to “detrend” the data. After detrending, more familiar analysis techniques, such as multiple-regression, can then be applied to the data set. The team of computer science researchers I work with has used EMD to analyze workload and performance data for large complex data grids.

Those familiar with Fourier analysis or other similar decomposition techniques know that a complex signal can be represented by a combination of simpler functions that form what mathematicians call an orthonormal basis or basis set. In the case of Fourier series the basis set is composed of sines and cosines and can be expressed in the following equation:

Many of the phenomena studied in engineering and science are periodic in nature. Fourier analysis breaks these periodic functions into their constituent components (fundamentals and harmonics). Consider the example of a simple step function, shown below. As more and more trigonometric functions are added to the series, the sum of the functions converges on the target signal. For a good overview of the mathematics behind Fourier series see the S.O.S Math web site.


Fourier Series representation of a step function. From S.O.S. Math.

Similarly, EMD breaks a complex signal down into simpler components. Where it differs from other decomposition methods is that EMD is empirical, direct, and adaptive, without requiring any predetermined basis functions. The decomposition is designed to seek the different intrinsic modes of oscillations, called intrinsic mode functions (IMFs), in any data based on the principle of local scale separation. The big advantage of EMD is that it can deal with aperiodic, non-stationary data from non-linear processes, something Fourier analysis cannot do.

In 2007, Zhaohua Wu et al. published a paper on the use of EMD to illustrate the determination of the intrinsic trend and natural variability in climate data (see PNAS “On the Trend, Detrending, and Variability of Nonlinear and Nonstationary Time Series”). In their paper they explain the complexities of analyzing climate related data:

The most commonly seen trend is the simple trend, which is a straight line fitted to the data, and the most common detrending process usually consists of removing a straight line best fit, yielding a zero-mean residue. Such a trend may suit well in a purely linear and stationary world. However, the approach may be illogical and physically meaningless for real-world applications such as in climatic data analyses. In these studies, the trend often is the most important quantity sought, and the linearly fitted trend makes little sense, for the underlying mechanism is likely to be nonlinear and nonstationary.

In many papers you will find reference to just this type of detrending, whether its use is appropriate or not. Unfortunately, statistics and data analysis in climate science is often a matter of applying the tools and techniques that the researchers are familiar with, not the correct ones for the job. The paper's explanation of the problem continues:

Another commonly used trend is the one taken as the result of a moving mean of the data. A moving mean requires a predetermined time scale so as to carry out the mean operation. The predetermined time scale has little rational basis, for in nonstationary processes the local time scale is unknown a priori. More complicated trend extraction methods, such as regression analysis or Fourier-based filtering, also are often based on stationarity and linearity assumptions; therefore, one will face a similar difficulty in justifying their usage. Even in the case in which the trend calculated from a nonlinear regression happens to fit the data well fortuitously, there still is no justification in selecting a time-independent regression formula and applying it globally for nonstationary processes. In general, various curve fits with a priori determined functional forms are subjective, and there is no foundation to support any contention that the underlying mechanisms should follow the selected simplistic, or even sophistic, functional forms, except for the cases in which physical processes are completely known.

The assertion that simplistic analysis (that is a very relative term here) will not work unless the underlying physical processes are completely understood is a warning to all those who attempt to relate trends in climate with trends in other physical phenomena. Wu et al. extracted linear and multidecadal trend data for a set of temperature data using EMD. The data used were the annual global surface temperature anomalies analyzed by Jones et al. and posted at the web site of the Climate Research Unit, University of East Anglia, UK (I'm not going to comment on the data as the paper was pre-climategate). The resulting EMD IMFs are shown below.


Means (black lines) and standard deviations (gray lines) of IMFs of 10 different siftings corresponding to S numbers from 4 to 13. Wu et al.

While the first four IMFs were not distinguishable from IMFs generated by pure white noise, the higher ones did pass tests for statistical relevance. They found that the linear trend gives a warming value of 0.005°C per year. The multidecadal trend, the sum of C5 and C6 from the figure above, showed no warming in the mid-19th century and is ≈0.008°C per year currently. The rates of change were higher during the 1860s, 1930s, and 1980s, which were separated by periods of temperature decreases.


Rates of change (temporal derivative of a trend, in degrees Kelvin per year) for the overall trend (solid red line) and the multidecadal trend (dashed line). Wu et al.

Breaking down the underlying signals in the temperature data yields an interesting statistical explanation for the anomalous temperature increases in the 1990s: “The extreme temperature records in the 1990s stand out mainly because the general global warming trend over the whole data length coincides with the warming phase of the 65-year cycle.” Here are the major conclusions from the paper:

  • The linear trend gives a warming value of 0.5°C per century.
  • The overall adaptive trend was close to no warming in the mid-19th century and is ≈0.8°C per century currently.
  • The multidecadal trend rate of change is much larger than the overall adaptive trend but oscillates between warming and cooling.
  • The higher frequency part of the record in recent years is not more variable than that in the 1800s.

From these analyses the predicted temperature rise by 2100, assuming the current trends continue, would only be 0.8°C, significantly less than even the low-end IPCC predictions. This is in fairly close agreement with the statistical break analysis previously reported. From the figure above is would appear that the long-term trend rate of increase has flattened out while the multidecadal trend, which peaked in 1998, is headed downward and is expected to drive the overall temperature trend into cooling. If the current trend is maintained, the peak of the cooling period will be around 2030, followed by a warming trend that will peak around 2060. As with all non-stationary data, the underlying system may change and the future trends vary in unpredictable ways.

One of the most interesting things they found was that the time scale of the multidecadal trend was not constant, but varied from 50 to 80 years with a mean value slightly higher than 65 years. “Significantly, other than the familiar overall global warming trend, the 65-year cycle really stands out,” state the authors. “The origin of this 65-year time scale is not completely clear because there is no known external force that varies with such a time scale.” Remember, all of this is strictly from analyzing historical data. It is in no way calculated from the underlying mechanisms that cause Earth's climate to change, as GCM attempt to do.

The Predictions Just Keep On Coming

NASA's James Hansen and his posse at GISS have stated that there has been no reduction in the global warming trend since the late 1970s. In a recent paper, they also say that 2010 will likely set a new record high global surface temperature, unless El Niño rapidly weakens by mid-2010 and gets replaced by La Niña conditions. Reportedly, many models do continue to show a steady weakening of El Niño by the summer. Here is what NOAA's National Weather Service Climate Prediction Center has to say about the current El Niño and this coming summer's weather outlook:

Nearly all models predict decreasing SST anomalies in the Niño-3.4 region through 2010, with the model spread increasing at longer lead times (see figure). The majority of models predict the 3-month Niño-3.4 SST anomaly will drop below +0.5°C by May-June-July 2010, indicating a transition to ENSO-neutral conditions near the onset of Northern Hemisphere summer. However, several models suggest the potential of continued weak El Niño conditions through 2010, while others predict the development of La Niña conditions later in the year. Predicting when El Niño will dissipate and what may follow remains highly uncertain.

As those who have been following the recent articles on this web site have seen, three different analyses of climate time series data have yielded predictions much different than the IPCC “projections.” The first focused on testing for structural breaks using the Chow Test. This statistical approach predicts no temperature rise between now and 2050 with only a slight increase of 0.2°C between 2050 and 2100. Details can be found in “Stat Model Predicts Flat Temperatures Through 2050.”

The second post focused on polynomial cointegration, a term sure to intimidate the non-statistician. In a draft of an unpublished paper that found its way onto the Internet, two economic statisticians claim to have proven that CO2 and other greenhouse gas levels do not correlate with global temperature change in the 20th century. Moreover, they flatly claim that their work “refutes AGW.” Strong stuff. For more information see my post “Econometrics vs Climate Science.”

This third example, EMD, indicates a very modest long-term warming trend that has flattened out in recent years. That trend is swamped by a multidecadal signal that may drive a cooling trend for the next 30 years. What physical mechanism is responsible for the dominant 65 year trend is not known, but the cointegration analysis paper we previously examined says that is not greenhouse gas levels—CO2 is not driving Earth's temperature change. Which prediction is correct no one knows, and that most definitely includes the world's climate scientists.

As a result of this analytical chaos, some climate scientists are even jumping on the multidecadal cooling trend bandwagon. Dr. Mojib Latif, a professor at the Leibniz Institute at Germany's Kiel University and an author of the UN's IPCC report, believes we're in for 30 years of cooler temperatures—a mini ice age, he calls it, basing his theory on an analysis of natural cycles in water temperatures in the world's oceans. Professor Latif said: “A significant share of the warming we saw from 1980 to 2000 and at earlier periods in the 20th century was due to these cycles—as much as 50 per cent.” Others are sticking to the familiar predictions of doom and disaster. Was there not a “consensus” among the world's scientists? Wasn't this all supposed to be “settled” science?

It could be that Hansen's GISS gang—seeing that the ENSO is currently in an El Niño—are betting that the current trend in the Pacific will last through the summer months. This could cause the return of hot weather, allowing them to crow about the return of global warming. What happened to all of the stern warnings that weather isn't climate and that a single hot year or cold year is not statistically important? Indeed, such events may be part of the white noise described in the EMD paper. It could also be that some climate scientists are simply hedging their bets with this cooling trend stuff. Given that the misapplication and misinterpretation of statistics are climate science's dirtiest secret, little credence can be given to any predictions about future climatic conditions. They may as well be casting horoscopes and reading the entrails of sheep.

Be safe, enjoy the interglacial and stay skeptical.

IPCC Says Climate Prediction Impossible

The statement below appeared in the Executive Summary of Chapter 14 of the report produced by Working Group 1.

    The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.

For more look here.

WMO El Niño/La Niña Update

This is the latest from World Meteorlogical Organization:

The ongoing El Niño event continues to have significant and widespread impacts. The signature of this event, which started in June 2009, is seen in basin-wide Pacific Ocean conditions, and in many of the climate patterns around the world that are typically impacted by an El Niño event. The most likely outcome by mid-year 2010 is for the El Niño event to have decayed and near-neutral conditions to be re established across the tropical Pacific. However, this time of year (March-June) is a particularly difficult period for forecasting developments in the tropical Pacific, and forecasters cannot rule out persistence of El Niño or the possible early stages of La Niña by mid-year. Even during the decaying phase of the El Niño expected over the next few months, the conditions associated with it will continue to influence climate patterns at least through the second quarter of the year, and this information will be contained in the available national and regional seasonal climate forecast assessments.

In summary:
  • A basin-wide El Niño event continues. Its strength peaked in November-December 2009 at a moderate level. So far, declines in strength have only been modest;
  • Further decline in strength is expected over the next few months, but considerable uncertainty remains regarding the timing and rate of decay. Nonetheless, the most likely outcome by mid-year is considered to be near-neutral;
  • The period March-June is a particularly difficult time of the year for forecasting tropical Pacific developments, and while near-neutral is considered the most likely outcome by mid-year, it is still considered possible for El Niño to persist or for the early stages of a La Niña to be present by mid-year;
  • It is however important to recognize that impacts of the current El Niño are expected to continue to be felt in many parts of the world through at least the second quarter of 2010. This is because impacts on many climate patterns both close to and remote from the Pacific, can occur even during the decay phase of an El Niño event.
for more see: http://www.wmo.int/pages/prog/wcp/wcasp/enso_update_latest.html

11-year cycle?

It is my imagination, or does the "C3" IMF show an 11-year period? I am not a statistician; I confess to being impressed with EMD if it can deduce the sunspot cycle from the historical data with no a priori model.

11 year cycle

Well, you may very well be right. It certainly looks like an 11 year cycle is an included component. My guess (yes, what follows is mere opinion!) is that the double sunspot 22 year cycle which has been implicated as affecting cosmic ray generated cloud formation is an even better match than the 11 year by itself. What may be happening is that the natural turnover frequency of Earth's ocean thermal reservoir is somewhere close (but not exactly) to three times the 22 year sunspot cycle. We get a resonance between the varying cosmic ray induced cloud cover (cause by the solar cycle) and the approximately 66 year ocean cycle. We also get (since the match is not perfect) an occasional jump, or beat frequency, that takes place when the mismatch between sun and ocean cycles becomes too great.

Speaking of the ocean thermal oscillation cycle, it may also be that it is a hemispherical swap, one that gives us a warmer Northern Hemisphere (and a cooler Southern Hemisphere) for approximately 30+ years and then reverses for the next 30+ years. Darned interesting stuff! Unfortunately the current so-called "value added data" makes it impossible to speak more certainly.

Just an opinion. I am not a climate scienctists and do not play one on TV.

Logging in

We have debated the merits of restricting posting to members only and decided against it as doing so might turn away individuals who would comment meaningfully in passing but are unwilling to take the time to join yet another website (at least one of the TRE authors often runs into this situation on other sites). Those who do take the time to join the TRE website can stay logged in and avoid the repeated captcha challenges that afflict those who remain anonymous, so membership has its advantages. We want to make participation as easy as possible, without throwing the doors open to a flood of malicious span and commercial messages (hence the hated but unfortunately necessary captcha). We do appreciate your membership and your participation.

Solar cycles and cloud formation

Hey Suricat, regarding the 22 year vs 11 year solar cycles, yes, I think that there is indeed a difference. Pardon me for not finding a link to the info on it, but my memory (failable though it is!)is that the solar wind of an sunspot active sun does help shield the Earth from cosmic rays -- rays which promote cloud formation and thus cooling of the surface. The allignment of the solar magnetic field with the Earth's magnetic field can either weaken or strengthen the effect.

If you are interested in the effects of cosmic rays, clouds and climate, please check http://cdsweb.cern.ch/record/1181073
and watch the video there. CERN has a good reputation and I tend to give their info credence. The video is rather long, but if you find the subject interesting I believe you will be astonished by the info and by how much sense it makes.

I'll try to track down some info on the 22 yr vs 11 yr thing.

Not really "Anonymous,"
Best regards,
Jason Calley

website about statistics

A website with a statistical emphasis I highly recommend is http://www.numberwatch.co.uk/ by John Brignell.
If you want a window into the politics of the UK, exposure of the statistical absurdities of epidemiologists, presented with a good dose of humor (or humour) this is the place to go.