The Science of Deception

There was much made in the media about a new report that claims modern day temperatures are the highest in 5,000 years. Moreover, the investigators assert that this century's temperature rise is “unprecedented,” echoing the assertions of climate change alarmists over the past 30 years. Various news outlets seized upon this report as final proof that the world is headed for a hot steamy demise because of human greenhouse gas (GHG) emissions. There are, however, a number of problems with that assertion. First among them is the methodology used to generate the global temperature history and the comparison of proxy data with instrument data from recent times. This may be science but it is being used to deceive the public into believing that anthropogenic global warming (AGW) is a crisis on an unprecedented scale.

Appearing the the journal Science—a publication with a notably biased stance regarding the theory of AGW—the report of a new study of historical global temperatures has reignited global warming fever in the news media and blogosphere:

  • Past Century's Global Temperature Change Is Fastest On Record – Christopher Joyce on America's National Public Radio.

  • The world is hottest it has been since the end of the ice age - and the temperature's still rising – the UK's Independent.

  • Global warming is epic, long-term study says – Ben Brumfield on CNN.

  • Global Warming Has Already Caused Unprecedented Change – John Light on

  • Global warming escalation: Alarming new study shows rapid increase of hell on Earth – the Prairie Dog Press on

Aside from mostly getting the story wrong—the report did not claim that the present is the hottest time since the end of the ice age—the news mavens were beside themselves with glee. Nothing like pending disaster to help increase ratings and readership. It really is amazing the level of scientific ignorance among the news media “experts” that cover the environment. For them, climate change is the gift that keeps on giving.

The study itself, titled “A Reconstruction of Regional and Global Temperature for the Past 11,300 Years,” was authored by Shaun A. Marcott, Jeremy D. Shakun, Peter U. Clark, and Alan C. Mix. Beginning with a bow to climate science convention, the report's abstract is not as bombastic as the media reporting but clearly does obeisance to global warming dogma.

Surface temperature reconstructions of the past 1500 years suggest that recent warming is unprecedented in that time. Here we provide a broader perspective by reconstructing regional and global temperature anomalies for the past 11,300 years from 73 globally distributed records. Early Holocene (10,000 to 5000 years ago) warmth is followed by ~0.7°C cooling through the middle to late Holocene (<5000 years ago), culminating in the coolest temperatures of the Holocene during the Little Ice Age, about 200 years ago. This cooling is largely associated with ~2°C change in the North Atlantic. Current global temperatures of the past decade have not yet exceeded peak interglacial values but are warmer than during ~75% of the Holocene temperature history. Intergovernmental Panel on Climate Change model projections for 2100 exceed the full distribution of Holocene temperature under all plausible greenhouse gas emission scenarios.

No wonder the warmist media was excited. But should they be? Notice that even in the abstract, the authors admit that current temperatures have not exceeded the temperatures of more than 5,000 years ago, a time commonly known as the Holocene Climate Optimum. In fact, they reinforce the commonly held history of Holocene temperatures, including the Little Ice Age, a cool period that several previous studies have tried to erase. Even more important is how this study was conducted and the experimental limits on its results. To examine these we need to understand something about the use of foraminifera as a proxy for temperature.

The accepted climate history of the Holocene. Source The Resilient Earth.

In 1947, Harold Urey, a nuclear chemist at the University of Chicago, discovered a way to measure ancient temperatures. The secret was in the oxygen isotopes found in fossil sea shells. The amount of different oxygen isotopes that an organism takes up from sea water varies according to the water's temperature while it was alive. Basically, the ratio of 18O to 16O serves as a proxy thermometer.

In 1955, Cesare Emiliani, a geology student from Italy working in Urey's laboratory, used this method to create a record of historical temperatures. Emiliani measured the oxygen isotopes in microscopic shells from foraminifera, a kind of ocean plankton. Tracking the shells layer by layer in long cores of clay extracted from the seabed, he constructed a record of temperature variations. This is essentially the same method used in the Marcott et al. paper.

There are a number of problems with using foraminifera sediment data to reconstruct a history of ancient temperatures. Here is a quote from chapter 3 of the Global Warming Field Guide by John Weaver, Jeff Braun, and William J. Szlemko, “Measuring Temperature in the Distant Past, The Art of Developing Temperature Proxy Data”:

The theory is that these creatures form near the surface at a certain sea temperature, then fall to the ocean floor over time, leaving behind a permanent record of historic sea temperatures – which are loosely related to air temperatures. Unfortunately, settling rates can be variable due to ocean currents and other, lesser, factors. Plus, settling rates are extremely slow for these tiny shells, so there can be mixing of many years before they actually make it all the way to the bottom. In the end, there is good news and bad news. The good news is that sediment layers contain much longer records than do ice core samples. The bad news is that it is nearly impossible to resolve the year-to-year differences that are possible with ice core data. The resolution for sediment cores is more likely on the order of hundreds of years, although the records cover several million years. In this sense, perhaps, ice core data and sediment cores sort of provide complimentary information.

Scientists know that all sources of palaeoclimatic proxy data differ according to their resolution, spatial coverage and the time period to which they pertain. There are several types of uncertainty lurking in the proxy method used in this study, including temporal, spatial and measurment. To start with, the number of samples is fairly small, only 73 scattered around the world, and they vary in quality. The southern hemisphere is acknowledged as being under represented. Then there is the resolution with which the proxy temperatures can be dated.

“The 73 globally distributed temperature records used in our analysis are based on a variety of paleotemperature proxies and have sampling resolutions ranging from 20 to 500 years, with a median resolution of 120 years,” they authors state. How did they compensate for differences in their datasets? “We account for chronologic and proxy calibration uncertainties with a Monte Carlo–based randomization scheme.” they explain.

“In addition to the previously mentioned averaging schemes, we also implemented the RegEM algorithm to statistically infill data gaps in records not spanning the entire Holocene, which is particularly important over the past several centuries.” It seems that their complete record of Holocene temperature contains a lot of gaps and uncertainties that have been filled with estimates and randomness. Far from the conclusive record of global temperatures trumpeted by the media.

In the past a number of studies based on tree rings have gotten things notoriously wrong—for example Mann's “hockey stick” graph resulted from the improper merging of several tree ring studies. “Published reconstructions of the past millennium are largely based on tree rings and may underestimate low-frequency (multicentury-to-millennial) variability because of uncertainty in detrending ... whereas our lower-resolution records are well suited for reconstructing longer-term changes,” the autors claim. Some of that older data is provided for comparison in the figures below with descriptions by the authors.

Comparison of different methods and reconstructions of global and hemispheric temperature anomalies. (A and B) Globally stacked temperature anomalies for the 5° × 5° area-weighted mean calculation (purple line) with its 1σ uncertainty (blue band) and Mann et al.'s global CRU-EIV composite mean temperature (dark gray line) with their uncertainty (light gray band). (C and D) Global temperature anomalies stacked using several methods (Standard and Standard5x5Grid; 30x30Grid; 10-lat: Arithmetic mean calculation, area-weighted with a 5° × 5° grid, area-weighted with a 30° × 30° grid, and area-weighted using 10° latitude bins, respectively; RegEM and RegEM5x5Grid: Regularized expectation maximization algorithm-infilled arithmetic mean and 5° × 5° area-weighted). The gray shading [50% Jackknife (Jack50)] represents the 1σ envelope when randomly leaving 50% of the records out during each Monte Carlo mean calculation. Uncertainties shown are 1σ for each of the methods. (E and F) Published temperature anomaly reconstructions that have been smoothed with a 100-year centered running mean, Mann08Global, Mann08NH, Moberg05, WA07, Huange04, and plotted with our global temperature stacks [blue band as in (A)]. The temperature anomalies for all the records are referenced to the 1961–1990 instrumental mean. (G and H) Number of records used to construct the Holocene global temperature stack through time (orange line) and Mann et al.'s reconstruction (gold vertical bars). Note the y axis break at 100. The latitudinal distribution of Holocene records (gray horizontal bars) through time is shown. (I and J) Number of age control points (e.g., 14C dates) that constrain the time series through time.

Notice how they include a sharp uptick in temperature at the end of each graph, supposedly those unprecedented temperatures mentioned at the beginning of the article. The only reason they seem so anomalous is that the older temperatures are averages, of at least 20 years and as many as 120 years, due to the inherent temporal uncertainty in the foram data. Modern temperatures, taken directly with instruments, should not be compared directly with the foram proxy derived data. To present an unbiased representation the graphs should all have stopped a decade ago.

“Because the relatively low resolution and time-uncertainty of our data sets should generally suppress higher-frequency temperature variability, an important question is whether the Holocene stack adequately represents centennial- or millennial-scale variability,” the authors confess. What this means is that sudden excursions in temperature during times past would be totally undetectable from the proxy data. To merge modern data with historical proxy data in this manner is disingenuous to say the least and, given the tone of the abstract, might even be construed as intentionally misleading.

What was the hottest year of the Holocene? What was the hottest decade? No one knows, but they almost certainly occurred during the “temperature plateau” extending from 9500 to 5500 years ago. Even by the flawed comparisons made in this paper the averaged temperatures during the Holocene Climate Optimum were higher than today. If the year to year variability could somehow be reconstructed we would surely find many years with much higher temperature spikes during that 4,000 year period of global warmth.

While this study is interesting and useful in its results, it is clear that the authors tried to spin the results into a prop for the failed AGW theory. It certainly had the intended result, news outlets around the world announced the new conclusive proof of AGW causing an unprecedented temperature increase—none of them realizing, or perhaps caring, that the comparison was invalid. This may be science, but it is science distorted, it is the science of deception.

Be safe, enjoy the interglacial and stay skeptical.

McIntyre vs. Barton (R-TX)

McIntyre e-mailed Dr. Mann, requesting that he supply the raw data used to build the hockey stick. The feature story on McIntyre caught the attention of Rep. Joe Barton (R-TX), chair of the House Committee on Energy and Commerce. Barton later made his own request for the raw data from the hockey-stick analysis. McIntyre told Environmental Science & Technology (ES&T) that Joe Barton's congressional staff called him shortly after the WSJ article: "They wanted to know if I had spoken to the Wall Street Journal and if the article was true," McIntyre said nonchalantly, while cleaning his glasses.

In late June, Barton sent out letters to Mann and his colleagues that referenced McIntyre's study. Barton's letters were later criticized by professional societies. For example, The American Association for the Advancement of Science and the AGU protested Barton’s "intrusion into the scientific process" with an 11-page point-by-point refutation of every issue raised by Barton.

The Depths of the Deception Revealed

The Marcott paper in Science has become the laughing stock of the climate skeptic world, with new reports of the work's bogosity coming in every hour. In a post over at WUWT, Willis Eschenbach has done an analysis of the raw data used in the Marcott paper. His first plot shows all of the proxies used. The colors are only to distinguish individual records, they have no meaning otherwise.

The second shows the end result, with all the Marcott proxies, expressed as anomalies about their most recent 2,000 years of record. The black line shows a 401-point Gaussian average.

Here is what Eschenbach has to say about these data and the conclusions the authors conjured from them...

A fine example of their choice of proxies can be seen in the fact that they’ve included a proxy which claims a cooling about nine degrees in the last 10,000 years … although to be fair, they’ve also included some proxies that show seven degrees of warming over the same period

I’m sorry, guys, but I’m simply not buying the claim that we can tell anything at all about the global temperatures from these proxies. We’re deep into the GIGO range here. When one proxy shows rising temperatures for ten thousand years and another shows dropping temperatures for ten thousand years, what does any kind of average of those two tell us? That the temperature was rising seven degrees while it was falling nine degrees?

It is obvious how unsupportable the paper's conclusions are, particularly when comparing recent temperatures with historical norms. The Science paper isn't just bad science, it is intentionally deceptive false science. That a major journal would publish such pap beggars the imagination. I can only conclude that the climate change zealots at the AAAS have colluded with the authors to intentionally mislead the public.

See here for the full story.

New hockey stick is bogus

According to Steve McIntrey at Climate Audit, Something odd has been discovered about the provenance of the work associate with the Marcott et al paper. It seems that the sharp uptick wasn’t in the thesis paper Marcott defended for his PhD, but is in the paper submitted to Science.

Author admits the new hockey-stick is bogus

In an email to Steve McIntyre, the lead author of the new hockey stick admits the primary finding of the paper, the alleged rapid rise of temperature from 1890-1950 is "probably not robust." Translation: it is not statistically significant and the hyped press releases from the authors are neither scientifically nor statistically valid. In the email Shaun Marcott is quoted as saying:

    Please note that we clearly state in paragraph 4 of the manuscript that the reconstruction over the past 60 yrs before present (the years 1890 − 1950 CE) is probably not robust because of the small number of datasets that go into the reconstruction over that time frame. Specifically, we concluded this based on several lines of evidence.
    Regarding the SH reconstruction: It is the same situation, and again we do not think the last 60 years of our Monte Carlo reconstruction are robust given the small number and resolution of the data in that interval.

In other words, the recent part of the graph is bullshit.


Extending data by using proxy variables not really useful

Attempts by alarmists to prove that anthropogenic climate change is taking place by comparing thousands of years of proxy derived temperatures to a few years of instrument recorded temperatures is disingenuous as Doug explains, especially when the extended data has a temporal resolution of well over 100 years. Remember Mann’s hockey stick!

I have been watching for someone to show a correlation between instrument measured temperatures and other measured data such as CO2, irradiance, and sun spots. CO2 has been steadily rising during the last 15 years with no increase in temperature, in spite of the IPCC model projections. Researchers have indicated in the past that changes in irradiance were too small to cause any significant climate change. This blog has several posts on the impact of the sun’s activity through sun spots affecting climate; however, it is difficult to quantitatively correlate the 11-year sun spot cycles to temperatures.

In a March 8, 2013 short summary paper by Carter, Soon & Briggs, graphs have been plotted for the period of 1850 to 2010 for: the contiguous US; the Arctic; and China. The graphs plotting total irradiance data against maximum temperature show a strong correlation between the two variables. The authors used the BEST reconstructed global land temperature record compiled by Professor Richard Muller (University of California, Berkley). The authors are surprised that Professor Muller said, “CO2 controls our changing temperature”. The authors’ selection of maximum daily temperature over average daily temperature is supported by the US National Center for Atmospheric Research scientists Drs. Harry van Loon and Gerald Meehl. The authors speculate that cloud cover plays a substantial role in modulating the amount of solar radiation reaching the earth’s surface.

Don Farley, Gatineau, Quebec Canada