The threat of widespread and persistent drought, ruining crops and threatening water supplies, is constantly cited as an outcome of global warming. Media talking heads, climate scientists (who should know better) and even the American President have all made this assertion—and there is nothing to back up the claim. Results presented recently at the annual assembly of the European Geosciences Union in Vienna show that forecasting drought is still beyond the reach of current climate models. Models run against historical data have either predicted periods of drought at the wrong times or missed them all together. Yet climate alarmists continue to spread this pernicious lie, preaching damnation with the certitude of an Old Testament prophet.
For some reason a lot of people have become fixated on Antarctic ice—is it waxing or waning, accumulating or melting. Climate alarmists have striven mightily to show that ice at the poles in on the decline, melting in the face of rising global temperatures. Antarctica, with the largest store of glacial ice on the planet, is the primary focus of attention. If Antarctica’s ice sheets were to melt it would be a calamity for mankind. Unfortunately, Earth's climate system contains many cyclic trends, operating on decadal and longer periods of time. In the past, what some claim are clear trends have turned out to be only short term in nature. A new report, just published online, concludes that it is unclear if changes in atmospheric circulation over West Antarctica during the past few decades are part of a longer-term trend. In fact, ice cores reveal a significant increase in the oxygen isotopes from precipitation over the past 50 years, but the anomaly cannot be distinguished from natural climate variability.
There was much made in the media about a new report that claims modern day temperatures are the highest in 5,000 years. Moreover, the investigators assert that this century's temperature rise is “unprecedented,” echoing the assertions of climate change alarmists over the past 30 years. Various news outlets seized upon this report as final proof that the world is headed for a hot steamy demise because of human greenhouse gas (GHG) emissions. There are, however, a number of problems with that assertion. First among them is the methodology used to generate the global temperature history and the comparison of proxy data with instrument data from recent times. This may be science but it is being used to deceive the public into believing that anthropogenic global warming (AGW) is a crisis on an unprecedented scale.
NASA says that something unexpected is happening on the Sun. This year, 2013, is supposed to be the peak of the 11-year sunspot cycle—the year of Solar Max. Yet solar activity is well below the expected level. Out somnolent star refuses to behave according to the predictions of Sun watching scientists, leading some observers to wonder if forecasters missed the mark. The botched solar forecast not only has implication for our understanding of the physical processes inside the Sun, it has possible links to future climate change here on Earth. Scientists admit that no one knows for sure what the Sun will do next.
Instrument data from the last 160 years indicate a general warming trend during that span of time. However, when this period is examined in the light of palaeoclimate reconstructions, the recent warming appears to be a part of more systematic fluctuations. Specifically, it is an expected warming period following the 200-year “Little Ice Age” cold period. Moreover, a new study of the natural variability of past climate, as seen from available proxy information, finds a synthesis between the Milankovitch cycles and Hurst–Kolmogorov (HK) stochastic dynamics—a result that shows multi-scale climate fluctuations cannot be described adequately by classical statistics.
Once again the fear-mongering hoards of lay-climatologists are denouncing the importance of the Medieval Warm Period (MWP), or Medieval Climate Optimum as some refer to it. On the strength of a single new study, involving algal lipids from a lake in Svalbard, Norway, one pundit referred to the “so-called” Medieval Warm Period and impugned the significance of the Little Ice Age for good measure. Of course the writer, an “ecological journalist,” saw nothing wrong with using such a dismissive and pejorative term since it is IPCC promoted doctrine that today things are warmer than they have ever been since the onset of the Holocene. Yet literally hundreds of other studies have shown that the MWP was as warm or warmer than today and that is the real consensus among paleoclimatologists.
A study of ancient volcanic ash found at key archaeological sites across Europe suggests that early modern humans were more resilient to climate change and natural disasters than commonly thought. The study, which appeared in PNAS, analyzed volcanic ash from a major eruption that occurred in Europe around 40,000 years ago. The volcano spewed so much ash that the event probably created winter-like conditions and a sudden colder shift in climate. Scientists have generally suggested that the spread of modern humans, and the decline of our cousins the Neanderthals, was primarily due to ancient volcanic eruptions and deteriorating climate conditions, but this study shows that stone-age man rolled with the punches and shrugged off the sudden shifts in climate. This new evidence flies in the face of modern predictions that a shift of a few degrees in average yearly temperature will decimate human populations world wide.
One of the main problems with the “theory” of anthropogenic global warming is its reliance on rising atmospheric CO2 levels to force a global rise in temperature. This is predicted by climate change proponents by running large, complex computer models that imperfectly simulate the physics of Earths biosphere: ocean, land and atmosphere. Central to tuning these general circulation models (GCM) is a parameter called climate sensitivity, a value that purports to capture in a single number the response of global climate to a doubling of atmospheric carbon dioxide. But it has long been known that the Earth system is constantly changing—interactions shifting and factors waxing and waning—so how can a simple linear approximation capture the response of nature? The answer is, it can not, as a new perspective article in the journal Science reports.
That large changes in solar radiation can affect Earth's climate is widely accepted. However, the hypothesis of solar-induced centennial to decadal climate changes, which suggests feedback mechanisms in the climate system amplifying even small solar variations, has not found acceptance among orthodox climate scientists. The climate change clique would rather place their money on greenhouse gasses—human generated CO2 in particular. It is true that satellite-based measurements of total solar irradiance show that mean variations during solar cycles do not exceed 0.2 W m−2 (~ 0.1% of the Sun's energy output). It has also been noted that relatively large variations of 5–8% in the ultraviolet (UV) frequencies can occur, though how this could change global climate remained a puzzlement—but perhaps no longer. From studying a significant climate shift 2,800 years ago, a group of scientists have concluded that large changes in solar UV radiation can, indeed, affect climate by inducing atmospheric changes.
The last interglacial period (LIG)—the Eemian—is commonly believed by scientists to have been warmer than the current Holocene interglacial. Along with that balmier climate there is evidence that sea levels were significantly higher than today. Previous studies have pegged Eemian sea levels at 4 to 6m higher than today. Recently, a new investigation raises that estimate, reporting that ancient sea levels peaked between 6.6 and 9.4 m (~20 to 30 feet). Modern day accounts of flooding in low lying coastal areas and tropical islands abound, with ominous suggestions of links to global warming. How high the oceans will rise is a topic of debate for IPCC members, the news media and assorted climate alarmists, but they are asking the wrong question. Instead, they should ask why are sea levels so low?
Between 15 and 20 million years (Myr) ago, Earth's climate took a pause during its long slide into the Pleistocene Ice Age for a period of real global warming. During this relatively brief time glaciers around the world retreated and there are indications that, at least around the edges of the continent, there was significant vegetation on Antarctica. Temperatures may have been as high as 11°C higher than today. Scientists say this global warm spell took place under under CO2 levels in the range of 190–850 ppmv, both significantly higher and lower than today's 390 ppmv. It is hoped that studying conditions during the Miocene warming can provide constraints on the fundamental laws governing the climate system. Why? If the Pleistocene Ice Age is truly coming to an end, as some have said, this may be the climate of the future.
After decades of debunking and statements by responsible scientists that climate is not weather and individual anomalies are not an indication of climate change, the government funded IPCC lackeys at the UK's Met Office and America's National Oceanic and Atmospheric Administration have publicly attributed recent bad weather events to man-made climate change. These irresponsible boffins' shrill claims illustrate the desperation in the anthropogenic global warming (AGW) camp in the face of declining public concern over climate change. While admitting that it is impossible to blame a single event on global warming, climate alarmists are claiming attribution is possible as long as it is framed in terms of probability. They have gone from lies, to damn lies and now, finally, to statistics.
It is accepted that the ancient Sun was considerably cooler than our local star is today, so much so that Earth a few billion years ago should have been a lifeless frozen ball. But scientists have also shown that the planet was not frozen—shallow seas warmer than any modern ocean abounded with microbial life. A recent study, detailed in the journal Nature, is a good example of the sometimes convoluted, even improbable reasoning is used to get a handle on earthly climates during eons long vanished. Using the fossilized impact dimples from rain drops that fell 2.7 billion years ago, researchers have calculated new limits on the density of Earth's atmosphere. This, in turn, has implications on the development of the ancient atmosphere and what role greenhouse gases may have played in warming the young Earth.
Just when it looked like the climate catastrophists had slunk back into well deserved academic obscurity, a new report in the journal Nature Geoscience has resurrected claims of Earth's impending climatic demise. A new computer climate study says to expect increases in temperature of up to 3°C by 2050, confirming or exceed predictions made by the IPCC reports. Can this model based report be considered any more accurate than previous attempts? Have modeling techniques suddenly improved? Or is this report's appearance in a major scientific journal the signal of a renewed round of scaremongering by eco-alarmists?
The dire results of anthropogenic global warming have become passé. Treated by the news media and climate alarmists as established scientific fact, the IPCC's vision of a dystopian future, a world ravaged by global warming, is feed to our children in school, TV shows and Hollywood movies. What is never mentioned is that even the IPCC's predictions encompass several ranges of possible outcome, all predicated on a seemingly simple but mysterious factor called climate system sensitivity. A recent study, published in the journal Science, used spatially more complete paleoclimate data for the Last Glacial Maximum (LGM) in an effort to improve previous estimates of climate sensitivity. The new results have not been widely reported in the news media because, according to the researchers, “these results imply a lower probability of imminent extreme climatic change than previously thought.”