Mixed in with the swirling cloud of misinformation about global warming is also a bit of blather regarding the onset of a new Ice Age. What the promoters of a new big freeze are talking about is a glacial period, since technically we are still in the midst of an ice age—the Pleistocene. A number of scientists have warned that the planet might be headed for a new period of glacial growth based on Earth's recent history. Over the past half million or so years there has been a series of short (15-20k year) warm period sandwiched between longer (70-100k year) glacial episodes. Make no mistake, global cooling is much worse than global warming, so this really matters to our descendants and the whole human race. Now, a new study in Nature says that we maybe off the hook, glacial wise. Not only that, human activity might be the reason.
We have all heard about the Ice Age, if only in cartoon movies. A time when massive ice sheets covered the planet while mammoths and saber toothed cats roamed the frozen landscape. What is more, the cycle of interglacial-glacial-interglacial has happened over and over again during the past million or so years. During the last half a million years the cycle has repeated every 130,000 years, with the warm period we are now enjoying—the Holocene—just the latest interglacial respite from the icy conditions of the Pleistocene Ice Age. What most people don't know is that there were many areas on Earth that remained unchanged, even during the height of the last glacial period. The Sahara was hot and dry, and in the Amazon rainforests, though a bit smaller in area, looked much like they do today.
A lot has been written about melting ice caps and new mini-ice ages recently. Seems that science can't decide if we are going to drown in rising oceans or starve because summer will be a thing of the past. This leaves the layperson justifiably confused as to who to believe—the climate change alarmists who back rapid global warming or those who warn of a new glacial period. There is little certainty when it comes to science but one thing that can be counted on is our ignorance. Quite simply, scientists cannot predict with any certainty what Earth's climate will do next. If someone tries to tell you different they are lying.
Climate scientists have constructed models to predict what Earth's climate will look like decades, even hundreds of years in the future. Unfortunately, many major components of Earth's climate system have not been accurately monitored for very long. This makes such predictions suspect if not laughable. A case in point are variations in ocean circulation and temperature. In the Atlantic there is a cycle for sea surface temperatures variation called the Atlantic Multidecadal Oscillation (AMO). The AMO is linked with decadal scale climate fluctuations like European summer precipitation, rainfall in Europe and India, Atlantic hurricanes and variations in global temperatures. A new study in the journal Nature reports that the AMO is again transitioning to a negative phase, meaning the vaunted “pause” in global warming may be with us for decades. In fact, scientists at the University of Southampton predict that cooling in the Atlantic Ocean could cool global temperatures a half a degree Celsius.
Marine and terrestrial proxy records suggest that there was a peak in global warming between 10,000 and 6,000 years ago, following the end of the last glacial period. Since the Holocene Thermal Maximum, Earth has undergone global cooling. The physical mechanism responsible for this global cooling has remained unknown and doesn't fit in with the current CO2 based climate models. Those climate models generate a robust global annual mean warming throughout the Holocene, mainly in response to rising CO2 levels and albedo changes due to retreating of ice sheets. In other words, the models disagree with reality, and when models disagree with nature the models have a credibility gap. A new paper in the Proceedings of the National Academy of Sciences (PNAS) says this model-data inconsistency indicates a critical reexamination of both proxy data and models is called for.
Forty million years ago, Earth began slipping from a “hothouse” climate to an “icehouse” climate. Currently the planet is in a brief warm interlude know as an interglacial—a period of retreating ice sheets and shrinking glaciers. As the word interglacial suggests, our current comfortable climate is not permanent, but merely a pause between frigid ice age conditions. Though climate alarmists and media talking heads continue to natter on about uncontrollable rising temperatures a more devastating climate change would be a descent into an ice age so cold and so deep that the entire globe freezes over—it has happened before. A new scientific paper reveals what researchers say is a feedback mechanism that acts as a natural thermostat and keeps Earth from cooling to the point of uninhabitability.
Over 4,000 years ago, the Harappan Civilization of the Indus Valley faded and disappeared. Never heard of the Harappans? Theirs was a Bronze-Age civilization located where Pakistan and northwest India are today. With large, well-planned cities, municipal sewage systems and writing that has never been deciphered, they had a civilization equal to Egypt, Mesopotamia, and Greece. But the Harappans fell victim to what most view as a modern horror—climate change. Sometime around 2,100 BC the monsoon cycle, vital to all of South Asia, faltered. The reliable rains stopped, and man's earliest civilizations fell. Now we are told that California—that progressive paradise on the Pacific—is poised on the brink of its own drought spawned disaster. So desperate have things become that one restaurant chain has threatened to stop serving guacamole and vintners are turning to witchcraft. Can the total collapse of Californian civilization be far behind?
As 2013 draws to a close two new Nobel laureates have spoken out about the state of science, and their conclusions are not encouraging. One describes how science is damaged by journals like Nature, Cell and Science. These premier journals distort scientific inquiry and create false competition among authors that lead to sensational or controversial papers. Add to this peer pressure and group think and the result is crap science like the global warming scam. The second laureate believes that emphasis on publishing volume has created an academic climate where no university would employ him today because he would not be considered “productive” enough. The publish or perish philosophy has expanded to the point that researchers are doing fast science, not good science. If scientists were not forced to rapidly publish their results perhaps the quality of the research would rise.
The current interglacial warm period, the Holocene, started ∼11,500 years ago. At its start, among the dramatic changes in climate was a notable increase in rainfall, triggered by summer insolation values higher than those of today. This caused what is called the African Humid Period in North Africa—a time when the Sahara was dotted with large and small lakes, savannah grasslands, and in some regions, humid tropical forests and shrubs. The African Humid Period ended abruptly ∼5000 ybp (years before present) in many locations, such as western North Africa and northern Kenya. In other places, such as the central Sahara and the southern Arabian Peninsula, change occurred more gradually, taking several millennia. Regardless of the pace of change, those areas are tracts of arid desert today, and the animals and humans who had previously thrived in those formerly verdant regions have either moved or had to adapt to much harsher conditions. This is but one example of nature at its most capricious—the tyranny of climate change.
Climate alarmists are constantly warning that Earth is going to warm up, driven they say by the level of CO2 in the atmosphere. To bolster their claims they point to the Pliocene, a time 4-5 million years ago, when the planet was 4-8°C hotter and CO2 levels were 400ppm or higher. This is the climate we are heading for, the global warming supporters say—but it that really true? Superficially it seems a plausible assertion, but as it turns out there is much more here than CO2 and temperature. It is not just the average temperature but the distribution of temperature at different latitudes, both over land and sea, that controls the climate. It is the temperature gradient that drives storms and affects weather patterns and it was much different during the Pliocene. Moreover, climate models do not generate a Pliocene like climate when run with higher CO2 levels, which means climate scientists are missing something important about the way Earth's climate system works.
Science is supposed to be unbiased, seeking to understand the workings of nature untainted by the personal beliefs or prejudices of its practitioners. Nature alone is the arbiter of truth—when science and nature disagree it is science that is wrong. But science is practiced by human beings, who cannot keep their beliefs, whether engendered by religious, philosophical, or political leanings, from skewing any result that is equivocal or highly complex. Presented here are two examples taken from the pages of Nature, perhaps the world's primer scientific journal. One is a rehash of temperature history in northern latitudes with a new statistical twist, the other a report on a study regarding fracking. One shows how what scientists leave out of their studies may be more important than what they put in. The other shows how a headline can spin the results of a report even when its authors are carefully neutral in their conclusions.
Much has been done to vilify carbon dioxide in the media. Listening to the talking heads and on-air “experts” could lead one to believe that CO2 is an evil scourge that the world would be better off without. Nothing could be further than the truth. CO2 is necessary for life on Earth, forests in particular. It is not just plant food, the maligned gas also plays a role in regulating water use by the world's forests. New research has uncovered an unexpectedly strong decrease in H2O uptake caused by increasing CO2. Along with global increases in photosynthesis, forest growth rates, and carbon uptake, higher CO2 levels contribute to enhanced timber yields and improved water availability. Who says higher CO2 levels are a bad thing?
Scientists who study climate will tell you that today's warm temperatures and mild conditions are not normal for Earth during the past several million years. Our planet has been in a general cooling trend for 35 million years and in the grip of an Ice Age for the last 1.6 million years. What's more, this Ice Age, known as the Pleistocene, consists of relatively short periods of warmth, called interglacials, separated by much longer periods of bitter cold, referred to as glacials. The recorded history of humankind covers only the later half of the most recent interglacial warming, though our ancient ancestors did leave messages in the form of cave art that date back to much colder times, times when the Ice Age held the world fast in its frozen embrace. Predicting the timing and duration of these periods remains a problem for scientists. Why the glacial periods should last around 100,000 years, as they have for the last million years or so, is called the 100-kyr problem. Now, a group of researchers claim they know the answer.
The threat of widespread and persistent drought, ruining crops and threatening water supplies, is constantly cited as an outcome of global warming. Media talking heads, climate scientists (who should know better) and even the American President have all made this assertion—and there is nothing to back up the claim. Results presented recently at the annual assembly of the European Geosciences Union in Vienna show that forecasting drought is still beyond the reach of current climate models. Models run against historical data have either predicted periods of drought at the wrong times or missed them all together. Yet climate alarmists continue to spread this pernicious lie, preaching damnation with the certitude of an Old Testament prophet.
For some reason a lot of people have become fixated on Antarctic ice—is it waxing or waning, accumulating or melting. Climate alarmists have striven mightily to show that ice at the poles in on the decline, melting in the face of rising global temperatures. Antarctica, with the largest store of glacial ice on the planet, is the primary focus of attention. If Antarctica’s ice sheets were to melt it would be a calamity for mankind. Unfortunately, Earth's climate system contains many cyclic trends, operating on decadal and longer periods of time. In the past, what some claim are clear trends have turned out to be only short term in nature. A new report, just published online, concludes that it is unclear if changes in atmospheric circulation over West Antarctica during the past few decades are part of a longer-term trend. In fact, ice cores reveal a significant increase in the oxygen isotopes from precipitation over the past 50 years, but the anomaly cannot be distinguished from natural climate variability.