A lot has been written about melting ice caps and new mini-ice ages recently. Seems that science can't decide if we are going to drown in rising oceans or starve because summer will be a thing of the past. This leaves the layperson justifiably confused as to who to believe—the climate change alarmists who back rapid global warming or those who warn of a new glacial period. There is little certainty when it comes to science but one thing that can be counted on is our ignorance. Quite simply, scientists cannot predict with any certainty what Earth's climate will do next. If someone tries to tell you different they are lying.
Climate scientists have constructed models to predict what Earth's climate will look like decades, even hundreds of years in the future. Unfortunately, many major components of Earth's climate system have not been accurately monitored for very long. This makes such predictions suspect if not laughable. A case in point are variations in ocean circulation and temperature. In the Atlantic there is a cycle for sea surface temperatures variation called the Atlantic Multidecadal Oscillation (AMO). The AMO is linked with decadal scale climate fluctuations like European summer precipitation, rainfall in Europe and India, Atlantic hurricanes and variations in global temperatures. A new study in the journal Nature reports that the AMO is again transitioning to a negative phase, meaning the vaunted “pause” in global warming may be with us for decades. In fact, scientists at the University of Southampton predict that cooling in the Atlantic Ocean could cool global temperatures a half a degree Celsius.
Marine and terrestrial proxy records suggest that there was a peak in global warming between 10,000 and 6,000 years ago, following the end of the last glacial period. Since the Holocene Thermal Maximum, Earth has undergone global cooling. The physical mechanism responsible for this global cooling has remained unknown and doesn't fit in with the current CO2 based climate models. Those climate models generate a robust global annual mean warming throughout the Holocene, mainly in response to rising CO2 levels and albedo changes due to retreating of ice sheets. In other words, the models disagree with reality, and when models disagree with nature the models have a credibility gap. A new paper in the Proceedings of the National Academy of Sciences (PNAS) says this model-data inconsistency indicates a critical reexamination of both proxy data and models is called for.
One of the greatest failures of climate science has been the dismal performance of general circulation models (GCM) to accurately predict Earth's future climate. For more than three decades huge predictive models, run on the biggest supercomputers available, have labored mighty and turned out garbage. Their most obvious failure was missing the now almost eighteen year “hiatus,” the pause in temperature rise that has confounded climate alarmists and serious scientists alike. So poor has been the models' performance that some climate scientists are calling for them to be torn down and built anew, this time using different principles. They want to adopt stochastic methods—so called Monte Carlo simulations based on probabilities and randomness—in place of today’s physics based models.
Forty million years ago, Earth began slipping from a “hothouse” climate to an “icehouse” climate. Currently the planet is in a brief warm interlude know as an interglacial—a period of retreating ice sheets and shrinking glaciers. As the word interglacial suggests, our current comfortable climate is not permanent, but merely a pause between frigid ice age conditions. Though climate alarmists and media talking heads continue to natter on about uncontrollable rising temperatures a more devastating climate change would be a descent into an ice age so cold and so deep that the entire globe freezes over—it has happened before. A new scientific paper reveals what researchers say is a feedback mechanism that acts as a natural thermostat and keeps Earth from cooling to the point of uninhabitability.
Over 4,000 years ago, the Harappan Civilization of the Indus Valley faded and disappeared. Never heard of the Harappans? Theirs was a Bronze-Age civilization located where Pakistan and northwest India are today. With large, well-planned cities, municipal sewage systems and writing that has never been deciphered, they had a civilization equal to Egypt, Mesopotamia, and Greece. But the Harappans fell victim to what most view as a modern horror—climate change. Sometime around 2,100 BC the monsoon cycle, vital to all of South Asia, faltered. The reliable rains stopped, and man's earliest civilizations fell. Now we are told that California—that progressive paradise on the Pacific—is poised on the brink of its own drought spawned disaster. So desperate have things become that one restaurant chain has threatened to stop serving guacamole and vintners are turning to witchcraft. Can the total collapse of Californian civilization be far behind?
One of the scary scenarios frequently trotted out by climate change alarmists is the possible shutdown of the ocean currents in the Atlantic Ocean. This would disrupt northern hemisphere climate, particularly in Europe. Indeed, one Hollywood disaster movie had frozen military helicopters falling from the skies in the UK and Manhattan buried under a tsunami of ice. We are told this could happen at any time, if the world gets too hot from all that CO2 our species is churning out. Now a shocking new paper in the journal Science implies that the standard view of a relatively stable interglacial circulation may not hold for conditions warmer/fresher than at present. Why? Because it happened before, over 100,000 years ago, without the help of man made global warming. Another catastrophic climate threat is shown to be totally natural and to have happened before our species began burning coal and driving SUVs.
The current interglacial warm period, the Holocene, started ∼11,500 years ago. At its start, among the dramatic changes in climate was a notable increase in rainfall, triggered by summer insolation values higher than those of today. This caused what is called the African Humid Period in North Africa—a time when the Sahara was dotted with large and small lakes, savannah grasslands, and in some regions, humid tropical forests and shrubs. The African Humid Period ended abruptly ∼5000 ybp (years before present) in many locations, such as western North Africa and northern Kenya. In other places, such as the central Sahara and the southern Arabian Peninsula, change occurred more gradually, taking several millennia. Regardless of the pace of change, those areas are tracts of arid desert today, and the animals and humans who had previously thrived in those formerly verdant regions have either moved or had to adapt to much harsher conditions. This is but one example of nature at its most capricious—the tyranny of climate change.
Scientists who study climate will tell you that today's warm temperatures and mild conditions are not normal for Earth during the past several million years. Our planet has been in a general cooling trend for 35 million years and in the grip of an Ice Age for the last 1.6 million years. What's more, this Ice Age, known as the Pleistocene, consists of relatively short periods of warmth, called interglacials, separated by much longer periods of bitter cold, referred to as glacials. The recorded history of humankind covers only the later half of the most recent interglacial warming, though our ancient ancestors did leave messages in the form of cave art that date back to much colder times, times when the Ice Age held the world fast in its frozen embrace. Predicting the timing and duration of these periods remains a problem for scientists. Why the glacial periods should last around 100,000 years, as they have for the last million years or so, is called the 100-kyr problem. Now, a group of researchers claim they know the answer.
Most people have never heard of the Anthropocene era and with good reason—it is not an officially recognized geologic time period. It is the invention of a small group of scientific busy bodies who evidently have nothing better to do than try to effect a change in the official timeline of Earth's past. The International Commission on Stratigraphy, the body charged with formally designating geological time periods, has been petitioned in the past and just recently a group of chuckle-heads attending the Society for American Archaeology meetings in Hawaii have brought the idea up again. Only problem is, the proponents of the Anthropocene have fallen to arguing amongst themselves—when did the “Age of Man” really start?
Since it was recently Earth Day, a yearly day of celebration and protest by conservationists and assorted greens, it is instructive to take a look at a number of recent studies taken from the scientific literature. The dire predictions made by climate change alarmists are many, far to numerous to all be addressed here, so this article will examine three areas of concern: increased drought, destruction of the world's rainforests, and the die-off of ocean coral reefs. Each of these reported calamities has been linked to increasing anthropogenic greenhouse gas emissions and that supposed bane of nature, anthropogenic global warming (AGW). These threats have been repeated ad nauseum by talking heads and climate change activists, but the truth is that these predicted outcomes are not as threatening as they would have you believe.
It has been vilified in the press and maligned in school classrooms by the ignorant. Opposing it has become a cause célèbre in Hollywood an a litmus test for liberal politicians. Labeled man's chosen weapon for ravaging nature and laying waste to the environment this poor, largely misunderstood gas is in fact essential to our existence. Without carbon dioxide, CO2, we would not be here and Earth would be a frozen lifeless chunk of rock. Forget that life would never have developed on a planet without greenhouse warming, a recent scientific study say the daemon gas has rescued the planet from the deep freeze at least twice during the Neoproterozoic era, roughly 750 to 635 million years ago.
There was much made in the media about a new report that claims modern day temperatures are the highest in 5,000 years. Moreover, the investigators assert that this century's temperature rise is “unprecedented,” echoing the assertions of climate change alarmists over the past 30 years. Various news outlets seized upon this report as final proof that the world is headed for a hot steamy demise because of human greenhouse gas (GHG) emissions. There are, however, a number of problems with that assertion. First among them is the methodology used to generate the global temperature history and the comparison of proxy data with instrument data from recent times. This may be science but it is being used to deceive the public into believing that anthropogenic global warming (AGW) is a crisis on an unprecedented scale.
A newly released study from the Research Council of Norway has climate change alarmists abuzz. One of the things the alarmists have been pushing for is to halt warming at a 2°C increase at any cost (and they mean that literally). In the Norwegian study, much to the alarmists' dismay, researchers have arrived at an estimate of 1.9°C as the most likely level of future warming. The report also recognizes that temperatures have stabilized at 2000 levels for the past decade even though CO2 levels have continued to rise. Meanwhile, a reconstruction of the Eemian interglacial from the new NEEM ice core, published in the journal Nature, shows that in spite of a climate 8°C warmer than that of the past millennium, the ice in Northern Greenland was only a few hundred meters lower than its present level. This finding casts doubt on the projected melting of ice sheets and resulting sea-level rise.
Instrument data from the last 160 years indicate a general warming trend during that span of time. However, when this period is examined in the light of palaeoclimate reconstructions, the recent warming appears to be a part of more systematic fluctuations. Specifically, it is an expected warming period following the 200-year “Little Ice Age” cold period. Moreover, a new study of the natural variability of past climate, as seen from available proxy information, finds a synthesis between the Milankovitch cycles and Hurst–Kolmogorov (HK) stochastic dynamics—a result that shows multi-scale climate fluctuations cannot be described adequately by classical statistics.