Over 4,000 years ago, the Harappan Civilization of the Indus Valley faded and disappeared. Never heard of the Harappans? Theirs was a Bronze-Age civilization located where Pakistan and northwest India are today. With large, well-planned cities, municipal sewage systems and writing that has never been deciphered, they had a civilization equal to Egypt, Mesopotamia, and Greece. But the Harappans fell victim to what most view as a modern horror—climate change. Sometime around 2,100 BC the monsoon cycle, vital to all of South Asia, faltered. The reliable rains stopped, and man's earliest civilizations fell. Now we are told that California—that progressive paradise on the Pacific—is poised on the brink of its own drought spawned disaster. So desperate have things become that one restaurant chain has threatened to stop serving guacamole and vintners are turning to witchcraft. Can the total collapse of Californian civilization be far behind?
One of the scary scenarios frequently trotted out by climate change alarmists is the possible shutdown of the ocean currents in the Atlantic Ocean. This would disrupt northern hemisphere climate, particularly in Europe. Indeed, one Hollywood disaster movie had frozen military helicopters falling from the skies in the UK and Manhattan buried under a tsunami of ice. We are told this could happen at any time, if the world gets too hot from all that CO2 our species is churning out. Now a shocking new paper in the journal Science implies that the standard view of a relatively stable interglacial circulation may not hold for conditions warmer/fresher than at present. Why? Because it happened before, over 100,000 years ago, without the help of man made global warming. Another catastrophic climate threat is shown to be totally natural and to have happened before our species began burning coal and driving SUVs.
The current interglacial warm period, the Holocene, started ∼11,500 years ago. At its start, among the dramatic changes in climate was a notable increase in rainfall, triggered by summer insolation values higher than those of today. This caused what is called the African Humid Period in North Africa—a time when the Sahara was dotted with large and small lakes, savannah grasslands, and in some regions, humid tropical forests and shrubs. The African Humid Period ended abruptly ∼5000 ybp (years before present) in many locations, such as western North Africa and northern Kenya. In other places, such as the central Sahara and the southern Arabian Peninsula, change occurred more gradually, taking several millennia. Regardless of the pace of change, those areas are tracts of arid desert today, and the animals and humans who had previously thrived in those formerly verdant regions have either moved or had to adapt to much harsher conditions. This is but one example of nature at its most capricious—the tyranny of climate change.
Scientists who study climate will tell you that today's warm temperatures and mild conditions are not normal for Earth during the past several million years. Our planet has been in a general cooling trend for 35 million years and in the grip of an Ice Age for the last 1.6 million years. What's more, this Ice Age, known as the Pleistocene, consists of relatively short periods of warmth, called interglacials, separated by much longer periods of bitter cold, referred to as glacials. The recorded history of humankind covers only the later half of the most recent interglacial warming, though our ancient ancestors did leave messages in the form of cave art that date back to much colder times, times when the Ice Age held the world fast in its frozen embrace. Predicting the timing and duration of these periods remains a problem for scientists. Why the glacial periods should last around 100,000 years, as they have for the last million years or so, is called the 100-kyr problem. Now, a group of researchers claim they know the answer.
Most people have never heard of the Anthropocene era and with good reason—it is not an officially recognized geologic time period. It is the invention of a small group of scientific busy bodies who evidently have nothing better to do than try to effect a change in the official timeline of Earth's past. The International Commission on Stratigraphy, the body charged with formally designating geological time periods, has been petitioned in the past and just recently a group of chuckle-heads attending the Society for American Archaeology meetings in Hawaii have brought the idea up again. Only problem is, the proponents of the Anthropocene have fallen to arguing amongst themselves—when did the “Age of Man” really start?
Since it was recently Earth Day, a yearly day of celebration and protest by conservationists and assorted greens, it is instructive to take a look at a number of recent studies taken from the scientific literature. The dire predictions made by climate change alarmists are many, far to numerous to all be addressed here, so this article will examine three areas of concern: increased drought, destruction of the world's rainforests, and the die-off of ocean coral reefs. Each of these reported calamities has been linked to increasing anthropogenic greenhouse gas emissions and that supposed bane of nature, anthropogenic global warming (AGW). These threats have been repeated ad nauseum by talking heads and climate change activists, but the truth is that these predicted outcomes are not as threatening as they would have you believe.
It has been vilified in the press and maligned in school classrooms by the ignorant. Opposing it has become a cause célèbre in Hollywood an a litmus test for liberal politicians. Labeled man's chosen weapon for ravaging nature and laying waste to the environment this poor, largely misunderstood gas is in fact essential to our existence. Without carbon dioxide, CO2, we would not be here and Earth would be a frozen lifeless chunk of rock. Forget that life would never have developed on a planet without greenhouse warming, a recent scientific study say the daemon gas has rescued the planet from the deep freeze at least twice during the Neoproterozoic era, roughly 750 to 635 million years ago.
There was much made in the media about a new report that claims modern day temperatures are the highest in 5,000 years. Moreover, the investigators assert that this century's temperature rise is “unprecedented,” echoing the assertions of climate change alarmists over the past 30 years. Various news outlets seized upon this report as final proof that the world is headed for a hot steamy demise because of human greenhouse gas (GHG) emissions. There are, however, a number of problems with that assertion. First among them is the methodology used to generate the global temperature history and the comparison of proxy data with instrument data from recent times. This may be science but it is being used to deceive the public into believing that anthropogenic global warming (AGW) is a crisis on an unprecedented scale.
A newly released study from the Research Council of Norway has climate change alarmists abuzz. One of the things the alarmists have been pushing for is to halt warming at a 2°C increase at any cost (and they mean that literally). In the Norwegian study, much to the alarmists' dismay, researchers have arrived at an estimate of 1.9°C as the most likely level of future warming. The report also recognizes that temperatures have stabilized at 2000 levels for the past decade even though CO2 levels have continued to rise. Meanwhile, a reconstruction of the Eemian interglacial from the new NEEM ice core, published in the journal Nature, shows that in spite of a climate 8°C warmer than that of the past millennium, the ice in Northern Greenland was only a few hundred meters lower than its present level. This finding casts doubt on the projected melting of ice sheets and resulting sea-level rise.
Instrument data from the last 160 years indicate a general warming trend during that span of time. However, when this period is examined in the light of palaeoclimate reconstructions, the recent warming appears to be a part of more systematic fluctuations. Specifically, it is an expected warming period following the 200-year “Little Ice Age” cold period. Moreover, a new study of the natural variability of past climate, as seen from available proxy information, finds a synthesis between the Milankovitch cycles and Hurst–Kolmogorov (HK) stochastic dynamics—a result that shows multi-scale climate fluctuations cannot be described adequately by classical statistics.
Once again the fear-mongering hoards of lay-climatologists are denouncing the importance of the Medieval Warm Period (MWP), or Medieval Climate Optimum as some refer to it. On the strength of a single new study, involving algal lipids from a lake in Svalbard, Norway, one pundit referred to the “so-called” Medieval Warm Period and impugned the significance of the Little Ice Age for good measure. Of course the writer, an “ecological journalist,” saw nothing wrong with using such a dismissive and pejorative term since it is IPCC promoted doctrine that today things are warmer than they have ever been since the onset of the Holocene. Yet literally hundreds of other studies have shown that the MWP was as warm or warmer than today and that is the real consensus among paleoclimatologists.
Heralded far and wide as a harbinger of global climate change, this year's record Arctic ice melt has the uninformed climate alarmists celebrating and the more knowledgeable scratching their heads. You see, this summer's ice retreat was predicted by no computer model and few scientists even though it possible. While climate scientists ponder what is wrong with their theories nature has carried on—no fuss, no muss, no drama. Circulation patterns are shifting and living creatures from zooplankton to megafauna are taking the change in stride. What has flummoxed environmental scientists is the simple and now demonstrated fact that successful life forms have a common trait—they are adaptable, something many scientists are not.
Over the past 50 years or so, the Antarctic Peninsula, the northernmost part of the mainland of Antarctica, has experienced rapid warming and the collapse of a number of ice shelves. A new temperature record derived from an ice core drilled on James Ross Island, has triggered a reassessment of what triggered the recent warming trends. This new core provides the best record of climate events on the peninsula going back at least 20,000 years, and may extend back as far as 50,000 years. From this new data a team of researchers has constructed the most detailed history of climate on the Antarctic Peninsula known to science and it has revealed a number of interesting things. Most important of these is the fact that this area undergoes bouts of rapid warming periodically and that things were at least as warm on the peninsula 2,000 years ago. So much for “unprecedented” warming on the Antarctic Peninsula.
There has been a wave of triumphal announcements by climate change proponents recently, almost giddy over the summer shrinkage of the Arctic ice sheet. “Lowest level ever!” they proclaim, thought that is not quite true. Nonetheless, The Arctic pack ice has been receding over the last decade or so, but that is only natural. You see, there is a well known, if poorly understood, linkage between the ice at the north pole and the ice in and around Antarctica—and the ice around Antarctica is doing quite well. Satellite radar altimetry measurements indicate that the East Antarctic ice sheet interior increased in mass by 45±7 billion metric tons per year from 1992 to 2003. This trend continues today, reinforcing recent scientific investigations into this millennial scale oscillation between the poles. According to studies, this is how things have been for hundreds of thousands of years.
A study of ancient volcanic ash found at key archaeological sites across Europe suggests that early modern humans were more resilient to climate change and natural disasters than commonly thought. The study, which appeared in PNAS, analyzed volcanic ash from a major eruption that occurred in Europe around 40,000 years ago. The volcano spewed so much ash that the event probably created winter-like conditions and a sudden colder shift in climate. Scientists have generally suggested that the spread of modern humans, and the decline of our cousins the Neanderthals, was primarily due to ancient volcanic eruptions and deteriorating climate conditions, but this study shows that stone-age man rolled with the punches and shrugged off the sudden shifts in climate. This new evidence flies in the face of modern predictions that a shift of a few degrees in average yearly temperature will decimate human populations world wide.