Most people have never heard of the Anthropocene era and with good reason—it is not an officially recognized geologic time period. It is the invention of a small group of scientific busy bodies who evidently have nothing better to do than try to effect a change in the official timeline of Earth's past. The International Commission on Stratigraphy, the body charged with formally designating geological time periods, has been petitioned in the past and just recently a group of chuckle-heads attending the Society for American Archaeology meetings in Hawaii have brought the idea up again. Only problem is, the proponents of the Anthropocene have fallen to arguing amongst themselves—when did the “Age of Man” really start?
The threat of widespread and persistent drought, ruining crops and threatening water supplies, is constantly cited as an outcome of global warming. Media talking heads, climate scientists (who should know better) and even the American President have all made this assertion—and there is nothing to back up the claim. Results presented recently at the annual assembly of the European Geosciences Union in Vienna show that forecasting drought is still beyond the reach of current climate models. Models run against historical data have either predicted periods of drought at the wrong times or missed them all together. Yet climate alarmists continue to spread this pernicious lie, preaching damnation with the certitude of an Old Testament prophet.
It has been vilified in the press and maligned in school classrooms by the ignorant. Opposing it has become a cause célèbre in Hollywood an a litmus test for liberal politicians. Labeled man's chosen weapon for ravaging nature and laying waste to the environment this poor, largely misunderstood gas is in fact essential to our existence. Without carbon dioxide, CO2, we would not be here and Earth would be a frozen lifeless chunk of rock. Forget that life would never have developed on a planet without greenhouse warming, a recent scientific study say the daemon gas has rescued the planet from the deep freeze at least twice during the Neoproterozoic era, roughly 750 to 635 million years ago.
Once again the fear-mongering hoards of lay-climatologists are denouncing the importance of the Medieval Warm Period (MWP), or Medieval Climate Optimum as some refer to it. On the strength of a single new study, involving algal lipids from a lake in Svalbard, Norway, one pundit referred to the “so-called” Medieval Warm Period and impugned the significance of the Little Ice Age for good measure. Of course the writer, an “ecological journalist,” saw nothing wrong with using such a dismissive and pejorative term since it is IPCC promoted doctrine that today things are warmer than they have ever been since the onset of the Holocene. Yet literally hundreds of other studies have shown that the MWP was as warm or warmer than today and that is the real consensus among paleoclimatologists.
Over the past 50 years or so, the Antarctic Peninsula, the northernmost part of the mainland of Antarctica, has experienced rapid warming and the collapse of a number of ice shelves. A new temperature record derived from an ice core drilled on James Ross Island, has triggered a reassessment of what triggered the recent warming trends. This new core provides the best record of climate events on the peninsula going back at least 20,000 years, and may extend back as far as 50,000 years. From this new data a team of researchers has constructed the most detailed history of climate on the Antarctic Peninsula known to science and it has revealed a number of interesting things. Most important of these is the fact that this area undergoes bouts of rapid warming periodically and that things were at least as warm on the peninsula 2,000 years ago. So much for “unprecedented” warming on the Antarctic Peninsula.
After decades of debunking and statements by responsible scientists that climate is not weather and individual anomalies are not an indication of climate change, the government funded IPCC lackeys at the UK's Met Office and America's National Oceanic and Atmospheric Administration have publicly attributed recent bad weather events to man-made climate change. These irresponsible boffins' shrill claims illustrate the desperation in the anthropogenic global warming (AGW) camp in the face of declining public concern over climate change. While admitting that it is impossible to blame a single event on global warming, climate alarmists are claiming attribution is possible as long as it is framed in terms of probability. They have gone from lies, to damn lies and now, finally, to statistics.
Carbon offsets, favored by PC Hollywood stars and green politicians alike, have been used by the rich and powerful as a way to make their energy wasting lifestyles appear to be OK from an ecological point of view. Al Gore and Leonardo DiCaprio have both used carbon offsets to greenwash their jet-setting travel on private aircraft. Airlines even offer the unwashed masses an opportunity to purchase carbon offsets before boarding a flight—you may not be able to fly like they stars but you can lie like the stars. Finally, a voice from the green ranks has spoken up to denounce carbon offsets for what they are: “carbon offsetting is without scientific legitimacy and is dangerously misleading.”
The Nobel Prize in Chemistry for 2011 has been awarded to Daniel Shechtman of the Technion, the Israel Institute of Technology in Haifa, for the discovery of quasicrystals. Quasicrystals are non-repeating regular patterns of atoms that were once thought to be impossible. Most people would agree that this is the type of unexpected discovery that deserves a Nobel Prize, but this is not just the story of an amazing scientific revelation. After first discovering evidence for quasicrystals in 1982, using an electron microscope in a US government lab, Shechtman was expelled from the lab and subjected to years of ridicule by other scientists. You see, quasicrystals were thought impossible by crystallographers at the time and even though Shechtman had evidence backing his claim, he was ostracized by mainstream science.
Nostradamus, the famous prognosticator, was said to have received his visions of the future by scrying—staring into a pool of ink for hours on end. The climate science equivalent of a gazing pool is the computer climate model, huge collections of complicated computer code that supposedly crank out frightening visions of the future. Most of the discussion about climate change has centered on the global aspects of the anthropogenic global warming (AGW), but now a number of climate mystics are turning their modeling dark arts to doing regional predictions. Why? The better to frighten the public and elicit funding from government coffers. Unfortunately, as climate researchers struggling to sharpen their fuzzy picture of what the future holds one fact has been ignored—there is even more uncertainty in regional models than in the global ones.
Tracking the flow of ice in the Arctic is difficult. Reconstructing the extent and flow in times past is even more difficult. An interesting new report has turned to driftwood, embedded in the Arctic pack ice, as a way of deciphering Arctic climate conditions over the last 10,000 years. The researchers found a climate record that is in good agreement with previous histories, including such events as the Medieval Warm Period, the Little Ice Age and the Holocene Thermal Maximum. In fact, they found temperatures during the HTM to be 2° to 4°C higher than today. They also found a complementarity oscillation in sea-ice abundance between East and West that is not correctly simulated by current ice models.
Precaution is now an established tenet of environmental governance, law, and public policy at the international, national and local levels. When it comes to pollution, toxic chemicals, genetically modified organisms, endangered species and climate change, the so called precautionary principle has become the guiding doctrine for timorous souls everywhere. But more than that, it is a codification of the idea that before anything new is allowed, it must be proven, beyond a shadow of a doubt, to cause no harm to anything in anyway, under any conditions, anywhere—period. It is “look before you leap” on steroids and a major legal weapon used by environmentalists and neo-Luddites everywhere to hamstring human progress. Raising angst to an art form, progress hating activists have managed to block needed energy and industrial expansion at a critical time in humanity's development.
For decades, climate change alarmists have generated a host of doomsday scenarios, all based on the theory of anthropogenic global warming: human CO2 emissions will force Earth's climate to warm uncontrollably causing all manner of unpleasantness. A new study, published by the Center for the Study of Carbon Dioxide and Global Change, addresses the major predicted effects of global warming head on. Making extensive use of peer reviewed research papers, the dire predictions of climate alarmists are demolished point by point. In fact, the authors conclude that rising atmospheric CO2 concentrations associated with the development of the Industrial Revolution have actually been good for the planet.
Carbon monoxide, CO, is a trace gas that is important in atmospheric chemistry. It indirectly influences climate and has significant effects on methane and ozone levels. CO is a byproduct of combustion—particularly the incomplete burning of fossil fuels and biomass—and conventional wisdom says that humans, with their tendency to set things on fire, should be responsible for releasing much of the gas into the atmosphere. Little is known about the abundance and sources of CO prior to the industrial age, or about the importance of anthropogenic activities have had. A new study in the journal Science presents a 650-year-long record of CO atmospheric concentration using samples from Antarctic ice cores. Reconstructed past CO variability and its causes have come up with a shocking fact: CO levels are at a 2,000 year low. Apparently, humans actually prevent wildfire, reducing the release of carbon monoxide and, consequently, CO2.
For years climate alarmists have terrorized the public with frightening tales of impending disaster, a coming climate apocalypse. Because of global warming fertile croplands will become arid and barren while desert areas will experience torrential rain and uncontrollable flooding. Tropical rainforests will wither in the heat and polar ice will melt. Coastal areas and islands will disappear beneath the ocean and the world's great cities will huddle behind great seawalls to avoid the flood. Nature's furry will drive millions of refugees to migrate to less blighted lands, followed by plague, pestilence and war. Because of human hubris our civilization will collapse, as has happened so many times in the past. Or maybe not. A quiet revolution among anthropologists and archaeologists has overturned the scientific dogma surrounding failed ancient civilizations, with some lessons for those who currently preach climate catastrophe.
Whenever a skeptic points out a new paper or journal article refuting some claim made by the theory of anthropogenic global warming, climate change alarmists often shout “cherry picking!” Evidently, most climate change true believers do not understand how science works or how theories are tested. Scientific theories must make predictions by which they can be tested. Providing evidence that AGW has failed in its predictions is not cherry picking, it is refutation. Unfortunately, when confronted with failed predictions the standard alarmist answer is to disavow the predictions. They will say that those are not predictions at all, they are projections—and that means AGW is not a scientific theory at all.