Recently, the journals Nature and Science reported on two experiments which revealed molecules released by trees can seed clouds. These findings run contrary to an assumption that sulphuric acid is required for a certain type of cloud formation. This further suggest that climate predictions may have underestimated the role that clouds had in shaping the preindustrial climate. And since sulphuric acid is considered an anthropogenic pollutant, this means man has a smaller role in climate change than previously assumed. Almost left unmentioned in the news reports of these results is the role cosmic rays play in the formation of cloud nuclei, an idea proposed years ago and treated with derision by the mainstream climate change cabal.
The history of human progress is essentially the history of man's mastery of ever more powerful and effective sources of energy. Today, with misguided “greens” attempting to roll back the clock on energy production, it is a good time to teach our children and remind ourselves where we came from. The following is an entertaining history of humans and their energy sources by Viv Forbes—a science graduate, geologist, mineral economist, farmer and Chairman of the Carbon Sense Coalition—with illustrations by cartoonist Steve Hunter.
Once again climate scientists have put forth a scary prediction about melting ice caps and once again they are portraying output from computer models as a reliable prediction of things to come. The normally staid and reliable journal Nature emblazoned its cover with “Rising tide” in large black print, with the subtext “A 500-year model of Antarctica's contribution to future sea-level rise” in smaller print below. When a closer look is taken at this new model it turns out to be a house of cards, incomplete and built on top of other climate change models that are known to be faulty. As the old saying goes: garbage in, garbage out. Yet a major science journal chose this piece of computerized legerdemain as its lead article. No wonder that climate alarmists, and climate science in general, have fallen into such ill repute.
Mixed in with the swirling cloud of misinformation about global warming is also a bit of blather regarding the onset of a new Ice Age. What the promoters of a new big freeze are talking about is a glacial period, since technically we are still in the midst of an ice age—the Pleistocene. A number of scientists have warned that the planet might be headed for a new period of glacial growth based on Earth's recent history. Over the past half million or so years there has been a series of short (15-20k year) warm period sandwiched between longer (70-100k year) glacial episodes. Make no mistake, global cooling is much worse than global warming, so this really matters to our descendants and the whole human race. Now, a new study in Nature says that we maybe off the hook, glacial wise. Not only that, human activity might be the reason.
To end the year 2015, climate change alarmists tried valiantly to drive home their message that the world was going into a global warming meltdown. Myopically focusing on the unseasonably warm temperatures in the eastern half of the United States was not enough, they decided to hype the occurrence of a fairly common winter weather event with the scary name “bomb cyclone.” The weather media–always up for a natural calamity–jumped on the bomb storm meme, billing the impending event as a “blowtorch” that would melt the North Pole. Such a meteorological feeding frenzy had not been seen for ages. Well the storm has come and gone and the Arctic is intact, catastrophe unrealized and the media weather ghouls moving on to the next faux disaster. But was the “bomb” real, and if so, what was it?
When most people think of climate change they are really thinking of weather. Specifically the weather where they live. Weather is caused by Earth's climate engine moving heat about, so the two are definitely linked, but is it possible to capture climate in a single number? For years, those alarmed by the prospect of climate change have bandied about a number for Earth's average global temperature, currently given as about 61 degrees F (16°C). But what does that mean? This is why climate alarmists like to talk about the change in global temperature above some past average, starting at some arbitrary time – the real meaning is, well, a bit vague. But if no one can interpret the meaning of Earth's average temperature, what are we to make of a change in that number? As it turns out, Earth's average temperature is a mostly meaningless number, often used to mislead people and susceptible to manipulation for nefarious purposes.
On October 14, 2015, Dr Patrick Moore delivered the Global Warming Policy Foundation annual lecture in London. An ecologist and environmentalist for more than 45 years, Moore was one of the founding members and a leader of Greenpeace. After 15 years he left the organization because he felt its mission and message had changed. “Over the years the 'peace' in Greenpeace was gradually lost and my organization, along with much of the environmental movement, drifted into a belief that humans are the enemies of the earth.” Since then, Dr. Moore has been a consistent voice for sanity in ecological matters. In his address he asserts that CO2 is not evil rather it is the currency of life and the most important building block for all life on Earth. Human emissions of carbon dioxide have helped save plant life on our planet. “We are not the enemy of nature but its salvation,” he proclaimed.
Acknowledging that today's supercomputers lack the computational power to successfully model Earth's climate system, a climate modeler is suggesting that climate models would benefit from running on computers whose calculations are less exact. “In designing the next generation of supercomputers, we must embrace inexactness if that allows a more efficient use of energy and thereby increases the accuracy and reliability of our simulations,” says Tim Palmer, a Royal Society research professor of climate physics and co-director of the Oxford Martin Programme on Modelling and Predicting Climate at the University of Oxford, UK. This is nothing more than grasping for excuses to explain the dismal performance of the current crop of climate model simulations. There is an old saying: a craftsman never blames his tools for a bad result. Evidently climate modelers are not even close to being craftsmen.
We have all heard about the Ice Age, if only in cartoon movies. A time when massive ice sheets covered the planet while mammoths and saber toothed cats roamed the frozen landscape. What is more, the cycle of interglacial-glacial-interglacial has happened over and over again during the past million or so years. During the last half a million years the cycle has repeated every 130,000 years, with the warm period we are now enjoying—the Holocene—just the latest interglacial respite from the icy conditions of the Pleistocene Ice Age. What most people don't know is that there were many areas on Earth that remained unchanged, even during the height of the last glacial period. The Sahara was hot and dry, and in the Amazon rainforests, though a bit smaller in area, looked much like they do today.
The term “settled science” gets tossed around in the media a lot these days. Mostly by non-scientists, who know no better, and by some errant scientists, who should. In 2002, the U.S. National Research Council Committee on Abrupt Climate Change published its findings in a book entitled Abrupt Climate Change: Inevitable Surprises. A new report in Science recaps the surprising discoveries made since then, and they are big. So big that ocean circulation models, integral parts of all climate models, do not accurately predict reality. The observed change in AMOC strength was found to lie well outside the range of interannual variability predicted by coupled atmosphere-ocean climate models. Sounds like circulation in the Atlantic Ocean is not so settled.
Climate scientists have constructed models to predict what Earth's climate will look like decades, even hundreds of years in the future. Unfortunately, many major components of Earth's climate system have not been accurately monitored for very long. This makes such predictions suspect if not laughable. A case in point are variations in ocean circulation and temperature. In the Atlantic there is a cycle for sea surface temperatures variation called the Atlantic Multidecadal Oscillation (AMO). The AMO is linked with decadal scale climate fluctuations like European summer precipitation, rainfall in Europe and India, Atlantic hurricanes and variations in global temperatures. A new study in the journal Nature reports that the AMO is again transitioning to a negative phase, meaning the vaunted “pause” in global warming may be with us for decades. In fact, scientists at the University of Southampton predict that cooling in the Atlantic Ocean could cool global temperatures a half a degree Celsius.
Despite what gets reported in the news media, there are good climate scientists out there, quietly laboring away at that “settled science.” They are actually trying to understand climate instead of making unsubstantiated, bombastic predictions about future global warming. As a result, several recent papers have cast additional doubt on the validity of climate models as they now stand. In one, wide variation in solar radiation at the top of the atmosphere adds to the major errors in basic physics inherent in the so-called “state of the art” climate models. Another paper, on aerosol radiative forcing, casts doubt on the fundamental assertion of climate alarmists regarding the warming power of CO2. Adding to the stink over the settled science claim, the journal Nature carried a news article that says such assertions are “absolutely not true” and pleads for a new crop of physicists to help unravel the persistent ongoing climate mysteries. The outlook for climate science is cloudy indeed.
Recently, a PR offensive has been mounted by the minions of climate alarmism, attempting to rehabilitate the soiled reputation of climate models. Most everyone by now has heard of the 18+ year pause in global temperature increase, dubbed the “pause” by climate change advocates. This hiatus in global temperature increase, happening in the face of ever rising atmospheric CO2 levels, has caused even the most die hard climate alarmists to doubt the veracity of climate science's digital oracles. The latest phrase being test marketed in the green stooge press is the claim that climate models are just “Basic Physics”, implying that they are in some way scientifically accurate. Nothing could be farther from the truth.
Proving the old adage, if your first lie isn't believed lie again and make it a whopper, NASA GISS announced another study proclaiming imminent climate catastrophe. This time it's the US Southwest and the scourge is not just drought, it's Megadrought! The report predicts that decades-long droughts are likely to ravage the US Southwest and Great Plains within the next century. “This drying could be worse than any other in the past 1,000 years, including a 'megadrought' seven centuries ago that helped drive an ancient civilization to collapse,” wails Nature online. But just how did the researchers come to this conclusion and what evidence do they base their predictions on? As it turns out the whole thing is a house of cards.
As the new year begins the forces of climate alarmism find themselves in disarray. The world refuses to warm, despite contorted data manipulation aimed at squeezing out a claimed “hottest year ever” record by NOAA. One hundredth of a degree does not a warming trend create. Northern hemisphere temperatures are depressed for the second year in a row—also not a climate trend in itself but a psychological blow to the warmists trying to sell the idea that temperatures are at a record high. Moreover, there are indications that the climatologists' bane, CO2, is not ravaging tropical forests and is not being produced where the “blame the developed world first” crowd expected. In all, a good start to the new year because it is a bad start for the alarmists.