More Fictitious Hurricane Predictions
According to the AP, top researchers now agree that the world is likely to get stronger but fewer hurricanes in the future because of global warming, seeming to settle a scientific debate on the subject. But they say there's not enough evidence yet to tell whether that effect has already begun. Despite warnings by scientists that identifying an actual trend in storm variability is impossible due to a lack of reliable historical data, a new report in Nature Geoscience is being cited as a solid prediction of future trends in tropical cyclone activity. The other thing not mentioned is that this research is based on models of questionable accuracy.
The review article by Thomas R. Knutson et al., entitled “Tropical cyclones and climate change,” was published online on Sunday, February 12, 2010. In it, the authors warn that there is precious little that can be predicted from past data. But this does not stop them from blithely predicting the future based on new “high-resolution” models. Here is part of the paper's abstract:
Large amplitude fluctuations in the frequency and intensity of tropical cyclones greatly complicate both the detection of long-term trends and their attribution to rising levels of atmospheric greenhouse gases. Trend detection is further impeded by substantial limitations in the availability and quality of global historical records of tropical cyclones. Therefore, it remains uncertain whether past changes in tropical cyclone activity have exceeded the variability expected from natural causes. However, future projections based on theory and high-resolution dynamical models consistently indicate that greenhouse warming will cause the globally averaged intensity of tropical cyclones to shift towards stronger storms, with intensity increases of 2–11% by 2100.
Once again, climate scientists are predicting future climate behavior based, not on emperical data, but on computer models. They go on to state that confidence in some of their predictions is low “owing to uncertainties in the large-scale patterns of future tropical climate change, as evident in the lack of agreement between the model projections of patterns of tropical SST changes.” Their approach is to combine a number of different models into an “ensemble,” manipulating the output until it converges on what historical observations we have. In the end they predict fewer but stronger storms because of global warming, though “the actual intensity level of these strong model cyclones varies between the models, depending on model resolution and other factors.”
Modeling and the Search for Scientific Truth
I have repeatedly stated that models can be a useful tool in any number of fields. Understand that there are different kinds of computer models. Some are quite exact and can be used for such things as aerodynamics and structural analysis. Those types of quantitative models are based in well understood natural laws and are relatively tractable. They give answers that engineers can use as actual guidance. But even then they are not always right. Recently Boeing had to reinforce the wing root attachment points on their new 787 airliner because the computer model simulations were not born out in actual testing.
Moreover, not all models are blessed with such passing verisimilitude with respect to nature. Most models are approximations for the systems being modeled. They are pressed into service when the system being studied is too complex for human intuition to predict system behavior. Computer models of this kind—which includes GCM climate models—should be used to provide insight, but instead are used to make authoritative predictions of things to come. This brings us to the philosophy of Sir Karl Popper.
Karl Popper was one of the most influential philosophers of the 20th Century and a tremendous influence on modern scientific thought. One TRE reader, Peter Foster, pointed out to me that the 2007 IPCC report cites Popper's 1934 book, The Logic of Scientific Discovery. Peter is the second person with a connection to Canterbury University in Christchurch, New Zealand, to contact me this month. Unsurprisingly, both mentioned Popper. In 1937, the rise of Nazism and the threat of the Anschluss led the Austrian born Popper to emigrate to New Zealand. There he became a lecturer in philosophy at Canterbury University where he had a strong influence that evidently persists to this day.
According to Popper “the criterion of the scientific status of a theory is its falsifiability, or refutability, or testability.” By falsifiability he did not mean that a theory was false but that there exists a way to prove the theory false (for more see Popper's essay “Science as Falsification”). A theory has to be testable. There have to be defined properties which can be predicted by theory and checked by measurement. It would appear that the IPCC authors agree, as shown in this quote from the AR4 section entitled “The Nature of Earth Science”:
Science generally advances through formulating hypotheses clearly and testing them objectively. This testing is the key to science. It is not the belief or opinion of the scientists that is important, but rather the results of this testing. Scientific theories are ways of explaining phenomena and providing insights that can be evaluated by comparison with physical reality. Each successful prediction adds to the weight of evidence supporting the theory, and any unsuccessful prediction demonstrates that the underlying theory is imperfect and requires improvement or abandonment.
Popper was among the first to state that discovering truth is the aim of scientific inquiry while acknowledging that most of the greatest scientific theories in the history of science are, strictly speaking, false. Scientists' theories represent their current understanding of nature. As that understanding improves old theories are discarded and new ones formulated. Popper's philosophy of science defined progress as the process of moving from one false theory to another, still false theory that is nonetheless closer to the truth.
Climate models are analogous to those false yet useful theories—models try to encapsulate science's understanding of how the Earth system works. This has led many, including another friend of mine from Canterbury University, to make the argument that the models we have may not be perfect but they are at least usable. The question becomes, how much faith are you willing to place in a model's results, starting from the acknowledgment that all such models are by definition wrong. Having spent many years modeling large, nonlinear systems I am not willing to base potentially world changing decisions on the output of current climate models.
This is because of how the models are constructed and how they are calibrated. In short, the models are tuned to produce a specific amount of temperature increase for a doubling of CO2 levels. It is unsurprising that the researchers then get the answers they expected. It is also unsurprising that, when faced with an unexpected response from the natural system like the recent leveling and possible decline in global temperatures, the models fail miserably. Worse than that, secondary predictions are often made based on the predictions of models or even models that use the output of other models as their starting data.
A New Hurricane Model
One of the references cited by the Nature Geoscienc hurricane modeling report was a recent paper in Science that can help fill in some of the technical details of how the modeling research was performed. In it, researchers did, indeed, report that fewer but stronger hurricanes will sweep the Atlantic Basin in the 21st century. This new modeling study by US government researchers from NOAA is predicated on climate change “continuing.” As explained in an accompanying perspective by Science writer Richard A. Kerr:
What makes the new study more realistic is its sharper picture of the atmosphere. In low-resolution models such as global climate models, the fuzzy rendition of the atmosphere can't generate any hurricanes, much less the intense ones that account for most of the damage hurricanes cause. The high-resolution models used by the U.S. National Weather Service to forecast hurricane growth and movement do produce a realistic mix of both weak and strong storms, but those models can't simulate global warming.
So, as a compromise the researchers took the output of some of those “fuzzy” GCM and used their projections for global environment at the end of the century as the starting point for the new “high-resolution” models.
Climate modeler Morris Bender of the National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey, and his colleagues used a technique sometimes called "double-downscaling." The group started with the average of atmospheric and oceanic conditions forecast for the end of the century by 18 global climate models. They transferred those averaged conditions into a North Atlantic regional model detailed enough to generate a realistic number of hurricanes, although still too sketchy to get their intensities right. Finally, the team transferred the regional model's storms to an even higher-resolution hurricane forecast model capable of simulating which ones would develop into category 3, 4, and 5 storms.
Naturally this has led to a number of reports in the popular media that we are to expect fewer but stronger hurricanes in the future and those hurricanes are going to be caused by global warming. It should be noted that this study actually contradicts some reports that the recent “anomalous” rise in hurricane activity is linked to climate change. No consensus here.
Model tracks for all storms that eventually reached category 4 or 5 intensity. Bender et al./Science.
Given that the model predictions for 2100 are not testable except in the fullness of time, there is no convenient way to test to the new models future accuracy. As the researchers themselves state, “these findings are dependent on the global climate models used to provide the environmental conditions for our downscaling experiments.” It is, however, possible to run the model on known data taken over the past quarter of a century. The new modeling study attempted to reproduce recent conditions and they found:
The researchers note that the new modeling offers no support for claims that global warming has already noticeably affected hurricane activity. In the real world, the number of Atlantic hurricanes observed during the past 25 years has doubled; in the model, global warming would cause a slight decline in the number over the same period. Given that the mid-resolution model used by the group duplicates the observed rising trend, it may be natural.
So the fuzzier mid-resolution model, presumably less accurate than the new one, gets the recent trend correct, which the researchers interpret as an indication that any rising trend is purely natural. The new high-resolution model doesn't correctly predict current conditions. Here we have low-resolution models known to be inexact providing the hypothetical starting point data for other models—models which fail to correctly predict trends even based on real data—yet we are asked to uncritically accept the projections for hurricanes 90 years from now. It is to be expected that, if you start with wonky data input, you end up with wonky data output, but this carries the process a step further. Believing the results of this exercise seems more an act of faith than science. Is it any wonder that I mentioned the greatest sin for a modeler: believing that the model is the thing being modeled.
Computer simulation of the most intense hurricanes shows an increase from today (top) to a warmer world at the end of the century (bottom). Adapted from Bender et al./Science.
Rather amazingly, an earlier study in Nature stated that current climate conditions resemble those that led to peak Atlantic hurricane activity about 1000 years ago. I say amazingly because this study based on examining ocean sediments, included Pennsylvania State University meteorologist Michael Mann of hockey stick fame—a global warming true believer in anyone's book. The paper states: “The short nature of the historical record and potential issues with its reliability in earlier decades, however, has prompted an ongoing debate regarding the reality and significance of the recent rise.”
Good modelers, like all cautious scientists, always use conditional phrases and qualifiers when writing of their work. “In the absence of a detectable change, we are dependent on a combination of observational, theoretical and modeling studies to assess future climate changes in tropical cyclone activity,” concludes the review by Knutson et al. “These studies are growing progressively more credible, but still have many limitations.” We have consensus and that consensus is “we don't really know.” Unfortunately, such reservations do not make it into the news headlines.
What usually happens is more a sin of omission rather than commission. Climate modelers, and the climate science community in general, have not gone out of their way to stress the inherent unreliability of their predictions to the lay public. In the public forum, climate science has been happy to let overly excitable reporters and fringe eco-activists spin the GCM results into predictions of future catastrophe. This is disingenuous at best and can lead to the types of backlash recently visited on CRU and other research organizations over mishandling and manipulation of data. Sure, the IPCC calls them scenarios and projections, not predictions, as if that gives them deniability when the projections do not come to pass.
In 2007, the IPCC said it was “more likely than not” that man-made greenhouse gases had already altered storm activity, but the authors of the review said more recent evidence muddies the issue. “The evidence is not strong enough that we could make some kind of statement” along those lines, Knutson said. It doesn't mean the IPCC report was wrong; it was just based on science done by 2006 and recent research has changed a bit, said Knutson and the other researchers.
The fact is, climate scientists have continued to use models to make predictions about future climatic conditions, and by attaching those predictions to the AGW theory they have weakened the very theory they are at pains to defend. Fortunately, Popper provides us with a way to filter truth from falsehood. The IPCC and other global warming alarmists have a choice—they can either say that AGW makes no predictions and is therefore not a scientific theory by definition, or they can stand by their model generated predictions and admit that their theory has been proven false time and again.
Be safe, enjoy the interglacial and stay skeptical.
The computer model says it's going to be a heck of a blow.
Post Script: This is the first of two articles that, in part, are based on the philosophy of Karl Popper. These articles were motivated by an exchange of ideas with Doug Campbell, one of the proprietors of ClimateDebateDaily and a climate change proponent. Correspondence with Doug has renewed my faith that good people can not only have honest differences of opinion but that they can discuss those differences in a civilized manner.