There's a huge demand for local and regional climate projections. Policy-makers, planners and everyday people all over the world are looking for scientists to provide "data" on the future climate of their region, their town, their coast, their water supply, in order to better inform long-term decisions.
This demand is captured by the fast-spreading concept of climate services. In the past few years, there have been many new national and international initiatives and forums, like the recent Pacific Islands Climate Services Forum, aimed at getting scientists, government agencies and the private sector to "supply" these services.
Pacific Islands Climate Services Forum (Jan 2013) |
The recent RealClimate post on regional climate modelling illustrates the size of that gap. In short: a couple recent publications, summarized in Science (Kerr, 2013), question the effectiveness of the regional models based on comparison analysis of model output and climate observations. The RealClimate post rightly takes the articles to task, reminding everyone that no climate model, regional or global, should be expected to recreate the exact year-to-year variation in the weather. The system is too chaotic and sensitive to the initial conditions in the model. So models can describe the frequency and magnitude of climate variability, but "these fluctuations are not synchronised with the real world."
This confusion is an example of a gap between what science can deliver and what people expect science to deliver, as Mike Hulme discusses in Why we disagree about climate change?.
The gap is common with climate change science, but hardly unique to climate change science. Think of going to the doctor with a sprained ankle. You hope for a clear diagnosis and timetable for recovery. Instead, you receive a vague answer on a simple three point scale about the severity of the sprain, and a range in weeks for the likely recovery time.
The expected recovery time from the sprain is the medical equivalent of a multi-model ensemble prediction: we can't tell you exactly when the ankle will heal or how much the climate will change, but we can tell you given the input data, it "should" occur in this range. It is, statistically-speaking, possible that it will not occur in that range, because there is a chance that the data on which that range was based did not capture 100% of the range of possible experiences.
As a patient worried about being able to walk, you quite likely to want the "expert" to do more definitive tests to improve the answer. However, the high-technology test, be it a MRI or a new climate model, are not guaranteed to radically improve the diagnosis of what's happened or the future prediction. This is simply not something we can know with 100% confidence and 100% certainty.
Many of the potential users of climate change projections, not educated on the fine technical points of climate modelling or statistics, are often looking for precise answers that scientists and our models will never be able to provide. This expectation is embedded in the very language that is used. At the Pacific Islands Climate Services Forum in January, I was told many times about the need for "data", a word I've intentionally place in quotes in this post, for decision-making. The word "data" implies a precise measurement. Yet what scientists can provide is a "prediction", which comes with uncertainty, itself a combination of known and unknown elements
It is clearly important to develop and properly evaluate methods for regional climate prediction. Even with the uncertainty, some of which is irreducible, in future predictions, the information can still be of use in decision-making. We are, after all, able to decide whether it is safe to start running again after an ankle sprain, despite imperfect knowledge on the exact state of the ligaments, muscles and tendons.
Scientists, however, need to recognize the core challenge is not just improving models, but improving understanding of what can be modelled. Otherwise, scientists and decision-makers will be at cross purposes.
If you're interested in more of these ideas, I recommend the third chapter - "Performance of Science" - of Hulme's book. I assign it to my undergraduate students every year.
No comments:
Post a Comment