More than six months after Climategate, the fallout continues. Controversies surrounding the IPCC since the affair offer us an opportunity to think critically about the so-called “climate consensus,” While, much of the MSM continues undeterred with alarmist claims and carbon finger-pointing, we owe it to rational discourse to keep asking pointed questions. We might also reflect on the manner and method in which scientists arrive at such consensuses. For in looking at the past, we see evidence of groupthink and coalescence around big mistakes.
Consider what one might call a brief history of hysteria. Ronald Bailey has a solid piece about this history over at Reason that he closes with this passage:
So what to make of this increase in the use of the concept of “scientific consensus?” After all, several scientific consensuses before 1985 turned out to be wrong or exaggerated, e.g., saccharin, dietary fiber, fusion reactors, stratospheric ozone depletion, and even arguably acid rain and high-dose animal testing for carcinogenicity. One reasonable response might be that anthropogenic climate change is different from the cited examples because much more research has been done. And yet. One should always keep in mind that a scientific consensus crucially determines and limits the questions researchers ask. And one should always worry about to what degree supporters of any given scientific consensus risk succumbing to confirmation bias. In any case, the credibility of scientific research is not ultimately determined by how many researchers agree with it or how often it is cited by like-minded colleagues, but whether or not it conforms to reality.
Indeed.
And that reality is a moving target. First, you need a whole lot of variables to “model” reality. Then, complex systems - reality’s toughest nuts to crack - are infamous for eluding description by our best models. Of course, this makes complex systems notorious for exposing the limitations of our ability to describe, explain and predict nature to start with.
“I have studied the climate models and I know what they can do,” writes physicist Freeman Dyson. “The models solve the equations of fluid dynamics, and they do a very good job of describing the fluid motions of the atmosphere and the oceans. They do a very poor job of describing the clouds, the dust, the chemistry, and the biology of fields and farms and forests.”
The fact is, people can be overconfident about what they think they know. In the face of such hubris, a few remain skeptical. Does that mean this minority is in denial? Obscuring the truth? No. It means serious questions linger. Such questions demand agnosticism (particularly in the case of climate change). When it comes to science, history reveals a legacy of getting a lot of things wrong, with only punctuated episodes of getting things right. And that goes doubly for prediction and forecast.
We may therefore do better to adapt than to anticipate. As Aaron Wildavsky wrote before his death in 1993:
In order to make a strategy of anticipation effective, it is necessary to know the quality of the adverse consequence expected, its probability, and the existence of effective remedies. The knowledge requirements and organizational capacities required to make anticipation an effective strategy—to know what will happen, when, and how to prevent it without making things worse—are large, often impossibly so.
If only we had more people like Wildavsky around today we might be far less likely to rush to judgment in matters of science and policy.
Compounded Uncertainty, Magnified Error
In thinking more about any climate consensus, consider a thought experiment:
Suppose there were ten consensus scientists in a room down at the United Nations. Each scientist claimed to be 95% certain about his or her particular prediction about climate change. We might be tempted to say that, taken together, the scientists were 95% certain about their “consensus” view. But is this correct?
Not quite. Economists Eric Alston and Richard Stroup say:
Confidence intervals are crucial; Even if each scientist were 95% certain of his particular prediction or set of parameters, this doesn't mean we can be so certain about the agglomeration of 10 scientists' opinions, each of whom was individually 95% confident in their predictions. .95 to the tenth power is .598!
Uncertainty gets compounded. And so do our errors. In The Black Swan, Nassim Nicolas Taleb reminds us that:
Simply, we are facing nonlinearities and magnifications of errors coming from the so-called butterfly effects [...] actually discovered by Lorenz using weather forecasting models. Small changes in input, coming from measurement error, can lead to massively divergent projections--and that generously assumes we have the right equations.
Taleb goes on to argue a version of the precautionary principle: “We have no proof that we are not harming nature, either”. While this is a fair point, it is not the stuff of policymaking. In fact, we can predict with far greater certainty that climate change mitigation policies currently on the table would have greater negative effects on the economy than they would positive effects on the climate (i.e. cooling the planet). If we’re going to apply the precautionary principle anywhere, we should apply it to the economy and become more resilient a la Wildavsky.
Resilience and Wealth
Here’s Wildavsky on resilience:
A strategy of resilience, on the other hand, requires reliance on experience with adverse consequences once they occur in order to develop a capacity to learn from the harm and bounce back. Resilience, therefore, requires the accumulation of large amounts of generalizable resources—such as organizational capacity, knowledge, wealth, energy, and communication—that can be used to craft solutions to problems that the people involved did not know.
Wealthier is healthier. And wealthier is more resilient. That’s why even if the consensus view of climate change turns out to be right, we’ll probably do better to adapt than to shoot ourselves in the proverbial foot.
Note: Thanks to Eric Alston and Richard Stroup for their great insights and resources, some of which I borrowed for this post.