November 23, 2009

This Weekend’s News: Climate Science Exposed!!!

The sensational stories of the weekend that really got me thinking were the countless news reports that hackers hacked more than 1,000 emails by prominent climate scientists at the University of East Anglia. Wired’s Threat Level blog and Dot Earth (as well as the main NYT) have some of the most useful coverage, and I’ll let you go there for the details, which are continuing to unfold. The gist is that climate data and projections, how they are derived and decided upon, and the people behind them are exposed very publicly, causing the efficacy of this valuable work to again be seriously questioned. Much attention is centered on the emails exposing universal emotions such as anger (gasp!) and personality disputes by the scientists involved.

One reader of the RealClimate blog posted this suggestion for better avoiding this kind of uproar in the future: “This is an argumnet for total transperancy in data and methods [sic].” Spelling aside, the more information about sources and methods, the more useful scientific projections and quantitative analyses can be. It’s hard to imagine that these hacked emails would have been able to cause this kind of brouhaha if we all knew more about how scientists go about doing their jobs.

In my own research, I try my best to review and scrutinize the assumptions behind models, projections, and statistics constantly – climate and otherwise – and it is more than a little surprising to me that more detail out how the climate science sausage is made, if you will, is causing such a major stir (as of this writing it was the top story for The Washington Post, a rare feat for a climate change story). What matters in the stats and assessments you look at obviously depends on how you plan to use them, and that varies even for us security-minded folks. I most often use data on trends such as climate change as indicators or to add detail for the sake of more clearly identifying security problems, and occasionally I look to detailed climate projections of specific regions to get a sense of how the effects of climatic changes might exacerbate or alter existing trends. For example, many of us at CNAS have been quite concerned about Yemen’s deterioration as of late, and it’s worthwhile in our general research to check what climate projections suggest the future may hold for that country in considering what U.S. policy for that country should be – particularly given that water scarcity is a major problem even today. The National Intelligence Council (NIC) also uses current scientific findings, data, and projections on climate change in its Global Trends reports, and more recently for a series of research papers and conference reports, The Impact of Climate Change to 2030, which studied potential climate change effects in six regions. I find the explanation of how they go about considering climate science in a security context particularly helpful (and brief):

  • “In the first phase, commissioned research reports explore the latest scientific findings on the impact of climate change in the specific region/country.
  • In the second phase, a workshop or conference composed of experts from outside the Intelligence Community (IC) will determine if anticipated changes from the effects of climate change will force inter- and intra-state migrations, cause economic hardship, or result in increased social tensions or state instability within the country/region.
  • In the final phase, the NIC Long-Range Analysis Unit (LRAU) will lead an IC effort to identify and summarize for the policy community the anticipated impact on US national security.”

What’s perhaps most important is that researchers like us account for what we don’t know about statistics and scientific assumptions as we use them in our research and analysis. The NIC 2030 reports state that their assessments identify “deficiencies in climate change data that would enhance the IC understanding of potential impact[s]” for regions of concern.

Ever since our climate change war game last year, we’ve also been interested in exploring how climate science is improving, and what kinds of information will be (or should be) available to policy makers in the future. When I finally finish it, I will cover this topic in a bit more depth as I review the book The Future of Everything: The Science of Prediction, which provides historical context to the scientific models and projections we use on a regular basis in our work. Jay and Will are also wrapping up a report on a recent CNAS project that explored how climate scientists and the security community collaborate and communicate that I think might be useful to many readers in the wake of this weekend’s big news. For now, I’m hoping that the exposure of climate scientists’ emails at least helps everyone to learn a bit more about scientific processes, and that it serves as a reminder that analysts in every field need to understand the context behind the information they use.