31 January 2011

David Goldston at CIRES on Friday

Friday, February 4, 2011
David Goldston
Director of Government Affairs, U.S. Natural Resources Defense Council

Loving Science to Death:
Problems at the Intersection of Science and Policy


University of Colorado CIRES Auditorium | 4:00-5:00 p.m. (directions to CIRES)

Light reception to follow in the CIRES Atrium

Why is the use of science in policy so fraught with political and substantive danger? What can be done to improve the use of science in the policy process? Is the situation improving or getting worse? The talk will address these questions, drawing on a variety of past and current examples from environmental policy that David Goldston has been involved in on and off Capitol Hill.


About this series

The Cooperative Institute for Research in Environmental Sciences — CIRES — seeks to promote global perspectives by sponsoring distinguished speakers whose work crosses disciplinary boundaries. The Distinguished Lecture Series is designed to bring outstanding scientists, as well as historians of science, science policy makers, and science journalists, and others who take imaginative positions on environmental issues and can establish enduring connections after their departure. Participants' interests embrace those of the University departments and programs, and the National Oceanic and Atmospheric Administration labs affiliated with CIRES.

For a current list of seminar offerings, visit: http://cires.colorado.edu/events/lectures

Many thanks to Jon and Elaine Krupnick for their generous support of CIRES' Distinguished Lecture Series.

30 January 2011

Updated 31 Jan 2011 -- Normalized Disaster Losses in Australia

Courtesy Ryan Crompton, the figure above shows the most recent insured loss estimates from the recent Queensland flooding based on an update from the Insurance Council of Australia (here in PDF).  The estimated costs of the flood have increased from A$1.2 billion to A$1.51 billion (and shown as the bright blue bar on the far right of the figure above -- and is not normalized (the normalized values would be lower) -- the other data shows the normalized losses from Crompton and McAneney 2008 updated through 2010, more details here). 

Writing in the Sydney Morning Herald, Ross Gittins offers a valuable perspective on the economic magnitude of the losses from the Queensland floods.  I'll skip over the issues of domestic politics that he discusses to focus on the comments that he makes about the magnitude of the losses.  He provides a useful bit of advice, which is exactly the advice that I give to my students (emphasis added):
The wise and much-loved econocrat Austin Holmes used to say that one of the most important skills an economist needed was ''a sense of the relative magnitudes'' - the ability to see whether something was big enough to be worth worrying about.

That sense has been absent from the comments of those business and academic economists on duty over the silly season, happily supplying the media's demand for comments confirming the immensity of the floods' economic and budgetary implications.
Gittins then gets into the numbers:
If this is the most expensive natural disaster in Australian history, all it proves is the cost of earlier disasters was negligible. If you can ''rebuild Queensland'' for just $5.6 billion, it must be a pretty tin-pot place.

If $5.6 billion seems a lot, consider some ''relative magnitudes'': the economy's annual production of goods and services (gross domestic product) totals $1400 billion, and the budget's annual revenue collections total $314 billion.

Note that, though no one's thought it worthy of mention, the $5.6 billion in spending will be spread over at least three financial years, making it that much easier to fund.

We know that more than a third of the $5.6 billion will be paid out in the present financial year with, presumably, most of the rest paid in 2011-12. So just how the flood reconstruction spending could threaten the budget's promised return to surplus in 2012-13 is something no one has explained.

And if $5.6 billion isn't all that significant in the scheme of things, how much less significant is the $1.8 billion to be raised from the tax levy? The fuss economists have been making about it tells us more about their hang-ups over taxation than their powers of economic analysis.
It turns out that domestic politics are difficult to avoid!  But what about the possible GDP impacts?  Gittins explains:
Turning from the budget to the economy, Treasury's estimate is that the floods will reduce gross domestic product by about 0.5 percentage points, with the effect concentrated in the March quarter.

Thereafter, however, the rebuilding effort - private as well as public - will add to GDP and probably largely offset the initial dip. So the floods will do more to change the profile of growth over the next year or two than to reduce the level it reaches.

Most of the temporary loss of production will be incurred by the Bowen Basin coal miners. But, though it won't show up directly in GDP, their revenue losses will be offset to some extent by the higher prices they'll be getting as a consequence of the global market's reaction to the disruption to supply.

And despite all the fuss the media have been making over higher fruit and vegetable prices, Treasury's best guess is that this will cause a spike of just 0.25 percentage points in the consumer price index for the March quarter, with prices falling back in subsequent quarters.

So the floods do precious little to change the previous reality that, with unemployment down to 5 per cent and a mining investment boom on the way, the economy is close to its capacity constraint and will soon need to be restrained by higher interest rates.

Talks this Week: UBC and Portland

[UPDATE 1/31: DUE TO A WINTER STORM IN COLORADO THE UBC TALK HAS BEEN POSTPONED UNTIL LATER IN THE SEMESTER, STAY TUNED]

I'm giving two talks this week.  If you are a reader of this blog, please do say hello.  Here are the details from the two talk announcements:

First, Tuesday at the University of British Columbia in Vancouver:
The Climate Fix: What Scientists and Politicians
Won’t Tell You About Global Warming


University of British Columbia
Tuesday February 1st 2011
12 - 1pm
AERL room 120

The world’s response to climate change is deeply flawed. The conventional wisdom on how to deal with climate change has failed and it’s time to change course. To date, climate policies have been guided by targets and timetables for emissions reduction derived from various academic exercises. Such methods are both oblivious to and in violation of on-the-ground political and technological realities that serve as practical “boundary conditions” for effective policy making. Until climate policies are designed with respect for these boundary conditions, failure is certain. Using nothing more than arithmetic and logical explanation, this talk provides a comprehensive exploration of the problem and a proposal for a more effective way forward.

ROGER PIELKE, Jr., has been on the faculty of the University of Colorado since 2001 and is a Professor in the Environmental Studies Program and a Fellow of the Cooperative Institute for Research in Environmental Sciences (CIRES). At CIRES, Roger served as the Director of the Center for Science and Technology Policy Research from 2001-2007. Roger’s research focuses on the intersection of science and technology and decision making. In 2006 Roger received the Eduard Brückner Prize in Munich, Germany for outstanding achievement in interdisciplinary climate research. Before joining the University of Colorado, from 1993-2001 Roger was a Scientist at the National Center for Atmospheric Research. Roger is a Senior Fellow of the Breakthrough Institute. He is also author, co-author or co-editor of seven books, including The Honest Broker: Making Sense of Science in Policy and Politics published by Cambridge University Press in 2007. His most recent book is The Climate Fix: What Scientists and Politicians Won’t Tell you About Global Warming (September, 2010, Basic Books).
And then Thursday evening in Portland:
Fixing Climate Through Energy Innovation
Illahee Lecture Series
Thursday February 3rd
7 PM at the First Congregational Church, 1126 SW Park Avenue in Portland
Tickets here

February 3, 2011 Political scientist Roger Pielke maintains that we'll make better progress on climate if we focus on energy innovation. Pielke is Professor of Environmental Studies at University of Colorado, and has held leadership positions at NCAR, CIRES, and the Breakthrough Institute. He is also author, co-author or co-editor of seven books, including The Honest Broker: Making Sense of Science in Policy and Politics, and The Climate Fix: What Scientists and Politicians Won't Tell you About Global Warming. More about Roger Pielke here and here.

28 January 2011

Science Meets Politics


If you happen to teach science and policy, or you are a scientist interested in participating in the policy process or you are just curious about how experts participate in the political process, then you'll find this short 10 minute video of interest.

It shows a scientist, an agronomist, testifying before the Washington State Senate Environment, Water and Energy Committee.  He gives a short testimony, then gets some questions from the State Senators and the State Senators then get into a debate, with one ultimately walking out.

Here are some questions for discussion after you see the video:

The scientist claims to be "objective" and speaking for science.  What might this mean?

The policy makers appear to have no interest in his science, and focus on his legitimacy and thus credibility.  What is going on here?

What might the scientist have done differently?

What might the policy makers have done differently?

What does this say about the relationship of science and policy in a highly politicized context?

Human Innovative Capacity Knows No Bounds

From Metro.co.uk:
The ultimate in fast food innovations, St Pauli have installed the model railway to serve supporters in their VIP section with freshly cooked sausages throughout the game.

The train runs every five minutes direct from the club kitchens to the VIP section - and is topped up with fresh porkers throughout each home match.

There's nothing better to wash down those dogs than a good pitcher of beer - so what could be more appropriate than individual pumps for each seat, AND a built-in flat screen TV for action replays?

How to Get to 80% "Clean Energy" by 2035

Motivated by Michael Levi at the CFR, I have put together a quick spreadsheet to allow me to do a bit of sensitivity analysis of what it would take for the US to get to 80% "clean energy" in its electricity supply by 2035, as proposed by President Obama in his State of the Union Speech earlier this week.

Here is what I did:

1. I started with the projections from the EIA to 2035 available here in XLS.
2. I then calculated the share of clean energy in 2011, assuming that natural gas gets a 50% credit for being clean.  That share is just under 44% (Nukes 21%, Renewable 13%, Gas 10%).
3. I then calculated how that share could be increased to 80% by 2035.

Here is what I found:

1. Coal pretty much has to go away.  Specifically, about 90% or more of coal energy would have to be replaced.
2. I first looked at replacing all the coal with gas, all else equal.  That gets the share of clean energy up to about 68%, a ways off of the target.
3. I then fiddled with the numbers to arrive at 80%.  One way to get there would be to increase the share of nukes to 43%, gas to 31% and renewables to 22% (Note that the EIA reference scenario -- BAU -- to 2035 has these shares at 17%, 21% and 17% respectively, for a share of 45% just about like today.)

What would this actually mean?

Increasing nuclear power in the EIA reference scenario from a 17% to 43% share of electricity implies, in round numbers, about 300 new nuclear power plants by 2035.***  If you do not like nuclear you can substitute wind turbines or solar thermal plants (or even reductions in electricity consumption) according to the data provided in The Climate Fix, Table 4.4.  The magnitude of the task is the same size, just expressed differently.

One nuclear plant worth of carbon-free energy every 30 days between now and 2035.  This does not even consider electrification of some fraction of the vehicle fleet -- another of President Obama's goals -- which presumably would add a not-insignificant amount to electricity demand.

Thus, I'd suggest that the President's clean energy goal is much more of the aspirational variety than a actual policy target expected to be hit precisely.

***[Math: (43/17)*898 (billion kilowatthours in 2035)/815 (bkWh in 2011) *109 (nuclear plants in 2011) = 304.16]

27 January 2011

Not a Sputnik Moment

From James Fallows' blog, Jorge and Paola Guajardo point to the image above and write:
At a time when China is making a lot of Americans nervous, what could be less threatening than its leader arriving in a Boeing-747, made in Seattle and bearing the Star Alliance logo?
Also, a commenter points to this spot-on article from The Economist, which also has the clever cartoon below.

The Problem with a "Sputnik Moment"

In the State of the Union address earlier this week President Obama invoked the notion of a "our generation's Sputnik moment."  I don't think that the symbolism works for several reasons.

First, Sputinik was a thing, a technology, that everyone could see as a tiny dot of light zipping across the sky at night. One day it did not exist and the next day it did. It did not take a great imagination to imagine that dot of light falling to Earth with a nuclear warhead attached. Sputnik was tangible, a discrete event that embodied both the symbolic and real fears of a nuclear war with the Soviet Union. It was indeed a unique moment that transformed U.S. politics in an instant.

Today's "moment" just doesn't compare for at least several reasons. First, there is no single thing out there, no technology that we can all see, fear and develop a shared understanding about. The technologies of everyday life are not, for the most part, threats, but rather the source of information, freedom, jobs, health and other good things. When an American charges his smart phone, I seriously doubt that he worries about whether the power source is built with technologies that may originate overseas. Second, a single enemy that threatened apocalyptic annihilation would tend to focus the mind. Today it is not even clear what the nature of our competition is with other countries, as we are bound together in a globalized world. Trade imbalances, patent applications and technology transfer hardly have the same mind-focusing quality as a nuclear war.

But these are fairly wonky criticisms.  At the Washington Post, Alexandra Petri, writing from the perspective of the Millennial generation offers a more fundamental and irreverent critique:
As far as I can understand it, [Sputnik] seems to have been something that Soviet Russia launched into space.

Apparently, thanks to the impetus that Sputnik gave us the last time, an entire generation of Americans committed to developing expertise in engineering, math, science, and technology that would enable us to convincingly fake a moon landing on a soundstage somewhere in 1969. This gave added emphasis to the Cold War. Given my advanced youth, I also missed the Cold War. I am accustomed to wars that are hot and distant, like certain men.

To people like me, the idea that there was ever just one team lined up across the field from us is a novel one. But this was the condition of Sputnik. Lyndon B. Johnson aide George Reedy exclaimed: "It really doesn't matter whether the satellite has any military value. The important thing is that the Russians have left the earth and the race for control of the universe has started."

So I couldn't help wondering: Could we ever have a Sputnik moment?

Frontiers? We live on them. In 1969, things were still analog. You didn't have to discard your devices after a few months because Steve Jobs had decided that light purple was the new purple. Now, if something is lasting, we look down on it. "The only thing that lasts these days are dead armadillos and those seasonal breads in the glass case at Starbucks," we point out. Ephemeral is the new permanent. We have the collective memory -- and persistent desire to mate with anything in sight -- of Viagra-addled mayflies.

This comes with many boons. Thanks to our insistence on living on the bubble of the present moment, our world is rife with unnatural wonders - iPhones, iPads, Clouds, memes, videos of cats in Japan stuffing themselves into boxes. When I have a sore throat, I can go online and describe my symptoms, and strangers from across the globe (or the part of the globe that follows me on Twitter, at any rate) can suggest that I drink blueberry syrup and hot toddies! This is the stuff!

Everyone admits that the world has shrunk. But this shrinkage has also closed the window for Sputnik moments.
She concludes:
But -- especially in the very fields President Obama was urging us to become competitive -- there isn't the same U. S. versus them imperative. Scientists across the world share resources, data, and equipment - applying to spend nights gathering data through radio telescopes in South America, or posting their findings online. They float together in the bowels of the International Space Station -- then post updates on Twitter. Our scientists don't innovate because "the Russians have left the earth and the race for control of the Universe has started." They innovate because our species is racing, in unison, to be faster, better, more efficient, and maybe someday it will slip the bonds of the solar system.
From a policy perspective, with its renewed focus on innovation the Obama Administration is certainly moving in an effective direction.  However, it needs to apply a bit of innovation to the narrative that it is using to characterize what it is up to -- a "Sputnik moment" isn't it.

Quote of the Day 2

Energy Secretary Steven Chu:
[O]n Wednesday morning, Secretary of Energy Steven Chu, reiterated the belief that energy is a "nonpartisan issue," as it means creating jobs and building wealth. "It doesn't matter because this is wealth creation," said Chu. "Rather than get in a debate about climate predictions, say this is a debate about our future prosperity."

Quote of the Day

From jstults in the comments:
In strategy the longest way round is often the shortest way there; a direct approach to the object exhausts the attacker and hardens the resistance by compression, whereas an indirect approach loosens the defender's hold by upsetting his balance.

Sir Basil Henry Liddell Hart

26 January 2011

Change Comes Fast

Last night President Obama outlined a far more oblique strategy related to climate that one might have countenanced as recently even just months ago, proving that politicians are far more adept at pivoting when reality intercedes than are advocates and pundits.

Here are a few observers having a hard time with the new political reality:

Joe Romm:
The President could not bring himself to utter the words “climate change” or “global warming.”  These omissions were depressingly predictable
Andy Revkin:
It’s one thing to cave to a wave of naysaying climate rhetoric and build a new American energy conversation on points of agreement rather than clear ideological flash points like global warming.

It’s another to duck and cover entirely on climate, as President Obama did in his State of the Union message.
Bryan Walsh:
[T]here's no avoiding the fact that a candidate who spoke of climate change as an existential threat on the 2008 campaign trail—and whose diplomats were still promising to reduce U.S. greenhouse gas emissions 17% below 2005 levels by 2020 as recently as last month in Cancun—didn't mention the term "climate change," nor "global warming," nor "carbon."
David Roberts:
Obama said ... nothing about climate change. It didn't come up.

This is a failure on Obama's part. A moral failure, a failure of leadership, but also, I would argue, a political failure.
Buck up guys. Sometimes you have to take an indirect path to where you want to go -- here is a blurb from John Kay's neat little book, Obliquity:
If you want to go in one direction, the best route may involve going in another. This is the concept of ‘obliquity’: paradoxical as it sounds, many goals are more likely to be achieved when pursued indirectly. Whether overcoming geographical obstacles, winning decisive battles or meeting sales targets, history shows that oblique approaches are the most successful, especially in difficult terrain.

Obliquity is necessary because we live in an world of uncertainty and complexity; the problems we encounter aren’t always clear – and we often can’t pinpoint what our goals are anyway; circumstances change; people change – and are infuriatingly hard to predict; and direct approaches are often arrogant and unimaginative.
I am amazed to see views that have been espoused by The Breakthrough Institute, in The Hartwell Paper and The Climate Fix go from being outside the mainstream perspective on climate policy to being highly consistent with the approach now being advocated by the US President.  This is good news for climate policy and politics, even if it is hard for some to accept.

25 January 2011

State of the Union Word Cloud

Updated -- Normalized Disaster Losses in Australia

[UPDATE: The data in the graph above was updated on 31 January 2011 and presented here.]

Ryan Crompton at Macquarie University has updated the Crompton and McAneney (2008) analysis of disaster losses in Australia to account for the estimated losses from the 2011 Brisbane floods, as well as each of the years following their analysis to the present.  You can see the results in the figure above.  The figure shows meteorological disasters (bushfire, flood, hailstorm, thunderstorm, tropical cyclone) for the period 1966 to present using an apples-to-apples comparison of losses across hazards.

Here is a note that Ryan sent me which explains the figure:
The figure shows the Crompton and McAneney (2008) Figure 2(b) adjusted to include the seasons 2006/07 - 2009/10 and the loss from the recent Queensland floods (the only loss included in the current 2010/11 season). The losses in the four complete seasons added to the time series have been normalised to 2005/06 values as per Crompton and McAneney (2008) so that losses have been reduced from their original values for consistency with the other data. The Insurance Council of Australia (ICA) has released updated figures (PDF) on the insurable losses in Queensland as a result of flooding, with the current estimate at $1.2 billion. With the final loss from this event not yet known and likely to increase, the original loss of $1.2 billion has been included in the above figure rather than the equivalent 2005/06 normalised value (which would be a bit lower).
You can find Crompton and McAneneny (2008) here, and if you do not have access to that journal you can read a very nice summary of that work here in PDF.

Writing with Kevin Roche, John McAneneny discussed lessons of the recent floods in a op-ed:
The city of Brisbane, like many other towns in Queensland, is built on a floodplain.

Compared with the current disaster, there have been even bigger floods in the past: in 1841 and 1893 when flood waters topped 8.35 meters, some 3.9 m above the latest peak.

After the 1974 floods, the senior engineer with the Hydrometeorology Branch of the Bureau of Meteorology, G. Heatherwick, warned that heavier rainfalls were possible over the Bremer river catchment with higher flooding in Ipswich, independent of the flood mitigation effects of the Wivenhoe Dam.

This is not to deny that climate change is a real concern: few continue to believe that only positive outcomes will arise from the continued heating of the planet.

The latest research, however, just published in the international journal Environmental Research Letters by Ryan Crompton, Roger Pielke Jr. and John McAneney suggests that it may be centuries until we can be confident that climate change is influencing disaster losses.

If we truly wish to reduce the scale of future disasters in Australia, we need risk-informed land planning policies with risks appropriately priced by an active insurance market.

In simple terms, for flood and bushfire, this means an end to unmanaged development of flood plains or within bushlands.

In the Black Saturday fires, studies by Risk Frontiers showed that 25% of the home destruction in the most affected towns of Marysville and Kinglake took place physically within bushlands; 60% occurred within 10 m of bushland boundaries.

At these distances, chances of home survival are low and not surprisingly few resisted the flames.

At the beginning of last century, lives and property were lost as people lived and worked in the bush; today the problem remains as people chose to live in or near the bush for lifestyle reasons. The analogy to floodplains is obvious.

Extreme weather is not new to Australia with some 95% of all property losses in natural disasters since 1900 attributable to flood, hail, bushfire or tropical cyclones.

However, excluding the case of improved construction standards for residential homes in cyclone-prone areas of the country, successive Australian governments, at all levels, have failed to address shortcomings of land policy solutions to mitigate against the potential impacts of natural disasters.

In the emotive days after the Black Saturday fires of 2009, the then Australian Prime Minster, Kevin Rudd, stated that we would rebuild impacted communities "brick by brick."

There was no immediate consideration about reducing risks. Let's hope the official reaction to these floods is more measured.

Energy Efficiency Misconceptions

Over at The Breakthrough blog Harry Sauders has an interesting and provocative post up about energy efficiency and the so-called rebound effect, in which he critiques 6 popular misconceptions held in policy debates over energy. 

The misconceptions are:
Misconception #1: End use energy consumption is all that matters
Misconception #2: Rebound must be small because we can't re-spend all energy efficiency savings on energy
Misconception #3: The Income/output component of rebound must be small
Misconception #4: Energy is being "decoupled" from the economyMisconception #5: Macro rebound effects are discernible by considering micro effects
Misconception #6: Efficiency gains happen only in energy
The post is worth reading in full.  As is the case with many important issues, the issues of efficiency gains and its impacts on energy consumption are, in Sauders words, "both subtle and complex."  Saunders concludes:
And they create some thorny issues for policy makers that will need to be addressed. Most importantly, energy analysts must cease utilizing a far-too-simple assumption that efficiency gains yield direct and linear reductions in energy use. The complex and varied economic phenomena known collectively as "rebound effects" mean that we cannot expect that improving the energy efficiency of steel production by 30 percent, for example, will yield a simple and direct 30 percent reduction in the energy consumed by the steel sector, let alone the economy as whole. Just as economists expect that gains in labor productivity will contribute to greater employment overall, not less, gains in energy productivity (aka energy efficiency) are not likely to be taken up simply as direct reductions in energy demand overall.
Have a look and feel free to come back here to discuss.

22 January 2011

FT Column on Disasters and Climate Change

Writing in today's FT, Simon Kuper has a great essay on disasters and climate change that draws on some of my work.  Here is the bottom line:
When it comes to preventing today’s disasters, the squabble about climate change is just a distraction. The media usually has room for only one environmental argument: is climate change happening? This pits virtually all climate scientists against a band of self-taught freelance sceptics, many of whom think the “global warming hoax” is a ruse got up by 1960s radicals as a trick to bring in socialism. (I know, I get the sceptics’ e-mails.) Sometimes in this squabble, climate scientists are tempted to overstate their case, and to say that the latest disaster proves that the climate is changing. This is bad science. It also gives the sceptics something dubious to attack. Better to ignore the sceptics, and have more useful debates about disasters and climate change – which, for now, are two separate problems.
Read the whole thing.

20 January 2011

Societal Change Happens Fast

Shanghai 1990 vs. 2010.  (h/t Lowy Interpreter)

Corruption Kills

Nicholas Ambraseys and Roger Bilham (a colleague here at Colorado) have a paper in Nature this week on the relationship of an index of corruption and earthquake fatalities.  Their analysis offers some quantitative support for what has long been assumed to be the case:
In sum, there is statistical support for widespread anecdotal evidence of a correlation between corruption and loss of life in earthquakes. Haiti and Iran are extreme examples of nations where fatalities from earthquakes are excessive and where perceived levels of corruption are above average. The statistics also support last year's widely voiced opinions that the probability of earthquake-related deaths is less a function of geography and more the ability to afford earthquake-resistant construction and to enforce building codes.

Sadly, these figures have no predictive value. Moreover, even if corrupt practices were eliminated, many present-day impoverished nations will have inherited a building stock that to some degree incorporates the products of corrupt practices. The problem of what to do about these existing poorly built constructions is particularly difficult, if not economically insoluble.
Their analysis suggests that direct aid to foreign countries for the purposes of rebuilding may not be effective:
But our analyses suggest that international and national funds set aside for earthquake resistance in countries where corruption is endemic are especially prone to being siphoned off. The structural integrity of a building is no stronger than the social integrity of the builder, and each nation has a responsibility to its citizens to ensure adequate inspection. In particular, nations with a history of significant earthquakes and known corruption issues should stand reminded that an unregulated construction industry is a potential killer.
Here is the caption that goes with the figure at the top of this post:
Corruption versus the level of corruption that might be expected from per capita income. Of all earthquake fatalities attributable to building collapse in the past three decades, 82.6% occur in societies that are anomalously corrupt (left-hand corner of the plot).

Budget Scale

Over the next few weeks in my graduate seminar we'll be discussing the mathematics of government budgeting, with a focus on the US budget deficit. 

A first task will be to understand the magnitudes of large numbers.  This website has some very useful graphics.  Here is what one million dollars looks like in $100 bills:

Here is what one billion dollars looks like:
And here is one trillion dollars (the dude in the red shirt can be seen standing next to the corner in the lower left!):
The US federal budget deficit in 2010 is estimated to be $1.6 trillion in a budget of about $4 trillion (source, XLS).

19 January 2011

Hurricane Damage Risk and Predictions

Karen Clark and Co. have released an updated report (here in PDF, the table shown above comes from the report) evaluating the performance of short-term predictions of US hurricane damage from catastrophe modeling firms (h/t ClimateWire).  Clark, one of the founders of the cat modeling industry, finds that the near-term cat model predictions have come up short
We have now completed the first five-year near term hurricane model projected period, and actual insured loss experience has been well below the level of the model predictions. Four of the past five years have had minimal insured property loss from Atlantic tropical cyclones, well below both the long term average and the (much higher) near term projections. To date, the catastrophe models have not demonstrated any skill in projecting near term hurricane losses.

The first decade of the 21st century was equal to the long term average with respect to hurricane landfall frequency and loss experience. This average decade was preceded by several decades of below average activity. This means the even the long term standard hurricane model activity rates are higher than what has been experienced since the models were first introduced in the 1980’s.
These conclusions reinforce the results of my analysis of  short-term landfall and damage projections (PDF).  The figure below comes from that paper and shows the historical record of US hurricane landfalls from 1851-2008.
There has been no upward trend in hurricane landfalls or damage over the period of available data.

Looking globally at all hurricanes and tropical cyclones, there are no upwards trends in frequency (top graph below, from R. Maue) or intensity (bottom graph below, from R. Maue) over the period of record.

To some degree the issue of short-term hurricane risk has become tangled in the debate over human-caused climate change and its long-term effects on hurricane behavior.  These issues should not be conflated for the simple reason that even taking predictions of a significant human influence on hurricane behavior as a given, it will be many, many decades before that signal can be seen in the damage record.  It is simply logical that a signal that cannot be seen for decades is not immediately relevant to judgments of near-term risk.

Make no mistake, even in the absence of a demonstrable signal from human-caused climate change there will be large losses from hurricanes in coming years, and there will be extended periods of above average losses.  In 2005 I warned of the possibility of a $500 billion hurricane by 2020.  Unfortunately, we cannot predict skillfully hurricane landfalls or damage on short-time scales.  The most recent five-year period underscores this reality.

In the context of the very large variability in hurricane behavior, the more important questions are thus not about climate change, despite the moth-to-a-flame-like attraction of that topic. Rather, the issue is one of risk management in the face of uncertainty and ignorance.

Insurance and reinsurance are not supposed to operate like games of chance, rather they are supposed to take chance out of the equation.  Rather than trying to do the impossible -- demonstrate predictive skill in hurricane losses on short timescales -- the industry might instead start thinking creatively about approaches to insurance and reinsurance that make the industry more robust, and less like a casino.

Here is the advice that I concluded my 2009 paper (PDF) with:
So what might a decision maker concerned about hurricane landfalls or damage over the next one to five years actually do?

The recommendation here is to start with the historical data as a starting point for judging the likelihood of future events and their impacts. Figure 6 shows the frequency of landfalling hurricanes per year for the period 1851–2008 (other time periods are shown in Table 2, and decision makers may wish to use a record that starts in 1900 for data quality reasons). Similarly, Figure 7 shows the same data but for running five-year periods from 1851 to 2008.

A decision maker may have reasons to hedge his or her views of these distributions in one
way or another, and (s)he will certainly be able to find a scientific justification for whatever hedge (s)he prefers (see Murphy, 1978).  However, it is important to recognize that any decision to adjust expectations away from those in the historical record represents a hedge. Reasons for hedging might include risk aversion or risk-seeking behaviour, a gut feeling, trust in a subset of the expert community, a need to justify decisions made for other reasons and so on. But at present, there is no single, shared scientific justification for altering expectations away from the historical record. There are instead many scientific justifications pointing in different directions.

Starting with the historical record allows for a clear and unambiguous identification of hedging strategies and justifications for them. An ability to distinguish between judgements that can be made based on empirical analysis and those that are based on speculation or selectivity is an important factor in using science in decision making. Such a distinction can also help to identify the role that financial or other interests play in the choice of relevant science in a particular decision process.

Given that the climate system is known to be non-stationary on various timescales, there are of course good reasons to expect that uncertainties may be larger than the variability observed in the past, given that the climate system can assume modes of behaviour not observed over the past century and a half. Each decision maker should carefully evaluate how unknown unknowns might influence their judgements. In addition to
decision making under conditions of uncertainty, decision makers need also to make judgements under conditions of ignorance, where uncertainties cannot be known with certainty.

Decision makers will continue to make bets on the future and, just like in a casino, some bets will prove winners and some will be losers. But over the long term those who do the best in the business of decision making related to hurricane landfalls and their impacts will be those who best match their decisions to what can and cannot be known about the uncertain future. And such wisdom starts with understanding the historical record and why the scientific community cannot produce skilful forecasts of future landfalls and damage for the foreseeable future.

Pielke, Jr., R.A. (2009), United States hurricane landfalls and damages: Can one-to five-year predictions beat climatology?. Environmental Hazards 8 187-200,doi: 10.3763/ehaz.2009.0017

18 January 2011

Large Balls

[Image from The Australian, see the original here in PDF]

Wivenhoe Dam near Brisbane, Australia is at the center of controversy in its role in the recent flood. The dam, as is commonly the case, is expected to serve two seemingly contradictory functions.  On the one hand it is a buffer against drought, meaning that it is desirable to keep it more full in the eventuality of low precipitation.  On the other hand, the dam is a buffer against floods, meaning that it is desirable to keep it more empty in the eventuality of heavy precipitation.  Since keeping the reservoir full and empty are not simultaneously possible, it then is necessary to balance these objectives.  Since future precipitation is uncertain, the dam's management is thus a matter of decision making under uncertainty (where risks are known) and ignorance (where they are not).

The Queensland government has initiated a high-level investigation of the dam's management during the flood.  It will focus on the decisions related to storage and release.  According to The Australian, publicly available evidence points to the dam management as a key factor in the magnitude of the Brisbane flood:
More than 80 per cent of the flood in the Brisbane River at its peak last Thursday was the direct result of the release from Wivenhoe, the city's flood shield, of up to 30 per cent of its capacity, according to official data obtained by The Australian. The data shows that, without the unprecedented and massive release at a peak rate of 645,000 megalitres a day from the dam on Tuesday, January 11, the flooding in Brisbane would have been minimal.
Andrew Dragun, an adjunct professor in economics at the Australian Rivers Institute, Griffith University and editor of the International Journal of Water, critqued the management of Wivenhoe as follows:
In the days before the flood, the BOM warned of an upper level low pressure system dumping a large amount of rain over southeast Queensland. The warning came late in the first week of January and was visible on the BOM interactive weather and wave forecast maps. Ironically, the system slowed off the coast for a few days, giving operators plenty of time to make any adjustments to capacity levels. But the gates remained shut on the 100 per cent capacity.

As the low system dumped rain, the operator opened the gates, releasing about 116,000 megalitres on Friday-Saturday, with releases of 100,000ML over the next two days. Despite these releases the dam level rose to 148 per cent by Monday last week.

By Tuesday, with the dam at 176 per cent, the operator released a phenomenal 645,000ML. The result was bound to be bad. Significant flooding and tears, all while Dannien expressed his "confidence . . . that everything happened the right way".

Consequently, the flood peaked in Brisbane on Wednesday at 4.46m. However, the inflows from the catchment were surging, and the dam reached a capacity of 191 per cent.

Anna Bligh admitted that the operators nearly lost control, with water only 90cm from spillway fuse plugs. If triggered, the plugs would have released a torrent of water to the system. The results could have been catastrophic.
Why didn't the dam operators empty the reservoir further in the days and weeks before the storm?

There appear to be at least two important  reasons. One has to do with the psychology of decision making:
Retired engineer Ian Chalmers, a key project supervisor in the construction of Wivenhoe Dam between 1977-85, defended the decisions of the operators in the past week, adding they will do a better job next time.

"These questions are all valid, but put it this way - you would have to have very large balls to [significantly reduce the dam's volumes in the months after the weather warnings] after 10 years of drought, because if you had got it wrong you would be accused of wasting the water," Chalmers said.
In other words, after 10 years of drought it would have been very easy to see a reservoir full of water not so much as a risk of flooding but instead as a buffer to risk of drought.  As well, imagine the outcry if the flood had not occurred and in the next prolonged dry spell, observers point back to all that water was allowed to flow downstream.  Chalmers is suggesting that it is understandable that the dam's operators erred on the side of protecting against drought rather than floods.

Of course, the gets us back to the dual purposes of the dam, and the need to trade off risks of drought with risks of floods.  The only way to make that trade off go away it to have enough storage capacity to make that trade-off go away.

The second reason for not emptying the dam is that according to Professor Dragun the dam operators did not consider the state of the ENSO cycle, which has profound impacts on the Australian climate:
What have the Wivenhoe Dam operators been doing for the past couple of months? According to SEQ Water Grid chief Barry Dennien, dam levels were managed according to the rules and strictly by the operating manual. Dennien is comfortable that "everything happened the right way".

It seems the manual and the operator do not differentiate between the weather outlook of an El Nino (dry drought) and a La Nina (rain, flooding). After the drought, Wivenhoe reached 96 per cent of its supply capacity on March 16, 2010, and has been maintained at that level or higher since.
If it is in fact the case that Wivenhoe is managed without regard to ENSO, then this would be a case of decision making under willful ignorance, rather than decision making under uncertainty, as the ENSO signal is extremely strong in Australia and has a demonstrable influence on the probabilities of extreme (high and low) precipitation.  Thus, when Wivenhoe Dam operators say that they did everything by the book, they may indeed be correct, but at the same time "the book" may have led them astray.

While the Queensland flood inquiry will focus on hydrology and the dam's management, there will be deeper issues here of decision making under uncertainty and ignorance, and how such decisions should be made in the future.

17 January 2011

How Big is China?

One of the themes of my graduate seminar this semester on quantitative methods of policy analysis is to develop tools that enable more intuitive understandings of the numbers that we encounter in policy analysis.  As the course goes along I'll be posting up examples of efforts to achieve such understandings, as I did last week.

Here is another excellent example of an effort to create an intuitive understanding of something that we hear about every day, but most of us hardly understand -- China.  The following comparison comes courtesy Thomas Barnett via James Fallows. How big is China?  Fallows explains:
If Americans wanted to imagine what it would take to be "strong" in the way China currently is, [Barnett] said, all we'd have to do is think of moving the entire population of the Western Hemisphere into our existing borders. Every single Mexican. (Rather than enforcing the southern border, we'd require everyone to cross it, headed north.) Every Haitian, Cuban, and Jamaican. Everyone from Central America. All 190 million from Brazil. And so on. Even the Canadians. China, by the way, is just about the same size as the United States, though a larger share of its land area is desert, mountain, or otherwise nonarable.

If we did that, we'd be up to about a billion people -- and then if we also took every single person from Nigeria, and for good measure everyone in hyper-crowded Japan too, we'd finally be up to China's 1.3 billion size. At that point, like China, we'd have tremendous scale in everything. Rich people. Big businesses. A huge work force. Countless numbers of multi-million population cities. And we would also have a tremendous amount of poverty, plus pressure on resources of every kind, from water to food to living space. Just as China does now. Scale gives China some strengths. But it also creates tremendous challenges, as Americans would recognize if we thought about this prospect for even a minute. Seriously, reflect on this, and consider that it is China's reality now.
I for one now think differently about the size of China than I did before reading this comparison.  

14 January 2011

Another Deal for Joe Romm

Joe Romm shows up in the comments of an earlier thread and makes a request of me:
Now it's time for you to concede that R&D alone can't possible deliver on the massive scale in the timeframe needed to achieve the stabiliation at around 450 ppm CO2 that we both agree on.
Joe obviously hasn't done his homework, so I have, once again, offered him a deal:
While you are free to define a "wedge" however you like, you are not free to assign to me views that I do not hold.

I am happy to concede that "R&D alone" cannot result in low stabilization goals (in the same way I will will concede to you that the earth is not flat, ManU is thus far undefeated this season and 2+2 = 4).

In fact, if you actually read what I have written, you'd already know that (according to Brad DeLong you review books without reading them, tsk tsk).

So let me extend an offer -- I will send you a free copy of The Climate Fix. In return you will agree to read it and write up a review which I will post here on my blog, unedited. Be as critical as you like, but don't make things up.

Until you do so, you then agree to stop mischaracterizing my views simply because you do not know what they are.

If these terms are unfair, then just explain. Deal?

13 January 2011

The Economies of US States


The Economist has this creative graph, showing how the US states compare to various countries in terms of GDP (and population).

Colorado is compared to Thailand, but it could have also been Portugal.  A big difference in comparing US states to other countries is that Colorado, like virtually every other states, has to operate under a mandated balanced budget.  So even though Portugal and Colorado share similar size economies, Portugal's economy matters far more in the global economy than does the Colorado economy.

Effective media reporting of sea level rise projections: 1989–2009

[UPDATE 1/19: Nature Climate Change selects this paper as a "Research Highlight".]

We have a new, peer-reviewed paper just out on media coverage of climate change, specifically sea level rise to 2100. We find that overall the major print media in the US and UK has done a nice job reporting on this topic.  This post describes our paper and its findings.  The image above comes from the paper and shows (a) media reports of predicted sea level rise to 2100, (b) IPCC projections of sea level rise to 2100, and (c) projections of sea level rise to 2100 found in the peer-reviewed literature.

The print media is often the subject of criticism for its coverage of climate change.  The criticism usually occurs in the context of a high-profile article that this or that person happens to disagree with.  Since there are varied agendas and perspectives on climate change it is virtually certain that someone in the climate debate is not going to like pretty much any article, leading to a steady chorus of criticism.

This has led my colleague Tom Yulsman here at the University of Colorado to comment:
[D]uring this past year, environmental journalists have been the subject of lots of criticism, often vituperative, from both sides in the climate change wars.

If you read any number of partisan climate bloggers who claim to carry the torch of scientific truth, we’re mostly stupid, we’re hopelessly biased, we’re carrying water for warmist scientists, or we’re stenographers who copy down whatever the denialists have to say because we’re too dumb to know what false balance is.

It might be tempting to conclude that since we’re catching hell from both sides, on balance we’re probably getting it about right. But I think the topic is too overwhelmingly complex, and there are too many people covering the issue in myriad ways (daily reporters, magazine writers, bloggers, documentarians, even formerly ink-stained-wretch academics like me), to make such a sweeping generalization.
Tom is right -- one can be led astray by relying on anecdotal impressions to assess the quality of reporting on any topic.  So to get a better understanding of media coverage of climate change we decided to investigate the issue quantitatively.

Led by our former post-doc Ursula Rick, I along with Max Boykoff asked a straightforward question: How well did the print media represent scientific predictions or projections of sea level rise to 2100?  We picked sea level rise to 2100 because it is so often used and it is also an objective measure.  To conduct out analysis we looked at seven major newspapers in the US and the UK (New York Times, Washington Post, Los Angeles Times, Financial Times, The Times (London), The Guardian and The Telegraph).

We found that the major print media in the US and UK, with a few exceptions, was generally successful in its reporting, and concluded in our paper:
The numbers and ranges reported suggest, in aggregate, reporting on sea level rise among the sources that we have examined has been consistent with scientific literature on the issue.

U K Rick et al 2011 Environ. Res. Lett. 6 014004 doi: 10.1088/1748-9326/6/1/014004
You can read our full analysis here. Comments welcomed!

12 January 2011

Brisbane Floods in Historical Context

The graph above shows flood peaks in Brisbane from 1841 to 2011.  The data comes from the Australian Bureau of Meteorology, which published this chart in November 2010.  I added the red bar showing the peak of the current flood at the same location -- at about 4.5 m -- found at this site at the Bureau of Meteorology.

Here is how a BoM report (PDF) following the 1974 flood described the historical context, and offered a prescient warning (emphases added):
Prior to 1900 flooding occurred quite frequently at 1 to 8 year intervals and in one year (1893) four separate floods were recorded. Since 1900 flood rainfall has been much less frequent and the interval between floods has become much longer. Furthermore, dredging and other changes to the hydraulic character of the channel, together with the effect of Somerset Dam have reduced most floods in Brisbane in recent years and have eliminated the smaller floods. . .

. . . flooding is most common in the usual wet season months of January. February and March, and floods are rare from July to December.

The earliest flood recorded was in 1841. Its exact height is uncertain but it was said to be the highest flood known at that time. In 1857 (flood peak 4.42 m) a good deal of land, now the prestige suburb of St Lucia, but then a dense vine scrub, was submerged and in 1864 (peak 4.92 m) flood waters extended from the junction of Oxley Creek and the Brisbane River to the high land at the back of Coopers Plains, a distance of about 11 km. In the 1867 flood the original wooden bridge at the site of the Victoria Bridge was destroyed, and in January 1887 (peak 4.92 m) Bowen Bridge was washed away.

Three floods occurred during February 1893. During the first (peak 9.51 m) the ship Elamang and the gunboat Paluma were carried into and left aground in the Brisbane Botanical Gardens, and the ship Natone was stranded on the Eagle Farm flats. The Indooroopilly railway bridge and the north end of the old Victoria Bridge were washed away. Nine days later a second minor flood was experienced which attained a height of only 3.29 m. However, a week after that there was another major flood (peak 9.24 m) which carried the stranded Elamang, Paluma and Natone back into the Brisbane River!

Prior to January 1974 no flood this century had exceeded 4.5 m at the Brisbane Port Office. The last river flood of any consequence occurred in 1931 (peak 4.48 m), although in recent years there have been several severe floods in the Brisbane metropolitan creeks (in June 1967 and February and April 1972).

Because of changes in the physical characteristics of the river and its catchment, it is very difficult to calculate return periods for flooding in Brisbane. However, four floods well in excess of the 1974 levels have occurred in the past 133 years and, according to the Professor of Economic Geology at the University of Queensland (Professor Sergent), there is geological evidence of water levels 5.5 m higher than the 1974 flood in the Indooroopilly area of Brisbane.

Meteorological studies suggest that rainfalls well in excess of those recorded in the floods of 1893 and 1974 are possible. Therefore it seems certain that unless major flood mitigation schemes, such as the proposed Wivenhoe Dam, are implemented, floods even greater than those of 1974 will again be experienced in Brisbane.

One-Year Anniversary of the Haiti Earthquake

The graph above comes from my colleague Roger Bilham here at the Universisty of Colorado.  It shows historical earthquakes as a scatterplot of deaths and magnitude.  Highlighted on the graph of the 2010 Haiti and Chile earthquakes.  The Chile earthquake was more than 500 times more powerful than the Haiti quake, but the loss of life was 200 times more in the Haiti quake.

Roger distilled lesson of the Haiti quake soon after in Nature (PDF):
The future global burden of local earthquakes could be significantly reduced if minimal construction guidelines were mandated in all the world’s cities, and especially in those with a history of previous earthquakes. The projected doubling in world population means that we are constructing more buildings now than at any time in our history10,11. In recent earthquakes, buildings have acted as weapons of mass destruction. It is time to formulate plans for a new United Nations mission — teams of inspectors to ensure that people do not construct buildings designed to kill their occupants.
A more comprehensive analysis can be found in Roger's paper on "The Seismic Future of Cities" (PDF).

One year later the disaster is still unfolding.  Doctors Without Borders/Médecins Sans Frontières (MSF) reports:
One year after a devastating earthquake killed an estimated 222,000 people and left 1.5 million people homeless, Haitians continue to endure appalling living conditions amid a nationwide cholera outbreak, despite the largest humanitarian aid deployment in the world, said the international medical humanitarian organization Doctors Without Borders/Médecins Sans Frontières (MSF).

While overall access to basic healthcare has improved since the earthquake, the rapid spread of cholera across the country underscores the limits of the international aid system in responding effectively to new emergencies. International agencies must live up to the commitments made to the Haitian people and to donors by turning promises into more concrete actions, said MSF.

Urgent humanitarian needs must be met while long-term reconstruction plans are pursued. The overall health of the population and the ability to contain the risk of disease outbreaks depend on improving water and sanitation and ensuring that the one million people still living in tents have access to sufficient transitional shelter.
You can see the full MSF report here in PDF and see a MSF video report from Haiti below.  And of course, you can donate in support of their good work here.

11 January 2011

Joe Romm Finally Gets His Math Right

It has taken two years, but Joe Romm finally appreciates the true mathematical scale of the energy technology challenge implied by a goal of stabilizing carbon dioxide concentrations at a low level. 

Joe writes that we need to achieve 12-14 "wedges" of carbon-free energy.  What does a "wedge" imply to Joe?
. . . to do this [one wedge] by 2050 would require adding globally, an average of 17 [nuclear] plants each year, while building an average of 9 plants a year to replace those that will be retired, for a total of one nuclear plant every two weeks for four decades — plus 10 Yucca Mountains to store the waste.
If one wedge implies a need for 26 nuclear plants per year, then 14 wedges implies 26 * 14 = 364 plants per year, or the equivalent effort of one nuclear power plant per day from now until 2050.  Obviously, assumptions mean that it could be a little more or a little less.  And the use of nuclear plants here is simply to illustrate the scale of the challenge, not to propose or suggest that this is even remotely possible or desirable.

Joe's conclusion is just about the exact same conclusion that you'll find on p. 116 of The Climate Fix.  Nice work Joe!

Crompton et al. 2011

The paper that I discussed here last week on the detection of a signal of human-caused climate change in the hurricane loss record has now been published in Environmental Research Letters.  Here are a direct links to the PDF and the Supplementary Information (PDF). 

Here is the citation: Ryan P Crompton et al 2011 Environ. Res. Lett. 6 014003 doi: 10.1088/1748-9326/6/1/014003

Comments and questions welcomed -- Enjoy!

10 January 2011

More Deconstruction of “Science Integrity”: The President’s Memo Principles

[THIS IS A GUEST POST FROM A REAL LIVE US GOVERNMENT SCIENTIST, SHARON FRIEDMAN. HER VIEWS EXPRESSED HERE ARE HER OWN. SHARON BLOGS AT A NEW CENTURY OF FOREST PLANNING. THIS POST CONTINUES A DISCUSSION OF PRESIDENT OBAMA'S 2009 SCIENTIFIC INTEGRITY MEMO FIRST POSTED HERE.  

NOTE: THE WORD CLOUD ABOVE IS OF THE PRESIDENT'S MARCH, 2009 MEMO]

Some of my questions below have been clarified in the recently published guidelines; the next post in this series will be deconstructing the proposed guidelines themselves.
(7) By this memorandum, I assign to the Director of the Office of Science and Technology Policy (Director) the responsibility for ensuring the highest level of integrity in all aspects of the executive branch's involvement with scientific and technological processes.
Again, I am not sure if the term “integrity” is used in its sense as “wholeness” or its sense as “moral.” Certainly it is a word that it is hard to be against, because it means generally “good.” But note that this is the “executive branch’s involvement with scientific and technological processes.”

What exactly are “scientific and technological processes?” Let’s list them. First, there is determining the science and technology budget, and what goes to which agencies for what kind of research. Then there is the process of determining what is studied, which disciplines are included, how the questions are generally framed, and running the panels to determine who gets the grants. Finally, there is checking on the validity of the scientific products through peer review and other processes.

But again, I thought the problem that initiated all this work was about how science is used in policy processes, not about how “scientific processes” themselves are run. If we are going to “let the punishment fit the crime,” seems like the first step is clarifying the crime.

The principles (8):
(a) The selection and retention of candidates for science and technology positions in the executive branch should be based on the candidate's knowledge, credentials, experience, and integrity;

In addition to my other questions in the previous post about “what kind of integrity,” it’s not clear to me what a “science and technology position” is. There are many, many positions throughout the agencies that require a technical background of some kind. Are “science and technology” positions those related to administering and conducting research only, or to application of different scientific fields to the mission of the agencies? Where do you draw the line, or do you need to?

(b) Each agency should have appropriate rules and procedures to ensure the integrity of the scientific process within the agency;
See (7).

(c) When scientific or technological information is considered in policy decisions, the information should be subject to well-established scientific processes, including peer review where appropriate, and each agency should appropriately and accurately reflect that information in complying with and applying relevant statutory standards;
This statement sounds to me vaguely like the old Data Quality Act, which also required small armies of GS-14s and 15’s in D.C. to write guidelines for each agency. The purpose of the Data Quality Act was to "provide policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies". Here we see the word “integrity” again, but used in a different way.
In the Wikipedia cite to the DQA we find this quote:
The DQA has been criticized by the scientific community and journalists as a ploy of corporations and their supporters to suppress the release of government reports contrary to their economic interests.

"As subsequently interpreted by the Bush administration . . . the so-called Data Quality Act creates an unprecedented and cumbersome process by which government agencies must field complaints over the data, studies, and reports they release to the public. It is a science abuser's dream come true" (Chris Mooney, The Republican War on Science [New York: Basic Books, 2005], p. 103).
I remember when we were working on the DQA responses within agencies, we also attempted to deal with the question David Bruggeman raised in his comment here:
“Much research utilized to inform policy will have been conducted with no knowledge of how to inform policy or intent to inform policy. Can we realistically expect the kind of peer review (essentially a second round of it) and QA/QC that you would like to see, as it would have to be done retroactively?”
The problem remains the same problem as when we were dealing with the DQA. If you have quality considerations for the use of science in policy, it would restrict the science used to that which had been designed to be used in policy. I suppose instead you could have two separate classes of scientific information, “designed to be used for policy” and “other” with different guidelines for the use of each class.

I also question the words “appropriately” and “accurately.” I think anyone in the trenches of dealing with this recognizes that using any piece of research “appropriately” and “accurately” is a value judgment. I think it’s a great science education opportunity to have public discussion about what is appropriate and accurate for any piece of scientific information with regard to a particular piece of policy. Yet the principle as articulated here seems to assume that there is one right answer. This is the linear model of research to policy, which, of course, scientists who study science policy have shown to be incorrect. At least I think that is an “appropriate” and “accurate “reflection of the available information ;).
d) Except for information that is properly restricted from disclosure under procedures established in accordance with statute, regulation, Executive Order, or Presidential Memorandum, each agency should make available to the public the scientific or technological findings or conclusions considered or relied on in policy decisions.
This is pretty much the fourth principle of my four science-to-policy principles found below.
e) Each agency should have in place procedures to identify and address instances in which the scientific process or the integrity of scientific and technological information may be compromised.
What does it mean to “compromise the scientific process?”. Hopefully this does not mean “reducing funding for research”. Selecting only your buddies for reviews or review panels? And again, you could argue that “the integrity of scientific and technological information” was also sought by the DQA; what is the difference? Have we learned something from agency efforts under the DQA? Also, I don’t see why each agency needs to develop these separately (unless it’s part of a larger jobs program for mid-level guideline writers ;)). Can you imagine the potential train wreck when a regulatory agency, regulating a land management agency, operates from a different worldview with different procedures? And there’s the possibility that a research agency, regulatory agency and management agency could all have different approaches to determine the integrity of the research agency’s research. In my view, clarity is not achieved by asking many agencies to do something fuzzy.
(f) Each agency should adopt such additional procedures, including any appropriate whistleblower protections, as are necessary to ensure the integrity of scientific and technological information and processes on which the agency relies in its decision making or otherwise uses or prepares.
Again, does this make sense to do separately by agency? Many policy decisions use research derived from different sources and agencies. In fact, many use research developed outside the federal government. Should we expose that work to a series of quality reviews before we use it? (This was a discussion point on the DQA). Do these outside sources need to have their whistleblower procedures in place?

Summary of Memo:

1. What if we were to apply the ideas espoused in the memo to the promulgation of the memo (as it is policy) itself? We might expect a section describing how the work of noted science policy experts was used in the development of the memo, with peer-reviewed citations. I’d expect to see Jasanoff, Sarewitz and Pielke, Jr., at least, cited.

2. Here are my four principles for improving the use of information in policy, (1) joint framing and design of research with policymakers (2) explicit consideration of the relevance of practitioner and other forms of knowledge (3) quality measures for scientific information (including QA/QC, data integrity and peer and practitioner review), and (3) transparency and openness of review of any information considered and its application to policy.

3. If the DQA and the “Integrity” work are seen to be the result of inchoate longings by many for an improved “science to policy” process; and if they seem each to have become, instead, weapons to slime the opposing political party, then why not establish a bipartisan commission on improving the use of scientific and technical information in policy? Science policy experts would advise the commission, and the deliberations would be transparent and open to public comment. The terrain to be explored would include my four principles above, and add considerations of involving citizens more directly in working with the relevant Congressional committees in developing federal research budgets and priorities.

07 January 2011

Time to Short Cat Bonds?

Today's NYT reports that while still a niche market catastrophe bonds are finding some increasing attention, as investors seek to diversify their risk.
Catastrophe bonds and other insurance-related securities have no correlation to the broader markets. In 2008, the Swiss Re catastrophe bond index rose 2.3 percent, compared with a loss of 38 percent in the Standard & Poor’s 500-stock index. The catastrophe bond index returned 10.5 percent from 2007 through 2010, compared with an 11 percent fall in the S.&P.

“The fact that we demonstrated positive returns has been very helpful in getting new funds into the space,” said Paul Schultz, president of Aon Benfield’s investment banking division, which packages and sells catastrophe bonds. “We have more investors participating in this asset class than ever before. There’s a growing following and growing level of interest.”
Take a second look at the first paragraph above -- cat bonds outperformed the S&P 500 from 2007-2010.  What should we take from that?  Not much, other than the randomness of extreme events.

The graph at the top of this post shows the number of days in between major US hurricane landfalls, that is, storms of Category 3 or stronger strength, which are responsible for more than 85% of historical normalized damage.  The data comes from the ICAT Damage Estimator. The red line shows the (non)trend in this metric over the past 111 years.  There were 42 such storms in the first half of the record (1900-1954) and 37 in the second half (1955-2010).

What is interesting is that that presently we are in the midst of  the third longest period (over 1900-2011) during which a major hurricane has not hit the US, since 1915.  It will become the second longest streak if a major storm does not strike before August 31, 2011.

The fact that cat bonds outperformed from 2007-2010 in all likelihood reflects the present drought in major hurricane landfalls, a streak that will have to end sometime and perhaps in spectacular fashion.  Just as the market overreacted in 2005 after a single season that saw 4 major hurricane strikes, the market today risks overreacting in the other direction.  I don't run a hedge fund, but if I did, I'd be taking a good hard look at cat bonds to see if I could gain some leverage using that tried and true formula for success -- regression to the mean.