Thursday, March 15, 2012

Nuclear reality check

The Economist muses over the impact of Fukshima and the nuclear big picture:
And if the blow is harder than the previous one, the recipient is less robust than it once was. In liberalised energy markets, building nuclear power plants is no longer a commercially feasible option: they are simply too expensive. Existing reactors can be run very profitably; their capacity can be upgraded and their lives extended. But forecast reductions in the capital costs of new reactors in America and Europe have failed to materialise and construction periods have lengthened. Nobody will now build one without some form of subsidy to finance it or a promise of a favourable deal for selling the electricity. And at the same time as the cost of new nuclear plants has become prohibitive in much of the world, worries about the dark side of nuclear power are resurgent, thanks to what is happening in Iran.
After reading Burton Richer's incredible "Beyond Smoke and Mirrors" (which I hope to formally review pretty soon) I had a rekindled enthusiasm for nuclear energy. The problems of disposal can be greatly mitigated by running fuel through plants twice, albeit at the cost of a greater threat of nuclear proliferation if the stuff goes wandering. The actual threat posed by plant malfunctions to human safety is minimal, far less than that of just the deaths caused by inhaling the smoke from burning oil or coal. Nuclear is readily scalable with many promising technologies on the horizon. But for those hoping for modular reactors or micro reactors or molten sodium reactors or thorium on a white horse, another article in the review points to an important problem, beyond a nervous public or heavy-handed regulation, a problem that I, for one, had failed to consider:
Such homogeneity in a 70-year-old high-technology enterprise is remarkable. Seven decades after the Wright brothers’ first flight there were warplanes that could travel at three times the speed of sound, rockets that could send men to the moon, airliner fleets that carried hundreds of thousands of passengers a day, helicopters that could land on top of skyscrapers. Include unmanned spacecraft, and there really were flights a billion times as long as the Wright brothers’ first and lasting for years. But aircraft were capable of diversity and evolution and could be developed cheaply by small teams of engineers. It is estimated that during the 1920s and 1930s some 100,000 types of aircraft were tried out.

Developing a nuclear reactor, on the other hand, has never been a matter for barnstorming experimentation, partly because of the risks and partly because of the links to the technologies of the bomb.
Wind and solar, not without their disadvantages compared to nuclear power and energy sources, have this great advantage; they are relatively safe and easy to tinker with. There are hundreds if not thousands of designs of solar panels and wind turbines, at every stage of development. You will never be able to build and test 100,000 different types of nuclear energy generation, not if your regulators were raving Randian paleolibertarians. If you don't like the shape of your wind turbine you can radically redesign it and throw it up in the wind and see what it does; will you ever be able to do that with a nuclear reactor? And if not, what can we reasonable expect in the coming decades except that renewable technology will continue to advance and nuclear will be (at best, if the public's fears can be allayed) a bridge technology on the way to a low-carbon future?

Tuesday, March 13, 2012

Watching the maize

From AGWObserver's weekly roundup:

Projected temperature changes indicate significant increase in interannual variability of U.S. maize yields – Urban et al. (2012)

Abstract: “Climate change has the potential to be a source of increased variability if crops are more frequently exposed to damaging weather conditions. Yield variability could respond to a shift in the frequency of extreme events to which crops are susceptible, or if weather becomes more variable. Here we focus on the United States, which produces about 40% of the world’s maize, much of it in areas that are expected to see increased interannual variability in temperature. We combine a statistical crop model based on historical climate and yield data for 1950–2005 with temperature and precipitation projections from 15 different global circulation models. Holding current growing area constant, aggregate yields are projected to decrease by an average of 18% by 2030–2050 relative to 1980–2000 while the coefficient of variation of yield increases by an average of 47%. Projections from 13 out of 15 climate models result in an aggregate increase in national yield coefficient of variation, indicating that maize yields are likely to become more volatile in this key growing region without effective adaptation responses. Rising CO2 could partially dampen this increase in variability through improved water use efficiency in dry years, but we expect any interactions between CO2 and temperature or precipitation to have little effect on mean yield changes.”
 We expect the globe to add 28% more people between now and then. While we can expect improvement in agricultural technology, it would be better to be able to deploy those to meet the expected spike in demand. And what happens in 2080, 2100, 2150? With twice, three times, four times as much warming as 2030-2050?

Increased variability means in the best case more money and energy expended on long-term storage of grain reserves. In the worst case it means shortages and hunger.

Monday, March 12, 2012


The inevitable Revkin headline:

The argument is one we've seen before:

Many of my conservative friends are deeply suspicious of climate change, and they hate carbon taxes and cap and trade. They’re not interested in adapting to a supposedly hypothetical future. Fair enough. Everyone is entitled to an opinion [But not their own . . .] . . . .

Guess what? Many of the things that help reduce the threats of climate change can also be good for our economy and national security, and vice versa. Many of the changes proposed to adapt to climate change are readily justifiable as approaches to shelter our wealth and well-being against the erratic forces of nature such as Hurricane Katrina and the recent floods in Australia. Why not work to boost innovation, the economy, disaster preparedness and national security, and be pleasantly surprised when greenhouse gas emissions and vulnerability to climate change go down, too? Why not approach the debate from another direction, and be happy that we find allies instead of adversaries?
I'd like to think I'm very happy to find allies wherever on the political spectrum they might reside. But this specific approach is fraught, and not just because we have watched for the last three years as our tax- and spending-cutting, moderate-appointing, terrorist-killing president has been universally vilified by the right as the second coming of Lenin. As Revkin says, if you can grab the middle you can marginalize the fringes, and we're going to have to, whatever the game plan.

But is it worth it to dilute and muddy a very clear message -- we are radically change the climate in dangerous ways -- in order to frame action in terms of goals that haven't motivated the public to act by themselves? The benefits of mitigation and adaptation both for national security, the economy, innovation etc. have been extensively discussed and, on the right, dismissed. Fighting climate change has been discursively "coded" as a left-wing thing, and arguing that it advances right-wing and all-wing goals isn't going to change that. It's the different-values fallacy again; the idea that the right wants different things, and by offering them different things you bring them over to your side.

But the right doesn't want different things, and the appeal to the middle is the appeal to our shared values, to our shared interest in fighting climate change. We need to change, subtly, the emotional coding of the fight against climate change, not abandon what we have argued and believe is the central challenge to our 21st century civilization just so we can sell tired, nonspecific election year patter about "innovation" and "national security." The winning angle to not to make people respond to the idea of "national security," but to give them in a form they understand a visceral sense of the literal, specific threat to our nation and our world. It's about our common fate, not issues potpourri. As one master of the revolutionary middle ground put it:

"For, in the final analysis, our most basic common link is that we all inhabit this small planet. We all breathe the same air. We all cherish our children's future. And we are all mortal."

Wednesday, March 7, 2012

Richard Lindzen vs the aerosol forcing

GUEST POST by Fred Moolten

I thought it might be worthwhile to examine more carefully evidence related to a centerpiece of Lindzen’s claim that climate models overstate climate sensitivity by means of “fudge factors” involving aerosols. . . . 

In communicating with the public about Climate Change, Richard Lindzen has consistently claimed that climate scientists are overestimating the warming potential of CO2. Central to this claim is his assertion, unqualified by any caveats, that aerosol forcing is “unknown” but is “arbitrarily adjusted” in climate models to make them match observed trends. In particular, he suggests that most often the adjustments deliberately overstate the cooling effect of aerosols to bring the model trends down to the observed trends. We can therefore ask the following relevant questions: (a) Is aerosol forcing “unknown”? (b) Is there acknowledgment by modelers that they adjust the aerosol forcings for the purpose of matching observed trends? (c) If not, are the aerosol parameterizations they make justifiable on some other basis or are they “arbitrary”? (d) Is there independent evidence that can only be reasonably interpreted to mean that the adjustments are made to match observed trends? (e) If choices are made that are not clearly justified by the evidence, are they in the direction of exaggerated aerosol cooling? The answers can help us decide if what Lindzen states as fact is indeed a fact or if Lindzen’s claim in this regard is untruthful.

Before proceeding, it’s worth noting that there is no way to conclusively exclude the possibility that some model choices have on occasion been influenced, perhaps subconsciously, by an intent to match observed temperature trends. We can, however, ask whether this is likely to be true in general, and more importantly whether stating it as an established fact rather than a conjecture can be supported. I suggest that the evidence, taken in total, refutes Lindzen’s statement with high probability.

(a) Is aerosol forcing unknown? A frequent fallacy in blogosphere and some media criticism of mainstream scientific conclusions is the implication that if we don’t know everything, we know nothing. Clearly, if we knew nothing about aerosol forcing, any choices in models would necessarily be “arbitrary”. In fact, however, much is known about aerosol data in general, and in particular its incorporation into models. An example of the latter is found in Schmidt et al 2006, which includes extensive evidence based on physical principles and empirical data. Much also remains to be learned, but the evidence refutes the absolutist proposition that our ignorance is total.

(b) Is there acknowledgment by modelers that they adjust the aerosol forcings for the purpose of matching observed trends? One source on this issue is Gavin Schmidt, in both an exchange on collide-a-scape 334-378 and in the details of how aerosol forcing is developed in the GISS E model described in Schmidt et al 2006 (with coauthors who include modelers Jim Hansen and Andy Lacis). It’s hard to read what Gavin Schmidt wrote without concluding that he flatly rejects any motivation designed to match trends, and that he rejects the notion that such a motivation exists as a general phenomenon among the modelers. (A similar point has been made elsewhere specifically regarding GFDL and CCSM models – see Chapter 5 in the 2008 USCCSP report). What Gavin Schmidt says about how he and other modelers incorporate aerosol forcing into models contradicts Lindzen’s claim about their motivation, unless Gavin is either lying or engaged in self-delusion. His statements of course can’t exclude the possibility of exceptions among a few modelers that Schmidt et al are unaware of.

(c) Are the aerosol parameterizations modelers make justifiable on some empirical basis or are they “arbitrary”? The empirical basis was illustrated in the Schmidt et al reference cited above.
d) Is there independent evidence that can only be reasonably interpreted to mean that the adjustments are made to match observed trends? An important argument that there is some, perhaps unconscious, choice of aerosol parameters made with trends in mind among some modelers comes from papers by Kiehl 2007 and Knutti 2008, both of which report an inverse correlation (a weak one) between model climate sensitivity and total anthropogenic forcing in models that simulate 20th century trends fairly well – a low total forcing reflects primarily strong negative aerosol forcing. Certainly, one explanation for this might be a choice of aerosol forcings made with an eye toward matching observed trends. Since we have statements cited above that trend matching isn’t done, this creates a conflict that would be difficult to resolve if there were not other plausible explanations for the reported inverse correlation. We can explore this possibility.

At least two mechanisms might explain the correlation without invoking specific choices designed to match observed trends. The first is based on the principle that models are parameterized to match existing climatology in the absence of an imposed perturbation such as CO2-mediated forcing. This includes seasonal changes, for example, whereby temperature variation must be explained on the basis of forcings (including aerosols that affect albedo) and feedbacks (which affect climate sensitivity). It is conceivable that different modelers have made choices that permitted that matching but which varied inversely in the relative strengths of forcing and climate sensitivity ,and which then carried over into the trend simulations even though that was not the reason for the choice of parameters. In fact, it is possible that a choice involving a single parameter set could affect both aerosol forcing and sensitivity. For example, Knutti points out that in the case of aerosol indirect effects, both climate sensitivity and these indirect effects depend to some extent on a common hydrology, so that parameterization in that realm could create a correlation of the type observed.
A second mechanism that might contribute to the inverse correlation independent of modeler choice is selection bias. Many models have attempted to hindcast 20th century temperature trends. Those reported by Kiehl 2007 and the subset of CMIP3 models cited by Knutti 2008 do a fairly good job in this regard, but almost certainly others do less well. If, for example, the pairing of climate sensitivity strength and total aerosol forcing in models occurred in a random manner, those that paired them in the same direction (both high or both low) would do poorly and those that paired them inversely would perform better. In preferentially citing the latter, possibly because the poor simulations were less available, these authors have ensured that this type of randomness, if it occurred, would lead to the selective citing of the models that happened to “come out right” even if all models – skillful and unskillful combined – made their pairings at random, or at least independent of observed trends. It would be incumbent on anyone claiming deliberate, non-random pairing to provide direct evidence for that claim, particularly in light of the contradictory statements (see b above) that such deliberate choices were not part of model design. Note, however, that if some models matched temperature trends accurately “by chance”, the apparent accuracy probably overstates the actual skill of the models to make future predictions unless the same compensating errors exist in future simulations.

e) If choices are made that are not clearly justified by the evidence, are they in the direction of exaggerated aerosol cooling? Remember that one of the implications of Lindzen’s “arbitrary adjustments” claim is that they were needed to make the model simulations come out cool enough to match trends without requiring low climate sensitivity. However, if one looks at one of the choices that most significantly affects simulations, it was that most of these older models did not incorporate indirect aerosol effects into their negative forcing estimates. These effects are thought almost universally to be real, albeit fairly small. However, failure to include them makes the models run too warm, contrary to the implication by Lindzen that modelers are trying to overstate climate sensitivity by exaggerating aerosol cooling. Including the indirect effects cools the simulation, and so their absence in the majority of the models implies that the actual climate sensitivity might be higher than estimated from the earlier models. Whatever the practical reasons for excluding indirect aerosol effects, it is hard to see how it could have been motivated by a desire to exaggerate cooling. The omission of indirect effects is likely to be rectified in the current group of models. The absence of indirect effects in most models and their inclusion in others renders interpretation of model/observational relationships problematic. It’s not clear to me that we would see the same inverse relationship if all models had incorporated indirect aerosol effects.

Based on all the above, I find the most plausible interpretation to be the following. (1) Lindzen’s claim that modelers “arbitrarily” adjust “unknown” aerosol forcings to exaggerate the cooling effect of aerosols is unsupportable. (2) There is no convincing reason to doubt claims from modelers (e.g., Schmidt et al) that choices of aerosol parameters are based on available empirical evidence and are not designed to affect the trend simulations. However, the possibility of exceptions to this generalization among some modelers can’t be excluded. (3) The omission of indirect aerosol effects from models is a choice that would understate rather than exaggerate aerosol cooling. (4) Correlations between aerosol forcing and climate sensitivity are difficult to interpret from model simulations that include indirect effects is some cases but exclude them in others (the majority)*. (5) To the extent the inverse correlation would persist even if indirect effects were uniformly included, it can be explained at least in part without invoking deliberate choices by modelers designed to make simulated trends match observed ones. The assertion by modelers that they don’t engage in that type of “tuning” is not refuted.

*In an email conversation with Dr. Knutti, he informs me that the data from many models are inadequate to determine exactly what went into their forcings, and so categorizing the models may not be possible. Dr. Knutti repeats the inference he drew in his paper that some but not all models were guided by observed trends. My conclusion, based on the above analysis, is that at least many were not, and the possibility that some were is still unproven.


I asked Fred Moolten, whose carefully argued and exhaustively researched comments are a highlight of Climate, Etc, for permission to reproduce his comment on Lindzen as a post. This makes him our first guest poster at IT. Very exciting!

Some relevant links:

Lacis, A., J. Hansen, and M. Sato (1992), Climate forcing by stratospheric aerosols, Geophys. Res. Lett., 19(15), 1607–1610, doi:10.1029/92GL01620. 

Consistency Between Satellite-Derived and Modeled Estimates of the Direct Aerosol Effect, Gunnar Myhre,Science 10 July 2009: 325 (5937), 187-190.

From the abstract: "he uncertainty, however, ranges from –0.9 to –0.1 W m−2, which is largely due to differences between estimates from global aerosol models and observation-based estimates, with the latter tending to have stronger (more negative) radiative forcing."

Another examples of how models are bad, bad, bad, unless they say something a "skeptic" wants to hear.
A satellite view of aerosols in the climate system[Full Text] from YJ Kaufman, D Tanré, O Boucher - Nature, 2002 -

Tuesday, March 6, 2012

Republicans against free-market wind power

Teh horror
They're getting scared:

Like many states, Wisconsin has a patchwork of differing local setback rules governing the distance wind developers need to leave between turbines and adjacent homes. To streamline the process, the Wisconsin legislature passed the 2009 Wind Siting Law instructing the Public Service Commission (PSC) to create one overarching state siting law for all wind turbines subject to local review. . . . 

In response, Republican representatives and ALEC members proposed their own legislation to make implementation of larger wind projects much more difficult and protracted.

In October 2011, State Senator Frank Lasee (R) introduced a bill (SB 263) that would declare a moratorium on construction of wind farms over 100 feet, saying larger turbines should not be allowed until the state PSC was in possession of a report that ensures turbines do not cause health problems.
That's not how the conventional wisdom would have it, is it? Supposedly the pro-business conservatives are fighting for profits and against regulations, while the environmentalists are tying up new energy projects in red tape.

Supposedly renewable energy is impractical and expensive, and all that's necessary to ensure its demise is not to subsidize it and not to charge fossil fuel burners the cost that their soot and sulfates and NO2 and CO2 impose on society. Privatized profits and socialized harms -- but still, non-intervention is supposed to be their guiding principle.

Wind is already price-competitive in places where the wind blows hard

Obviously these measures are no longer sufficient. Contrawise conservatives feel sufficiently threatened by the appeal of renewable energy on the open market to strive to strangle it with overregulation. I call that a good sign.

Judith Curry needs living space!

There's a whole series of these posters. They're pretty great.

I kid, she didn't reference that awful rhetorical precedent, but pretty darn close:

With regards to K-12 education, there is no particular reason to teach ‘climate change’ in the K-12 curriculum.  Climate change is a topic that is more suitable high school ‘science and society’ courses.  In such courses, teaching the controversy would seem to be of paramount importance.
"Teaching the controversy" being of course the rallying cry of the Creationists' drive to forbid the teaching of evolution. What a tradition for a professor of atmospheric science to allude to!

Somebody missed the sarcasm here.
I was interested in how far this particular meme had spread, so here you go:

"Teaching the controversy" "evolution": 38,700,000 hits
"Teach the controversy" & "evolution": 229,000 hits.
"Teaching the controversy" "global warming": 42,900 hits.
"Teach the controversy" & "global warming": 62,000 hits.
"Teaching the controversy" "climate change": 32,700 hits. 
"Teach the controversy" & "climate change": 62,100 hits.

I found some hits for other examples of science denial -- "moon landing" and "vaccines" as well as "Holocaust" -- but they seem to be primarily by people criticizing the denial by comparing their arguments to the anti-evolution malarkey. Only the climate denial crowd, as far as I can tell, is seriously trying to adopt this anti-evolution meme as their own.

As a tactic, this seems . . . not inspired. The denial of evolution, as an analogy for the denial of anthropogenic climate change, is perhaps a little too close for comfort in ways that are not flattering to Curry et al:

Denial of Evolution vs Denial of Global Warming

1. Flies in the face of a massive amount of scientific evidence. Check. Check.
2. Utterly rejected by the vast majority of scientists. Check. Check.
3. Driven by the discomfort of a particular ideology with the implications of the science. Check. Check.
4. In lieu of a compelling alternative hypothesis, portrays the uncertainties and persistent unknowns that attend all science as huge, gaping flaws that falsify the science. Check. Check.
5. Unable to come to terms with the vast body of mutually supporting evidence from multiple fields, employ a fallacy of synecdoche: whatever point, major or minor, that they are critiquing at the moment, is treated as the cornerstone of the theory without which the whole corrupt edifice comes tumbling down. Check. Check.

I could go on, pointing out their mutual love for unreliable online lists of supposedly supportive supposed scientists (see here and here) and their common dependence on short memories and highly mobile goalposts. But you get the point. This is not a flattering comparison for either side.

The reason it is not flattering is really simple: this is a method. It is not spontaneous. It is a battle-tested set of strategies and tactics for attacking science and scientists and confusing the public. For it to work, it can't look like a method; it's supposed to be just a purer, better execution of the scientific method. That lie is hard to sell over and over again, especially when the people who pioneered these tactics have long been outed as an arm of the Christian right.

There's a reason magicians don't do the same trick for the same audience over and over again; there's a reason those Nigerian bankers always include a soonish deadline. Time for the mark to think and reflect is deadly to a con artist, who relies on distraction. Distance and perspective are the enemy of the demagogue, who relies on the emotional engagement of the audience. For all these people, showing their tactics to be at work in another cause, showing the same tired arguments and rhetorical fallacies deployed against another compelling scientific theory, is a terrible, terrible move.

Judith Curry is a scientist. She believes in the theory of evolution. There must be a part of her that understands what advocating "teaching the controversy" implies about the position she's taken and the people she's allied herself with. Is this a cry for help?

Monday, March 5, 2012

Idiot comment of the day

Gary's musings over at WUWT 2.0 have brought the oft-neglected daily feature back:
“Contraception,” “women’s reproductive health,” and “family planning”

Such wonderful euphemisms for abortion. [My only faint shread of hope for Gary is that he doesn't know what a euphemism is.] We have to kill more third world babies in the womb so they won’t use fossil fuels and mess up our imaginarily [sic] fragile climate. Oh, and we’re more than willing to open more Planned Parenthood abortion clinics in the developed world (80 percent of which we locate in “minority” neighborhoods) to show we aren’t the racist eugenicists our hero Margaret Sanger was [Who?].
The “population problem,” like the DDT problem, ozone problem and CO2 problem before it, are all just trojan horses for progressive assertion of power over the economy.
Notice the eagerness with which the peak oil hysterics embrace the barbarism of “population control,” just as their CAGW fellow travelers did before them.

Note the seamless blend of fanatic Christianist and raving paeloliberatarian. Leftists murder babies with contraception and Pap smears to further their real monstrous goal of economic planning . . .

Quite the load of Santorum.

Friday, March 2, 2012

Quote for the day

The red line is trying really, really hard.

Gene Mark's ill-thought-out "If I Was a Poor Black Kid" inspired this DiA riff:

One thing I find paradoxical is that highly numerate people, people in the engineering, business and technical fields (Mr Marks writes about the tech industry), are often most reluctant to consider social problems from a statistical point of view, and prefer to consider them as individual moral or motivational stories. We have a curve composed of 150m dots that is becoming steeper and more parabolic. Go down to Occupy Wall Street, and you'll find a lot of cultural-studies majors working for environmentalist nonprofits who support changing systemic rules to flatten the slope. Go into the financial-institution office buildings that surround them, and you'll find a lot of math majors devising computer models for risk-weighting assets who think the dots on the bottom end should try harder to get into the top end. It's weird.