Friday, February 16, 2007
The significance of ice sheet behavior to policy has resulted in substantial research efforts that have been bearing fruit in the last year or two. But recent studies suggest that the IPCC's uncertainty about ice sheet behavior is justified -- and may not be resolved quickly enough to allow us to make policy based on a narrow range of estimated sea level change. So what are we to do? It's the traditional problem of scientific uncertainty rearing its ugly head -- just at the time when we are convinced that something needs to be done, but now we have to decide what to do and how quickly it must be done.
The temptation is to say as much as feasible. But that begs the cost question, which defines our sense of what is feasible. So, we need to set "technology-forcing" or what I'd prefer to call "technology-facilitating" goals and let the genius of the market find ways to meet them. And the goals need to be set not based on what we guess is the central tendency of estimated climate impacts, but based on the higher end of estimated climate impacts --not based on a worst case scenario, but based on a moderately worse case scenario. To me, this is James Hansen's recommendation to hold SST increase to 1 degree by the end of the century. Now, what does that require? I don't know....but I'd like the answer to that question.
Here's the latest from the AAAS meeting:
Clues to Sea Rise May Lie Beneath Antarctic Glaciers
Images courtesy of NASA
A network of rapidly filling and emptying lakes lies beneath at least two of West Antarctica’s ice streams, according to new research published online today by the journal Science, at the Science Express website.
More than 100 subglacial lakes have already been discovered, but the new ones are particularly interesting because they occur below fast-moving ice. Though it’s too early to say exactly how this liquid water is affecting the rates of ice flow above, understanding the behavior of these fast-moving ice streams is essential for predicting how Antarctica may contribute to sea level rise.
Helen Fricker of the University of California San Diego’s Scripps Institution of Oceanography and colleagues analyzed elevation data recorded by NASA’s Ice Cloud and land Elevation Satellite (ICESat) collected over the lower parts of the Whillans and Mercer Ice Streams. These are two of the major, fast-moving glaciers that are carrying ice from the interior of the West Antarctic Ice Sheet to the floating Ross Ice Shelf.
“We’ve found that there are substantial subglacial lakes under ice that’s moving a couple of meters per day. It’s really ripping along. It’s the fast-moving ice that determines how the ice sheet responds to climate change on a short timescale,” said Robert Bindschadler of NASA Goddard Space Flight Center, one of the study’s coauthors.
“We aren’t yet able to predict what these ice streams are going to do. We’re still learning about the controlling processes. Water is critical, because it’s essentially the grease on the wheel. But we don’t know the details yet,” he said.
Bindschadler presented the findings at a news briefing for reporters on Thursday, 15 February, at the AAAS Annual Meeting in San Francisco, California. In coordination with the briefing, NASA released satellite images of West Antarctica.
Glaciologists have known that water exists under ice streams, but the observation of a system of water storage reservoirs is unprecedented. The surprising thing about this discovery is the amount of water involved, and the pace at which it moves from one reservoir to another, according to Fricker, the lead author.
“We didn’t realize that the water under these ice streams was moving in such large quantities, and on such short time scales,” Fricker said. “We thought these changes took place over years and decades, but we are seeing large changes over months.”
The authors identified numerous spots that either rose or deflated from 2003 to 2006, likely because water flowed into or out of them. Water would be capable of this because it is highly pressurized under the weight of the overlying ice.
The three largest regions are between approximately 120 and 500 square kilometers, while the others are widely scattered and smaller. One of the large regions, referred to as Subglacial Lake Engelhardt, drained during the first 2.7 years of the ICESat mission, while another, Subglacial Lake Conway, steadily filled during the same period.
“I’m quite astonished that with this combination of satellite sensors we could sense the movement of large amounts of water like this. From 600 kilometers up in space, we were able to see small portions of the ice sheet rise and sink,” Bindschadler said.
Studies of the subglacial environment are rare, being expensive, risky and labor-intensive. Bindschadler explained that before the ICESat mission, researchers would typically have to drill holes in the ice streams in order to study what was occurring beneath them. These holes, generally just about 4 inches in diameter, provided a much more limited view of the entire ice stream than the satellite images do.
“Until now, we’ve had just a few glimpses into what’s going on down there. This is the most complete picture to date what’s going on beneath fast flowing ice,” Bindschadler said.
Added Fricker: “The approach used for this work provides glaciologists with a new tool to survey and monitor the nature of the subglacial water system and to link these observations to the motion of the ice sheet. We still don’t know how the subglacial water system varies on longer time-scales from decades to centuries. To do this, we need to continue monitoring the ice streams with ICESat and future follow-on missions.”
15 February 2007 3:24 pm
February 16, 2007 in Climate Change, Economics, Energy, Environmental Assessment, Governance/Management, International, Legislation, Physical Science, Sustainability | Permalink | Comments (0) | TrackBack (0)
Thursday, February 15, 2007
Riverkeeper, Inc. v. U.S. E.P.A., (C.A.2) January
31, 2007: Clean Water - Clean Water Act's (CWA) "best technology
available" for cooling water intake structures precluded cost/benefit
analysis. The Clean Water Act (CWA) provision mandating the use of
"best technology available" (BTA) for minimizing the adverse
environmental impact of point sources' cooling water intake structures
did not permit the use of cost-benefit analysis in determining the BTA.
This was in contrast to the predecessor standard, "best practicable
control technology" or BPT. Instead, the later standard required a
determination of which means would be used to reach a specified level
of benefit. The issue arose on challenges to the EPA'S rule
implementing the provision for existing power plants.
The Clean Water Act (CWA) provision mandating the use of "best technology available" (BTA) for minimizing the adverse environmental impact of point sources' cooling water intake structures did not permit the use of cost-benefit analysis in determining the BTA. This was in contrast to the predecessor standard, "best practicable control technology" or BPT. Instead, the later standard required a determination of which means would be used to reach a specified level of benefit. The issue arose on challenges to the EPA'S rule implementing the provision for existing power plants.
Here's the IPCC 4th Assessment summary for policymakers regarding climate change science: IPCC4 Climate Science Summary
Here are a random few responses (original 2/2; revised thru 2/15)
Nature editorial (see below)
Worldwatch Institute (see below)
Pew Center on Global Climate Change
Real Climate on sea level change Prior ELP Blog post
World Council of Churches (see below)
Tiempo (see below)
A couple of articles caught my eye recently. One in Nature argues that phylogenetic diversity (a measure of how distantly species are related) should be considered in addition to using number of species to identify "hotspots" that deserve priority in conservation efforts. [see Science news report below]. Another Nature article described rapid biodiversity assessments, conflicting ideas on how to set biodiversity conservation priorities, and the utility of these assessments in priority setting [see excerpt below].
Monday, February 12, 2007
The Economist suggests that the markets are not yet taking climate change into account:
THE scientific consensus in favour of man-made global warming seems clear. There is also evidence that voters are increasingly inclined to believe in the phenomenon, even in America. This might lead one to believe that politicians will be forced to take action. But, as Tim Bond, of Barclays Capital, points out, the futures markets appear to be saying something different. The forward curves for hydrocarbon fuels (such as oil and coal) are upward-sloping: higher prices are expected in future.><>
Some of this might be due to investment demand for commodities, which is usually channelled via the futures markets; but the bulk of this money is placed into contracts of less than 12 months’ duration. The forward curve points to higher oil prices over five or more years. Conversely, the price curves of cleaner alternatives are downward-sloping. This does not suggest investors are expecting a mass shift away from oil and towards green fuel sources such as wind and solar power. Perhaps this reflects understandable cynicism that voters will ever agree to changes that cause them real pain: higher gasoline taxes in America, for example. As one hedge fund manager recently remarked to your correspondent: "If the UK stopped all carbon emissions tomorrow, the Chinese would replace them within six months. So why should I give up driving my big car?">
Or perhaps investor opinion reflects the enormity of the task. As Barclays Capital points out, energy demand is expected to rise by 50% over the next 30 years. The industry needs to meet that target, while simultaneously reducing the 80% share of supply currently provided by hydrocarbons. The attempt to square this circle may have important repercussions for financial markets. As Tim Bond says, an enormous amount of new investment is needed to meet increased energy demand―some $20 trillion, at 2000 prices. The energy industry did double its capital spending in nominal terms between 2000 and 2005, but thanks to infrastructure inflation (the cost of oil rigs, tankers etc), this translated into a real spending increase of only 10-20%.
The way that markets have traditionally encouraged investment spending is by ramping up prices, which helps explain why the oil price tripled within a few years (before a recent setback). But there is still a problem.><>
First, companies are uncertain as to how governments will act on climate change. This may cause them to postpone investment programmes until the outlook becomes clearer>
. Second, uncertainty will prompt them to apply a high discount rate to future projects. These discount rates may vary wildly. If, for example, it was clear that governments were pushing the energy industry away from oil, and towards, say, wind power, the discount rate for the wind industry would drop and that for the oil industry would rise. So energy prices might be both high (to encourage investment) and volatile.
That is not great news for stock markets. For a start, volatility leads to uncertainty which investors dislike. Furthermore, Barclays has found a clear inverse correlation between oil prices and the multiple that the market applies to corporate profits (see chart). In other words, the higher the oil price, the lower the market rating. Back in the 1970s, when oil prices rose consistently, the energy sector was about the only place to earn decent returns. In America, real returns from oil averaged nearly 25% a year, whereas shares managed just 1.4%.
Most investors seem to regard global warming as a long-term problem unlikely to affect equities over the next year or two. But the Barclays analysis suggests they may be wrong to be so complacent.