June 13, 2006
Enjoy this open access article by Henry Nicholls from PLOS on rewilding:
Restoring Nature's Backbone
Copyright: © 2006 Henry Nicholls. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Citation: Nicholls H (2006) Restoring Nature's Backbone. PLoS Biol 4(6): e202
A herd of bison has just made an extraordinary migration. The distance these animals travelled was huge—nearly 5,000 miles—and their means of transport was highly unorthodox: they flew. The cargo plane took some thirty of these hulking mammals from Elk Island National Park of Canada across Alaskan airspace, over the Bering Strait, and into the Republic of Yakutia. Their ultimate destination: the Lenskiye Stolby Nature Park, a 78,500-hectare reserve in northeast Siberia more commonly known as “Pleistocene Park.”
This is the latest phase of an experiment to explore the impact of large herbivores that once roamed these lands on the biodiversity and integrity of the Siberian steppe ecosystem. One key consequence of putting such creatures back is that they disrupt the snow cover during the winter, exposing the ground to the cold and preserving the permafrost. Without these herbivores, the snow insulates the earth and the permafrost melts, says Sergei Zimov, director of the Northeast Science Station in Cherskii, and the brains behind Pleistocene Park. This could allow microbes to break down vast reserves of carbon contained in the earth, thereby contributing to global warming, he says . The return of once-native flora and fauna—so-called “rewilding”—should prevent this and bring the soil much-needed fertilization. “Rewilding will increase the bioproductivity and biodiversity of the landscape,” he predicts.
“If we lose these large predators from the ecosystem, biodiversity is the ultimate loser.”
On the Offensive
This is all part of a proactive approach to nature being articulated with increasing regularity. Rather than trying to simply ring-fence what wildlife remains, conservationists need to be restoring whole ecologies to something of their former glory, says Josh Donlan, an ecologist at Cornell University (Ithaca, New York, United States). Last year, he and a long list of high-profile conservation biologists penned a controversial commentary in Nature in which they laid out the case for rewilding North America—seeding the continent with suitable stand-ins for species that went extinct thousands of years ago .
Donlan's world would see carefully chosen slivers of North America grazed by giant tortoises, horses, and camels; the stamping ground of elephants in place of five species of mammoth; and African lions in lieu of the extinct American lion that once stalked the continent.
The benefits, they argued, are obvious. It would restore ecological processes that have gone by the wayside, mend broken evolutionary relationships, create a back-up population of some of the planet's most endangered species, and raise huge awareness for the conservation cause. “The obstacles are substantial and the risks are not trivial, but we can no longer accept a hands-off approach to wilderness preservation,” they wrote of their optimistic vision.
There are several compelling illustrations of the importance of big creatures for the integrity of an ecosystem. “There's more and more evidence that large vertebrates are disproportionately important not only for maintaining biodiversity but also for generating biodiversity,” Donlan says (Box 1). It's examples like these that persuade him of the importance of restoring populations of large vertebrates. “Over the past 30 to 40 years, increasing evidence is showing that if we lose these large predators from the ecosystem, biodiversity is the ultimate loser,” he says.
Box 1. The Loss of Tooth and Claw
During the 18th and 19th centuries, overhunting decimated the population of sea otters feeding off the coast of Alaska (Figure 1). The disappearance of this predator set in motion a top-down cascade that rippled its way through the kelp forest community. Without otters, prey species—marine invertebrates like sea urchins, clams, snails, and crabs—took over, virtually destroying the kelp forests and wiping out countless ecological niches .
Figure 1. Sea Otters
Sea otters are keystone predators in the kelp forest ecosystem of Alaska, keeping invertebrate populations in check and thereby maintaining biodiversity
(Photograph: David Menke, US Fish and Wildlife Service)
Lessons could also be learned from a long-term study in Venezuela. In the 1980s, a valley was flooded as part of a hydroelectric scheme. This created Lake Guri, a 4,300-square-kilometer body of water dotted with hundreds of forested islands of various sizes. Large vertebrates, often predators, struggled to survive on small islands less than 2 hectares in size, and in their absence herbivores like leaf-cutter ants thrived (Figure 2). Conversely, islands of over 75 hectares could accommodate predators, keeping herbivory in check. By 1997, the density of saplings on small islands was only 37% of that on large islands. Over the next five years, small islands lost 46% of their trees and shrubs, compared with only 32% on large islands .
Figure 2. Leaf-Cutter Ants
Leaf-cutter ants can take over when predator pressure is removed
(Photograph: Scott Bauer, US Department of Agriculture)
The argument for rewilding is also about patching up broken evolutionary links between species. In New Zealand, for example, there are more than 50 endemic “divaricate” plants—species with thin, interwoven branches that form a tangled canopy. One explanation for this unusual structure is that it is an evolutionary adaptation to fend off the herbivorous approaches of the dozen or so species of flightless moa that went extinct with the arrival of humans in New Zealand about 1,000 years ago. Researchers have tested this hypothesis by observing the impact of emus and ostriches—surviving analogues of the extinct moa—on divaricate species where juvenile stems are tangled but adults are not (Figure 3). The birds removed 30%–70% less foliage from juvenile shoots than adult shoots . “A large section of the New Zealand woody flora is specifically adapted to ratite browsing,” says Bill Lee, a plant ecologist at Landcare Research in Dunedin, New Zealand. “We plan to use emu and ostriches in experiments in native ecosystems to examine how they modify ecosystem processes and to investigate their impact on native and introduced plants with different architectures,” he says.
Figure 3. Ostrich
The ostrich could fill a similar evolutionary niche to the extinct moas of New Zealand
(Photograph: Beth Jackson, US Fish and Wildlife Service)
The story is similar in the Mascarenes in the Indian cean, where Aldabran tortoises are being introduced onto a 28-hectare island nature reserve as proxies for the extinct Geochelone inepta and G. triserrata. Several native plant species appear to have evolved distinct juvenile morphological features as a defence against tortoise herbivory. The introduced tortoises are clearly avoiding these species when they are in the juvenile stage, says Vikash Tatayah, fauna manager of the Mauritian Wildlife Foundation (Vacoas, Mauritius).
There are countless other examples of severed links between species. Donlan and his colleagues cite the pronghorn, a deer-like mammal that spent more than four million years on North American grasslands trying to keep one hoof ahead of the now-extinct American cheetah. This key predator almost certainly shaped the pronghorn's astonishing speed, they wrote.
Benchmarks and Proxies
This sort of proactive vision for conservation raises some tricky questions, notably those of restoration benchmarks. In North America, conservation biologists routinely turn to the arrival of Christopher Columbus in 1492, Donlan says. “This is the default benchmark just because it is.” But, he and his colleagues argued, it would make more ecological sense to think about the arrival of humans on the continent some 13,000 years ago. This is the point at which the human-driven extinction of many large vertebrates contributed to a radical change in the continent's wildlife and a rapid loss of biodiversity, he says.
Elsewhere the appropriate benchmark may be different. David Steadman, curator of ornithology at the Florida Museum of Natural History (Gainesville, Florida, United States), has excavated on dozens of islands across the Pacific Ocean. Soon after the arrival of humans between 30,000 and 1,000 years ago depending on the island, whole swathes of endemic fauna vanish from the fossil record, Steadman says. As many as 2,000 species of bird that would probably exist today quickly wound up on the extinction scrapheap, he says .
“Large vertebrates are disproportionately important not only for maintaining biodiversity but also for generating biodiversity.”
In such places, the relatively recent arrival of humans with such dramatic effects makes a good case for setting a restoration benchmark. In places like Europe, where humans have been modifying the landscape for far longer, things are not going to be as clear-cut.
Even if there is agreement, there is still a debate to be had over the choice of species for restoration. When a species has disappeared completely, the idea is to use an ecological analogue or “proxy” for the extinct species. In some situations, so little choice remains that the decision is all but made. For example, if scientists ever attempt to restore flightless rail to the Pacific islands that the fossil record suggests had them, they will have only a handful of candidate species. “While it would be nice to be biogeographical purists, we don't have that luxury anymore,” Steadman says. But other settings could have many candidate proxies. It is still not clear whether the candidates should be chosen for their genetic, behavioural, or ecological similarity to the extinct species.
For those studying food webs—descriptions of who eats whom—it is the ecological services that a species performs that is crucial. Computer modelling of food webs is a good way to explore the impact of extinctions on an ecosystem (Figure 4). “You can take a species on the computer and kill it, but you can't in the wild as it's probably illegal,” says Jane Memmott, a community ecologist at the University of Bristol (Bristol, United Kingdom). “We should be conserving ecosystem services and interactions between species,” she says. “It's harder to come up with a food-web recovery plan, but we're definitely moving in that direction.”
Figure 4. Food Web from a Caribbean Reef
(Image created by software written by R. J. Williams and provided by the PEaCE Lab (http:/
One of the more robust findings of such virtual worlds is that removing the most highly connected species causes more secondary, knock-on extinctions than does the removal of species at random. In certain webs, large vertebrates can be highly connected. “We can say they played very important roles and losing them played huge knock-on effects,” confirms Neo Martinez, director of the Pacific Ecoinformatics and Computational Ecology Lab (PEaCE; Berkeley, California, United States). But restoring one or two absent vertebrates to a habitat may do little to repair an altered food web. “The sort of conditions that allowed the vertebrates to play their role just aren't here anymore,” Martinez says. “We just don't have enough knowledge about these systems to predict what would happen. The whole history of trophically oriented biological control is not pretty.”
“The more extravagant rewilding suggestions presuppose that we know what we're doing.”
One way to improve on ecology's predictive power might be to construct “paleo food webs,” collating information from the fossil record to understand the prehistoric interactions between species. Martinez is one of a handful of scientists interested in this approach. There is a wealth of paleobiological evidence that can help resurrect extinct food webs, he says.
Famous fossil ecosystems such as the iconic Burgess Shale are an obvious place to start. “Even by contemporary standards of food webs, these are really good data,” Martinez says. Although much of the bizarre Burgess Shale fauna came to an evolutionary dead end during the Cambrian period, studying the food web of an entire suite of species from a long-gone era could help us to understand how ecosystems have functioned through deep time and reveal general processes that can and cannot be counted on, he notes. “It starts to put envelopes around the plausible dynamics of a system.” If a rewilding initiative were to push that envelope too far, it would be likely to fail. “The more extravagant rewilding suggestions presuppose that we know what we're doing. We don't,” Martinez says. “Not yet.”
Donlan is well aware that there are substantial biological, social, and economic hurdles to clear if rewilding is to take off. It will take time, careful planning, well-designed experiments, and three stages: first, the restoration of populations of herbivores as is already occurring in the Siberian Pleistocene Park; second, rewilding large protected areas with predators; third, the formation of one or more “ecological history parks” on, for example, vast tracts of North America's Great Plains. The costs and benefits of such proactive conservation must be carefully calculated on a case-by-case basis. “If the costs outweigh the benefits, you don't proceed,” Donlan says. But the conservation community needs to think carefully about these ideas, he says. “There are substantial risks of not doing anything.”
Competing interests. The author has declared that no competing interests exist.
- Zimov SA (2005) Pleistocene Park: Return of the mammoth's ecosystem. Science 308: 796–798. Find this article online
- Donlan J, Greene HW, Berger J, Bock CE, Bock JH, et al. (2005) Re-wilding North America. Nature 436: 913–914. Find this article online
- Estes JA, Danner EM, Doak DF, Konar B, Springer AM, et al. (2004) Complex trophic interactions in kelp forest ecosystems. Bull Mar Sci 74: 621–638. Find this article online
- Terborgh J, Feeley K, Silman M, Nuñez P, Balukjian B (2006) Vegetation dynamics of predator-free land-bridge islands. J Ecol 94: 253–263. Find this article online
- Bond WJ, Lee WG, Craine JM (2004) Plant structural defences against browsing birds: A legacy of New Zealand's extinct moas. Oikos 104: 500–508. Find this article online
- Steadman DW, Martin PS (2003) The late Quaternary extinction and future resurrection of birds on Pacific Islands. Earth-Science Reviews 61: 133–147. Find this article online
ABA SEER Air Quality Committee Teleconference: Residual Risk Under Clean Air Act Section 112: The Recently Proposed HON Rule and Beyond
The CAA directs the US Environmental Protection Agency (“EPA”) to assess the risk remaining (known as “residual risk”) after the application of the Maximum Achievable Control Technology (“MACT”) standards and to promulgate additional standards if required to provide an ample margin of safety to protect public health or prevent an adverse environmental effect.
The CAA also requires EPA to review and revise MACT standards, as necessary, every eight years, taking into account developments in practices, processes, and control technologies that have occurred during that time.
EPA is in the early stages of promulgating residual risk standards following its promulgation of MACT standards for virtually all source categories. In 1994, EPA issued the MACT standard rule for the synthetic organic chemical manufacturing industry—known as the Hazardous Organic NESHAP or “HON” rule. A few weeks ago EPA proposed its residual risk standard for this source category. This standard is likely to serve as a template for hundreds of residual risk standards to follow. The panelists will discuss the residual risk program in general, and then focus on both the principles that underlay the HON residual risk rule, and the specifics of the HON proposal.
Faculty: Moderator: David Friedland, Beveridge & Diamond, P.C., Washington, DC
Dr. Richard Brooks, Vermont Law School, South Royalton, VT
Dave Guinnup, U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC
Leslie Hulse, American Chemistry Council, Arlington, VA
New Hurricane Model Seeks to Capture Warm Water Dynamics that Led to Hurricane Wilma's Shockingly Swift Intensification
New Aid for Storm Forecasters Computer Model to Give Greater Sense of Intensity and Size
By Peter Whoriskey
Washington Post Staff Writer
Monday, June 12, 2006
Scientists at the National Hurricane Center normally deliver findings in a just-the-facts style of prose: wind speeds, pressure readings, compass points. But their description of last year's Hurricane Wilma betrayed a sense of wonderment.
The storm, which strengthened unexpectedly and set records for intensity, was, in the words of the final report, "unprecedented," "explosive" and "incredible." More ominously, in an era in which the public has high expectations for meteorological pronouncements, Wilma had defied predictions.
"The bottom just dropped out," said Naomi Surgi, a hurricane scientist at the Environmental Modeling Center, which is part of the National Oceanic and Atmospheric Administration. "We had never in the Atlantic seen that kind of storm intensification. None of the models forecast that."
Meteorologists have made steady improvements in predicting the path of a hurricane, cutting errors roughly in half over the past 15 years or so. But they have long struggled to predict a storm's strength, a critical element because it determines who should and who will evacuate.
This hurricane season, forecasters have hopes of improving their record in predicting storm intensity.
A new computer model, developed by the Environmental Modeling Center and described as the "next generation" in tropical storm forecasting, will be at the disposal of forecasters at the National Hurricane Center here.
Built with $3 million in federal funds, according to Surgi, the Hurricane Weather Research and Forecast Model is expected to improve forecasts of hurricane intensity, size and rainfall.
In far more detail than its predecessors, the new computer model will envision the full three-dimensional hurricane, the circulation at its core and the varying winds from its bottom to its top, several miles up.
"We think it will have a more accurate physical representation of what goes on in the inner core of a hurricane," said Edward N. Rappaport, deputy director of the National Hurricane Center. "We're not sure we're going to see a monumental advance in the very first year, but this will set the framework for more accelerated improvements."
The model will also monitor and predict the waves and ocean temperatures beneath the hurricane, working in finer detail than previous attempts and seeking out "hot spots" in the ocean that might boost intensity. "It is very high-resolution," Surgi said.
Forecasters at the National Hurricane Center rely on about five to 10 models for any particular forecast, Rappaport said, with one known as the "GFDL" as the lead model. The new forecasting model is expected to supplant that.
Wilma eventually walloped the island of Cozumel, Mexico, last October as a Category 4 storm, then battered the Yucatan peninsula. It finally headed to South Florida, making landfall as a Category 3 and causing the largest disruption to electrical service in state history.
But what may be the lasting impression of Wilma is its rapid strengthening -- and what it reveals of forecasters' gaps in knowledge. Before striking Mexico, Wilma exploded over a 24-hour period from a tropical storm to a raging Category 5 hurricane.
It was superlative in many respects. The speed of its development is believed to have been unprecedented. It had at one point the smallest eye known to the National Hurricane Center staff -- two nautical miles in diameter -- and the central pressure at the time of peak intensity was a record low for an Atlantic hurricane.
"It is fortunate that this ultra-rapid strengthening took place over open waters, apparently void of watercraft, and not just prior to landfall," the tropical cyclone report on Wilma says in the understatement typical of such reports.
Forecasters did a relatively decent job at predicting its track, according to the report. But it noted that the errors in predicting intensity were "quite a bit larger than the average."
Predicting storm intensity is considered a critical challenge among forecasters because so many people depend upon forecasts of intensity to determine whether to flee an oncoming storm.
Emergency managers determine evacuation orders depending on the predicted intensity. Residents make their own calculations regarding whether to obey.
Coastal residents of the southeastern states have long been familiar with hurricane forecasts, and many simply ignore evacuation calls if the approaching tropical cyclone is a Category 1 or even a Category 2.
The uncertainty in the forecast intensity and track of storms forces emergency managers to make broader preparations than might be necessary, and when they prove unnecessary the inconvenience dulls the public's willingness to take precautions the next time.
"We have to base our evacuations on a worst-case scenario," said Jonathan Lord, an emergency manager in Miami-Dade County, noting that evacuation orders there assume that the storm will be one category stronger than forecast. "Look what happened to Wilma overnight."
Environment 0; Rove Won
But Rove just forwarded the letter to the appropriate officials. That couldn't influence anyone!
EPA Rule Loosened
After Oil Chief's
Letter to Rove
The White House says the executive's appeal had no role in changing a measure to protect groundwater. Critics call it a political payoff.
By Tom Hamburger and Peter Wallsten Iraq issues."
Times Staff Writers
June 13, 2006
WASHINGTON — A rule designed by the Environmental Protection Agency to keep groundwater clean near oil drilling sites and other construction zones was loosened after White House officials rejected it amid complaints by energy companies that it was too restrictive and after a well-connected Texas oil executive appealed to White House senior advisor Karl Rove. The new rule, which took effect Monday, came after years of intense industry pressure, including court battles and behind-the-scenes agency lobbying. But environmentalists vowed Monday that the fight was not over, distributing internal White House documents that they said portrayed the new rule as a political payoff to an industry long aligned with the Republican Party and President Bush. In 2002, an oilman and longtime Republican activist, Ernest Angelo, wrote a letter to Rove complaining that an early version of the rule was causing many in the oil industry to "openly express doubt as to the merit of electing Republicans when we wind up with this type of stupidity."
Rove responded by forwarding the letter to top White House environmental advisors and scrawling a handwritten note directing an aide to talk to those advisors and "get a response ASAP." Rove later wrote to Angelo, assuring him that there was a "keen awareness" within the administration of addressing not only environmental issues but also the "economic, energy and small business impacts" of the rule.
Environmentalists pointed to the Rove correspondence as evidence that the Bush White House, more than others, has mixed politics with policy decisions that are traditionally left to scientists and career regulators. At the time, Rove oversaw the White House political office and was directing strategy for the 2002 midterm elections. Angelo had been mayor of Midland, Texas, when Bush ran an oil firm there. He is also a longtime hunting partner of Rove's. The two men first worked together when Angelo managed Ronald Reagan's 1980 presidential campaign in Texas.
In an interview Monday, Angelo welcomed the new groundwater rule and said his letter might have made a difference in how it was written. But he waved off environmentalists' questions about Rove's involvement. "I'm sure that his forwarding my letter to people that were in charge of it might have had some impression on them," Angelo said. "It seems to me that it was a totally proper thing to do. I can't see why anybody's upset about it, except of course that it was effective." Asked why he wrote to Rove and not the Environmental Protection Agency or to some other official more directly associated with the matter, Angelo replied: "Karl and I have been close friends for 25 years. So, why wouldn't I write to him? He's the guy I know best in the administration."
White House spokesmen said Monday that the rule was revised as part of the federal government's standard rule-making process. They said the EPA was simply directed by White House budget officials to make the rule comply with requirements laid out by Congress in a sweeping new energy law passed last year. The issue has been a focus of lobbying by the oil and gas industry for years, ever since Clinton administration regulators first announced their intent to require special EPA permits for construction sites smaller than five acres, including oil and gas drilling sites, as a way to discourage water pollution.
Energy executives, who have long complained of being stifled by federal regulations limiting drilling and exploration, sought and received a delay in that permit requirement in 2003. Eventually, Congress granted a permanent exemption that was written into the 2005 energy legislation.
The EPA rule issued Monday adds fine print to that broad exception in ways that critics, including six members of the Senate, say exceeds what Congress intended. For example, the new rule generally exempts sediment — pieces of dirt and other particles that can gum up otherwise clear streams — from regulations governing runoff that may flow from oil and gas production or construction sites.
Sen. James M. Jeffords (I-Vt.), who joined five Democrats in objecting to the rule, wrote in March that there was nothing in the energy law suggesting that such an exclusion of sediment "had even entered the mind of any member of Congress as it considered the Energy Policy Act of 2005." Moreover, Jeffords wrote, the rule violated the intentions of Congress when it passed the Clean Water Act 19 years ago. White House and administration officials disagreed.
At the EPA, Assistant Administrator Benjamin H. Grumbles said the rule responded directly to congressional action. He cited a letter from Sen. James M. Inhofe (R-Okla.), chairman of the Senate Environment and Public Works Committee, endorsing it. He added that the rule still allows states to regulate pollution, and that it continues to regulate sediment that contains "toxic" ingredients.
Lisa Miller, a spokeswoman for another senior lawmaker, Rep. Joe L. Barton (R-Texas), chairman of the House Energy and Commerce Committee, said Monday that the rule was designed to hold oil companies accountable for putting toxic substances in the soil, but not for dirt that results from storms.
"When it rains, storm water gets muddy, regardless of whether there's an oil well in the neighborhood," Miller said. "Congress told EPA to do this, and now they have. If there's oil in the water, a producer has to clean it up. If it's nature, they don't."
The change in the rule occurred last year when staffers in the White House Office of Management and Budget began editing an early version drafted by EPA technical staff. The Office of Management and Budget oversees another division, the Office of Information and Regulatory Policy, which critics complain has served as a central hub in the Bush White House for making government regulations more business-friendly. A spokesman for the White House budget office, Scott Milburn, said Monday that the White House's involvement in making rules was intended to "ensure that agencies issue regulations that follow the law."
White House spokeswoman Dana Perino rejected the suggestion that Rove was involved in the rule change. Rove frequently receives requests, she said, and that he tries to reply and direct those requests to the appropriate people. She said that for environmentalists to accuse Rove of manipulating the EPA rule was a "typical overreach" by administration critics."That is quite an overreach, when it was the United States Congress that passed the Energy Act in a bipartisan way to ask the EPA to undertake this rulemaking," she said.
In their March letter, Jeffords and his Democratic colleagues asked EPA officials whether the correspondence with Rove influenced the final rule. A response written by Grumbles did not directly address the Rove question. But the Natural Resources Defense Council and other environmental groups assert that they know the answer.
"We can't say that Karl Rove walked over to OMB and demanded these changes," said Sharon Buccino, director of the Natural Resources Defense Council's land program. "But it is clear that there was direction coming from the top of the White House, and this was a result of the thinking of the White House as opposed to environmental experts at EPA." Buccino called the rule "yet another example of the Bush administration rewarding their friends in the oil and gas industry at the expense of the environment and the public's health."
In his letter to Rove, Angelo did not hide his political feelings. He thanked Rove for "all you do," and added words of encouragement on another topic: "The president has the opposition on the run on the
His letter appeared to gain notice at the highest levels of the administration. Three months after Angelo sent it, a top EPA official wrote to tell him that the agency had decided to impose the temporary delay on the construction permitting rule for oil and gas companies. The letter was copied to Rove, White House environmental advisor James L. Connaughton and then-EPA Administrator Christine Todd Whitman.
Florida Downgrades Manatee from Endangered to Threatened
Update: 17 environmental groups have filed a petition to reverse this decision. Gristmill post
The NY Time reports that Florida is downgrading the manatee from endangered to threatened as the U.S. Fish and Wildlife Services considers whether to reclassify it. The boating lobby is saying that they don't want to remove the boating protections the manatee enjoys, but they nonetheless want it downgraded. The development community also supports the downgrading of protection. Am I missing something? Why do they care unless they want to provide less protection? Like the saga of the grizzly and the wolf in the west, this situation highlights the need to moderate measures to protect biodiversity even for species that are not or no longer endangered or threatened.
EU Soil Strategy Redux
SOIL strategy update 6/13/06
The EU delayed release of the soil strategy last week due to objections by the EU Enterprise Commissioner. He seeks to limit contaminated site inventories to transboundary sites and to restrict public access to the inventories. The draft had required member states to identify areas at risk of degradation within five years based on common criteria. Member states would have two years to adopt an action plan with targets to reduce risks. The draft required
national inventories of contaminated sites and
remediation strategies, to be made publicly available and reviewed
regularly. According to the EU consultation survey of organizations and citizens, contamination is seen as the greatest threat to soil.
original post 6/1/06
On June 7, the European Commission is scheduled to adopt the thematic strategy on soil protection. The strategy calls for creating a framework directive requiring the 25 EU member states to meet soil remediation targets. The framework directive also would require sellers of contaminated land to provide soil reports to potential buyers. The thematic strategy for soil protection includes targets for other threats to soil such as compaction, decline in organic matter, declining biodiversity, erosion, landslides, salinization, and sealing. The strategy calls on member states to create stabilization strategies and action plans. EU link
June 12, 2006
The Science of Global Warming: Alberto Begins the Hurricane Season
Tropical Storm Alberto, the first
named storm of the 2006 Atlantic Hurricane season, is expected to pick up
strength as it heads toward the Gulf Coast of Florida, according to the U.S. National Hurricane
Center. Alberto will bring
up to 10 inches of rain today and tomorrow. This is the first of many according to forecasters. And it is accompanied by a flurry of recent research on the relationship between global warming and hurricanes. June 1 Post -- Hurricane Research Released Just Before Hurricane Season So, back to the debate. Nature recently had a piece by Nature Published online: 31 May 2006; | doi:10.1038/441564a
Tropical Storm Alberto, the first named storm of the 2006 Atlantic Hurricane season, is expected to pick up strength as it heads toward the Gulf Coast of Florida, according to the U.S. National Hurricane Center. Alberto will bring up to 10 inches of rain today and tomorrow. This is the first of many according to forecasters. And it is accompanied by a flurry of recent research on the relationship between global warming and hurricanes. June 1 Post -- Hurricane Research Released Just Before Hurricane Season
So, back to the debate. Nature recently had a piece by Nature Published online: 31 May 2006; | doi:10.1038/441564athat highlights the debate:
....As the 2006 hurricane season gets under way in the Atlantic basin, few issues could be hotter than the relationship between global warming and tropical storms. Forecasters predict that things won't be as bad as 2005, which saw a record 28 named storms in the Atlantic and probably more than US$100 billion in damages. But authorities are looking to scientists to tell them whether 2005 is an example of a hurricane season that we will have to get used to.
At first glance, a link between cyclones and global warming seems to makes sense. Tropical cyclones are born over the oceans, where masses of rotating air pick up ever more energy from warm surface water. Once the winds in the mass reach 33 metres per second, a tropical cyclone is born. In the northwest Pacific, it's called a typhoon; in the Atlantic and northeast Pacific, a hurricane; elsewhere, a cyclone.
But only recently have scientists come up with the data that suggest global warming makes cyclones more intense. Two major studies laid the groundwork last year. In the first, published in August, atmospheric scientist Kerry Emanuel proposed that hurricanes had grown more intense over the past 30 years, most likely because of increasing sea surface temperatures1. Emanuel, of the Massachusetts Institute of Technology, developed an index to describe how destructive a storm could be, and found that the wrecking power of storms correlated strongly with sea surface temperature.
The second paper2 came in September, soon after Hurricane Katrina had killed more than 1,800 people along the US Gulf Coast. A team led by Peter Webster, of the Georgia Institute of Technology in Atlanta, studied the occurrence of storms rated at the higher end of a strength-categorization scale called the Saffir–Simpson scale.
Hurricanes are ranked from 1 to 5 on the scale: storms with wind speeds reaching 33 metres per second are at the low end of category 1, and the threshold wind speed for a category 5 storm is 67 metres per second (or 241 kilometres per hour). Hurricane Katrina was category 5 when over the Gulf of Mexico, and had weakened to category 3 when it slammed into the Gulf Coast. Webster's team reported that there has been a rise in the number of category 4 and 5 storms in the past 35 years, in nearly all of the world's ocean basins2.
Together, the Emanuel and Webster papers kick-started fresh efforts in a previously obscure corner of meteorology. A veritable flood of findings has emerged; some preliminary work was presented in April at a meteorology meeting in Monterey, California3. "We've moved forward immensely since last June," says Greg Holland, a co-author on the Webster paper and a meteorologist at the National Center for Atmospheric Research in Boulder, Colorado.
Yet many research areas remain untapped. One major unknown, experts say, is how hurricanes interact with the ocean — not just form above it. "Right now, almost everyone attacks the problem as hurricanes responding passively to climate change," says Emanuel. "They are active players." Hurricanes leave a trail of cold water in their wake — which is not currently accounted for in most climate models.
Other researchers point to the need to better understand the factors affecting hurricane intensities. They hope that data from last year's 'hurricane-hunter' flights will help4; these involved forecasters flying into the heart of Atlantic hurricanes to find out what drives changes to their intensity. And mysteries still surround the issue of how hurricanes form in the first place. Indeed, one speaker, David Nolan of the University of Miami, Florida, drew a crowd in Monterey with his provocatively titled talk, 'Could hurricanes form from random convection in a warmer world?' (His answer: 'no'.)
Given that more research is obviously needed, how should scientists best direct their efforts to get useful answers as soon as possible? Many echo the adage that the past is the key to the present, and argue that far more time and money need to be spent on paleotempestology — the study of past hurricanes, as recorded in geological deposits. Studies that look back as far as several thousand years ago could help resolve the frequency with which hurricanes form, or at least make landfall in certain regions of the world. These would provide an invaluable measure of 'normal' patterns against which to weigh modern trends.
Perhaps most crucially, meteorologists say, the renewed interest in hurricanes could inspire researchers to work on improving the historical record of storms. It is possible to go back through the database and re-assess each storm with modern eyes, making sure its strength and trajectory are analysed by the same standard as more recent storms. That's what Christopher Landsea, a meteorologist at the National Hurricane Center in Miami, has been doing for the Atlantic hurricane database, which contains measurements on storms dating back to 1850.
<>Reanalysing past measurements is one thing. But the biggest problem, says Landsea, is in having to work with a lopsided data set. If you knew what was in sausages, you wouldn't want to eat them, he says; likewise the historical record shouldn't be trusted. For instance, Hurricane Wilma garnered headlines last summer when it was recorded to have the lowest central pressure — another measure of storm intensity — of any known hurricane in the Atlantic basin. Yet Wilma "was sampled just about every hour of its existence", says Landsea. Compare that, he says, to a tropical storm such as Carol, which moved up the US eastern seaboard for days in 1954 but was sampled only seven times over its lifetime.
<>The picture gets even bleaker in the world's other ocean basins. In a recent informal study, Landsea looked through satellite images of storms in the northern Indian Ocean. From these pictures, he estimated that the storms should have been rated as category 4 or 5; they were recorded as being of lower intensities at the time. Landsea says that if these storms are missing from the records, how is it possible to conclude that hurricane intensities are increasing because of global warming? They could be, he says; it's just impossible to tell.
Webster disagrees: "Chris has found several category 4s and 5s we missed in the early 1970s; he has to find 152 for us to be wrong." And Emanuel adds that, although one can argue about the particular number of hurricanes in a particular year, his measure of hurricane intensity still correlates strongly with sea surface temperature — no matter how many storms there are in a particular year.
The disagreement echoes deep battle lines between several camps, many of which have been re-ignited by the recent studies. A flurry of critiques has appeared in Science and Nature, as well as in the blogosphere. The debate has got personal at times, and few are happy about it.
In one recent paper, longtime climate-change sceptic Patrick Michaels and colleagues argue that rising sea surface temperature no longer affects the intensity of a hurricane once its winds have reached speeds of more than 50 metres per second5.
In another, Philip Klotzbach of Colorado State University in Fort Collins writes that there is no strong correlation between hurricane energy and sea surface temperature in most of the world's ocean basins — and that Webster's and Emanuel's results are due mostly to the patchiness of data sets prior to the mid-1980s (ref. 6). And the Bulletin of the American Meteorological Society has hosted a feisty back-and-forth, spearheaded by policy expert Roger Pielke Jr of the University of Colorado. He calls links between hurricanes and global warming "premature".
For many, the stakes could not be higher. Knowing where and how often storms might strike is crucial for shaping government policies. Exploding populations in coastal zones place ever-greater numbers of people at risk — a fact noted by some policy experts, who say that the apparent increase in hurricane destructiveness seen in the past few years is down to the fact that more people are living in at-risk areas7.
Preliminary studies by other groups seem to bear out Webster's and Emanuel's conclusions. A new study of Indian Ocean hurricanes, presented at the Monterey meeting, suggests that there has indeed been an increase in category 4 and 5 storms in the region — and few Indian Ocean storms are missing from the database. And using a data set of global storms that occurred between 1958 to 2001, scientists from Purdue University in West Lafayette, Indiana, have found the same overall increase in storm destructiveness in recent years — particularly after 1985 (ref. 8).
Other scientists are turning to computer models for possible answers to questions, such as how much will sea surface temperature rise, and exactly how will that influence hurricane formation? Computer models suggest that sea surface temperatures in the Atlantic hurricane-forming region could warm by 2 °C by 2100. They also suggest that if this rise occurs, maximum wind speeds could increase by 6% (ref. 9). It may not sound like much, but damage from hurricanes rises in proportion to the cube of the wind speed.
So far, the world's oceans haven't seen anything close to a 2 °C warming — just a 0.5 °C rise since 1970. "The warming we've seen to date is really just the tip of the iceberg," says Thomas Knutson, a climate modeller at the Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey.
Predicting future hurricane activity will also require greater understanding of how natural climate fluctuations interact with global warming. For example, the El Niño Southern Oscillation, a pattern of temperature fluctuations in the tropical Pacific ocean, can affect the formation of hurricanes in certain regions, as can volcanic eruptions.
Goodbye Cowboys: World Opinion Favors UN Power or Balanced Regional Power
The majority of people in nine major nations, Brazil, China,
France, Germany, Great Britain, India, Japan, Russia and the United
States, do not believe that
a world system dominated by a single world power is the best
framework for ensuring peace and stability in the world. Instead they
favor multipolar systems, either led by the United Nations or by a
balance of regional leaders. They also disfavor a bipolar system where
power wasdivided between two world powers.
World Public Opinion reports:
Despite their status as the world’s sole super power today, Americans also rejected the model of a world order based on a single world power. Nor did they want to return to a world dominated by two great powers. Instead, they indicated that they would prefer an international system where power was shared among nations. A majority (52%) thought a balance of regional powers was the best framework but a third (33%) said they would like the UN to lead the world. Only ten percent favored either a system led by a single power (6%) or two powers (4%).
These results are consistent with other polls showing that Americans are uncomfortable with their country’s role as the world’s supreme power. A 2004 poll commissioned by the Chicago Council on Foreign Relations and conducted by Knowledge Networks found that 80 percent of Americans agreed the United States was “playing the role of world policeman more than it should be.” Asked to choose the statement closest to their own position, only eight percent said that the United States should “continue to be the preeminent world leader in solving international problems;” 78 percent said instead that the United States should “do its share in efforts to solve international problems together with other countries.”
Among the other eight nations, most also favored some system where power was shared among several nations. The Germans (68%) and the Chinese (51%) were the most enthusiastic about UN leadership. Pluralities also favored the UN in Great Britain (47%) and France (46%) while they supported a balance of regional powers in Brazil (45%) and India (37%). The Russians and the Japanese were more closely divided, with about a third in each country choosing the UN and a third picking a balance of regional powers. But a quarter of the Russians said they preferred a world system dominated by one or two superpowers. And more than a third of the Japanese either did not know which system to pick or choose not to answer the question.
The Blue/Green Alliance
Link: NY Times<>
Political organizing in the US may be catching up with the rest of the world: the Steelworkers and the Sierra Club are going to tackle global warming together in what they call the Blue/Green Alliance.>
The Alliance supports stronger environmental and worker protections in trade agreements and ratification of the Kyoto Protocol. The Blue/Green Alliance will also advocate for higher fuel efficiency standards to combat global warming and air pollution, and to benefit American automobile manufacturers. "The companies that embrace the soundest environmental principles, that move to alternative and renewable forms of energy, those will be the companies that survive," said David Foster, the executive director of the alliance and the steelworkers' regional director for the Northwest. The Sierra Club and the steelworkers support a proposal, known as the Apollo Project, that aims to spur the economy and create jobs by investing $300 billion to create more energy-efficient office buildings, manufacturing techniques and modes of transportation.
Now if they can just get the Teamsters and autoworkers on board!
Ozone, sulfate, and climate: how should we factor this into NAAQS?
NASA continues to publish some interesting air quality research. Take the recent summary of research on the interactions of ozone and sulfate in air pollution and climate change by Dr. Nadine Unger and colleagues. It demonstrates the contribution of ozone to the formation of sulfates, which have severe adverse effects on air quality and moderate the effects of global warming. The take home message needs to be that, when we model the future CO2 emission reductions required to slow climate change, we need to assume that we will effectively control sulfate emissions -- so we need to reduce CO2 emissions even more. The mistaken message that can be heard is that emission of sulfate precursors emissions is really "good."
In two recent studies, we describe how emission of ozone precursor gases (gases which react to form ozone) can dramatically affect both air quality and climate forcing by increasing the levels of tropospheric sulfate. Like many of their precursors, ozone and sulfate are pollutants that can detrimentally affect climate, agriculture, and human health. However, they act differently on the climate, as ozone tends to warm the planet while sulfate cools it.
Ozone and sulfate aerosol are formed in the atmosphere from chemical reactions involving gases such as sulfur dioxide, carbon monoxide and methane, which are emitted by both natural and human sources, the latter including automobile traffic, power generation, industry and agriculture.
Many of the reactions and molecules involved in the formation of sulfate and ozone overlap. Sulfate is generated by the oxidation of sulfur dioxide by the hydroxyl radical or by hydrogen peroxide, both of which can be derived from ozone. Likewise, ozone production requires the presence of nitrogen oxides, which sulfate can remove by conversion to nitric acid.
In the future, man-made emissions of the precursor gases will change as more nations industrialize, other nations implement emissions control strategies, and world population grows, leading to changes in the amount of pollution that people are exposed to. We used the GISS ModelE to simulate a future Earth atmosphere based on a middle-of-the-road projection of man-made precursor emissions to simulate levels of air pollution in the future and to investigate how the interaction between sulfate and ozone might affect future climate changes.>
Figure 1 shows the percentage change in annual average sulfate aerosol and ozone air pollution at the Earth's surface by 2030. There are large increases in pollution in subtropical regions, especially Asia. Over the Indian subcontinent the surface sulfate aerosol amount changes from around 400 pptv in the present day to around 2000 pptv at 2030 and the surface level ozone increases from around 35 ppbv to 60 ppbv. The potential consequences of such large increases in the sulfate aerosol and ozone pollution may have serious social and economic impacts across the Indian subcontinent.
Next we calculated how much of the sulfate aerosol increase is due to the change in ozone precursor emissions alone and find the influence to be surprisingly large. Figure 2 shows the amount of the future surface sulfate aerosol that comes from changes in ozone precursor emissions only. Increases in ozone precursor emissions contribute about 10% to surface sulfate increases over the Middle East, North Africa and the most developed parts of South America, but the largest influence occurs over the Indian subcontinent, where the surface sulfate is 20% greater as a result of the future emissions-driven increases in ozone precursors.
Moreover, ozone precursor emissions also contribute 20% of the negative sulfate forcing over India, which is more than twice the direct positive forcing of ozone itself. In contrast, changes in sulfate precursor emissions do not significantly affect future ozone levels.>
This new insight, that ozone precursors have a surprisingly large influence on air quality via sulfate and that their overall climate impact may be opposite to the conventional view, is of direct relevance to regulatory policy. The interconnection between ozone and sulfate can complicate environmental efforts, as a reduction of ozone precursors would improve surface air quality, but also impose additional positive forcing via sulfate reduction. Our results suggest that future regulations should address ozone and sulfate simultaneously, which they do not currently do, as well as consider both air quality and climate.
- Unger, N., D.T. Shindell, D.M. Koch, and D.G. Streets 2006. Cross influences of ozone and sulfate precursor emissions changes on air quality and climate. Proc. Natl. Acad. Sci. 103, 4377-4380, doi:10.1073pnas.0508769103.
- Unger, N.B., D.T. Shindell, D.M. Koch, M. Amann, J. Cofala, and D.G. Streets 2006. Influences of man-made emissions and climate changes on tropospheric ozone, methane and sulfate at 2030 from a broad range of possible futures. J. Geophys. Res., in press.