RACHEL'S ENVIRONMENT & HEALTH WEEKLY

SURVEYED SCIENTISTS AGREE GLOBAL WARMING IS REAL

Ninety-seven percent of climatologists but only 64 percent of meteorologists say humans are contributing to global warming; a majority of petroleum geologists dissent.

Human-induced global warming is real, according to a recent U.S. survey based on the opinions of 3,146 scientists. However there remains divisions between climatologists and scientists from other areas of earth sciences as to the extent of human responsibility.

A survey of more than 3,000 scientists found that the vast majority believe humans cause global warming.

Against a backdrop of harsh winter weather across much of North America and Europe, the concept of rising global temperatures might seem incongruous.

However the results of the investigation conducted at the end of 2008 reveal that vast majority of the Earth scientists surveyed agree that in the past 200-plus years, mean global temperatures have been rising and that human activity is a significant contributing factor in changing mean global temperatures.

The study released today was conducted by academics from the University of Illinois, who used an online questionnaire of nine questions. The scientists approached were listed in the 2007 edition of the American Geological Institute's Directory of Geoscience Departments.

Two questions were key: Have mean global temperatures risen compared to pre-1800s levels, and has human activity been a significant factor in changing mean global temperatures?

About 90 percent of the scientists agreed with the first question and 82 percent the second.

The strongest consensus on the causes of global warming came from climatologists who are active in climate research, with 97 percent agreeing humans play a role.

Petroleum geologists and meteorologists were among the biggest doubters, with only 47 percent and 64 percent, respectively, believing in human involvement.

Copyright 2009 Cable News Network


From: Vancouver (British Columbia) Sun, Jan. 18, 2009

CLIMATE WARMING 'HIGHLY UNUSUAL' SAYS NEW STUDY

Findings counter argument that melt is part of climate cycle

By Randy Boswell, Canwest News Service

A major U.S. government report on Arctic climate, prepared with information from eight Canadian scientists, has concluded that the recent rapid warming of polar temperatures and shrinking of multi-year Arctic sea ice are "highly unusual compared to events from previous thousands of years."

The findings, released Friday, counter suggestions from skeptics that such recent events as the opening of the Northwest Passage and collapse of ice shelves in the Canadian Arctic are predictable phenomena that can be explained as part of a natural climate cycle rather than being driven by elevated carbon emissions from human activity.

A summary of the report -- described as "the first comprehensive analysis of the real data we have on past climate conditions in the Arctic," by U.S. Geological Survey director Mark Myers -- warns that "sustained warming of at least a few degrees" is probably enough "to cause the nearly complete, eventual disappearance of the Greenland ice sheet, which would raise sea level by several metres."

The study also sounds the alarm that "temperature change in the Arctic is happening at a greater rate than other places in the Northern Hemisphere, and this is expected to continue in the future. As a result, glacier and ice-sheet melting, sea-ice retreat, coastal erosion and sea-level rise can be expected to continue."

Ice cover in the Canadian Arctic and throughout the polar world has experienced record-setting melts in the past few years. The summer of 2007 saw polar ice cover shrink to its lowest extent in recorded history. Last summer's melt came close to matching that record, and recent research indicates that overall ice volume -- because of the continual replacement of thicker, multi-year ice with thinner new ice -- was lower in 2008 than 2007.

This past summer also saw further dramatic evidence of the unusual warming of the Canadian Arctic, including record-setting high temperatures in Iqaluit, Nunavut, rapid erosion and flooding of a glacial landscape on Baffin Island, the re-opening of the Northwest Passage, an unprecedented clearing of ice from the Beaufort Sea and the collapse of hundreds of square kilometres of ancient ice shelves on Ellesmere Island.

Research for the U.S. Congress-commissioned report was conducted by 37 scientists from the U.S., Germany, Canada, Britain and Denmark.

"The current rate of human-influenced Arctic warming is comparable to peak natural rates documented by reconstructions of past climates. However, some projections of future human-induced change exceed documented natural variability," the scientists conclude. "The past tells us that when thresholds in the climate system are crossed, climate change can be very large and very fast. We cannot rule out that human-induced climate change will trigger such events in the future."

Copyright (c) The Vancouver Sun


From: The Guardian (Manchester, U.K.), Jan. 18, 2009 [Printer-friendly version]

'ONLY FOUR YEARS LEFT TO ACT ON CLIMATE CHANGE,' JIM HANSEN WARNS

"We have only four years left to act on climate change -- America has to lead."

By Robin McKie, Science Editor

Along one wall of Jim Hansen's wood-panelled office in upper Manhattan, the distinguished climatologist has pinned 10 A4-sized photographs of his three grandchildren: Sophie, Connor and Jake. They are the only personal items on display in an office otherwise dominated by stacks of manila folders, bundles of papers and cardboard boxes filled with reports on climate variations and atmospheric measurements.

The director of Nasa's Goddard Institute for Space Studies in New York is clearly a doting grandfather as well as an internationally revered climate scientist. Yet his pictures are more than mere expressions of familial love. They are reminders to the 67-year-old scientist of his duty to future generations, children whom he now believes are threatened by a global greenhouse catastrophe that is spiralling out of control because of soaring carbon dioxide emissions from industry and transport.

"I have been described as the grandfather of climate change. In fact, I am just a grandfather and I do not want my grandchildren to say that grandpa understood what was happening but didn't make it clear," Hansen said last week. Hence his warning to Barack Obama, who will be inaugurated as US president on Tuesday. His four-year administration offers the world a last chance to get things right, Hansen said. If it fails, global disaster -- melted sea caps, flooded cities, species extinctions and spreading deserts -- awaits mankind.

"We cannot now afford to put off change any longer. We have to get on a new path within this new administration. We have only four years left for Obama to set an example to the rest of the world. America must take the lead."

After eight years of opposing moves to combat climate change, thanks to the policies of President George Bush, the US had given itself no time for manoeuvre, he said. Only drastic, immediate change can save the day and those changes proposed by Hansen -- who appeared in Al Gore's An Inconvenient Truth and is a winner of the World Wildlife Fund's top conservation award -- are certainly far-reaching. In particular, the idea of continuing with "cap-and-trade" schemes, which allow countries to trade allowances and permits for emitting carbon dioxide, must now be scrapped, he insisted. Such schemes, encouraged by the Kyoto climate treaty, were simply "weak tea" and did not work. "The United States did not sign Kyoto, yet its emissions are not that different from the countries that did sign it."

Thus plans to include carbon trading schemes in talks about future climate agreements were a desperate error, he said. "It's just greenwash. I would rather the forthcoming Copenhagen climate talks fail than we agree to a bad deal," Hansen said.

Only a carbon tax, agreed by the west and then imposed on the rest of the world through political pressure and trade tariffs, would succeed in the now-desperate task of stopping the rise of emissions, he argued. This tax would be imposed on oil corporations and gas companies and would specifically raise the prices of fuels across the globe, making their use less attractive. In addition, the mining of coal -- by far the worst emitter of carbon dioxide -- would be phased out entirely along with coal-burning power plants which he called factories of death.

"Coal is responsible for as much atmospheric carbon dioxide as other fossil fuels combined and it still has far greater reserves. We must stop using it." Instead, programmes for building wind, solar and other renewable energy plants should be given major boosts, along with research programmes for new generations of nuclear reactors.

Hansen's strident calls for action stem from his special view of our changing world. He and his staff monitor temperatures relayed to the institute -- an anonymous brownstone near Columbia University -- from thousands of sites around the world, including satellites and bases in Antarctica. These have revealed that our planet has gone through a 0.6C rise in temperature since 1970, with the 10 hottest years having occurred between 1997 and 2008: unambiguous evidence, he believes, that Earth is beginning to overheat dangerously.

Last week, however, Hansen revealed his findings for 2008 which show, surprisingly, that last year was the coolest this century, although still hot by standards of the 20th century. The finding will doubtless be seized on by climate change deniers, for whom Hansen is a particular hate figure, and used as "evidence" that global warming is a hoax.

However, deniers should show caution, Hansen insisted: most of the planet was exceptionally warm last year. Only a strong La Nina -- a vast cooling of the Pacific that occurs every few years -- brought down the average temperature. La Nina would not persist, he said. "Before the end of Obama's first term, we will be seeing new record temperatures. I can promise the president that."

Hansen's uncompromising views are, in some ways, unusual. Apart from his senior Nasa post, he holds a professorship in environmental sciences at Columbia and dresses like a tweedy academic: green jumper with elbow pads, cords and check cotton shirt. Yet behind his unassuming, self-effacing manner, the former planetary scientist has shown surprising steel throughout his career. In 1988, he electrified a congressional hearing, on a particular hot, sticky day in June, when he announced he was "99% certain" that global warming was to blame for the weather and that the planet was now in peril from rising carbon dioxide emissions. His remarks, which made headlines across the US, pushed global warming on to news agendas for the first time.

Over the years, Hansen persisted with his warnings. Then, in 2005, he gave a talk at the American Geophysical Union in which he argued that the year was the warmest on record and that industrial carbon emissions were to blame. A furious White House phoned Nasa and Hansen was banned from appearing in newspapers or on television or radio. It was a bungled attempt at censorship. Newspapers revealed that Hansen was being silenced and his story, along with his warnings about the climate, got global coverage.

Since then Hansen has continued his mission "to make clear" the dangers of climate change, sending a letter last December from himself and his wife Anniek about the urgency of the planet's climatic peril to Barack and Michelle Obama. "We decided to send it to both of them because we thought there may be a better chance she will think about this or have time for it. The difficulty of this problem [of global warming] is that its main impacts will be felt by our children and by our grandchildren. A mother tends to be concerned about such things."

Nor have his messages of imminent doom been restricted to US politicians. The heads of the governments of Britain, Germany, Japan and Australia have all received recent warnings from Hansen about their countries' behaviour. In each case, these nations' continued support for the burning of coal to generate electricity has horrified the climatologist. In Britain, he has condemned the government's plans to build a new coal plant at Kingsnorth, in Kent, for example, and even appeared in court as a defence witness for protesters who occupied the proposed new plant's site in 2007.

"On a per capita basis, Britain is responsible for more of the carbon dioxide now in the atmosphere than any other nation on Earth because it has been burning it from the dawn of the Industrial Revolution. America comes second and Germany third. The crucial point is that Britain could make a real difference if it said no to Kingsnorth. That decision would set an example to the rest of the world." These points were made clear in Hansen's letter to the prime minister, Gordon Brown, though he is still awaiting a reply.

As to the specific warnings he makes about climate change, these concentrate heavily on global warming's impact on the ice caps in Greenland and Antarctica. These are now melting at an alarming rate and threaten to increase sea levels by one or two metres over the century, enough to inundate cities and fertile land around the globe.

The issue was simple, said Hansen: would each annual increase of carbon dioxide to the atmosphere produce a simple proportional increase in temperature or would its heating start to accelerate?

He firmly believes the latter. As the Arctic's sea-ice cover decreases, less and less sunlight will be reflected back into space. And as tundras heat up, more and more of their carbon dioxide and methane content will be released into the atmosphere. Thus each added tonne of carbon will trigger greater rises in temperature as the years progress. The result will be massive ice cap melting and sea-level rises of several metres: enough to devastate most of the world's major cities.

"I recently lunched with Martin Rees, president of the Royal Society, and proposed a joint programme to investigate this issue as a matter of urgency, in partnership with the US National Academy of Sciences, but nothing has come of the idea, it would seem," he said.

Hansen is used to such treatment, of course, just as the world of science has got used to the fact that he is as persistent as he is respected in his work and will continue to press his cause: a coal- power moratorium and an investigation of ice-cap melting.

The world was now in "imminent peril", he insisted, and nothing would quench his resolve in spreading the message. It is the debt he owes his grandchildren, after all.

========================================================

The climate in figures

** The current level of carbon dioxide in the atmosphere is 385 parts per million. This compares with a figure of some 315ppm around 1960.

** Carbon dioxide is a greenhouse gas that can persist for hundreds of years in the atmosphere, absorbing infrared radiation and heating the atmosphere.

** The Intergovernmental Panel on Climate Change's last report states that 11 of the 12 years between 1995-2006 rank among the 12 warmest years on record since 1850.

** According to Jim Hansen, the nation responsible for putting the largest amount of carbon dioxide in the atmosphere is Britain, on a per capita basis -- because the Industrial Revolution started here. China is now the largest annual emitter of carbon dioxide .

** Most predictions suggest that global temperatures will rise by 2C [3.6 F.] to 4C [7.2 F.] over the century.

** The IPCC estimates that rising temperatures will melt ice and cause ocean water to heat up and increase in volume. This will produce a sea-level rise of between 18 [7 inches] and 59 centimetres [23 inches]. However, some predict a far faster rate of around one to two metres [3.3 ft. to 6.6 ft.].

** Inundations of one or two metres would make the Nile Delta and Bangladesh uninhabitable, along with much of south-east England, Holland and the east coast of the United States.


From: New Scientist (pg. 30), Jan. 21, 2009

ONE LAST CHANCE TO SAVE MANKIND

An interview with James Lovelock, originator of the gaia hypothesis.

By Gaia Vince

With his 90th birthday in July, a trip into space scheduled for later in the year and a new book out next month, 2009 promises to be an exciting time for James Lovelock. But the originator of the Gaia theory, which describes Earth as a self-regulating planet, has a stark view of the future of humanity. He tells Gaia Vince we have one last chance to save ourselves -- and it has nothing to do with nuclear power

GV: Your work on atmospheric chlorofluorocarbons led eventually to a global CFC ban that saved us from ozone-layer depletion. Do we have time to do a similar thing with carbon emissions to save ourselves from climate change?

JL: Not a hope in hell. Most of the "green" stuff is verging on a gigantic scam. Carbon trading, with its huge government subsidies, is just what finance and industry wanted. It's not going to do a damn thing about climate change, but it'll make a lot of money for a lot of people and postpone the moment of reckoning. I am not against renewable energy, but to spoil all the decent countryside in the UK with wind farms is driving me mad. It's absolutely unnecessary, and it takes 2500 square kilometres to produce a gigawatt -- that's an awful lot of countryside.

GV: What about work to sequester carbon dioxide?

JL: That is a waste of time. It's a crazy idea -- and dangerous. It would take so long and use so much energy that it will not be done.

GV: Do you still advocate nuclear power as a solution to climate change?

JL: It is a way for the UK to solve its energy problems, but it is not a global cure for climate change. It is too late for emissions reduction measures.

GV: So are we doomed?

JL: There is one way we could save ourselves and that is through the massive burial of charcoal. It would mean farmers turning all their agricultural waste -- which contains carbon that the plants have spent the summer sequestering -- into non-biodegradable charcoal, and burying it in the soil. Then you can start shifting really hefty quantities of carbon out of the system and pull the CO2 down quite fast.

GV: Would it make enough of a difference?

JL: Yes. The biosphere pumps out 550 gigatonnes of carbon yearly; we put in only 30 gigatonnes. Ninety-nine per cent of the carbon that is fixed by plants is released back into the atmosphere within a year or so by consumers like bacteria, nematodes and worms. What we can do is cheat those consumers by getting farmers to burn their crop waste at very low oxygen levels to turn it into charcoal, which the farmer then ploughs into the field. A little CO2 is released but the bulk of it gets converted to carbon. You get a few per cent of biofuel as a by- product of the combustion process, which the farmer can sell. This scheme would need no subsidy: the farmer would make a profit. This is the one thing we can do that will make a difference, but I bet they won't do it.

GV: Do you think we will survive?

JL: I'm an optimistic pessimist. I think it's wrong to assume we'll survive 2 deg. C [3.6 deg. F.] of warming: there are already too many people on Earth. At 4 deg. C [7.2 deg. F.] we could not survive with even one-tenth of our current population. The reason is we would not find enough food, unless we synthesised it. Because of this, the cull during this century is going to be huge, up to 90 per cent. The number of people remaining at the end of the century will probably be a billion or less. It has happened before: between the ice ages there were bottlenecks when there were only 2000 people left. It's happening again.

I don't think humans react fast enough or are clever enough to handle what's coming up. Kyoto was 11 years ago. Virtually nothing's been done except endless talk and meetings.

GV: It's a depressing outlook.

JL: Not necessarily. I don't think 9 billion is better than 1 billion. I see humans as rather like the first photosynthesisers, which when they first appeared on the planet caused enormous damage by releasing oxygen -- a nasty, poisonous gas. It took a long time, but it turned out in the end to be of enormous benefit. I look on humans in much the same light. For the first time in its 3.5 billion years of existence, the planet has an intelligent, communicating species that can consider the whole system and even do things about it. They are not yet bright enough, they have still to evolve quite a way, but they could become a very positive contributor to planetary welfare.

GV: How much biodiversity will be left after this climatic apocalypse?

JL: We have the example of the Palaeocene-Eocene Thermal Maximum event 55 million years ago. About the same amount of CO2 was put into the atmosphere as we are putting in and temperatures rocketed by about 5 deg. C over about 20,000 years. The world became largely desert. The polar regions were tropical and most life on the planet had the time to move north and survive. When the planet cooled they moved back again. So there doesn't have to be a massive extinction. It's already moving: if you live in the countryside as I do you can see the changes, even in the UK.

GV: If you were younger, would you be fearful?

JL: No, I have been through this kind of emotional thing before. It reminds me of when I was 19 and the second world war broke out. We were very frightened but almost everyone was so much happier. We're much better equipped to deal with that kind of thing than long periods of peace. It's not all bad when things get rough. I'll be 90 in July, I'm a lot closer to death than you, but I'm not worried. I'm looking forward to being 100.

GV: Are you looking forward to your trip into space this year?

JL: Very much. I've got my camera ready!

GV: Do you have to do any special training?

JL: I have to go in the centrifuge to see if I can stand the g-forces. I don't anticipate a problem because I spent a lot of my scientific life on ships out on rough oceans and I have never been even slightly seasick so I don't think I'm likely to be space sick. They gave me an expensive thorium-201 heart test and then put me on a bicycle. My heart was performing like an average 20 year old, they said.

GV: I bet your wife is nervous.

JL: No, she's cheering me on. And it's not because I'm heavily insured, because I'm not.

==============

James Lovelock is a British chemist, inventor and environmentalist. He is best known for formulating the controversial Gaia hypothesis in the 1970s, which states that organisms interact with and regulate Earth's surface and atmosphere. Later this year he will travel to space as Richard Branson's guest aboard Virgin Galactic's SpaceShipTwo. His latest book, The Vanishing Face of Gaia, is published by Basic Books in February.

Copyright Reed Business Information Ltd.


SLOUCHING TOWARD GOLGOTHA

[Rachel's introduction: "I can't understand why there aren't rings of young people blocking bulldozers and preventing them from constructing coal-fired power plants." -- Al Gore]

By Peter Montague

Most of my friends want to deny it, but the evidence is compelling: the U.S. and Europe are aggressively advancing the only real plan they've ever had for "solving" the global warming problem. Their plan -- their only published plan -- is to capture carbon dioxide (CO2) gas, compress it into a liquid, and pump it a mile below ground, hoping it will stay there forever. It will be the largest hazardous waste disposal program ever undertaken. Sometimes the plan is called CCS (short for "carbon capture and sequestration") but mostly it's known by its gimmicky PR name "clean coal."

On paper, the plan seems simple enough: Bury trillions of tons of hazardous CO2 in the ground. They tell us it will work even though its never been tested. But what if they're wrong? What if it leaks? If that happens, they've got no Plan B. Sorry, kids, we used up your world.

The U.S. and Europe have painted the whole planet into a corner: by denying or ignoring global warming science for more than 20 years and refusing to take precautionary action, political "leaders" have allowed the problem to grow so large that it now threatens the future of civilization.

To be cynically frank, the CCS plan has three big things going for it:

** First, after the stuff is pumped underground, it will be out of sight and out of mind, no one will know for sure where it is, and there will be no way to get it back. Problem solved. If it starts to leak out a few miles away from the injection site and the leakage is somehow miraculously discovered, chances are that nothing can be done about it, so we might as well forget the whole thing. It's a done deal, so eat, drink, and be merry -- just as we've been doing for the past 30 years.

** Second, with CCS as our "solution," no one important has to change anything they're now doing -- the coal, oil, automobile, railroad, mining and electric power corporations can continue on their present path undisturbed -- and no doubt they will reward Congress handsomely for being so "reasonable." Everyone knows that's how the system works. No one even bothers to deny it.

** Third, CCS cannot actually be tested; it will always require a leap of faith. Even though the goal is to keep CO2 buried in the ground forever, in human terms any test will have to end on some particular day in the not-too-distant future. On that day the test will be declared a "success" -- but leakage could start the following day. So, given the goal of long-term storage, no short-term test can ever prove conclusive. CCS will always rest on a foundation of faith; and, in the absence of conclusive tests, those with the greatest persuasive powers ($$) have the upper hand.

Two weeks ago the Germans inaugurated the world's first coal-fired power plant designed to bury its CO2 in the ground as an experiment. As New Scientist magazine told us last March, "In Germany, only CCS can make sense of an energy policy that combines a large number of new coal-fired power stations with plans for a 40 per cent cut in CO2 emissions by 2020." In other words, the Germans hitched their wagon to a CCS solution long before they designed the first experiment to see if it could work. With the future of the German economy dependent on the outcome, it seems unlikely that this first little experiment will be announced as a failure. Like us, the Germans are playing Russian roulette with the future of the planet.

This week saw several new developments:

** A study published in the proceedings of the National Academy of Sciences clarified that our past carbon emissions have already committed the world to an unavoidable temperature rise of 4.3 degrees Fahrenheit (2.4 C.) -- with the true number perhaps as low as 2.5 Fahrenheit (1.4 C.) or as high as 7.7 Fahrenheit (4.3 C.). This is global warming that is already "in the system" and cannot be reversed no matter what we do. One degree Fahrenheit (0.6 C.) of this "committed warming" has already occurred; the other 3.3 Fahrenheit (1.8 C.) will build up as the century unfolds. It's likely to be very unpleasant and very costly but it's already too late to do anything about it. Sorry, kids. Perhaps a little humor can make us feel better (this from the New York Times June 1, 2008);

Three words from our elders: We are toast.

** Another important study came out this week, this one from the American Physical Society -- the professional association for the nation's 46,000 physicists. It made a couple of really crucial points:

1. In case you had any lingering doubts, it said the physics and chemistry behind the human causes of climate change -- such as heat-trapping pollution from the burning of fossil fuels -- is "well-understood and beyond dispute."

2. It said the need for action now is "urgent." But what kind of action? Burying carbon dioxide in the ground? No.

"The bottom line is that the quickest way to do something about America's use of energy is through energy efficiency," said Burton Richter, the chairman of the study panel and a 1976 Nobel Prize winner in physics. "Energy that you don't use is free. It's not imported and it doesn't emit any greenhouse gases. Most of the things we recommend don't cost anything to the economy. The economy will save money."

** Of course Democrats in Congress joined Republicans in ignoring the advice of the nation's 46,000 physicists. Instead, they voted to end the 26-year-old ban on drilling for oil on the nation's coastal waters (both Atlantic and Pacific coasts, plus they ended a ban on oil shale drilling in the Rocky Mountain states) -- thus promising to prolong and worsen the global warming problem. Sorry, kids.

** The U.S. Department of Energy (DOE) announced this week it is offering an $8 billion subsidy for "clean coal" demonstrations. (As a sign of the appalling collapse of governmental independence, the DOE is now parroting the coal industry's loony slogan, "clean coal.") The coal companies are unwilling to put up their own money to start burying CO2 in the ground, so Uncle Sam is using our money to do it. Actually, federal money is increasingly borrowed these days, so it is actually our children's money that is funding this game of Russian roulette with the future of the planet. A double whammy. Sorry, kids.

** Next week U.S. Environmental Protection Agency (EPA) holds the first of two public hearings on its proposed "regulation" of carbon capture and sequestration (CCS) -- in Chicago Sept. 30 and Denver Oct. 2. But what's the point? EPA has already announced that CCS is a splendid idea. The agency's CCS web site says (evidently with a straight face), "With proper site selection and management, geologic sequestration could play a major role in reducing emissions of CO2." (As we saw a couple of weeks ago, to sequester even 10% of today's CO2 would require an infrastructure of pipelines and chemical processing plants larger than the entire global petroleum industry. Who's going to "properly" manage such a kluged-together behemoth? EPA? DOE? Perhaps the wizards of Wall Street?)

Despite the absence of experiments, demonstrations or data, the EPA chief is already firmly on board the CCS Express. Stephen L. Johnson said in 2007 [2.4 Mbyte PDF], "By harnessing the power of geologic sequestration technology, we are entering a new age of clean energy where we can be both good stewards of the Earth, and good stewards of the American economy." Clearly, we cannot look to EPA for careful scrutiny of this untried technology, on which we are betting the farm.

No, U.S. Environmental Protection Agency (EPA) has already cast aside all doubts about CCS and is prancing with pom poms -- ready to bet the future of humankind on this untested and untestable technology. Those of us who were around -- and were even naively enthusiastic -- when EPA was created by Richard Nixon back in 1970 can only say, with genuine shame and regret, "Sorry, kids."

** This week, Al Gore once again called for civil disobedience to stop the construction of new coal plants. The New York Times reports that Gore told an audience in New York September 24, 2008:

"If you're a young person looking at the future of this planet and looking at what is being done right now, and not done, I believe we have reached the stage where it is time for civil disobedience to prevent the construction of new coal plants that do not have carbon capture and sequestration." Since no coal plants have carbon capture and sequestration, Mr. Gore was calling for an end to all coal plants, as he soon made clear:

According to the Times, "Mr. Gore said the civil disobedience should focus on 'stopping the construction of new coal plants,' which he said would add tons of carbon dioxide to the atmosphere -- despite 'half a billion dollars' worth of advertising by the coal and gas industry' claiming otherwise. He added, 'Clean coal does not exist.'"

"Clean coal does not exist." Now there's a refreshing blast of simple honesty. The phrase "clean coal" was invented as a public relations gambit by the coal industry to bamboozle regulators and legislators into approving construction of new dirty coal plants. If the deciders can be convinced that some day a fancy end-of-pipe "clean coal" filter might be tacked onto today's dirty coal plants, then imaginary "clean coal" takes on an important reality: it becomes the crucial gimmick that allows more dirty coal plants to be built today even though everyone acknowledges they are destroying our future. And if the "clean coal" filter never materializes because it turns out to be too complicated or too unreliable or too costly or too leaky? Sorry, kids.

Yes, kids, the system is rigged. The fossil corporations claim the right to burn all the fossil fuels they own, no matter the cost to the rest of us. And of course they've got the state violence apparatus on their side (judges, police, national guard). To accomplish their goal, they have paid off Congress -- Republicans and Democrats (all perfectly legal, of course, through "campaign contributions"). And yes, all of them know your future is being sacrificed, but they don't care. They simply don't care.

But the system has been rigged before. It was rigged against all people of color, against women, against workers, and against children chained to machines in "dark, satanic mills." But in each of those cases, people marched; they picketed; they demonstrated; they took to the streets in hordes; they stuck wooden shoes into the gears of the industrial machine; they flushed pocket combs down toilets to stop up the works; they stashed stinking fish in safe deposit boxes to unnerve the bankers; they sat in restaurants and public libraries and college offices and industrial workplaces and they refused to budge; they conducted strikes and walk-outs and they sat down on the job; they faced dogs and fire hoses and guns and clubs and jail; they chained themselves to fences, they prayed, they sang; in short, they got courageous and creative and obstreperous and disobedient. They pushed the system until it fell over and changed.

In sum, they refused to allow their future to be crucified on the altar of the almighty Dollar.

And now that spirit is rising again.

Item: In early July a dozen protestors shut down rush hour traffic in Richmond, Virginia to protest Dominion Virginia Power's plan for a new $1.8 billion coal plant in Wise, in southern Virginia.

Item: Protestors sat in at the headquarters of AMP-Ohio in Columbus July 8 opposing the construction of a coal plant in Meigs County. Eight people occupied the headquarters lobby while another 40 people pressed against the door to the building, obstructing the entrance. Police said they maced "about 20 people," but denied accusations that they had used tasers.

Item: In late July an estimated 500 activists gathered in Coburg, Oregon for a week-long "climate action convergence camp" aimed at "low-impact living and high-impact action" -- learning a more sustainable lifestyle and successful protest tactics, including civil disobediance. Similar convergence camps were reportedly going on this summer in New York, Virginia, England, Germany, Australia, Denmark, Russia, and New Zealand.

Item: In August, 50 protesters marched noisily through downtown Richmond, Va., on their way to the headquarters of Massey Energy -- the nation's second-largest coal corporation.

Item: In early September six Greenpeace protestors were acquitted by a jury in England, despite having caused at estimated $76,000 dollars in damage to the Kingsnorth power station in Kent. During an eight-day trial, the six Greenpeacers argued that they were justified in shutting down coal-fired power plants because of the larger danger posed to the planet by coal emissions. The jury agreed, in a decision that rocked the system to its foundations.

In Australia, it was reported this way:

"In a decision that will send chills down corporate spines across Britain, the jury decided the dangers of global warming were so enormous that the Greenpeace campaigners were justified in trying to close down Kingsnorth power station in Kent. Jurors at Maidstone Crown Court cleared the six activists of criminal damage, accepting they had a 'lawful excuse' to damage the Kingsnorth property to try to prevent the even greater damage of climate change...."

Think of that, kids. An English jury concluded that you've got a lawful excuse to try to shut down coal plants in your role as guardians of the future.

"This verdict marks a tipping point for the climate change movement," said Ben Stewart, one of the defendants. "If jurors from the heart of Middle England say it's legitimate for a direct action group to shut down a coal-fired power station because of the harm it does to our planet, where does that leave government energy policy?"

Civil disobediance to stop coal is not an idea that Mr. Gore dreamed up yesterday. He's been recommending it for some time. A year ago he said, "I can't understand why there aren't rings of young people blocking bulldozers and preventing them from constructing coal-fired power plants." So, for some time now, Mr. Gore has been trying to tell us all something important: Our situation is dire. Our future is threatened. It's time for a new approach. It's time to act.

One last thing, kids. As a historian I can tell you that nothing having to do with justice in the United States has ever been accomplished without civil disobedience. Nothing. Not one thing. So Al Gore is right. It's necessary. It's justified. And it's time.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: Global Footprint Network ...........................[This story printer-friendly]
September 23, 2008

CELEBRATING (?) EARTH OVERSHOOT DAY

[Rachel's introduction: On September 23 (Tuesday last week), our demand surpassed nature's budget. For the rest of 2008, we will be in the ecological equivalent of deficit spending, drawing down our resource stocks -- in essence, borrowing from the future.]

On September 23rd this year we marked an unfortunate milestone: As of that day, humanity will have consumed all the new resources the planet will produce this year, according to Global Footprint Network calculations. For the rest of 2008, we will be in the ecological equivalent of deficit spending, drawing down our resource stocks -- in essence, borrowing from the future.

The recent bank failures in the United States have shown what happens when debt and spending get out of control. We are seeing signs of similarly disastrous consequences from our ecological overspending.

Climate change, shrinking forests, declining biodiversity and current world food shortages are all results of the fact that we are demanding more from nature than it can supply.

Humans now require the resources of 1.4 planets

Just like any company nature has a budget -- it can only produce so many resources and absorb so much waste each year. Globally, we now demand the biological capacity of 1.4 planets, according to Global Footprint Network data. But of course, we only have one.

Earth Overshoot Day (also known as Ecological Debt Day) was a concept devised by Global Footprint Network partner NEF (New Economics Foundation). Each year, Global Footprint Network calculates humanity's Ecological Footprint (its demand on cropland, pasture, forests and fisheries), and compares this with the amount of resources the world's lands and seas generate. Our data shows us that in less than 10 months we consume what it takes the planet 12 months to produce.

Earth Overshoot Day creeps earlier every year Humanity has been in overshoot since the mid 1980s, when the first Earth Overshoot Day fell on December 31, 1986. By 1995 it was more than a month earlier, arriving on November 21. Ten years later it had moved another six weeks earlier, to October 2, 2005.

What contributes to our increasing demand? Part of the story is that there are simply more people on the planet requiring nature's services. In some areas of the world -- most notably in high income regions like the U.S. and Europe, as well as industrializing nations like China -- per capita resource consumption has also been increasing.

In other areas of the world, however, including India and parts of Africa, per capita Ecological Footprints have actually declined, likely as a result of there being less resources available per person.

Carbon is also a big part of the story, as it is the greatest contributor to ecological overshoot. Humanity is emitting carbon faster than the planet can re-absorb it. Our carbon Footprint has increased more than 700 percent since 1961.

United Nations business-as-usual projections show humanity requiring the equivalent of two planets by 2050. (For details see Global Footprint Network and WWF's Living Planet Report 2006). This would put Earth Overshoot Day on July 1, and means it would take two years for the planet to regenerate what we use in one year. Reaching this level of ecological deficit spending may be physically impossible.

What Can I Do to End Overshoot?

Global Footprint Network and its international partner network is focused on solving the problem of overshoot, working with businesses and government leaders around the world to make ecological limits a central part of decision-making everywhere.

Citizens can take action to get out of overshoot in their own lives:

eating less meat, driving and flying less, and using less energy in the home. They can also encourage government and business leaders to build communities with smart infrastructure planning and best-practice green technology. Use our interactive calculator to determine your own Ecological Footprint and learn what you can do to reduce it.

With international commitment to end overshoot, Earth Overshoot Day can become history instead of news.

==============

Global Footprint Network 312 Clay Street, Suite 300 Oakland CA 94610

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: New Scientist ......................................[This story printer-friendly]
May 16, 2008

GLOBAL BIODIVERSITY SLUMPS 27% IN 35 YEARS

[Rachel's introduction: Ground-living vertebrates have declined by 25%, with most of the slump occurring since 1980. Marine species held fairly steady until the late 1990s before falling sharply to give an overall drop of 28%. Freshwater species have decreased by 25%, primarily since the late 1980s.]

By Michael Marshall

The latest data on the global biodiversity of vertebrates shows that it has fallen by almost one-third in the last 35 years. But experts say it may still underestimate the effect humans have had on global species counts.

The Living Planet Index (LPI) follows trends in nearly 4,000 populations of 1,477 vertebrate species and is said to reflect the impact humans have on the planet. It is based on a wide range of population datasets, such as commercial data on fish stocks and projects such as the Pan-European Common Bird Monitoring scheme.

New figures show that between 1970 and 2005, the global LPI has fallen by 27%. This suggests that the world will fail to meet the target of reducing the rate of biodiversity loss set by the 2002 Convention on Biological Diversity.

The results were released as part of a WWF report entitled 2010 and Beyond: Rising to the biodiversity challenge [1.5 Mbyte PDF].

"Governments have signally failed to deliver on their biodiversity commitments, and biodiversity declines are continuing," Jonathan Loh, a researcher at the Institute of Zoology and the editor of the report, told New Scientist.

Global picture

Ground-living vertebrates have declined by 25%, with most of the slump occurring since 1980. Marine species held fairly steady until the late 1990s before falling sharply to give an overall drop of 28%. Freshwater species have decreased by 25%, primarily since the late 1980s.

Loh says the most dramatic declines have been observed in the tropics. Tropical ground-living species have seen an average population drop of 46%, while their temperate cousins have shown no overall change.

Freshwater vertebrates show different trends in different regions, leading to "no obvious signal", says Loh. European and North American populations show no overall change, but Asian-Pacific populations have declined steeply since the late 1980s.

In the world's oceans, northern vertebrate populations have held fairly steady over the entire period, but may have entered a downward trend since 1990. By contrast, southern populations have fallen precipitously, although because less data is collected there the trend is less certain.

Rose-tinted view

The LPI focuses exclusively on vertebrates, which are relatively well- monitored. Loh says, "We started collecting data on invertebrates, but it's very patchy and not good enough as yet."

The survey may be "bird-biased", he adds, because their populations are well-monitored. The LPI tracks 811 bird species but just 241 fish and 302 mammals.

Fish should actually comprise the bulk of the Index. The world's 30,000 species of fish compare to just 10,000 bird species and 5,400 mammals.

Loh says this suggests that the situation is worse than the data shows. "Birds are doing better than fish," he says, "so if anything, by biasing the survey towards them we're underestimating the global decline."

Incomplete picture

There is also a lack of good data for Latin America and Africa. Loh says that, frustratingly, "the more species there are in an area, often the less data there are on how they're doing. For instance the UK is well-monitored, but has relatively few species. It's a priority for us to find out what's happening in areas like the Amazon Basin."

The WWF report was published ahead of a worldwide conference on biodiversity, the ninth meeting of the Conference of the Parties on 19-30 May. The conference will assess what has been achieved by the Convention on Biological Diversity.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: Toronto Globe and Mail .............................[This story printer-friendly]
September 23, 2008

BLACK CLOUDS ON THE HORIZON FOR BIRDS OF THE WORLD

[Rachel's introduction: "There has been a precipitous decline of more than 50 per cent in the populations of 20 of the most common North American birds over the past four decades, alarming conservationists, who say the trend is an indicator of a serious deterioration in the environment."]

By Martin Mittelstaedt, Environment Reporter

There has been a precipitous decline of more than 50 per cent in the populations of 20 of the most common North American birds over the past four decades, alarming conservationists, who say the trend is an indicator of a serious deterioration in the environment.

The figures were in the State of the World's Birds, a report released yesterday and posted on a related website. Canadian and U.S. figures showing the decline were based in part on the annual Christmas bird counts compiled by thousands of volunteers across North America, and on a separate breeding bird survey.

The species in trouble include those that breed in Canada's boreal forest, such as the evening grosbeak, greater scaup, rusty blackbird and boreal chickadee. Also, many grassland species are listed, including the eastern meadowlark, loggerhead shrike and field sparrow.

The drop has also extended to the avian marathon fliers, those birds that migrate from North America to tropical and subtropical designations in Latin America each year. More than half of these migrating species have experienced population declines, including the Canada warbler and bobolink.

The sharp declines for many of the most common but lesser known birds comes as some iconic North American species, including bald eagles, whooping cranes and peregrine falcons, are making strong comebacks. The eagle and falcon populations are recovering due to bans on toxic pesticides, and while it isn't known exactly why all of the other species are declining, alterations in habitat are usually the prime suspect.

"Though there is much we still need to learn about what is driving the declines, loss and degradation of habitat are usually implicated," said Jon McCracken, a spokesman for Bird Studies Canada, a conservation group in Port Rowan, Ont., that took part in the global survey.

"It is particularly worrying when we find that some of our most common species are headed into trouble."

The alarming trend found in North America is also occurring elsewhere, according to the report which was issued by Birdlife International, a global umbrella group of environmental organizations.

The report said that the status of the world's birds -- 9,856 living species is the current count -- "continues to get worse" and the "deterioration is accelerating, not slowing." Of these species, more than 1,200 are thought to be in trouble. The most threatened are albatrosses (82 per cent of species are at risk), cranes (60 per cent at risk) and parrots (27 per cent at risk).

"Alarm calls from the world's birds are becoming ever louder ...," the report said.

A total of 153 bird species are thought to have become extinct since 1500, including 18 from 1975 to 1999 and another three known or suspected to have died out since 2000. The rate of species extinctions is "exceptionally high," about 1,000 to 10,000 times what would occur in nature over these time periods.

Among the threats to populations are the replacing of natural forests with plantations of only one or two tree species, the biofuel mania that is leading to forests being converted for palm-oil production, logging, industrial agriculture and fishing, and the spread of invasive predators such as rats.

Conservationists have identified about 10,000 places in the world that offer crucial habitat for birds. Of these, about 600 are in Canada. They include hot spots such as Point Pelee National Park on Lake Erie, Toronto's Leslie Street Spit, a man-made peninsula that is one of the largest breeding areas on the Great Lakes for colonial water birds, and Boundary Bay, an area near Vancouver that is an important stopover site for migratory birds on the Pacific coast.

Only about half the important birding sites in Canada are protected by the federal, provincial or territorial governments. "We've got to make sure we have a strategy to protect all of them," said Sarah Wren, a biologist with Nature Canada, an Ottawa-based conservation group that also worked on the bird counts.

***

Birds have evolved to thrive in some of the world's most forbidding environments, but they're facing a huge challenge coping with humans. One in eight bird species around the world is at risk of extinction, with habitat loss and degradation the main reason. In North America, many common species have experienced population declines of 50 per cent since the 1960s.

Spix's Macaw from Brazil became extinct in the wild in 2000.

Hawaiian Crow became extinct in the wild in 2002.

GLOBALLY THREATENED SPECIES

Extinct in wild: 4

Critically Endangered: 190

Endangered: 363

Vulnerable: 669

IMPORTANT BIRD AREAS OF THE WORLD

NEARCTIC 732 species

PALEARCTIC 937 species

NEOTROPICAL 3,370 species

INDOMALAYAN 1,700 species

OCEANIC 187 species

AFROTROPICAL 1,950 species

AUSTRALASIAN 1,590 species

ANTARCTIC 85 species

More than 10,000 important bird areas have been identified

SOURCE: STATE OF THE WORLD'S BIRDS (BIRD LIFE INTERNATIONAL)

Copyright 2008 CTVglobemedia Publishing Inc.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: Politico.com ........................................[This story printer-friendly]
September 18, 2008

CHEMICAL INDUSTRY TO FIGHT NEW PROPOSAL

[Rachel's introduction: "The chemical industry lobby is gearing up to fend off a broad legislative overhaul that marks a significant shift in the way that chemicals are regulated in America."]

By Samuel Loewenberg

The chemical industry lobby is gearing up to fend off a broad legislative overhaul that marks a significant shift in the way that chemicals are regulated in America.

The push for chemical reform comes after this summer's revamping of the Consumer Product Safety Commission. Now, health advocates hope that success, which introduced the so-called precautionary principle into U.S. chemical regulation, will pave the way for an even more far- reaching overhaul next year.

The precautionary approach has faced fierce opposition from the petrochemical industry. Until now, applying it in U.S. chemical regulation had seemed out of reach.

"The environment seems to have changed in our favor," said Rep. Hilda L. Solis (D-Calif.), vice chairman of the House Environment and Hazardous Materials Subcommittee. "Consumers are expecting the federal government to do something. They don't want to hear about it at the last minute, after their children have been exposed."

The precautionary principle shifts the burden from regulators having to prove a substance is dangerous to manufacturers having to prove it is safe. It's a fundamental part of the new legislation, the proposed Kid Safe Chemical Act, and is already its most divisive aspect.

On Tuesday, the Senate Environment and Public Works Committee held an oversight hearing that will help set the stage for the bill. And its supporters are hoping to push it through next year and are preparing the groundwork with hearings, staff briefings, and lobbying visits by religious organizations, medical professionals and others.

A crucial part of their strategy is to gain the support of the retail industry, which also played an important, if at times reluctant, role in the passage of the consumer protection overhaul earlier this summer.

The Kid Safe Chemical Act was first introduced in the Senate in 2005 but gained little traction. After a series of high-profile recalls of toys and other children's products over the past two years, it became clear to both industry and consumer groups that the Consumer Product Safety Commission's enforcement was in need of an overhaul.

A last-minute protest came from the petrochemical industry, particularly ExxonMobil, which sought to strike down a provision to ban some types of phthalates that advocates say could damage children's reproductive systems. The ban, pushed by Democratic Sen. Dianne Feinstein, whose home state of California had already moved to outlaw the chemicals, passed despite fierce opposition from chemical manufacturers.

The passage of the phthalates ban was a "seismic shift" because it brought the precautionary principle into play, said Stephanie Lester, vice president for international trade for the Retail Industry Leaders Association, which represents Wal-Mart, Target and other large chains. The group has not taken a position on the Kid Safe Chemical Act.

The chemical industry opposes the precautionary principle on the grounds that it could unfairly ban chemicals and wants regulators to focus instead on the riskiest substances.

"Better safe than sorry is one thing, but then there is throwing the baby out with the bath water, too," said Marty Durbin of the American Chemistry Council, the trade group for the chemical manufacturers.

Durbin was skeptical that the children's chemical reform bill would pass in its current form, even with the success of the consumer and health groups on phthalates. The ban, he said, does not mean that "the walls have come tumbling down and now Congress is going to pass the precautionary principle."

But that is exactly what health advocacy groups are hoping for. "That's the endgame for environmental and health groups that were working on the phthalate ban," said Janet Nudelman, chief lobbyist for the Breast Cancer Fund, a major force in pushing the ban through. The ban "was a referendum on chemical policy reform," she said.

The health groups are planning to use the same techniques that paid off for them with the consumer protection agency overhaul. The Breast Cancer Fund was able to mobilize more than 100,000 mothers to write into the conferees asking for the ban on phthalates use in kids' toys, because of risks the chemical plastic softener could leach into children's bloodstream. One of the biggest questions with the new bill is how retailers will respond. Especially for large stores, the pressing issue is that they be given sufficient time to introduce changes into their supply chain, said Lester of the retail leaders group, and not be forced to immediately pull items off store shelves.

"We need to be clear on the scope of products it applies to, clear on what our responsibility is," she said.

The new bill seeks to revamp the current law, the Toxic Substances Control Act, widely acknowledged to be out of date and too weak. The bill would require that chemicals used in tens of thousands of products -- from the plastic in baby bottles to the paint on rocking chairs -- are proved safe before they are allowed to be sold.

Currently, the burden is on regulators to show that a product is dangerous before they can force its removal from the shelves. Of the 80,000 chemicals used in household products, the Environmental Protection Agency has required toxicity testing of only 200, according to the bill's sponsors.

"We already have strong regulations for pesticides and pharmaceuticals -- it's common sense that we do the same for chemicals that end up in household items such as bottles and toys," said the bill's sponsor, Sen. Frank R. Lautenberg (D-N.J.).

The current legislation is modeled on a sweeping overhaul recently passed by the European Union after a long campaign against it by the U.S. chemical industry and the Bush administration.

The State Department mobilized U.S. diplomatic missions around the world in opposition. In a cable to U.S. diplomats, then-Secretary of State Colin Powell argued that the European legislation "would be significantly more burdensome to industry" than current approaches.

Details of that effort can be found on the website of the investigations division of the House Government Reform Committee, chaired by Rep. Henry A. Waxman (D-Calif.), one of the co-authors of the new legislation.

The new legislation is likely to be bolstered by the actions of the states, a few of which -- including Arkansas, California, New York, Maine and Washington -- are putting together their own precautionary laws.

Myriad state environmental groups have signed on to the new federal legislation, including ones from Republican strongholds such as Alaska, Kentucky and Texas.

Other supporters include the American Nurses Association, the National Autism Association and the Service Employees International Union.

The National Council of Churches, representing 45 million people in 100,000 churches from 35 mainline denominations, is also mobilizing its membership. It was a strategy that the group also used on the consumer protection reform, said Chloe Schwabe, the council's assistant director of environmental health.

"We are educating and engaging congregations to take action to improve our system to regulate chemicals and protect God's creation," Schwabe said. The precautionary approach is especially important, she said, because it "can protect people first, before they or their children are exposed to toxic chemicals."

Schwabe said she is hopeful that the legislation will surpass partisan boundaries, especially with the approaching Christmas shopping season.

"If these legislators are people of faith, they should be guided by their faith in making that decision to protect children and the most vulnerable among us," she said. "It shouldn't matter if you are a Democrat or a Republican."

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: The Independent (London, U.K.) .....................[This story printer-friendly]
September 21, 2008

MOBILE PHONE USE "RAISES CHILDREN'S RISK OF BRAIN CANCER FIVEFOLD"

[Rachel's introduction: Last week the European Parliament voted by 522 to 16 to urge ministers across Europe to bring in stricter limits for exposure to radiation from mobile and cordless phones, Wi-fi and other devices, partly because children are especially vulnerable to them.]

By Geoffrey Lean, Environment Editor

Children and teenagers are five times more likely to get brain cancer if they use mobile phones, startling new research indicates.

The study, experts say, raises fears that today's young people may suffer an "epidemic" of the disease in later life. At least nine out of 10 British 16-year-olds have their own handset, as do more than 40 per cent of primary schoolchildren.

Yet investigating dangers to the young has been omitted from a massive 3.1 million-pound ($5.7 million) British investigation of the risks of cancer from using mobile phones, launched this year, even though the official Mobile Telecommunications and Health Research (MTHR) Programme -- which is conducting it -- admits that the issue is of the "highest priority".

Despite recommendations of an official report that the use of mobiles by children should be "minimised", the Government has done almost nothing to discourage it.

Last week the European Parliament voted by 522 to 16 to urge ministers across Europe to bring in stricter limits for exposure to radiation from mobile and cordless phones, Wi-fi and other devices, partly because children are especially vulnerable to them. They are more at risk because their brains and nervous systems are still developing and because -- since their heads are smaller and their skulls are thinner -- the radiation penetrates deeper into their brains.

The Swedish research was reported this month at the first international conference on mobile phones and health.

It sprung from a further analysis of data from one of the biggest studies carried out into the risk that the radiation causes cancer, headed by Professor Lennart Hardell of the University Hospital in Orebro, Sweden. Professor Hardell told the conference -- held at the Royal Society by the Radiation Research Trust -- that "people who started mobile phone use before the age of 20" had more than five-fold increase in glioma", a cancer of the glial cells that support the central nervous system. The extra risk to young people of contracting the disease from using the cordless phone found in many homes was almost as great, at more than four times higher.

Those who started using mobiles young, he added, were also five times more likely to get acoustic neuromas, benign but often disabling tumours of the auditory nerve, which usually cause deafness.

By contrast, people who were in their twenties before using handsets were only 50 per cent more likely to contract gliomas and just twice as likely to get acoustic neuromas.

Professor Hardell told the IoS: "This is a warning sign. It is very worrying. We should be taking precautions." He believes that children under 12 should not use mobiles except in emergencies and that teenagers should use hands-free devices or headsets and concentrate on texting. At 20 the danger diminishes because then the brain is fully developed. Indeed, he admits, the hazard to children and teenagers may be greater even than his results suggest, because the results of his study do not show the effects of their using the phones for many years. Most cancers take decades to develop, longer than mobile phones have been on the market.

The research has shown that adults who have used the handsets for more than 10 years are much more likely to get gliomas and acoustic neuromas, but he said that there was not enough data to show how such relatively long-term use would increase the risk for those who had started young.

He wants more research to be done, but the risks to children will not be studied in the MTHR study, which will follow 90,000 people in Britain. Professor David Coggon, the chairman of the programmes management committee, said they had not been included because other research was being done on young people by a study at Sweden's Kariolinska Institute.

He said: "It looks frightening to see a five-fold increase in cancer among people who started use in childhood," but he said he "would be extremely surprised" if the risk was shown to be so high once all the evidence was in.

But David Carpenter, dean of the School of Public Health at the State University of NewYork -- who also attended the conference -- said: "Children are spending significant time on mobile phones. We may be facing a public health crisis in an epidemic of brain cancers as a result of mobile phone use."

In 2000 and 2005, two official inquiries under Sir William Stewart, a former government chief scientist, recommended the use of mobile phones by children should be "discouraged" and "minimised".

But almost nothing has been done, and their use by the young has more than doubled since the turn of the millennium.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: U.S. News & World Report ........................[This story printer-friendly]
September 15, 2008

10 WAYS GLOBAL WARMING COULD HURT YOUR HEALTH

[Rachel's introduction: Global warming is not just an abstract ecological and economic issue -- a warmer planet could harm your physical well-being]

By Sarah Baldauf

Scientists the globe over have observed changes that are impacting individuals' health and have also created models to predict where we might be headed. Here's a sampling of what we could be discussing with our doctors in the decades to come.

** Stepped-up sniffling. Allergies -- from ragweed in the fall to tree pollen in the spring -- are predicted not only to become stronger but also to enjoy lengthened seasons because of less frost and earlier blooming. Fungal spores (those outdoors and in moist basements) will most likely thrive, tickling the throats of many.

** Algae-related complaints. Cyanobacteria, or blue-green algae, thrive and bloom in the rising temperatures of bodies of water, from municipal water systems to the Great Lakes and Florida's Lake Okeechobee. The algae have been linked to digestive, neurological, liver, and dermatological diseases.

** Painful kidney stones. Because of higher temps and more dehydration, the crystallized calcifications that must be passed -- often painfully -- through the urinary tract could plague an additional 2.2 million people a year by 2050, researchers estimate. The current "kidney stone belt," which includes southern states like Florida, the Carolinas, and Arkansas, could extend up into Kentucky and northern California.

** Exotic infections. Dengue fever, malaria, and encephalitis, while not exactly household names, have seen U.S. outbreaks and upticks in incidence in recent years. Mosquitoes and plankton, which flourish in warmer water temperatures, play a key role in transmitting such diseases.

** Itchier cases of poison ivy. Poison ivy appears to become more potent as carbon dioxide levels rise, research has suggested.

** Surplus of stings. Alaska's warming has heralded a sixfold rise in severe stings reported, and the buzzing bees, wasps, and yellow jackets are showing up in spots never before seen. Alaska may be a harbinger for the rest of us, as its temperature changes have been the most significant in the United States.

** Fewer fruits available. The value of crops produced in the Yakima River Valley -- more than 6,ooo square miles of orchards and farmland east of Seattle -- may drop almost a quarter as temperatures rise over the coming decades. Less water for irrigation from nearby mountain snowpack could drive down fruit availability and drive up the cost of the produce.

** Upsurge in summertime hacking and wheezing. Cool breezes coming down from Canada could diminish, driving up ozone pollution at ground level -- particularly in the Northeast and Midwest -- say some Harvard scientists. Possible result: irritated lungs, especially in people with respiratory illness.

** Deluge of heat-wave deaths. Already a risk to the very young and the very old in the summer months, strings of hot and humid days are expected to become more frequent and more severe, says the Intergovernmental Panel on Climate Change. In California, for example, such deaths could double by 2100.

** Bigger coastal storms. The flooding associated with the likes of Katrina and Ike and the physical and mental stresses that ensue are expected to occur more frequently as storms surge around the world. By 2050, a 1-foot rise in sea level is predicted, which could worsen flood damage by 36 to 58 percent.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

Rachel's Democracy & Health News highlights the connections between issues that are often considered separately or not at all.

The natural world is deteriorating and human health is declining because those who make the important decisions aren't the ones who bear the brunt. Our purpose is to connect the dots between human health, the destruction of nature, the decline of community, the rise of economic insecurity and inequalities, growing stress among workers and families, and the crippling legacies of patriarchy, intolerance, and racial injustice that allow us to be divided and therefore ruled by the few.

In a democracy, there are no more fundamental questions than, "Who gets to decide?" And, "How DO the few control the many, and what might be done about it?"

Rachel's Democracy and Health News is published as often as necessary to provide readers with up-to-date coverage of the subject.

Editors:
Peter Montague - peter@rachel.org
Tim Montague - tim@rachel.org

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

To start your own free Email subscription to Rachel's Democracy & Health News send a blank Email to: rachel-subscribe@pplist.net

In response, you will receive an Email asking you to confirm that you want to subscribe.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

Environmental Research Foundation
P.O. Box 160, New Brunswick, N.J. 08903
dhn@rachel.org

NOT A VISION OF OUR FUTURE, BUT OF OURSELVES

By Wendell Berry

Attempting to deal with an enormity so staggering as the human destruction of Earth, it is difficult to resist the temptation to write out a "vision of the future" that would offer something better. Even so, I intend to resist. I resist, not only because such visions run the risk of error, but also out of courtesy. A person of my age who dabbles in visions of the future is necessarily dabbling in a future that belongs mostly to other people.

What I would like to do, instead, if I can, is help to correct the vision we Kentuckians have of ourselves in the present. In our present vision of ourselves we seem to be a people with a history that is acceptable, even praiseworthy. A history that we are privileged to inherit uncritically and with little attempt at rectification. But by the measures that are most important to whatever future the state is to have, ours is a history of damage and of loss.

In a little more that two centuries -- a little more than three lifetimes such as mine -- we have sold cheaply or squandered or given away or merely lost much of the original wealth and health of our land. It is a history too largely told in the statistics of soil erosion, increasing pollution, waste and degradation of forests, desecration of streams, urban sprawl, impoverishment and miseducation of people, misuse of money, and, finally, the entire and permanent destruction of whole landscapes.

Eastern Kentucky, in its natural endowments of timber and minerals, is the wealthiest region of our state, and it has now experienced more than a century of intense corporate "free enterprise," with the result that it is more impoverished and has suffered more ecological damage than any other region. The worst inflictor of poverty and ecological damage has been the coal industry, which has taken from the region a wealth probably incalculable, and has imposed the highest and most burdening "costs of production" upon the land and the people. Many of these costs are, in the nature of things, not repayable. Some were paid by people now dead and beyond the reach of compensation. Some are scars on the land that will not be healed in any length of time imaginable by humans.

The only limits so far honored by this industry have been technological. What its machines have enabled it to do, it has done. And now, for the sake of the coal under them, it is destroying whole mountains with their forests, water courses and human homeplaces. The resulting rubble of soils and blasted rocks is then shoved indiscriminately into the valleys. This is a history by any measure deplorable, and a commentary sufficiently devastating upon the intelligence of our politics and our system of education. That Kentuckians and their politicians have shut their eyes to this history as it was being made is an indelible disgrace. That they now permit this history to be justified by its increase of the acreage of "flat land" in the mountains signifies an indifference virtually suicidal.

So ingrained is our state's submissiveness to its exploiters that I recently heard one of our prominent politicians defend the destructive practices of the coal companies on the ground that we need the coal to "tide us over" to better sources of energy. He thus was offering the people and the region, which he represented and was entrusted to protect, as a sacrifice to what I assume he was thinking of as "the greater good" of the United States -- and, only incidentally, of course, for the greater good of the coal corporations.

The response that is called for, it seems to me, is not a vision of "a better future," which would be easy and probably useless, but instead an increase of consciousness and critical judgment in the present. That would be harder, but it would be right. We know too well what to expect of people who do not see what is happening or who lack the means of judging what they see. What we may expect from them is what we will see if we look: devastation of the land and impoverishment of the people. And so let us ask: What might we expect of people who have consciousness and critical judgment, which is to say real presence of mind?

We might expect, first of all, that such people would take good care of what they have. They would know that the most precious things they have are the things they have been given: air, water, land, fertile soil, the plants and animals, one another -- in short, the means of life, health and joy. They would realize the value of those gifts. They would know better than to squander or destroy them for any monetary profit, however great.

Coal is undoubtedly something of value. And it is, at present, something we need -- though we must hope we will not always need it, for we will not always have it. But coal, like the other fossil fuels, is a peculiar commodity. It is valuable to us only if we burn it. Once burned, it is no longer a commodity but only a problem, a source of energy that has become a source of pollution. And the source of the coal itself is not renewable. When the coal is gone, it will be gone forever, and the coal economy will be gone with it.

The natural resources of permanent value to the so-called coalfields of Eastern Kentucky are the topsoils and the forests and the streams. These are valuable, not, like coal, on the condition of their destruction, but on the opposite condition: that they should be properly cared for. And so we need, right now, to start thinking better than we ever have before about topsoils and forests and streams. We must thing about all three at once, for it is a violation of their nature to think about any one of them alone.

The mixed mesophytic forest of the Cumberland Plateau was a great wonder and a great wealth before it was almost entirely cut down in the first half of the last century. Its regrowth could become a great wonder and a great wealth again; it could become the basis of a great regional economy -- but only if it is properly cared for. Knowing that the native forest is the one permanent and abundant economic resource of the region ought to force us to see the need for proper care, and the realization of that need ought to force us to see the difference between a forest ecosystem and a coal mine. Proper care can begin only with the knowledge of that difference. A forest ecosystem, respected and preserved as such, can be used generation after generation without diminishment -- or it can be regarded merely as an economic bonanza, cut down, and used up. The difference is a little like that between using a milk cow, and her daughter and granddaughters after her, for a daily supply of milk, renewable every year -- or killing her for one year's supply of beef.

And there is yet a further difference, one that is even more important, and that is the difference in comprehensibility. A coal mine, like any other industrial-technological system, is a human product, and therefore entirely comprehensible by humans. But a forest ecosystem is a creature, not a product. It is, as part of its definition, a community of living plants and animals whose relationships with one another and with their place and climate are only partly comprehensible by humans, and, in spite of much ongoing research, they are likely to remain so. A forest ecosystem, then, is a human property only within very narrow limits, for it belongs also to the mystery that everywhere surrounds us. It comes from that mystery; we did not make it. And so proper care has to do, inescapably, with a proper humility.

But that only begins our accounting of what we are permitting the coal companies to destroy, for the forest is not a forest in and of itself. It is a forest, it can be a forest, only because it comes from, stands upon, shelters and slowly builds fertile soil. A fertile soil is not, as some people apparently suppose, an aggregate of inert materials, but it is a community of living creatures vastly more complex than that of the forest above it. In attempting to talk about the value of fertile soil, we are again dealing immediately with the unknown. Partly, as with the complexity and integrity of a forest ecosystem, this is the unknown of mystery. But partly, also, it is an unknown attributable to human indifference, for "the money and vision expended on probing the secrets of Mars... vastly exceed what has been spent exploring the earth beneath our feet." I am quoting from Yvonne Baskin's sorely needed new book, Under Ground, which is a survey of the progress so far of "soil science," which is still in its infancy. I can think of no better way to give a sense of what a fertile soil is, what it does, and what it is worth than to continue to quote from Ms. Baskin's book:

A spade of rich garden soil may harbor more species than the entire Amazon nurtures above ground... the bacteria in an acre of soil can outweigh a cow or two grazing above them.

Together [the tiny creatures living underground] form the foundation for the earth's food webs, break down organic matter, store and recycle nutrients vital to plant growth, generate soil, renew soil fertility, filter and purify water, degrade and detoxify pollutants, control plant pests and pathogens, yield up our most important antibiotics, and help determine the fate of carbon and greenhouse gases and thus, the state of the earth's atmosphere and climate.

By some estimates, more than 40 percent of the earth's plant-covered lands... have been degraded over the past half-century by direct human uses.

The process of soil formation is so slow relative to the human lifespan that it seems unrealistic to consider soil a renewable resource. By one estimate it takes 200 to 1,000 years to regenerate an inch of lost topsoil.

And so on any still-intact slope of Eastern Kentucky, we have two intricately living and interdependent natural communities: that of the forest and that of the topsoil beneath the forest. Between them, moreover, the forest and the soil are carrying on a transaction with water that, in its way, also is intricate and wonderful. The two communities, of course, cannot live without rain, but the rain does not fall upon the forest as upon a pavement; it does not just splatter down. Its fall is slowed and gentled by the canopy of the forest, which thus protects the soil. The soil, in turn, acts as a sponge that absorbs the water, stores it, releases it slowly, and in the process filters and purifies it. The streams of the watershed -- if the human dwellers downstream meet their responsibility -- thus receive a flow of water that is continuous and clean.

Thus, and not until now, it is possible to say that the people of the watersheds may themselves be a permanent economic resource, but only and precisely to the extent that they take good care of what they have. If Kentuckians, upstream and down, ever fulfill their responsibilities to the precious things they have been given -- the forests, the soils, and the streams -- they will do so because they will have accepted a truth that they are going to find hard: the forests, the soils and the streams are worth far more than the coal for which they are now being destroyed.

Before hearing the inevitable objections to that statement, I would remind the objectors that we are not talking here about the preservation of the "American way of life." We are talking about the preservation of life itself. And in this conversation, people of sense do not put secondary things ahead of primary things. That precious creatures (or resources, if you insist) that are infinitely renewable can be destroyed for the sake of a resource that to be used must be forever destroyed, is not just a freak of short-term accounting and t he externalization of cost -- it is an inversion of our sense of what is good. It is madness.

And so I return to my opening theme: it is not a vision of the future that we need. We need consciousness, judgment, presence of mind. If we truly know what we have, we will change what we do.

==============

Reprinted with permission from Missing Mountains: We went to the mountaintop but it wasn't there: Kentuckians write against mountaintop removal (Nicholasville, KY: Wind Publications, 2005).

Site produced by Appalachian Voices 191 Howard St, Boone, NC 28607 phone:1-877-APP-VOICE


From: ScienceDaily

AUTISM RISK LINKED TO DISTANCE FROM POWER PLANTS

How do mercury emissions affect pregnant mothers, the unborn and toddlers? Do the level of emissions impact autism rates? Does it matter whether a mercury-emitting source is 10 miles away from families versus 20 miles? Is the risk of autism greater for children who live closer to the pollution source?

A newly published study of Texas school district data and industrial mercury-release data, conducted by researchers at The University of Texas Health Science Center at San Antonio, indeed shows a statistically significant link between pounds of industrial release of mercury and increased autism rates. It also shows -- for the first time in scientific literature -- a statistically significant association between autism risk and distance from the mercury source.

"This is not a definitive study, but just one more that furthers the association between environmental mercury and autism," said lead author Raymond F. Palmer, Ph.D., associate professor of family and community medicine at the UT Health Science Center San Antonio. The article is in the journal Health & Place.

Dr. Palmer, Stephen Blanchard, Ph.D., of Our Lady of the Lake University in San Antonio and Robert Wood of the UT Health Science Center found that community autism prevalence is reduced by 1 percent to 2 percent with each 10 miles of distance from the pollution source.

"This study was not designed to understand which individuals in the population are at risk due to mercury exposure," Dr. Palmer said. "However, it does suggest generally that there is greater autism risk closer to the polluting source."

The study should encourage further investigations designed to determine the multiple routes of mercury exposure. "The effects of persistent, low-dose exposure to mercury pollution, in addition to fish consumption, deserve attention," Dr. Palmer said. "Ultimately, we will want to know who in the general population is at greatest risk based on genetic susceptibilities such as subtle deficits in the ability to detoxify heavy metals."

The new study findings are consistent with a host of other studies that confirm higher amounts of mercury in plants, animals and humans the closer they are to the pollution source. The price on children may be the highest.

"We suspect low-dose exposures to various environmental toxicants, including mercury, that occur during critical windows of neural development among genetically susceptible children may increase the risk for developmental disorders such as autism," the authors wrote.

Study highlights

** Mercury-release data examined were from 39 coal-fired power plants and 56 industrial facilities in Texas.

** Autism rates examined were from 1,040 Texas school districts.

** For every 1,000 pounds of mercury released by all industrial sources in Texas into the environment in 1998, there was a corresponding 2.6 percent increase in autism rates in the Texas school districts in 2002.

** For every 1,000 pounds of mercury released by Texas power plants in 1998, there was a corresponding 3.7 percent increase in autism rates in Texas school districts in 2002.

** Autism prevalence diminished 1 percent to 2 percent for every 10 miles from the source.

** Mercury exposure through fish consumption is well documented, but very little is known about exposure routes through air and ground water. There is evidence that children and other developing organisms are more susceptible to neurobiological effects of mercury.

Implications

"We need to be concerned about global mercury emissions since a substantial proportion of mercury releases are spread around the world by long-range air and ocean currents," Dr. Palmer said. "Steps for controlling and eliminating mercury pollution on a worldwide basis may be advantageous. This entails greener, non-mercury-polluting technologies."

The U.S. Environmental Protection Agency (EPA) estimated environmental mercury releases at 158 million tons annually nationwide in the late 1990s, the time period studied by the Texas team. Most exposures were said to come from coal-fired utility plants (33 percent of exposures), municipal/medical waste incinerators (29 percent) and commercial/industrial boilers (18 percent). Cement plants also release mercury.

With the enactment of clean air legislation and other measures, mercury deposition into the environment is decreasing slightly.

Limitations

Dr. Palmer and his colleagues pointed out the study did not reflect the true community prevalence rates of autism because children younger than school age are not counted in the Texas Education Agency data system. The 1:500 autism rates in the study are lower than the 1:150 autism rates in recent reports of the U.S. Centers for Disease Control and Prevention.

Furthermore, the authors note that distance was not calculated from individual homes to the pollution source but from central points in school districts that varied widely in area.

Data sources

Data for environmentally released mercury were from the United States Environmental Protection Agency Toxics Release Inventory. Data for releases by coal-fired power plants came from the same inventory and from the Texas Commission for Environmental Quality. Data for school district autism came from the Texas Education Agency.

Journal reference: Palmer, R.F., et al., Proximity to point sources of environmental mercury release as a predictor of autism prevalence. Health & Place (2008), doi:10.1016/j.healthplace.2008.02.001.


REASONABLE DOUBT

By Ian Fairlie

Among the many environmental concerns surrounding nuclear power plants, there is one that provokes public anxiety like no other: the fear that children living near nuclear facilities face an increased risk of cancer. Though a link has long been suspected, it has never been proven. Now that seems likely to change.

Studies in the 1980s revealed increased incidences of childhood leukaemia near nuclear installations at Windscale (now Sellafield), Burghfield and Dounreay in the UK. Later studies near German nuclear facilities found a similar effect. The official response was that the radiation doses from the nearby plants were too low to explain the increased leukaemia. The Committee on Medical Aspects of Radiation in the Environment, which is responsible for advising the UK government, finally concluded that the explanation remained unknown but was not likely to be radiation.

There the issue rested, until a recent flurry of epidemiological studies appeared. Last year, researchers at the Medical University of South Carolina in Charleston carried out a meta-analysis of 17 research papers covering 136 nuclear sites in the UK, Canada, France, the US, Germany, Japan and Spain. The incidence of leukaemia in children under 9 living close to the sites showed an increase of 14 to 21 per cent, while death rates from the disease were raised by 5 to 24 per cent, depending on their proximity to the nuclear facilities (European Journal of Cancer Care, vol 16, p 355).

This was followed by a German study which found 14 cases of leukaemia compared to an expected four cases between 1990 and 2005 in children living within 5 kilometres of the Krummel nuclear plant near Hamburg, making it the largest leukaemia cluster near a nuclear power plant anywhere in the world (Environmental Health Perspectives, vol 115, p 941).

This was upstaged by the yet more surprising KiKK studies (a German acronym for Childhood Cancer in the Vicinity of Nuclear Power Plants), whose results were published this year in the International Journal of Cancer (vol 122, p 721) and the European Journal of Cancer (vol 44, p 275). These found higher incidences of cancers and a stronger association with nuclear installations than all previous reports. The main findings were a 60 per cent increase in solid cancers and a 117 per cent increase in leukaemia among young children living near all 16 large German nuclear facilities between 1980 and 2003. The most striking finding was that those who developed cancer lived closer to nuclear power plants than randomly selected controls. Children living within 5 kilometres of the plants were more than twice as likely to contract cancer as those living further away, a finding that has been accepted by the German government.

Though the KiKK studies received scant attention elsewhere, there was a public outcry and vocal media debate in Germany. No one is sure of the cause (or causes) of the extra cancers. Coincidence has been ruled out, as has the "Kinlen hypothesis", which theorises that childhood leukaemia is caused by an unknown infectious agent introduced as a result of an influx of new people to the area concerned. Surprisingly, the most obvious explanation for this increased risk -- radioactive discharges from the nearby nuclear installations -- was also ruled out by the KiKK researchers, who asserted that the radiation doses from such sources were too low, although the evidence they base this on is not clear.

Anyone who followed the argument in the 1980s and 1990s concerning the UK leukaemia clusters will have a sense of deja vu. A report in 2004 by the Committee Examining Radiation Risks of Internal Emitters (2 Mbyte PDF), set up by the UK government (and for which I was a member of the secretariat) points out that the models used to estimate radiation doses from sources emitted from nuclear facilities are riddled with uncertainty. For example, assumptions about how radioactive material is transported through the environment or taken up and retained by local residents may be faulty.

If radiation is indeed the cause of the cancers, how might local residents have been exposed? Most of the reactors in the KiKK study were pressurised water designs notable for their high emissions of tritium, the radioactive isotope of hydrogen. Last year, the UK government published a report on tritium which concluded that its hazard risk should be doubled. Tritium is most commonly found incorporated into water molecules, a factor not fully taken into account in the report, so this could make it even more hazardous.

As we begin to pin down the likely causes, the new evidence of an association between increased cancers and proximity to nuclear facilities raises difficult questions. Should pregnant women and young children be advised to move away from them? Should local residents eat vegetables from their gardens? And, crucially, shouldn't those governments around the world who are planning to build more reactors think again?

Ian Fairlie is a London-based consultant on radiation in the environment


HAIR DYES FOUND TO INCREASE CANCER RISK

By Jeremy Laurance, Health Editor

Hairdressers and barbers are at increased risk of developing cancer - because of their use of hair dyes. And the risks could extend to personal use of the dyes, according to international experts.

A review of the evidence by a panel of the International Agency for Research on Cancer (IARC) in Lyon, France, has found a "small but consistent risk of bladder cancer in male hairdressers and barbers".

A second review of the evidence on personal use of hair dyes found some studies suggesting a possible association with bladder cancer and with lymphoma and leukaemia.

But the panel found that the evidence was inadequate and concluded that personal use of hair dyes was "not classifiable as to its carcinogenicity to humans".

The panel was composed of 17 scientists who met last February to consider the latest evidence and update advice last issued by the agency in 1993.

Modern hair dyes are classified as permanent, semi permanent or temporary dyes. The permanent or oxidative hair dyes represent 80 per cent of the market and consist of colourless "intermediates" and couplers that, in the presence of peroxide, form the dyes by chemical reaction.

Dark hair dyes tend to contain the highest concentration of the colouring ingredients. The use of some such colourants was discontinued in the 1970s after positive cancer tests in rats.

Dr Robert Baan of the IARC and colleagues say in The Lancet Oncology: "A small but consistent risk of bladder cancer was reported in male hairdressers and barbers. Because of few supporting findings by duration or period of exposure, the working group considered these data as limited evidence of carcinogenicity and reaffirmed occupational exposures of haridressers and barbers as 'probably carcinogenic to humans'."

Copyright independent.co.uk


THE LINK BETWEEN TOXIC CHEMICALS AND GLOBAL WARMING

By Peter Montague

As never before, opportunity is knocking for activists. The solutions to global warming and chemical contamination are both peeking over the horizon and they look very much alike. The timing is perfect for building a global coalition to promote waste-free green chemistry (or clean production), end the rush toward coal-burning power plants, stuff the nuclear genie back into the bottle, power the future with sustainable solar energy, and advance environmental justice worldwide. From all this can flow many millions of truly green jobs.

Global warming and chemical contamination are two sides of the same coin, and solutions to both are now converging. Activists fighting coal, fighting nuclear, fighting dangerous waste technologies (landfills, incinerators and incinerators-in-disguise), fighting for the cleanup of superfund sites and brownfields and toxic emissions, fighting all the chemicals-and-health fights (childhood cancers, autism, diabetes, asthma, Parkinson's and so many other environment- linked diseases), and fighting for justice, joined with activists promoting solar, wind and biofuels, promoting tidal power and geothermal, promoting green jobs, clean production and green chemistry -- could now work together toward a common purpose. The combination would create a mountain of political power.

To curb global warming, we can transition as rapidly as possible to renewable solar, geothermal and tidal power, and to end chemical contamination we can transition as rapidly as possible to green chemistry. But to do either of these things, we first must get around one huge obstacle: the coal industry. I know that some influential people are saying that fossil fuels are over, that oil has peaked and coal is dead, but there's more to it than that.

Yes, perhaps oil has reached -- or soon will -- peak production and will be declining in volume each year, which foretells steadily rising prices (as we have seen -- a 10-fold increase since 2000). But coal is not dead. Coal executives are planning to turn coal into liquid fuels and into chemical feedstocks to replace oil. If they succeed, they will entrench coal for the next 100 years, derailing the drive toward green chemistry and clean production, and eliminating the incentives for renewable energy. This is a strategic fork in the road to the future. Which will we choose? This is a moment in history when activism can make a crucial difference.

There is still a huge (though surprisingly uncertain) amount of coal in the ground, especially in the U.S., and we would be naive to think that the people who own that coal are going walk away from it emptyhanded. Until it is all used up, they plan to (a) burn it for electricity, (b) turn it into liquid fuels (diesel, kerosene and jet fuel), or (c) turn it into chemical feedstocks (to create plastics, pesticides, solvents, etc.)-- or they will sell it to China, which will then burn it, liquify it, or turn it into chemicals. Peabody Energy of St. Louis, the largest private coal company in the world, opened an office in Beijing in 2005 because Chinese coal mines cannot keep up with demand, and U.S. coal exports are now helping fill that gap.

With coal-fired electric power now being fought to a standstill by a swarm of grass-roots activists all across the U.S., coal-to-liquids (CTL) and coal-to-chemicals are the most promising paths to salvation for the coal industry. Private chemical companies can build coal-to- chemical plants on their existing premises without getting any special licenses of the kind required for electric utilities. Eastman Chemical (formerly a part of Eastman Kodak) already derives 20% of its chemical feedstocks from coal and is thinking about pushing that up to 40%.[1] General Electric -- which sells coal gasification equipment needed for both coal-to-liquids and coal-to-chemicals -- wants to sell coal gasifiers to electric utilities, "But in the near term, turning coal to chemicals offers the most significant opportunities," says Edward C. Lowe, general manager of gasification for GE Energy.

Coal-to-liquids and coal-to-chemicals both use heat and pressure to break the molecular bonds in coal, producing gases (mostly carbon monoxide and hydrogen), which can be recombined to make various fuels and chemical feedstocks for paints, food additives, fertilizers, plastics, and all manner of other modern molecules. Germany commercialized these chemical processes before World War II, but after the war cheap oil shoved coal-gas technology to the back burner. Now oil is growing expensive and the chemical industry is paying roughly $40 billion per year for petroleum-based feedstocks, so that's a huge new market for Big Coal to penetrate. If they succeed, they're saved, if not, they're sunk. Their back is to the wall.

Coal-to-chemicals plants will try to bury their waste carbon dioxide (CO2) in the ground, just the way coal-fire power plants say they want to do -- so coal-to-chemicals and coal-to-liquid-fuels could provide a laboratory for the untried "carbon capture and storage (CCS)" technologies needed to create so-called "clean coal." If they can convince people that CCS works -- and will keep working safely for thousands of years into the future -- they're saved; if not, they're sunk. Current U.S. energy policy is providing large taxpayer subsidies to coal-with-CCS, starving the research budget for renewable energy.

The Wall Street Journal reported late last year that western chemical companies are now flocking to China to participate in coal- to- chemicals projects, some of which have an experimental carbon burial (CCS) component. Because of cheap labor and lax regulations, such plants cost 30% to 50% less to build in China than in the U.S. Several Chinese companies already use coal to manufacture vinyl chloride monomer, the building block of PVC ("vinyl") plastic, and American and European chemical firms want some of that action. "No one's made any real commitments yet, says the editor of Chemical Week magazine, "but it's clear that this is the beginning of a wave." (Get more data about coal gasification in China and worldwide from these slides.)

Coal-to-liquids and coal-to-chemicals will not develop in the U.S. without a knock-down fight. This is where a big coalition of toxics, environmental health, energy, and green jobs and environmental justice activists could weigh in: Stopping coal-to-chemicals is essential to create space for the emergence of clean production, green chemistry, renewable energy and green jobs with justice. We are not going to have a coal-based chemical industry and a green chemical industry. We'll have one or the other, not both. And the same is true for electric power: if the public can be convinced that "clean coal" is really clean and really safe, incentives for renewables will dry up. We'll have coal or renewables, not both.

Here's the thing. The Achilles' heel of the coal-to-chemicals industry, the coal-to-liquid-fuels industry, and the coal-fired electric power industry is carbon dioxide. Compared to petroleum-based fuels, coal-based fuels produce twice as much CO2 per gallon. With Wall Street already looking askance at all coal-based technologies because of the near-certainty that carbon emissions will face expensive regulation one of these days, plans to bury CO2 in the ground take on new urgency for the coal corporations. With it, they may have a future; without it, they're sunk. To stop coal, activists just have to frighten the money.

Rarely in history have activists on such a a broad range of issues been offered such a clear strategic opportunity to work together to kill a deadly, wasteful dinosaur like the coal-to-liquids, coal-to- chemicals and coal-to-electricity industries, simultaneously opening up a future of green possibilities for ourselves, for the world, and for our children.


MILLIONS OF 2-YEAR OLDS EXPOSED TO DANGEROUS LEVELS OF ROCKET FUEL

WASHINGTON -- A recent study by the US Food and Drug Administration (FDA) found that three quarters of 285 commonly consumed foods and beverages are contaminated with perchlorate, a toxic rocket fuel ingredient. According to the study, every day, the average two-year- old is exposed to more than half of the EPA 'safe' dose (RfD) of perchlorate from food alone. This is bad news for children in communities in 28 states who also are exposed to perchlorate through contaminated tap water. Very low levels of perchlorate in tap water will cause the average two year old to exceed EPA's safe exposure level.

Two-year-olds are particularly vulnerable because they eat and drink substantial amounts of food and water relative to their small size. An Environmental Working Group analysis of FDA data found that perchlorate levels as low as 4 parts per billion (ppb) in tap water could expose the average two-year-old to an unsafe dose of the rocket fuel contaminant every single day.

FDA's finding of high food exposures for small children makes clean up of perchlorate-contaminated water imperative. Perchlorate in tap water can be controlled through filtration and clean up. Perchlorate in food is harder to manage because the source of contamination is not clear, although contaminated irrigation water is one known source where levels could be reduced.

'Every final or proposed water standard for perchlorate fails to provide adequate protection for children,' said Dr. Anila Jacob, MD, a senior scientist at EWG. 'An average two-year-old drinking water with 4 ppb perchlorate will exceed the EPA's safe dose. New Jersey has set a standard at 5 ppb, California is at 6 ppb, and the EPA has issued a clean up standard of 24 ppb, nowhere near a level protective of children.'

Not only do children have higher exposures to perchlorate when compared with adults, they are also particularly susceptible to its adverse effects. Perchlorate acts by inhibiting the thyroid gland from taking up iodine from the circulation. Since iodine is the building block for thyroid hormone, perchlorate exposure can result in decreased thyroid hormone production by the thyroid gland. Adequate circulating levels of thyroid hormones are critical to maintaining normal growth and brain development during childhood.

'Pervasive perchlorate contamination of food underscores the need for a tough national drinking water standard to protect children. We need to take every action we can to minimize perchlorate exposures, and tough tap water standards are the logical first step. If we fail to act, we will needlessly expose millions of children to dangerous levels of this potent toxic compound,' said Dr. Anila Jacob, MD -- a Senior Scientist with EWG.

A recent report from the US Government Accountability Office finds that 28 states have communities in which perchlorate contaminates drinking water supplies at levels of 4 ppb or higher.

EWG is a nonprofit research organization based in Washington, DC that uses the power of information to protect human health and the environment. The group's analysis of the FDA study is available online at https://www.ewg.org/node/25875

WHY IS UNCLE SAM SO COMMITTED TO REVIVING NUCLEAR POWER?

By Peter Montague

The long-awaited and much-advertised "nuclear renaissance" actually got under way this week. NRG Energy, a New Jersey company recently emerged from bankruptcy, applied for a license to build two new nuclear power plants at an existing facility in Bay City, Texas -- the first formal application for such a license in 30 years.

NRG Energy has no experience building nuclear power plants but they are confident the U.S. government will assure their success. "The whole reason we started down this path was the benefits written into the [Energy Policy Act] of 2005," NRG's chief executive, David Crame, told the Washington Post. In other words, the whole reason NRG Energy wants to build nuclear power plants is to get bundles of free money from Uncle Sam. Who could blame them?

Other energy corporations are nuzzling up to the same trough. The federal Nuclear Regulatory Commission (NRC) is expecting to receive applications for an additional 29 nuclear power reactors at 20 sites. The NRC has already hired more than 400 new staff to deal with the expected flood of applications.

But the question remains, Can investors be fooled twice? Financially, the nuclear power industry has never stood on its own two feet without a crutch from Uncle Sam. Indeed, the nuclear power industry is entirely a creature of the federal government; it was created out of whole cloth by the feds in the 1950s. At that time, investors were enticed by offers of free money -- multi-billion-dollar subsidies, rapid write-offs, special limits on liability, and federal loan guarantees. Despite all this special help, by the 1970s the industry was in a shambles. The British magazine, the Economist, recently described it this way: "Billions were spent bailing out lossmaking nuclear-power companies. The industry became a byword for mendacity, secrecy and profligacy with taxpayers' money. For two decades neither governments nor bankers wanted to touch it."

Now the U.S. federal government is once again doing everything in its power to entice investors, trying to revive atomic power.

First, Uncle Sam is trying hard to remove the financial risk for investors. The Energy Policy Act, which Congress approved in 2005, provides four different kinds of subsidies for atomic power plants:

1. It grants $2 billion in insurance against regulatory delays and lawsuits to the first six reactors that get licenses and begin construction. Energy corporations borrow money to build plants and they must start paying interest on those loans immediately, even though it take years for a plant to start generating income. The longer the licensing and construction delays, the greater the losses. Historically, lawsuits or other interventions by citizens have extended the licensing timeline, sometimes by years, costing energy corporations large sums. Now Uncle Sam will provide free insurance against any such losses.

2. Second, the 2005 law extends the older Price-Anderson Act, which limits a utility's liability to $10 billion in the event of a nuclear accident. A serious accident at a nuclear plant near a major city could create hundreds of billions of dollars in liabilities. Uncle Sam has agreed to relieve investors of that very real fear.

3. The 2005 law provides a tax credit of 1.8 cents per kilowatt-hour for the first 6,000 megawatts generated by new plants. Free money, plain and simple.

4. Most important, the law offers guarantees loans to fund new atomic power reactors and other power plants using "innovative" technology. Investors need no longer fear bad loans.

One obvious conclusion from all this is that, more than 50 years into the nuclear enterprise, atomic power still cannot attract investors and compete successfully in a "free market." This industry still requires an unprecedented level of subsidy and other government support just to survive.

An energy corporation's motives for buying into this system are clear: enormous subsidies improve the chance of substantial gain. However, the federal government's reasons for wanting to revive a moribund nuclear industry are not so clear. If the "free market" won't support the revival of nuclear power, why would the federal government want to pay billions upon billions of dollars to allay investor fears?

It certainly has little to do with global warming. At the time the 2005 energy bill was passed by a Republic-dominated Congress the official position of the Republican leadership was that global warming was a hoax. Even now when some Republicans have begun to acknowledge that perhaps we may have a carbon dioxide problem, science tells us that nuclear power plants are not the best way to reduce our carbon dioxide emissions. They're not even close to being the best way. (Lazy journalists are in the habit it repeating the industry mantra that nuclear power produces no greenhouse gases. This is nonsense. Read on.)

Substantial carbon dioxide emissions accompany every stage of nuclear power production, from the manufacture and eventual dismantling of nuclear plants, to the mining, processing, transport, and enrichment of uranium fuel, plus the eventual processing, transport, and burial of nuclear wastes.

A careful life-cycle analysis by the Institute for Applied Ecology in Darmstadt, Germany, concludes that a 1250 megaWatt nuclear power plant, operating 6500 hours per year in Germany, produces greenhouse gases equivalent to 250,000 metric tonnes of carbon dioxide per year. In other (unspecified) countries besides Germany, the same power plant could produce as much as 750,000 metric tonnes of carbon dioxide equivalents, the Institute study shows. (See Figure 3, pg. 5)

The study concludes that, in the emission of global warming gases (measured per kilowatt-hour of electricity made available), nuclear power compares unfavorably to...

** conservation through efficiency improvements

** run-of-river hydro plants (which use river water power but require no dams)

** offshore wind generators

** onshore wind generators

** power plants run by gas-fired internal combustion engines, especially plants that use both the electricity and the heat generated by the engine

** power plants run by bio-fuel-powered internal combustion engines

Of eleven ways to generate electricity (or avoid the need to generate electricity through efficiency and conservation) analyzed by the Institute, four are worse than nuclear from the viewpoint of greenhouse gas emissions, and six are better.

[For a great deal of additional solid information showing that nuclear power is no answer to global warming, check in with the Nuclear Information and Resource Service (NIRS)].

A study completed this summer by the Institute for Energy and Environmental Research (IEER) in Takoma Park, Maryland concluded that it is feasible, within 35 to 60 years, to evolve an energy system to power the U.S. economy without the use of any nuclear power plants or any coal plants. See the IEER study, Carbon-Free and Nuclear-Free; A Roadmap for U.S. Energy Policy.

So the rationale for the U.S. government's Herculean efforts to revive a decrepit nuclear power industry cannot be based on concern for global warming or energy independence. The facts simply don't support such a rationale.

Whatever its motivation, the U.S. federal government is doing everything in its power to revive atomic power. In addition to removing most of the financial risk for investors, Uncle Sam has removed other obstacles like democratic participation in siting and licensing decisions.

Throughout the 1970s, energy corporations complained that getting a license took too long. In response, the Nuclear Regulatory Commission (NRC) has spent more than a decade "streamlining" the process for building nuclear plants. Most of the "streamlining" consists of new ways to exclude the public from information and decisions. For example, members of the public used to be able to question witnesses during licensing hearings. No more. There used to be two sets of hearings -- one for siting the plant, and another for constructing the plant. No more. These two set of issues have been rolled into a single license and a single hearing. The purpose is to accommodate the needs of the nuclear industry, to help it survive. As attorney Tony Roisman observed recently, "The nuclear industry has come to the agency [the NRC] and said, 'If you don't make it easy for us to get a license, we are not going to apply for one.'" So the agency is making it easy.

Perhaps it is natural for NRC commissioners to justify a strong bias in favor of keeping the nuclear industry alive even if safety and democracy have to be compromised. After all, if there were no corporations willing to build new nuclear power plants, soon there would be no need for a Nuclear Regulatory Commission. So NRC commissioners know in their bones that their first priority must be to keep the nuclear industry alive. Every bureaucracy's first priority is self-perpetuation. Furthermore, historically a position as an NRC commissioner can lead directly to a high-paying job, often in the nuclear industry itself.

To grease the skids for a nuclear revival, the most important change the NRC has made has been to creatively redefine the meaning of the word "construction." This change was enacted in April, 2007, with lightning speed -- six months from initial proposal to final adoption.

By way of comparison, it took the NRC eleven years to adopt regulations requiring drug testing for nuclear plant operators.

"Construction" has traditionally included all the activities undertaken to build a nuclear power plant, starting with site selection, evaluation, testing and preparation, construction of peripheral facilities like cooling towers, and so on. Even the earliest stages of siting are crucially important with a facility as complex and dangerous as a nuclear power plant.

In April of this year, the NRC officially redefined "construction" to include only construction of the reactor itself -- excluding site selection, evaluation, testing and preparation, construction of peripheral facilities and all the rest. At the time, one senior environmental manager inside NRC complained in an email that NRC's redefinition of "construction" would exclude from NRC regulation "probably 90 percent of the true environmental impacts of construction." Under the new rules, by the time the NRC gets involved, a company will have invested perhaps a hundred million dollars. Will NRC commissioners have the backbone to toss that investment into the toilet if they eventually find something wrong with the site? Or will they roll over for the industry and compromise safety?

The lawyer who dreamed up the redefinition of "construction" is James Curtiss, himself a former NRC commissioner who now sits on the board of directors of the nuclear power giant, Constellation Energy Group. This revolving door pathway from NRC to industry is well-worn.

One NRC commissioner who voted in April to change the definition of construction is Jeffrey Merrifield. Before he left the NRC in July, Mr. Merrifield's last assignment as an NRC commissioner was to chair an agency task force on ways to accelerate licensing.

In April, while he was urging his colleagues at NRC to redefine "construction," Mr. Merrifield was actively seeking a top management position within the nuclear industry. In July he became senior vice president for Shaw Group, a nuclear builder that has worked on 95% of all existing U.S. nuclear plants. Mr. Merrifield's salary at NRC was $154,600. Bloomberg reports that, "In Shaw Group's industry peer group, $705,409 is the median compensation for a senior vice president."

No one in government or the industry seems the least bit embarrassed by any of this. It's just the way it is. Indeed, Mr. Merrifield points out that, while he was an NRC commissioner providing very substantial benefits to the nuclear industry by his decisions, his concurrent search for a job within the regulated industry was approved by the NRC's Office of General Counsel and its Inspector General. From this, one might conclude that Mr. Merrifield played by all the rules and did nothing wrong. Or one might conclude that venality and corruption reach into the highest levels of the NRC. Or one might conclude that after NRC commissioners have completed their assignment inside government, everyone in the agency just naturally feels they are entitled to a lifetime of lavish reward from the industry on whose behalf they have labored so diligently.

As recently as this summer, Uncle Sam was still devising new ways to revive nuclear power. In July the U.S. Senate allowed the nuclear industry to add a one-sentence provision to the energy bill, which the Senate then passed. The one sentence greatly expanded the loan guarantee provisions of the Energy Policy Act of 2005. The 2005 Act had specified that Uncle Sam could guarantee loans for new nuclear power plants up to a limit set each year by Congress. In 2007 the limit was set at a paltry $4 billion. The one-sentence revision adopted by the Senate removed all limits on loan guarantees. The nuclear industry says it needs at least $50 billion in the next two years. Michael J. Wallace, the co-chief executive of UniStar Nuclear, a partnership seeking to build nuclear reactors, and executive vice president of Constellation Energy, said: "Without loan guarantees we will not build nuclear power plants."

The Senate and the House of Representatives are presently arm- wrestling over the proposed expansion of loan guarantees. In June, the White House budget office said that the Senate's proposed changes to the loan-guarantee program could "significantly increase potential taxpayer liability" and "eliminate any incentive for due diligence by private lenders." On Wall Street this is known as a "moral hazard" -- conditions under which waste, fraud and abuse can flourish.

All these subsidies to revive a dead duck run directly counter to free market ideals and capitalism's credo of unfettered competition. So the intriguing question remains, Why? Why wouldn't the nation go whole-hog for alternative energy sources and avoid all the problems that accompany nuclear power -- routine radioactive releases, the constant fear of a serious accidents, the unsolved problem of radioactive waste that must be stored somewhere reliably for a million years, and -- the greatest danger of all -- the inevitable reality that anyone who can build a nuclear power plant can build an atomic bomb if they set their minds to it. The recent experience of Israel, India, North Korea and Pakistan in this regard is completely convincing and undeniable.

So why is Uncle Sam hell-bent on reviving nuclear power? I don't have a firm answer and can only speculate. Perhaps from the viewpoint of both Washington and Wall Street, nuclear power is preferable to renewable-energy alternatives because it is extremely capital-intensive and the people who provide the capital get to control the machine and the energy it provides. It provide a rationale for a large centralized bureaucracy and tight military and police security to thwart terrorists. This kind of central control can act as a powerful counterweight to excessive democratic tendencies in any country that buys into nuclear power. Particularly if they sign a contract with the U.S. or one of its close allies for delivery of fuel and removal of radioactive wastes, political control becomes a powerful (though unstated) part of the bargain. If you are dependent on nuclear power for electricity and you are dependent on us for reactor fuel, you are in our pocket. On the other hand, solar, wind and other renewable energy alternatives lend themselves to small- scale, independent installations under the control of local communities or even households. Who knows where that could lead?

Then I think of the present situation in the Middle East. Saddam Hussein started down the road to nuclear power until the Israelis bombed to smithereens the Osirak nuclear plant he was building in 1981. That ended his dalliance with nuclear power and nuclear weapons -- but that didn't stop Don Rumsfeld and Dick Cheney from using Saddam's nuclear history as an excuse to invade his country and string him up.

And now something similar is unfolding in Iran. Iran wants nuclear power plants partly to show how sophisticated and capable it has become, and partly to thumb its nose at the likes of Don Rumsfeld and Dick Cheney -- and perhaps to try to draw us into another war that would indelibly mark us for the next hundred years as enemies of Islam, serving to further unite much of the Arab world against us.

Is this kind of thinking totally nuts? I don't think so. Newsweek reported in its October 1, 2007 issue that Dick Cheney has been mulling a plan to convince the Israeli's to bomb the Iranian nuclear power plant at Natanz, hoping to provoke the Iranians into striking back so that the U.S. would then have an excuse to bomb Iran. I'm not making this up.

So clearly there are more important uses for nuclear power than just making electricity. Arguably, nuclear reactors have become essential tools of U.S. foreign policy -- being offered, withheld, and bargained over. They have a special appeal around the world because they have become double-edged symbols of modernity, like shiny toy guns that can be loaded with real bullets. Because of this special characteristic, they have enormous appeal and can provide enormous bargaining power. Witness North Korea. And, as we have seen, nuclear reactors can provide excuses to invade and bomb when no other excuses exist.

So perhaps Uncle Sam considers it worth investing a few hundred billion dollars of taxpayer funds to keep this all-purpose Swiss army knife of U.S. foreign policy available in our back pocket. In the past five years, we've already devoted $800 billion to splendid little wars in Afghanistan and Iraq, at least partly to secure U.S. oil supplies. Uncle Sam's desperate attempts to revive nuclear power can perhaps best be understood as part of that ongoing effort at oil recovery.

Meanwhile, investors should think twice before buying into the "nuclear renaissance" because there's another "renaissance" under way as well: A powerful anti-nuclear movement is growing again and they will toss your billions into the toilet without hesitation. Indeed, with glee.


EARLY PUBERTY IN GIRLS TROUBLING

The trend raises the risk of breast cancer, emotional problems.

By Dorsey Griffith, Bee Medical Writer

American girls are entering puberty at earlier ages, putting them at far greater risk for breast cancer later in life and for all sorts of social and emotional problems well before they reach adulthood.

Girls as young as 8 increasingly are starting to menstruate, develop breasts and grow pubic and underarm hair -- biological milestones that only decades ago typically occurred at 13 or older. African American girls are especially prone to early puberty.

Theories abound as to what is driving the trend, but the exact cause, or causes, is not known. A new report, commissioned by the San Francisco-based Breast Cancer Fund, has gathered heretofore disparate pieces of evidence to help explain the phenomenon -- and spur efforts to help prevent it.

"This is a review of what we know -- it's absolutely superb," said Dr. Marion Kavanaugh-Lynch, an oncologist and director of the California Breast Cancer Research Program in Oakland, which directs tobacco tax proceeds to research projects. "Having something like this document put together that discusses all the factors that influence puberty will advance the science and allow us to think creatively about new areas of study."

The stakes are high: "The data indicates that if you get your first period before age 12, your risk of breast cancer is 50 percent higher than if you get it at age 16," said the report's author, biologist Sandra Steingraber, herself a cancer survivor. "For every year we could delay a girl's first menstrual period, we could prevent thousands of breast cancers."

Kavanaugh-Lynch said most breast cancer cells thrive on estrogen, and girls who menstruate early are exposed to more estrogen than normally maturing girls.

Steingraber's paper, "The Falling Age of Puberty in U.S. Girls: What We Know, What We Need to Know," examines everything from obesity and inactivity to family stress, sexual imagery in media sources and accidental exposures of girls to chemicals that can change the timing of sexual maturation.

Steingraber concludes that early puberty could best be understood as an "ecological disorder," resulting from a variety of environmental hits.

"The evidence suggests that children's hormonal systems are being altered by various stimuli, and that early puberty is the coincidental, non-adaptive outcome," she writes.

Steingraber's report is being released amid growing national interest in how the environment contributes to disease, particularly cancer.

California is at the forefront of the research movement. Among the ongoing efforts:

** The California Environmental Contaminant Biomonitoring Program, a five-year, state-funded project, will measure chemical exposures in blood and urine samples from more than 2,000 Californians.

** The Bay Area Breast Cancer and the Environment Research Center, a federally funded project run by scientists at Kaiser Permanente and the University of California, San Francisco, is studying predictors of early puberty through monitoring of environmental exposures in more than 400 Bay Area girls over several years.

For years, parents, doctors and teachers have recognized the trend in early puberty among girls, with little information to explain it.

Dr. Charles Wibbelsman, a pediatrician with Kaiser Permanente in San Francisco and a member of the American Academy of Pediatrics committee on adolescents, said he now routinely sees girls as young as 8 with breast development and girls as young as 9 who have started their periods. He said the phenomenon is most striking in African American girls.

"We don't think of third-graders as using tampons or wearing bras," he said. In fact, he said, pediatricians are having to adjust the way they do regular check-ups because the older approaches don't jibe with reality.

Steingraber acknowledges that some of the shift in girls' puberty is evolutionary, a reflection of better infectious disease control and improved nutrition, conditions that allow mammals to reproduce.

But since the mid-20th century, she said, other factors seem to have "hijacked the system" that dictates the onset of puberty.

Rising childhood obesity rates clearly play a role, she said, noting that chubbier girls tend to reach puberty earlier than thinner girls. Levels of leptin, a hormone produced by body fat, is one trigger for puberty, and leptin levels are higher in blacks than in other groups.

But obesity cannot alone be blamed for the shifts, she said. Steingraber's paper explored many other factors that likely play a role, including exposure to common household chemicals. And she cited findings that link early puberty with premature birth and low birth weight, formula feeding of infants and excessive television viewing and media use.

"My job was to put together a huge jigsaw puzzle," she said.

Steingraber also reported associations of early puberty with emotional and social problems. "The world is not a good place for early maturing girls," she said. "They are at higher risk of depression, early alcohol abuse, substance abuse, early first sexual encounter and unintended pregnancies."

The reasons for this may be related to the way these children are treated or because of the way puberty affects a child's judgment, she said.

"It's possible that developing an adult-style brain at age 10 instead of 14 makes you make decisions about your life that are not really in your best interest," she said.

Priya Batra, a women's health psychologist at Kaiser Permanente in Sacramento, said she's seen the effects on girls who "look like sexual beings before they are ready to be sexual beings," and counseled mothers worried about their daughters entering puberty too early.

"It's a stressful culture, and we have a lot of demands on children," she said. "It's hard when we add this other layer of early puberty."


From: New York Times, Sept. 6, 2007

SOME FOOD ADDITIVES RAISE HYPERACTIVITY, STUDY FINDS

By Elisabeth Rosenthal

Common food additives and colorings can increase hyperactive behavior in a broad range of children, a study being released today found.

It was the first time researchers conclusively and scientifically confirmed a link that had long been suspected by many parents. Numerous support groups for attention deficit hyperactivity disorder have for years recommended removing such ingredients from diets, although experts have continued to debate the evidence.

But the new, carefully controlled study shows that some artificial additives increase hyperactivity and decrease attention span in a wide range of children, not just those for whom overactivity has been diagnosed as a learning problem.

The new research, which was financed by Britain's Food Standards Agency and published online by the British medical journal The Lancet, presents regulators with a number of issues: Should foods containing preservatives and artificial colors carry warning labels? Should some additives be prohibited entirely? Should school cafeterias remove foods with additives?

After all, the researchers note that overactivity makes learning more difficult for children.

"A mix of additives commonly found in children's foods increases the mean level of hyperactivity," wrote the researchers, led by Jim Stevenson, a professor of psychology at the University of Southampton. "The finding lends strong support for the case that food additives exacerbate hyperactive behaviors (inattention, impulsivity and overactivity) at least into middle childhood."

Additional background reading:

P. Grandjean and P.J. Landrigan, "Developmental neurotoxicity of industrial chemicals," Environmental Health Perspectives Vol. 368 (December 16, 2006), pgs. 2167-2178.

In response to the study, the Food Standards Agency advised parents to monitor their children's activity and, if they noted a marked change with food containing additives, to adjust their diets accordingly, eliminating artificial colors and preservatives.

But Professor Stevenson said it was premature to go further. "We've set up an issue that needs more exploration," he said in a telephone interview.

In response to the study, some pediatricians cautioned that a diet without artificial colors and preservatives might cause other problems for children.

"Even if it shows some increase in hyperactivity, is it clinically significant and does it impact the child's life?" said Dr. Thomas Spencer, a specialist in Pediatric Psychopharmacology at Massachusetts General Hospital.

"Is it powerful enough that you want to ostracize your kid? It is very socially impacting if children can't eat the things that their friends do."

Still, Dr. Spencer called the advice of the British food agency "sensible," noting that some children may be "supersensitive to additives" just as some people are more sensitive to caffeine.

The Lancet study focused on a variety of food colorings and on sodium benzoate, a common preservative. The researchers note that removing this preservative from food could cause problems in itself by increasing spoilage. In the six-week trial, researchers gave a randomly selected group of several hundred 3-year-olds and of 8- and 9-year-olds drinks with additives -- colors and sodium benzoate -- that mimicked the mix in children's drinks that are commercially available. The dose of additives consumed was equivalent to that in one or two servings of candy a day, the researchers said. Their diet was otherwise controlled to avoid other sources of the additives.

A control group was given an additive-free placebo drink that looked and tasted the same.

All of the children were evaluated for inattention and hyperactivity by parents, teachers (for school-age children) and through a computer test. Neither the researchers nor the subject knew which drink any of the children had consumed.

The researchers discovered that children in both age groups were significantly more hyperactive and that they had shorter attention spans if they had consumed the drink containing the additives. The study did not try to link specific consumption with specific behaviors. The study's authors noted that other research suggested that the hyperactivity could increase in as little as an hour after artificial additives were consumed.

The Lancet study could not determine which of the additives caused the poor performances because all the children received a mix. "This was a very complicated study, and it will take an even more complicated study to figure out which components caused the effect," Professor Stevenson said.

Copyright 2007 The New York Times Company


From: New Scientist (pg. 44), Sept. 1, 2007

TOXIC COCKTAIL

By Bijal Trivedi

Today, and every day, you can expect to be exposed to some 75,000 artificial chemicals. All day long you will be breathing them in, absorbing them through your skin and swallowing them in your food. Throughout the night they will seep out of carpets, pillows and curtains, and drift into your lungs. Living in this chemical soup is an inescapable side effect of 21st-century living. The question is: is it doing us any harm?

There are good reasons to think that it might be. Not because of the action of any one chemical but because of the way the effects of different components combine once they are inside the body. As evidence stacks up that this "cocktail effect" is real, regulators around the world are rethinking the way we measure the effects of synthetic mixtures on health.

Environmentalists have long warned of this danger, but until recently there was no solid evidence to confirm their fears -- nor any to allay them. Most toxicity testing has been done on a chemical-by-chemical basis, often by exposing rats to a range of concentrations to find the maximum dose that causes no harm. It's a long way from gauging the effects of the complex mixtures we experience in everyday life, and that could be a dangerous omission.

"When you get a prescription the doctor will ask what else you are taking, because they are concerned about drug interactions, which everyone knows can be quite devastating," says Shanna Swan, director of the Center for Reproductive Epidemiology at the University of Rochester in New York. This also happens with chemicals like pesticides and endocrine disrupters, she adds. "You have to consider their interactions, and we are just starting to do that."

To assess the risk posed by such mixtures, a small number of scientists in Europe and the US are now testing chemical brews on yeast, fish and rats. The effects could be additive, or they might be synergistic -- that is, greater than the sum of the parts. They could even cancel each other out. Finding out is important, because we don't have enough data on many compounds to anticipate how they will interact when mixed. Other researchers are probing for associations between disease in humans and past exposure to groups of chemicals.

Andreas Kortenkamp, an environmental toxicologist at the School of Pharmacy, University of London, and his colleagues developed an interest in these mixture effects after they noticed a rise in endocrine disorders, suggesting that the body's hormonal systems may have been disrupted. In men there were increases in congenital malformations like hypospadia -- in which the urethra is on the wrong side of the penis -- and cryptorchidism, a condition in which the testes fail to descend into the scrotum. There was also a rise in testicular cancer and lower sperm counts. In women there were more breast cancers and polycystic ovaries.

These increases posed a conundrum for the researchers. When they examined people who had these disorders, and their mothers, they found they had only very low levels of the chemicals that are known to cause the disorders; in the lab, only much higher concentrations of these individual compounds have be found to produce the same effects. This led Kortenkamp to suspect that mixtures were the missing link. He wondered if the effects of different chemicals, acting through the same biochemical pathway, could add up.

Kortenkamp's group focused on groups of chemicals called xenoestrogens, compounds that disrupt the activity of the hormone oestrogen and induce the development of female sexual characteristics. High levels of xenoestrogens in the environment have been shown to feminise male fish, and have even driven one species in an isolated experimental lake in Canada almost to extinction.

In 2002 Kortenkamp and his colleagues tested a mix of eight xenoestrogens on yeast. These included chemicals used as plasticisers, sunscreen ingredients and others found in cooling and insulating fluids. In the mixture, each was below the level that toxicologists call the "no-observed-effect concentration" -- the level that should be safe. Sure enough, the combination triggered unusual effects in the yeast. Kortenkamp and his colleagues dubbed the mixture effect "something from nothing" (see Diagram).

Kortenkamp and his colleagues found that if the doses of all eight chemicals were simply added together, after adjusting for the varying potencies, this new cumulative dose could be used to predict the effect -- a principle called "dose addition". "This result was to be expected, but it had never been shown with endocrine disrupters until our work," says Kortenkamp. Intuitively this makes sense, he says: "Every mixture component contributes to the effect, no matter how small."

Since then the effect has been shown with other species, too. Kortenkamp and his colleagues now report that mixtures of xenoestrogens feminised males to varying degrees even though the individual components should have been harmless. In July this year the team showed that a blend of anti-androgens -- chemicals that block the effect of male sex hormones -- can work in the same way. They exposed pregnant rats to two common fungicides, vinclozolin and procymidone, and the prostate cancer drug flutamide, and then screened the male offspring for reproductive deformities. At higher doses, each of these three chemicals wreaks havoc with sex hormones, and they all do it via the same mechanism: they disrupt male development by blocking androgen receptors and so prevent natural hormones from binding. The researchers found that even when the chemicals were used in doses that had no effect when given individually to pregnant rats, a mixture of them disrupted the sexual development of male fetuses.

Earl Gray, an ecotoxicologist at the reproductive toxicology division of the US Environmental Protection Agency's Health and Environmental Effects Research Laboratory (HEERL) in Research Triangle, North Carolina, and his team also tried exposing pregnant rats to vinclozolin and procymidone. When they exposed the animals to the compounds individually, they too saw no effect. But when they combined the two, half of the males were born with hypospadia. Gray calls this phenomenon "the new math -- zero plus zero equals something".

Gray then tried the same experiment with phthalates -- the ubiquitous compounds that are used to soften plastics and thicken lotions, and are found in everything from shampoo to vinyl flooring and flexible medical tubing. They also disrupt male development, in this case by stopping the fetus from making testosterone. The mix of two phthalates that Gray used caused many of the same effects on male rat fetuses as a mixture of vinclozolin and procymidone.

It makes sense that chemicals targeting the same pathway would have an additive effect. But what about mixtures of chemicals that work via different mechanisms? Surely the individual doses of such chemicals would not be additive in the same way.

"The mixture of different chemicals shouldn't have had any effect. But it did"In 2004, Gray and his team put this to the test by mixing procymidone with a phthalate at levels that, on their own, would produce no effect. Because the chemicals work via different routes, he expected that the combination wouldn't have any effect either. But they did. Then the team mixed seven compounds -- with four independent routes of action -- each at a level that did not produce an effect. "We expected nothing to happen, but when we give all [the compounds] together, all the animals are malformed," Gray says. "We disrupted the androgen receptor signalling pathway by several different mechanisms. It seems the tissue can't tell the difference and is responding in an additive fashion."

All of this is throwing up problems for regulatory agencies around the world. Governments generally don't take into account the additive effects of different chemicals, with the exception of dioxins -- which accumulate to dangerous levels and disrupt hormones in the body -- and some pesticides. For the most part, risk assessments are done one chemical at a time.

Even then, regulation is no simple issue. First you need to know a chemical's potency, identify which tissues it harms and determine whether a certain population might be exposed to other chemicals that might damage the same tissue. Add in the cocktail effect and it gets harder still. "It is a pretty difficult regulatory scenario," admits Gray. "At this point the science is easier than implementing the regulatory framework."

Mixed up inside

For one thing, with many mixtures it's almost impossible to work out how much we're getting. The endocrine disrupter diethyl phthalate, for example, easily escapes from plastics and is in so many different products -- from toothbrushes to toys, and packaging to cosmetics and drugs -- that it would be difficult to work out the aggregate exposure from all sources, says Gray. This also makes it tricky to investigate possible links between chemical mixtures and disease. "Everyone has exposure to chemicals, even people living in the Arctic," says John Sumpter, an ecotoxicologist at Brunel University in London. "We can't go to a group with a mixture of nasty chemicals and then go to another who have had no exposure and compare their rate of breast cancer risk or sperm count. We are doing a scientific experiment by letting these chemicals accumulate in our bodies, blood and wildlife." That's why some researchers are suggesting new ways to gauge the effects of chemical mixtures on the body. For example, rather than trying to identify levels of individual xenoestrogens in a patient's blood, it may be more efficient to take a serum sample and determine the "oestrogenic burden" being imposed on their body from a variety of different sources by testing the sample on oestrogen-sensitive cells in the lab. "It might work well as a screening tool to identify people with potential problems," says Linda Birnbaum, director of the experimental toxicology division at HEERL. Then, for example, you could make cocktails of foods, water and other products from the person's life to try to identify the source of the chemicals. Nicolas Olea, a doctor and oncologist at the University of Granada, Spain, is already trying this kind of approach. He is exploring whether exposure to chemicals with oestrogenic activity leads to genital malformations like cryptorchidism and hypospadia in men, and breast cancer in women. He and his colleagues took samples from various tissues and measured the ability of the environmental contaminants in them to trigger the proliferation of lab-cultured oestrogen-sensitive cells. Because it is difficult to predict from a compound's structure whether it might have oestrogenic effects, a cell-based assay like this is a cheap way to screen potentially harmful chemicals. They found that the higher this "total effective xenoestrogen burden" the greater the chance the contaminants could disrupt oestrogen-dependent processes.

Others are cautiously optimistic about Olea's approach. "The concept is correct, I cannot comment on how well the cell effect tracks a cancer effect," says James Pirkle, deputy director of the US Centers for Disease Control's Environmental Health Laboratory in Atlanta, Georgia.

Shanna Swan is doing something similar. In a study published in 2005 she showed that boys whose mothers had had higher levels of five phthalates while their babies were in the womb had a shorter distance between the anus and genitals -- a marker of feminising activity. They also had higher rates of cryptorchidism compared to sons of mothers with lower phthalate levels. Swan devised a cumulative score to reflect exposure levels to all five phthalates and found that score was "very predictive of ano-genital distance".

The method is still expensive, and a regular "phthalate scan" isn't on the cards just yet. A potentially less costly approach, says Pirkle, is regular biomonitoring of subsets of the population to measure the levels of dangerous chemicals in blood and urine, and link particular chemicals to specific health effects. Every two years since 2001, the US Centers for Disease Control has published data on the US population's exposure to a range of potentially harmful chemicals. In 2005 the agency released data for 148 chemicals; next year it plans to release a report covering 275. While that number falls far short of the number of new chemicals entering the fray each year, Pirkle says that technology is making it ever easier to monitor new substances. The reports do not consider specific mixtures but include exposure data for each individual chemical to make it easier to calculate the likely effects of mixtures.

The European Union, meanwhile, is taking steps to control the number of chemicals being released in the first place. On 1 June its REACH (registration, evaluation, authorisation and restriction of chemical substances) regulations became law. The aim is to cut health risks associated with everyday chemicals by forcing chemical manufacturers and importers to register their compounds and provide safety information to the new European Chemicals Agency, based in Helsinki, Finland. This information must be provided before the chemicals are sold. The new law shifts the burden of responsibility for the health effects of chemicals from government to industry and is also intended to encourage the use of less harmful alternatives for the more toxic chemicals.

Not everyone is so worried about the cocktail effect. Some researchers even find it reassuring -- or at least not as bad as it could be. Kevin Crofton, a neurotoxicologist at the EPA, explored how a mixture of 18 polyhalogenated aromatic hydrocarbons found in electrical equipment, flame retardants and paints could disrupt thyroid hormone levels in rats. At the lowest doses of the mixture the effect on the levels of the thyroid T4 hormone was what you would expect from the principle of dose addition; at the highest doses the effect was twice that. "Some people would call that synergy," says Crofton, "but it is not a very big synergistic effect. It was a twofold difference."

He adds: "These results are quite reassuring because EPA's default to calculate the cumulative risk of mixtures is dose addition." Only recently, however, have scientists like Crofton been able to prove that this default is correct. "If it had been a 20-fold difference I would have said, 'Boy, the agency needs to look into how it is doing things.'"

Kortenkamp says that regulatory bodies seem to be starting to acknowledge that chemical-by-chemical risk assessment provides a false sense of security. In November last year around 100 scientists and EU policy-makers at the "Weybridge +10" workshop held in Helsinki concluded that mixture effects must be considered during risk assessment and regulation. The European Commission plans to spend more on probing the effects of environmental chemicals on human health.

For now, though, chemicals are an inescapable part of life. And while high-profile campaigns by pressure groups like WWF seek to alert us to what they see as the dangers of artificial chemicals, some toxicologists warn that they may be overstating the case. "I think you need to be careful about hyping the risk," says Crofton, referring to stories in which individuals have been screened for several hundred chemicals. "When you say I have 145 chemicals in my body, that in itself does not translate into a hazard. You have to know something about the dose, the hazard and how all these chemicals can add up." Olea, however, suggest that it is sensible to be cautious. "If you don't know it is good, assume it is bad," he says.

Like it or not, the chemicals are with us. "People can't keep phthalates [or other chemicals] out of their air, water or food," says Swan. "Most people don't have the information or money to do these things." A more productive approach might be to tell people how to limit exposure to harmful substances and request better labelling from manufacturers. "We need to put a lot of money into figuring out what these things do in real-world scenarios and take regulatory action," she says. "Just like we limited cigarette smoke exposure, we'll have to limit other exposures."

Bijal Trivedi is a freelance science writer based in Washington DC

Copyright Reed Business Inform

hr>

From: The Daily Green, Sept. 4, 2007

EDITORIAL: A PREEMPTIVE STRIKE AGAINST TOXIC CHEMICALS

California Considers Schemes To Stop Pollution Before It Starts

In the United States, the use of potentially toxic chemicals has always had a leap-before-you-look regulatory approach, with the emphasis on producing innovative new products and solutions. Problem is, time and again substances thought to be harmless prove toxic, or persistent in the environment, and cleaning them becomes a costly both in health and dollars -- headache.

That's why it's good to watch California, which is considering two competing proposals that may be more complementary, according to a Los Angeles Times editorial, than they appear.

Gov. Arnold Schwarzenegger, a Republican, is pushing a "green chemistry" initiative that has the backing of business and industry in the Golden State. It would look at chemicals used and study their potential risks -- as well as potential alternatives.

Assemblyman Mike Feuer, a Democrat, has proposed a Massachusetts- style inventory of all hazardous chemicals used by California businesses, and require regular reports about their plans to reduce or eliminate their use.

The take-home message from both proposals is this: Business as usual isn't working. We've had too many man-made chemicals that turn out to prey on the health of humans or wildlife to let more into the environment without the closest of scrutiny. After all, we're talking about chemicals developed in labs -- chemicals that the human body and the environment have not evolved to process or defend themselves against.

Copyright Reuters


From: Financial Times (London, UK), Sept. 5, 2007

CLIMATE CHANGE DEBATE NEEDS REVOLUTION

By John Aglionby in Jakarta

A revolution of society on a scale never witnessed in peacetime is needed if climate change is to be tackled successfully, the head of a major business grouping has warned.

Bjorn Stigson, the head of the Geneva-based World Business Council for Sustainable Development (WBCSD), predicted governments would be unable to reach agreement on a framework for reducing carbon emissions at either a US-sponsored meeting in Washington later this month or at a United Nations climate summit in Indonesia in December.

Climate change is also expected to be high on the agenda at this week's annual summit of Pacific leaders in Sydney.

"It will probably get worse before it gets better before governments feel they've got the political mandate to act," he told the Financial Times during a visit to Jakarta. "We're going to have to go into some sort of crisis before it's going to be resolved. I don't think people have realised the challenge. This is more serious than what people think."

The "challenge", Mr Stigson said, is for developed nations to cut carbon emission levels by 60 to 80 per cent from current levels by 2050 if global emissions are to be kept below 550 parts per million. Global emissions at that level would keep average permanent global temperature increase below 3 degrees by 2050, a level beyond which most scientists say climate change would be significantly worse.

The WBCSD reached this conclusion after studying the Stern review on climate change, the International Energy Association's world energy outlook, and a recent International Plant Protection Convention review.

"I think it's beginning to dawn on people that we are talking about such a major change in society people are saying this is tougher than what we thought," he said. "How do you change society in a radical way in a democracy so the people you want to vote for you are also going to suffer the consequences of the policies that you put in place."

"I don't think we've seen that kind of a challenge in societal change happening peacefully. It's [only] happened in revolutions."

The 200 members of the WBCSD, which have a combined market cap of $6,000bn, are dismayed by politicians' lack of political will to address the issues, Mr Stigson said.

"We're very concerned by what we see and the lack of response from governments in grasping the responsibility they have in dealing with this issue," he said. "Our problem right now is that we...don't know what the policies are going to be beyond 2012. How do you take these issues into consideration when you build a new plant that's going to live for 30, 40 years."

The WBCSD want rich countries to agree on global targets for themselves while committing to developing nations $80-$100bn a year and technology to help them grow more sustainably.

"If that deal is not there, you'll be in a situation where India, China and Brazil will say, we're not going to get into any agreement," he said. "If I were betting my money now, I would bet that by 2012 the world will not have a global framework. We will have a patchwork of regional and national regulations that we have to make as compatible as possible."

Copyright The Financial Times Limited 2007


From: Courier-Post (Cherry Hill, N.J.), Aug. 13, 2007

OYSTERMEN 'IN THE FIGHT'

By Richard Pearsall

Port Norris, N.J. -- Thanks to the oyster industry, this old port at the mouth of the Maurice River once had more millionaires per capita than any other place in New Jersey.

Or so the story goes.

Those days are gone, the oyster fishery on the Delaware Bay having been decimated by not one, but two waves of parasites.

But the industry survives -- a few dozen oystermen still ply the bay -- and New Jersey is striving to ensure that the fishery at least holds its own.

"We're in the fight," said Eric Powell, director of the Haskin Shellfish Research Laboratory, the Rutgers outpost here dedicated to both research and managing the oyster beds.

Toward the latter end, the lab, in conjunction with the Army Corps of Engineers, the state Department of Environmental Protection (DEP) and others, has been dumping millions of clamshells into the bay each year for the past three years.

The idea is to encourage what's called "recruitment," building up a layer of the clean, hard surfaces that baby oysters, or "spat," love to grab hold of to get started in life.

The results so far are inconclusive.

Oystermen will be allowed to harvest 80,000 bushels from state managed oyster beds this year, slightly above the average of 70,000 bushels a year that has prevailed over the past decade.

But the number has gone up and down from year to year and is likely to continue to do so depending on the strength of the parasite that began afflicting the oyster beds in the 1990s.

"The mortality rate from disease is still too high," Powell said of dermo, which has proven to be a persistent enemy of the Eastern oyster. "This is one of the penalties of global warming."

Dermo is a warm-water disease, he explained, one that has moved up the East Coast, reaching New Jersey in the 1990s and now a problem as far north as the coast of Maine.

It also likes high salinity, so it is more of a problem in periods of drought when the salt line moves further up the bay.

"We can minimize the damage from dermo," Powell said, "but we can't eradicate it. It's going to take its toll. As a result, we're going to have to be more proactive."

The oystermen themselves seem to have found a niche market for their product and are looking more to protect and preserve than grow their industry.

"We want the industry to be stable," said Steve Fleetwood, an oystermen himself and the manager of the Hillard Bloom Packing Company in Port Norris.

"With the dermo still around, I don't know that any of us are really comfortable."

At its peak in the late 1920s, the oyster fishery sent some 500 boats and 4,000 men out on to the Delaware Bay, where they dredged up as many as two million bushels of oysters a year.

The captains of the schooners that sailed the bay got rich, as did the entrepreneurs who ran the shucking and packing houses in Bivalve and Shell Pile and the railroad barons whose cars carried the prized seafood to market.

"This was a boom town," said Megan Wren, surveying the now largely desolate area around her offices here at the Bayshore Discovery Project, a nonprofit whose mission is to preserve the history and culture of the region, but also to promote its future.

Nature turned out to be even crueler to the oyster fishery than the Great Depression, striking with two waves of parasites that, while not harmful to humans, are deadly to oysters.

In the 1950s, a parasite called MSX destroyed more than 90 percent of the harvestable oysters on the bay's bottom.

Then in the 1990s, just when the oyster beds were beginning to recover from MSX, the second parasite, dermo, struck.

Today about 70 boats, each with a license attached, dredge the Delaware Bay for oysters.

They still employ basically the same techniques as the sailing schooners that preceded their diesel-powered boats, dragging a steel cage along the bottom, pulling it up, then sorting out its contents on deck for the prized catch.

Rutgers and the DEP carefully monitor both the bay bottom and the catch, enabling them to control this year's catch and accurately establish a quota for next year.

The overall quota the state sets for the beds it manages is divided among the license holders, resulting this year in an allotment of about 1,100 bushels per boat.

"The number of licenses is limited," said Jason Hearon, fisheries biologist for the state DEP, "in part to keep the pressure for a bigger harvest down."

One problem facing the program is finding enough shell to use for seed strata.

"A few years ago this was a waste product they couldn't get rid of," Powell said of the clamshells. Now there is competition from road work and other uses.

"It's really criminal for shells to go anywhere but back into their natural environment," he concluded.

Reach Richard Pearsall at (856) 486-2465 or rpearsall@courierposton


From: Denver Post (Denver, Colo.), Sept. 3, 2007

GRASSLANDS ARE LOSING GROUND

By Katy Human Denver Post Staff Writer

On eastern Colorado's grassy rangeland, the dominant plant of the future may be one shunned even by the hungriest of cattle: fringed sage.

The unpalatable mint-green shrub increased in bulk by 40 times during climate change experiments conducted by the U.S. Department of Agriculture and Colorado State University, the scientists reported last week.

"It was a minor species at the beginning of the study, but by the end of four years, 10 percent of the aboveground cover was this species," said Jack Morgan, a USDA range scientist in Fort Collins.

"Here's a plant that may be a winner in a greenhouse future," Morgan said.

Grassland covers about 40 percent of Earth's land, Morgan and his colleagues

Scientists set up greenhouses on prairie 40 miles northeast of Fort Collins and then pumped carbon dioxide into some of them to see how that would affect the vegetation over four years. (Photo courtesy ARS Rangeland Resources Research Unit)wrote in a paper in the Proceedings of the National Academy of Sciences, and woody shrubs have been moving steadily into most of the planet's grasslands for more than a century. Scientists have attributed that livestock-unfriendly trend to many things, Morgan said, from the suppression of natural fires to overgrazing, drought and climate change.

"There's some who would debate if carbon dioxide and climate change were a factor," Morgan said. "But that's what our study shows - clearly."

Forty miles northeast of Fort Collins, on Colorado's northeastern plains, Morgan and his colleagues set up clear plastic greenhouses around plots of prairie.

They pumped extra carbon dioxide into some of the greenhouses, left ambient air in others, and followed plant communities in both -- and on normal prairie -- for four years.

Fringed sage increased its aboveground bulk by 40 times in the greenhouses with extra carbon dioxide, the team found. "That's a huge response," Morgan said. "I've not seen any plant in the literature that responds as much."

Other studies have shown that plant species react differently to climate change.

In Colorado's mountains, fields of wildflowers gave way to sage during warming experiments.

In a California grassland, species diversity dropped when scientists increased carbon dioxide levels.

On rangeland, the likely continued woody plant encroachment is not just an ecological problem. It's a financial one for ranchers, said Terry Fankhauser, executive vice president of the Arvada-based Colorado Cattlemen's Association.

"We know climate change is inevitable, and we know that means species changes," Fankhauser said.

More shrubs and fewer grasses would not be welcome changes, he said, although Colorado cattlemen have long had to deal with woody invaders.

"It's already a part of everyday ranch management," Fankhauser said. "We have to control those woody species to maintain native vegetation."

Staff writer Katy Human can be reached at 303-954-1910 or khuman@denverpost.com.


From: The New York Times (pg. A19), Aug. 24, 2007

SEEKING WILLIE HORTON

By Paul Krugman, Times columnist

So now Mitt Romney is trying to Willie Hortonize Rudy Giuliani. And thereby hangs a tale -- the tale, in fact, of American politics past and future, and the ultimate reason Karl Rove's vision of a permanent Republican majority was a foolish fantasy.

Willie Horton, for those who don't remember the 1988 election, was a convict from Massachusetts who committed armed robbery and rape after being released from prison on a weekend furlough program. He was made famous by an attack ad, featuring a menacing mugshot, that played into racial fears. Many believe that the ad played an important role in George H.W. Bush's victory over Michael Dukakis.

Now some Republicans are trying to make similar use of the recent murder of three college students in Newark, a crime in which two of the suspects are Hispanic illegal immigrants. Tom Tancredo flew into Newark to accuse the city's leaders of inviting the crime by failing to enforce immigration laws, while Newt Gingrich declared that the "war here at home" against illegal immigrants is "even more deadly than the war in Iraq and Afghanistan."

And Mr. Romney, who pretends to be whatever he thinks the G.O.P. base wants him to be, is running a radio ad denouncing New York as a "sanctuary city" for illegal immigrants, an implicit attack on Mr. Giuliani.

Strangely, nobody seems to be trying to make a national political issue out of other horrifying crimes, like the Connecticut home invasion in which two paroled convicts, both white, are accused of killing a mother and her two daughters. Oh, and by the way: over all, Hispanic immigrants appear to commit relatively few crimes -- in fact, their incarceration rate is actually lower than that of native-born non-Hispanic whites.

To appreciate what's going on here you need to understand the difference between the goals of the modern Republican Party and the strategy it uses to win elections.

The people who run the G.O.P. are concerned, above all, with making America safe for the rich. Their ultimate goal, as Grover Norquist once put it, is to get America back to the way it was "up until Teddy Roosevelt, when the socialists took over," getting rid of "the income tax, the death tax, regulation, all that."

But right-wing economic ideology has never been a vote-winner. Instead, the party's electoral strategy has depended largely on exploiting racial fear and animosity.

Ronald Reagan didn't become governor of California by preaching the wonders of free enterprise; he did it by attacking the state's fair housing law, denouncing welfare cheats and associating liberals with urban riots. Reagan didn't begin his 1980 campaign with a speech on supply-side economics, he began it -- at the urging of a young Trent Lott -- with a speech supporting states' rights delivered just outside Philadelphia, Miss., where three civil rights workers were murdered in 1964.

And if you look at the political successes of the G.O.P. since it was taken over by movement conservatives, they had very little to do with public opposition to taxes, moral values, perceived strength on national security, or any of the other explanations usually offered. To an almost embarrassing extent, they all come down to just five words: southern whites starting voting Republican.

In fact, I suspect that the underlying importance of race to the Republican base is the reason Rudy Giuliani remains the front-runner for the G.O.P. nomination, despite his serial adultery and his past record as a social liberal. Never mind moral values: what really matters to the base is that Mr. Giuliani comes across as an authoritarian, willing in particular to crack down on you-know-who.

But Republicans have a problem: demographic changes are making their race-based electoral strategy decreasingly effective. Quite simply, America is becoming less white, mainly because of immigration. Hispanic and Asian voters were only 4 percent of the electorate in 1980, but they were 11 percent of voters in 2004 -- and that number will keep rising for the foreseeable future.

Those numbers are the reason Karl Rove was so eager to reach out to Hispanic voters. But the whites the G.O.P. has counted on to vote their color, not their economic interests, are having none of it. From their point of view, it's us versus them -- and everyone who looks different is one of them.

So now we have the spectacle of Republicans competing over who can be most convincingly anti-Hispanic. I know, officially they're not hostile to Hispanics in general, only to illegal immigrants, but that's a distinction neither the G.O.P. base nor Hispanic voters takes seriously.

Today's G.O.P., in short, is trapped by its history of cynicism. For decades it has exploited racial animosity to win over white voters and now, when Republican politicians need to reach out to an increasingly diverse country, the base won't let them.


TWO FRIENDS DEBATE RISK ASSESSMENT AND PRECAUTION

By Peter Montague

Recently my friend Adam Finkel -- a skilled and principled risk assessor -- won two important victories over the Occupational Safety and Health Administration (OSHA).

In 2000, Adam was appointed Regional Administrator for OSHA in charge of the Denver office. When he began to suspect that OSHA inspectors were being exposed to dangerous levels of the highly-toxic metal, beryllium, he took precautionary action -- urging OSHA to test beryllium levels in OSHA inspectors. It cost him his job.

The agency did not even want to tell its inspectors they were being exposed to beryllium, much less test them. So Adam felt he had no choice -- in 2002, he blew the whistle and took his concerns public. OSHA immediately relieved him of his duties as Regional Administrator, and moved him to Washington where they changed his title to "senior advisor" and assigned him to the National Safety Council -- a place where "they send people they don't like," he would later tell a reporter.

Adam sued OSHA under the federal whistleblower protection statute and eventually won two years' back pay, plus a substantial lump sum settlement, but he didn't stop there. In 2005, he lodged a Freedom of Information Act law suit against the agency, asking for all monitoring data on toxic exposures of all OSHA inspectors. Meanwhile, OSHA began testing its inspectors for beryllium, finding exactly what Adam had suspected they would find -- dangerously high levels of the toxic metal in some inspectors.

Adam is now a professor in the Department of Environmental and Occupational Health, School of Public Health, University of Medicine and Dentistry of New Jersey (UMDNJ); and a visiting scholar at the Woodrow Wilson School, Princeton University. At UMDNJ, he teaches "Environmental Risk Assessment."

Earlier this summer, Adam won his FOIA lawsuit. A judge ruled that OSHA has to hand over 2 million lab tests on 75,000 employees going back to 1979. It was a stunning victory over an entrenched bureaucracy.

Meanwhile in 2006 the American Public Health Association selected Adam to receive the prestigious David Rall Award for advocacy in public health. You can read his acceptance speech here.

When Adam's FOIA victory was announced early in July, I sent him a note of congratulations. He sent back a note, attaching a couple of articles, one of which was a book review he had published recently of Cass Sunstein's book, Risk and Reason. Sunstein doesn't "get" the precautionary principle -- evidently, he simply sees no need for it. Of course the reason why we need precaution is because the cumulative impacts of the human economy are now threatening to wreck our only home -- as Joe Guth explained last week in Rachel's News.

In any event, I responded to Adam's book review by writing "A letter to my friend, who is a risk assessor," and I invited Adam to respond, which he was kind enough to do.

So there you have it.

Do any readers want to respond to either of us? Please send responses to peter@rachel.org.

From: Journal of Industrial Ecology (pg. 243), Oct. 1, 2005

ADAM FINKEL REVIEWS CASS SUNSTEIN'S BOOK, RISK AND REASON

A review of: Risk and Reason: Safety, Law, and the Environment, by Cass R. Sunstein. Cambridge, UK: Cambridge University Press, 2002, 342 pp., ISBN 0521791995, $25.00 (Also in paperback: ISBN 0521016258 $22.99).

By Adam M. Finkel

As someone dedicated to the notion that society needs quantitative risk assessment (QRA) now more than ever to help make decisions about health, safety, and the environment, I confess that I dread the arrival of each new book that touts QRA or cost-benefit analysis as a "simple tool to promote sensible regulation." Although risk analysis has enemies aplenty, from both ends of the ideological spectrum, with "friends" such as John Graham (Harnessing Science for Environmental Regulation, 1991), Justice Stephen Breyer (Breaking the Vicious Circle, 1994), and now Cass Sunstein, practitioners have their hands full.

I believe that at its best, QRA can serve us better than a "precautionary principle" that eschews analysis in favor of crusades against particular hazards that we somehow know are needlessly harmful and can be eliminated at little or no economic or human cost. After all, this orientation has brought us increased asbestos exposure for schoolchildren and remediation workers in the name of prevention, and also justified an ongoing war with as pure a statement of the precautionary principle as we are likely to find ("we have every reason to assume the worst, and we have an urgent duty to prevent the worst from occurring," said President Bush in October 2002 about weapons of mass destruction in Iraq). More attention to benefits and costs might occasionally dampen the consistent enthusiasm of the environmental movement for prevention, and might even moderate the on- again, off-again role precaution plays in current U.S. economic and foreign policies. But "at its best" is often a distant, and even a receding, target -- for in Risk and Reason, Sunstein has managed to sketch out a brand of QRA that may actually be less scientific, and more divisive, than no analysis at all.

To set environmental standards, to set priorities among competing claims for environmental protection, or to evaluate the results of private or public actions to protect the environment, we need reliable estimates of the magnitude of the harms we hope to avert as well as of the costs of control. The very notion of eco-efficiency presupposes the ability to quantify risk and cost, lest companies either waste resources chasing "phantom risks" or declare victory while needless harms continue unabated. In a cogent chapter (Ch. 8) on the role of the U.S. judiciary in promoting analysis, Sunstein argues persuasively that regulatory agencies should at least try to make the case that the benefits of their efforts outweigh the costs, but he appears to recognize that courts are often ill-equipped to substitute their judgments for the agencies' about precisely how to quantify and to balance. He also offers a useful chapter (Ch. 10) on some creative ways agencies can transcend a traditional regulatory enforcer role, help polluters solve specific problems, and even enlist them in the cause of improving eco-efficiency up and down the supply chain. (I tried hard to innovate in these ways as director of rulemaking and as a regional administrator at the U.S. Occupational Safety and Health Administration, with, at best, benign neglect from political appointees of both parties, so Sunstein may be too sanguine about the practical appeal of these innovations.)

Would that Sunstein had started (or stopped) with this paean to analysis as a means to an end -- perhaps to be an open door inviting citizens, experts, those who would benefit from regulation, and those reined in by it to "reason together." Instead, he joins a chorus of voices promoting analysis as a way to justify conclusions already ordained, adding his own discordant note. Sunstein clearly sees QRA as a sort of national antipsychotic drug, which we need piped into our homes and offices to dispel "mass delusions" about risk. He refers to this as "educating" the public, and therein lies the most disconcerting aspect of Risk and Reason: he posits a great divide between ordinary citizens and "experts," and one that can only be reconciled by the utter submission of the former to the latter. "When ordinary people disagree with experts, it is often because ordinary people are confused," he asserts (p. 56) -- not only confused about the facts, in his view, but not even smart enough to exhibit a rational form of herd behavior! For according to Sunstein, "millions of people come to accept a certain belief [about risk] simply because of what they think other people believe" (p. 37, emphasis added).

If I thought Sunstein was trying by this to aggrandize my fellow travelers -- scientists trained in the biology of dose-response relationships and the chemistry and physics of substances in the environment, the ones who actually produce risk assessments -- I suppose I would feel inwardly flattered, if outwardly sheepish, about this unsolicited elevation above the unwashed masses. But the reader will have to look long and hard to find citations to the work of practicing risk assessors or scientists who helped pioneer these methods. Instead, when Sunstein observes that "precisely because they are experts, they are more likely to be right than ordinary people. . . brain surgeons make mistakes, but they know more than the rest of us about brain surgery" (p. 77), he has in my view a quaint idea of where to find the "brain surgeons" of environmental risk analysis.

He introduces the book with three epigrams, which I would oversimplify thus: (1) the general public neglects certain large risks worthy of fear, instead exhibiting "paranoia" about trivial risks; (2) we maintain these skewed priorities in order to avoid taking responsibility for the (larger) risks we run voluntarily; and (3) defenders of these skewed priorities are narcissists who do not care if their policies would do more harm than good. The authors of these epigrams have something in common beyond their worldviews: they are all economists. Does expertise in how markets work (and that concession would ignore the growing literature on the poor track record of economists in estimating compliance costs in the regulatory arena) make one a "brain surgeon" qualified to bash those with different views about, say, epidemiology or chemical carcinogenesis?

To illustrate the effects of Sunstein's continued reliance on one or two particular subspecies of "expert" throughout the rest of his book, I offer a brief analysis of Sunstein's five short paragraphs (pp. 82-83) pronouncing that the 1989 public outcry over Alar involved masses of "people [who] were much more frightened than they should have been."1 Sunstein's readers learn the following "facts" in this example: ** Alar was a "pesticide" (actually, it regulated the growth of apples so that they would ripen at a desired time).

** "About 1 percent of Alar is composed of UDMH [unsymmetrical dimethylhydrazine], a carcinogen" (actually, this is roughly the proportion found in raw apples -- but when they are processed into apple juice, about five times this amount of UDMH is produced).

** The Natural Resources Defense Council (NRDC) performed a risk assessment claiming that "about one out of every 4,200 [preschool children] exposed to Alar will develop cancer by age six" (actually, NRDC estimated that exposures prior to age six could cause cancer with this probability sometime during the 70-year lifetimes of these children -- a huge distinction, with Sunstein's revision making NRDC appear unfamiliar with basic assumptions about cancer latency periods).

** The EPA's current risk assessment is "lower than that of the NRDC by a factor of 600" (actually, the 1/250,000 figure Sunstein cites as EPA's differs from NRDC's 1/4,200 figure by only a factor of 60 (250,000 ÷ 4,200). Besides, EPA never calculated the risk at one in 250,000. After Alar's manufacturer (Uniroyal) finished a state-of-the- art study of the carcinogenicity of UDMH in laboratory animals, EPA (Federal Register, Vol. 57, October 8, 1992, pp. 46,436-46,445) recalculated the lifetime excess risk to humans at 2.6 × 10-5, or 1 in 38,000. And, acting on recommendations from the U.S. National Academy of Sciences, EPA has subsequently stated that it will consider an additional tenfold safety factor to account for the increased susceptibility of children under age 2, and a threefold factor for children aged 2 to 16 -- which, had they been applied to UDMH, would have made the EPA estimate almost equal to the NRDC estimate that made people "more frightened than they should have been").

** "A United Nations panel... found that Alar does not cause cancer in mice, and it is not dangerous to people" (true enough, except that Sunstein does not mention that this panel invoked a threshold model of carcinogenesis that no U.S. agency would have relied on without more and different scientific evidence: the U.N. panel simply ignored the large number of tumors at the two highest doses in Uniroyal's UDMH study and declared the third-highest dose to be "safe" because that dose produced tumors, but at a rate not significantly higher than the background rate).

** A 60 Minutes broadcast "instigated a public outcry.. . without the most simple checks on its reliability or documentation" (readers might be interested, however, that both a federal district court and a federal appeals court summarily dismissed the lawsuit over this broadcast, finding that the plaintiffs "failed to raise a genuine issue of material fact regarding the falsity of statements made during the broadcast").

** The demand for apples "plummeted" during 1989 (true enough, but Sunstein neglects to mention that within five years after the withdrawal of Alar the apple industry's revenues doubled versus the level before the controversy started).

Sunstein's entire source material for these scientific and other conclusions? Four footnotes from a book by political scientist Aaron Wildavsky and one quotation from an editorial in Science magazine (although the incorrect division of 250,000 by 4,200 and the mangling of the NRDC risk assessment appear to be Sunstein's own contributions). One reason the general public annoys Sunstein by disagreeing with the "experts," therefore, is that he has a very narrow view of where one might look for a gold standard against which to judge the merits of competing conclusions. Perhaps Sunstein himself has come to certain beliefs about Alar and other risks "simply because of what [he thinks] other people believe," and comforts himself that the people he agrees with must be "experts."

Similarly, Sunstein makes some insightful points about the public's tendency to assume that the risks are higher for items whose benefits they perceive as small, but he fails to notice the mountains of evidence that his preferred brand of experts tend to impute high economic costs to regulations that they perceive as having low risk- reduction benefits. He accepts as "unquestionably correct" the conclusion of Tengs and colleagues (1995) that government badly misallocates risk-reduction resources, for example, without acknowledging Heinzerling's (2002) finding that in 79 of the 90 environmental interventions Tengs and colleagues accused of most severely wasting the public's money, the agency involved never required that a dime be spent to control those hazards, probably because analysis showed such expenditures to be of questionable cost- effectiveness.

Finally, Sunstein fails to acknowledge the degree to which experts can agree with the public on broad issues, and can also disagree among themselves on the details. He cites approvingly studies by Slovic and colleagues suggesting that laypeople perform "intuitive toxicology" to shore up their beliefs, but fails to mention that in the most recent of their studies (1995), toxicologists and the general public both placed 9 of the same 10 risks at the top of38 risks surveyed, and agreed on 6 of the 10 risks among the lowest 10 ranked. Yet when toxicologists alone were given information on the carcinogenic effects of "Chemical B" (data on bromoethane, with its identity concealed) in male and female mice and rats, only 6% of them matched the conclusions of the experts at the U.S. National Toxicology Program that there was "clear evidence" of carcinogenicity in one test group (female mice), "equivocal evidence" in two others, and "some evidence" in the fourth. "What are ordinary people thinking?" (p. 36) when they disagree with the plurality of toxicologists, Sunstein asks, without wondering what these toxicologists must have been thinking to disagree so much with each other. One simple answer is that perhaps both toxicologists and the general public, more so than others whose training leads them elsewhere, appreciate the uncertainties in the raw numbers and the room for honest divergence of opinion even when the uncertainties are small. These uncertainties become even more influential when multiple risks must be combined and compared -- as in most life-cycle assessments and most efforts to identify and promote more eco- efficient pathways -- so Sunstein's reliance on a style of expertise that regards uncertainty as an annoyance we can downplay or "average away" is particularly ill-suited to broader policy applications.

I actually do understand Sunstein's frustration with the center of gravity of public opinion in some of these areas. Having worked on health hazards in the general environment and in the nation's workplaces, I devoutly wish that more laypeople (and more experts) could muster more concern about parts per thousand in the latter arena than parts per billion of the same substances in the former. But I worry that condescension is at best a poor strategy to begin a dialogue about risk management, and hope that expertise would aspire to more than proclaiming the "right" perspective and badgering people into accepting it. Instead, emphasizing the variations in expertise and orientation among experts could actually advance Sunstein's stated goal of promoting a "cost-benefit state," as it would force those who denounce all risk and cost-benefit analysis to focus their sweeping indictments where they belong. But until those of us who believe in a humble, humane brand of risk assessment can convince the public that substandard analyses indict the assessor(s) involved, not the entire field, I suppose we deserve to have our methods hijacked by experts outside our field or supplanted by an intuitive brand of "precaution."

Adam M. Finkel, School of Public Health, University of Medicine and Dentistry of New Jersey Piscataway, New Jersey, USA

Note

1. This is admittedly not a disinterested choice, as I was an expert witness for CBS News in its successful attempts to have the courts summarily dismiss the product disparagement suit brought against it for its 1989 broadcast about Alar. But Sunstein's summaries of other hazards (e.g., toxic waste dumps, arsenic, airborne particulate matter) could illustrate the same general point.

References

Heinzerling, L. 2002. Five hundred life-saving interventions and their misuse in the debate over regulatory reform. Risk: Health, Safety and Environment 13(Spring): 151-175.

Slovic, P., T. Malmfors, D. Krewski, C. K. Mertz, N. Neil, and S. Bartlett. 1995. Intuitive toxicology, Part II: Expert and lay judgments of chemical risks in Canada. Risk Analysis 15(6): 661-675.

Tengs, T. O., M. E. Adams, J. S. Pliskin, D. G. Safran, J. E. Siegel, M. C. Weinstein, and J. D. Graham. 1995. Five hundred life saving interventions and their cost-effectiveness. Risk Analysis 15(3): 369- 390.

From: Rachel's Precaution Reporter #103, Aug. 15, 2007

A LETTER TO MY FRIEND WHO IS A RISK ASSESSOR

Dear Adam,

It was really good to hear from you. I'm delighted that you're now working in New Jersey, with joint appointments at Princeton University and at the School of Public health at the University of Medicine and Dentistry of New Jersey (UMDNJ). Everyone in New Jersey is taking a bath in low levels of toxic chemicals, so we need all the help we can get, and you're exactly the kind of person we most need help from -- honest, knowledgeable, committed, and principled. Your knowledge of toxicology and risk assessment -- and the courage you demonstrated as a government whistle-blower within the Occupational Safety and Health Administration -- are sorely needed in the Garden State, as they are everywhere in America.

I read with real interest your review of Cass Sunstein's book, Risk and Reason. I thought you did an excellent job of showing that Sunstein "has managed to sketch out a brand of QRA [quantitative risk assessment] that may actually be less scientific and more divisive, than no analysis at all." To me, it seems that Sunstein is more interested in picking a fight with his fellow liberals than in helping people make better decisions.

What I want to discuss in this note to you is your attack on the precautionary principle. In the second paragraph of your review, you wrote, "I believe that at its best, QRA [quantitative risk assessment] can serve us better than a 'precautionary principle' that eschews analysis in favor of crusades against particular hazards that we somehow know are needlessly harmful and can be eliminated at little or no economic or human cost. After all, this orientation has brought us increased asbestos exposure for schoolchildren and remediation workers in the name of prevention, and also justified an ongoing war with as pure a statement of the precautionary principle as we are likely to find ("we have every reason to assume the worst, and we have an urgent duty to prevent the worst from occurring," said President Bush in October 2002 about weapons of mass destruction in Iraq)."

The two examples you give -- asbestos removal, and Mr. Bush's pre- emptive war in Iraq -- really aren't good examples of the precautionary principle. Let's look at the details. All versions of the precautionary principle have three basic parts:

1) If you have reasonable suspicion of harm

2) and you have scientific uncertainty

3) then you have a duty to take action to avert harm (though the kind of action to take is not spelled out in the precautionary principle).

Those are the three basic elements of the precautionary principle, found in every definition of the principle.

And here is a slightly more verbose version of the same thing:

1. Pay attention and heed early warnings. In a highly technological society, we need to keep paying attention because so many things can go wrong, with serious consequences.

2. When we develop reasonable suspicion that harm is occurring or is about to occur, we have a duty to take action to avert harm. The action to take is not spelled out by the precautionary principle but proponents of the principle have come up with some suggestions for action:

3. We can set goals -- and in doing this, we can make serious efforts to engage the people who will be affected by the decisions.

4. We can examine all reasonable alternatives for achieving the goal(s), again REALLY engaging the people who will be affected.

5. We can give the benefit of the doubt to nature and to public health.

6. After the decision, we can monitor, to see what happens, and again heed early warnings.

7. And we can favor decisions that are reversible, in case our monitoring reveals that things are going badly.

Now, let's look at the examples you gave to see if they really represent failures of the precautionary principle.

The asbestos removal craze (1985-2000) was essentially started by one man, James J. Florio, former Congressman from New Jersey (and later Governor of New Jersey).

As we learn from this 1985 article from the Philadelphia Inquirer, gubernatorial candidate Florio was fond of bashing his opponent for ignoring the "asbestos crisis" and the 'garbage crisis.' In Mr. Florio's campaign for governor of New Jersey, the "asbestos crisis" was a political ploy -- and it worked. He got elected by promising to fix the asbestos crisis and the garbage crisis.

As governor, Mr. Florio's approach to the "garbage crisis" was to site a new garbage incinerator in each of New Jersey's 21 counties. At $300 to $600 million per machine, this would have set loose an unprecedented quantity of public monies sloshing around in the political system. Later it turned out that Mr. Florio's chief of staff had close ties to the incinerator industry.

I don't know whether Mr. Florio or his political cronies profited directly from the asbestos removal industry that he created almost single-handedly, but the decision-making for asbestos was similar to his approach to the "garbage crisis." He did not engage the affected public in setting goals and then examine all reasonable alternatives for achieving those goals. He did not take a precautionary approach.

In the case of garbage, Mr. Florio and his cronies decided behind closed doors that New Jersey needed to build 21 incinerators. He and his cronies then justified those incinerators using quantitative risk assessments.

Mr. Florio's approach to asbestos was the same: without asking the affected public, he decided that removal of asbestos from 100,000 of the nation's schools was the correct goal, and thus creating a new "asbestos removal" industry was the only reasonable alternative. You can read about Mr. Florio's "Asbestos Hazard Emergency Response Act of 1986" here.

So, the goal ("asbestos removal") was not set through consultation with affected parties. Perhaps if the goal had been to "Eliminate exposures of students and staff to asbestos in schools," a different set of alternatives would have seemed reasonable. Asbestos removal might not have been judged the least-harmful approach.

Even after the questionable decision was made to remove all asbestos from 100,000 school buildings nationwide, systematic monitoring was not done, or not done properly. Furthermore, the decision to remove asbestos was not easily reversible once removal had been undertaken and new hazards had been created.

So Mr. Florio's law created overnight a new asbestos removal industry, companies without asbestos removal experience rushed to make a killing on public contracts, in some cases worker training was poor, removals were carried out sloppily, monitoring was casual or non-existent, and so new hazards were created.

This was not an example of a precautionary approach. It was missing almost all the elements of a precautionary approach -- from goal setting to alternatives assessment, to monitoring, heeding early warnings, and making decisions that are reversible.

Now let's examine President Bush's pre-emptive strike against Iraq.

True enough, Mr. Bush claimed that he had strong evidence of an imminent attack on the U.S. -- perhaps even a nuclear attack. He claimed to know that Saddam Hussein was within a year of having a nuclear weapon. But we now know that this was all bogus. This was not an example of heeding an early warning, because no threat existed -- it was an example of manufacturing an excuse to carry out plans that had been made even before 9/11.

Here is the lead paragraph from a front-page story in the Washington Post April 28, 2007:

"White House and Pentagon officials, and particularly Vice President Cheney, were determined to attack Iraq from the first days of the Bush administration, long before the Sept. 11, 2001, attacks, and repeatedly stretched available intelligence to build support for the war, according to a new book by former CIA director George J. Tenet."

Fully a year before the war began, Time magazine reported March 24, 2003, President Bush stuck his head in the door of a White House meeting between National Security Advisor Condoleeza Rice and three U.S. Senators discussing how to deal with Saddam Hussein through the United Nations or perhaps in a coalition with the U.S's Middle East partners. "Bush wasn't interested," reported Time. "He waved his hand dismissively, recalls a participant, and neatly summed up his Iraq policy:" "F--- Saddam. We're taking him out." This President was not weighing alternatives, seeking the least-harmful.

The CIA, the State Department, members of Congress, and the National Security staff wanted to examine alternatives to war, but Mr. Bush was not interested. With Mr. Cheney, he had set the goal of war, without wide consultation. He then manufactured evidence to support his decision to "take out" Saddam without much thought for the consequences. Later he revealed that God had told him to strike Saddam. Mr. Bush believed he was a man on a mission from God. Assessing alternatives was not part of God's plan, as Mr. Bush saw it.

This was not an example of the precautionary approach. Goals were not set through a democratic process. Early warning were not heeded -- instead fraudulent scenarios were manufactured to justify a policy previous set (if you can call "God told me to f--- Saddam" a policy). As George Tenet's book makes clear again and again, alternatives were not thoroughly examined, with the aim of selecting the least harmful. For a long time, no monitoring was done (for example, no one has been systematically counting Iraqi civilian casualties), and the decision was not reversible, as is now so painfully clear.

This was definitely not an example of the precautionary approach.

So I believe your gratuitous attack on the precautionary principle is misplaced because you bring forward examples that don't have anything to do with precautionary decision-making.

Now I want to acknowledge that the precautionary principle can lead to mistakes being made -- it is not a silver bullet that can minimize all harms. However, it we take to heart its advice to monitor and heed early warnings, combined with favoring decisions that are reversible, it gains a self-correcting aspect that can definitely reduce harms as time passes.

I am even more worried by the next paragraph of your review of Sunstein's book. Here you seem to disparage the central goal of public health, which is primary prevention. You write,

"More attention to benefits and costs might occasionally dampen the consistent enthusiasm of the environmental movement for prevention, and might even moderate the on-again, off-again role precaution plays in current U.S. economic and foreign policies." I'm not sure what you mean by the 'on-again, off-again role precaution plays in current U.S. economic and foreign policies." I presume you're referring here to asbestos removal and pre-emptive war in Iraq, but I believe I've shown that neither of these was an example of precautionary decision-making.

Surely you don't REALLY want environmentalists or public health practitioners to "dampen their consistent enthusiasm for prevention." Primary prevention should always be our goal, shouldn't it? But we must ask what are we trying to prevent? And how best to achieve the goal(s)? These are always the central questions, and here I would agree with you: examining the pros and cons of every reasonable approach is the best way to go. (I don't want to use your phrase "costs and benefits" because in common parlance this phrase implies quantitative assessment of costs and benefits, usually in dollar terms. I am interested in a richer and fuller discussion of pros and cons, which can include elements and considerations that are not strictly quantifiable but are nevertheless important human considerations, like local culture, history, fairness, justice, community resilience and beauty.) So this brings me to my fundamental criticisms of quantitative risk assessment, which I prefer to call "numerical risk assessment."

1. We are all exposed to multiple stressors all of the time and numerical risk assessment has no consistent way to evaluate this complexity. In actual practice, risk assessors assign a value of zero to most of the real-world stresses, and thus create a mathematical model of an artificial world that does not exist. They then use that make-believe model to drive decisions about the real world that does exist.

2. The timing of exposures can be critical. Indeed, a group of 200 physicians and scientists recently said they believe the main adage of toxicology -- "the dose makes the poison" -- needs to be changed to, "The timing makes the poison." Numerical risk assessment is, today, poorly prepared to deal with the timing of exposures.

3. By definition, numerical risk assessment only takes into consideration things that can be assigned a number. This means that many perspectives that people care about must be omitted from decisions based on numerical risk assessments -- things like historical knowledge, local preferences, ethical perspectives of right and wrong, and justice or injustice.

4. Numerical risk assessment is difficult for many (probably most) people to understand. Such obscure decision-making techniques run counter to the principles of an open society.

5. Politics can and do enter into numerical risk assessments. William Ruckelshaus, first administrator of U.S. Environmental Protection Agency said in 1984, "We should remember that risk assessment data can be like the captured spy: If you torture it long enough, it will tell you anything you want to know."

6. The results of numerical risk assessment are not reproducible from laboratory to laboratory, so this decision-technique does not meet the basic criterion for being considered "science" or "scientific."

As the National Academy of Sciences said in 1991, "Risk assessment techniques are highly speculative, and almost all rely on multiple assumptions of fact -- some of which are entirely untestable." (Quoted in Anthony B. Miller and others, Environmental Epidemiology, Volume 1: Public Health and Hazardous Wastes [Washington, DC: National Academy of Sciences, 1991], pg. 45.)

7. By focusing attention on the "most exposed individual," numerical risk assessments have given a green light to hundreds of thousands or millions of "safe" or "acceptable" or "insignificant" discharges that have had the cumulative effect of contaminating the entire planet with industrial poisons. See Travis and Hester, 1991 and Rachel's News #831.

All this is not to say that numerical risk assessment has no place in decision-making. Using a precautionary approach, as we set goals and, later, as we examine the full range of alternatives, numerical risk assessments might be one technique for generating information to be used by interested parties in their deliberations. Other techniques for generating useful information might be citizen juries, a Delphi approach, or consensus conferences. (I've discussed these techniques briefly elsewhere.)

It isn't so much numerical risk assessment itself that creates problems -- it's reliance almost solely on numerical risk assessment as the basis for decisions that has gotten us into the mess we're in.

Used within a precautionary framework for decision-making, numerical risk assessment of the available alternatives in many cases may be able to give us useful new information that can contribute to better decisions. And that of course is the goal: better decisions producing fewer harms.

From: Rachel's Precaution Reporter #103, Aug. 15, 2007 RISK ASSESSMENT AND PRECAUTION: COMMON STRENGTHS AND FLAWS

By Adam Finkel

Dear Peter,

Whether we agree more than we disagree comes down to whether means or ends are more important. To the extent you share my view (and given your influence on me early in my career, I should probably say, "I share your view...") that we have a long way to go to provide a safe, healthy, and sustainable environment for the general and (especially) the occupational populations, our remaining differences are only those of strategy. Outcomes "producing fewer harms" for nature and public health are, I agree, the goal, and I assume you agree that we could also use fewer decisions for which more harm is the likely -- perhaps even the desired -- outcome of those in power.

But being on the same side with respect to our goals makes our differences about methods all the more important, because knowing where to go but not how to get there may ultimately be little better than consciously choosing to go in the wrong direction.

Your long-standing concern about quantitative risk assessment haunts me, if there's such a thing as being "haunted in a constructive way." I tell my students at the University of Medicine and Dentistry of New Jersey and at Princeton during the first class of every semester that I literally haven't gone a week in the past 10 years without wondering, thanks to you, if I am in fact "helping the state answer immoral questions" about acceptable risk and in so doing, am "essentially keeping the death camp trains running on time" (quote from Rachel's #519, November 7, 1996). I don't consider this analogy to be name-calling, because I have such respect for its source, so I hope you won't take offense if I point out that everyone who professes to care about maximizing life expectancy, human health, and the natural functioning of the planet's ecosystems ought to ask the same question of themselves. I do worry about quantitative risk assessment and its mediocre practitioners, as I will try to explain below, but I also wish that advocates of the precautionary principle would occasionally ask themselves whether more or fewer people will climb unwittingly aboard those death-camp trains if they run on a schedule dictated by "precaution."

And if the outcomes we value flourish more after an action based on quantitative risk assessment than they do after an action motivated by precaution, then a preference for the latter implies that noble means matter more than tangible ends -- which I appreciate in theory, but wonder what would be so noble about a strategy that does less good or causes more harm.

I distrust some versions of the precautionary principle for one basic reason.[1] If I re-express your first three-part definition in one sentence (on the grounds that in my experience, the fact of scientific uncertainty goes without saying), I get "if you reasonably suspect harm, you have a duty to act to avert harm, though the kind of action is up to you." Because I believe that either inaction or action can be unacceptably harmful, depending on circumstances, I worry that a principle that says "act upon suspicion of harm" can be used to justify anything. This was my point about the Iraq war, which I agree is despicable, but not only because the suspicion of harm was concocted (at least, inflated) but because the consequences of the remedy were so obviously glossed over.

Whatever principle guides decision makers, we need to ask how harmful the threat really is, and also what will or may happen if we act against it in a particular way. Otherwise, the principle degenerates into "eliminate what we oppose, and damn the consequences." I'm not suggesting that in practice, the precautionary principle does no better than this, just as I trust you wouldn't suggest that quantitative risk assessment is doomed to be no better than "human sacrifice, Version 2.0." Because I agree strongly with you (your "verbose version, point 5") that when health and dollars clash, we should err on the side of protecting the former rather than the latter, I reject some risk-versus-risk arguments, especially the ones from OMB [Office of Management and Budget] and elsewhere that regulation can kill more people than it saves by impoverishing them (see, for example, my 1995 article "A Second Opinion on an Environmental Misdiagnosis: The Risky Prescriptions of Breaking the Vicious Circle [by Judge Steven Breyer]", NYU Environmental Law Journal, vol. 3, pp. 295-381, especially pp. 322-327). But the asbestos and Iraq examples show the direct trade-offs that can ruin outcomes made in a rush to prevent. Newton's laws don't quite apply to social decision-making: for every action, there may be an unequal and not-quite-opposite reaction. "Benign" options along one dimension may not be so benign when viewed holistically. When I was in charge of health regulation at OSHA, I tried to regulate perchloroethylene (the most common solvent used in dry-cleaning laundry). I had to be concerned about driving dry-cleaners into more toxic substitutes (as almost happened in another setting when we regulated methylene chloride, only to learn of an attempt -- which we ultimately helped avert -- by some chemical manufacturers to encourage customers to switch to an untested, but probably much more dangerous, brominated solvent). But encouraging or mandating "good old-fashioned wet cleaning" was not the answer either (even if it turns out to be as efficient as dry cleaning), once you consider that wet clothes are non-toxic but quite heavy -- and the ergonomic hazards of thousands of workers moving industrial-size loads from washers to dryers is the kind of "risk of action" that only very sophisticated analyses of precaution would even identify.

This is why I advocated "dampening the enthusiasm for prevention" -- meaning prevention of exposures, not prevention of disease, which I agree is the central goal of public health. That was a poor choice of words on my part, as I agree that when the link between disease and exposure is clear, preventing exposure is far preferable to treating the disease; the problem comes when exposures are eliminated but their causal connection to disease is unfounded.

To the extent that the precautionary principle -- or quantitative risk assessment, for that matter -- goes after threats that are not in fact as dire as worst- case fears suggest, or does so in a way that increases other risks disproportionately, or is blind to larger threats that can and should be addressed first, it is disappointing and dangerous. You can say that asbestos removal was not "good precaution" because private interests profited from it, and because the remediation was often done poorly, not because it was a bad idea in the first place. Similarly, you can say that ousting Saddam Hussein was not "good precaution" because the threat was overblown and it (he) could have been "reduced" (by the military equivalent of a pump-and-treat system?) rather than "banned" (with extreme prejudice). Despite the fact that in this case the invasion was justified by an explicit reference to the precautionary principle ("we have every reason to assume the worst and we have an urgent duty to prevent the worst from occurring"), I suppose you can argue further that not all actions that invoke the precautionary principle are in fact precautionary -- just as not all actions that claim to be risk-based are in fact so. But who can say whether President Bush believed, however misguidedly, that there were some signals of early warning emerging from Iraq? Your version of the precautionary principle doesn't say that "reasonable suspicion" goes away if you also happen to have a grudge against the source of the harm.

Again, in both asbestos removal and Iraq I agree that thoughtful advocates of precaution could have done much better. But how are these examples really any different from the reasonable suspicion that electromagnetic fields or irradiated food can cause cancer? Those hazards, as well as the ones Hussein may have posed, are/were largely gauged by anecdotal rather than empirical information, and as such are/were all subject to false positive bias. We could, as you suggest, seek controls that contain the hazard (reversibly) rather than eliminating it (irrevocably), while monitoring and re-evaluating, but that sounds like "minimizing" harm rather than "averting" it, and isn't that exactly the impulse you scorn as on the slippery slope to genocide when it comes from a risk assessor? And how, by the way, are we supposed to fine-tune a decision by figuring out whether our actions are making "things go badly," other than by asking "immoral questions" about whose exposures have decreased or increased, and by how much?

We could also, as you suggest, "really engage the people who will be affected," and reject alternatives that the democratic process ranks low. I agree that more participation is desirable as an end in itself, but believe we shouldn't be too sanguine about the results. I've been told, for example, that there exist people in the U.S. -- perhaps a majority, perhaps a vocal affected minority -- who believe that giving homosexual couples the civil rights conferred by marriage poses an "unacceptable risk" to the fabric of society. They apparently believe we should "avert" that harm. If I disagree, and seek to stymie their agenda, does that make me "anti-precautionary" (or immoral, if I use risk assessment to try to convince them that they have mis-estimated the likelihood or amount of harm)?

So I'm not sure that asbestos and Iraq are atypical examples of what happens when you follow the precautionary impulse to a logical degree, and I wonder if those debacles might even have been worse had those responsible followed your procedural advice for making them more true to the principle. But let's agree that they are examples of "bad precaution." The biggest challenge I have for you is a simple one: explain to me why "bad precaution" doesn't invalidate the precautionary principle, but why for 25 years you've been trashing risk assessment based on bad risk assessments! Of course there is a crevasse separating what either quantitative risk assessment or precaution could be from what they are, and it's unfair to reject either one based on their respective poor track records. You've sketched out a very attractive vision of what the precautionary principle could be; now let me answer some of your seven concerns about what quantitative risk assessment is.

(1) (quantitative risk assessment doesn't work for unproven hazards) I hate to be cryptic, but "please stay tuned." A group of risk assessors is about to make detailed recommendations to address the problem of treating incomplete data on risk as tantamount to zero risk. In the meantime, any "precautionary" action that exacerbates any of these "real-world stresses" will also be presumed incorrectly to do no harm...

(2) (quantitative risk assessment is ill-equipped to deal with vulnerable periods in the human life cycle) It's clearly the dose, the timing, and the susceptibility of the individual that act and interact to create risk. quantitative risk assessment depends on simplifying assumptions that overestimate risk when the timing and susceptibility are favorable, and underestimate it in the converse circumstances. The track record of risk assessment has been one of slow but consistent improvement toward acknowledging the particularly vulnerable life stages and individuals (of whatever age) who are most susceptible, so that to the extent the new assumptions are wrong, they tend to over-predict. This is exactly what a system that interprets the science in a precautionary way ought to do -- and the alternative would be to say "we don't know enough about the timing of exposures, so all exposures we suspect could be a problem ought to be eliminated." This ends up either being feel-good rhetoric or leading to sweeping actions that may, by chance, do more good than harm.

(3) (quantitative risk assessment leaves out hard-to-quantify benefits) Here, as in the earlier paragraph about "pros and cons," you have confused what must be omitted with "what we let them omit sometimes." I acknowledge that most practitioners of cost-benefit analysis choose not to quantify cultural values, or aggregate individual costs and benefits so that equitable distributions of either are given special weight. But when some of us risk assessors say "the benefits outweigh the costs" we consciously and prominently include justice, individual preferences, and "non-use values" such as the existence of natural systems on the benefits side of the ledger, and we consider salutary economic effects of controls as offsetting their net costs. Again, "good precaution" may beat "bad cost-benefit analysis" every time, but we'd see a lot more "good cost-benefit analysis" if its opponents would help it along rather than pretending it can't incorporate things that matter.

(4) (quantitative risk assessment is hard to understand) The same could be said about almost any important societal activity where the precise facts matter. I don't fully understand how the Fed sets interest rates, but I expect them to do so based on quantitative evaluation of their effect on consumption and savings, and to be able to answer intelligent questions about uncertainties in their analyses. "Examining the pros and cons of every reasonable approach," which we both endorse, also requires complicated interpretation of data on exposures, health effects, control efficiencies, costs, etc., even if the ruling principle is to "avert harm." So if precaution beats quantitative risk assessment along this dimension, I worry that it does so by replacing unambiguous descriptions ("100 deaths are fewer than 1000") with subjective ones ("Option A is 'softer' than Option B").

(5) (Decision-makers can orchestrate answers they most want to hear) "Politics" also enters into defining "early warnings," setting goals, and evaluating alternatives -- this is otherwise known as the democratic process. Removing the numbers from an analysis of a problem or of alternative solutions simply shifts the "torturing" of the number into a place where it can't be recognized as such.

(6) (quantitative risk assessment is based on unscientific assumptions) It sounds here as if you're channeling the knee-jerk deregulators at the American Council for Science and Health, who regularly bash risk assessment to try to exonerate threats they deem "unproven." quantitative risk assessment does rely on assumptions, most of which are grounded in substantial theory and evidence; the alternative would be to contradict your point #1 and wait for proof which will never come. The 1991 European Commission study you reference involved estimating the probability of an industrial accident, which is indeed a relatively uncertain area within risk assessment, but one that precautionary decision-making has to confront as well. The 1991 NAS study was a research agenda for environmental epidemiology, and as such favored analyses based on human data, which suffer from a different set of precarious assumptions and are notoriously prone to not finding effects that are in fact present.

(7) (quantitative risk assessment over-emphasizes those most exposed to each source of pollution) This is a fascinating indictment of quantitative risk assessment that I think is based on a non sequitur. Yes, multiple and overlapping sources of pollution can lead to unacceptably high risks (and to global burdens of contamination), which is precisely why EPA has begun to adopt recommendations from academia to conduct "cumulative risk analyses" rather than regulating source by source. The impulse to protect the "maximally exposed individual" (MEI) is not to blame for this problem, however; if anything, the more stringently we protect the MEI, the less likely it is that anyone's cumulative risk will be acceptably high, and the more equitable the distribution of risk will be. Once more, this is a problem that precautionary risk assessment can allow us to recognize and solve; precaution alone can at its most ambitious illuminate one hazard at a time, but it has no special talent for making risks (as opposed to particular exposures) go away.

I note that in many ways, your list may actually be too kind given what mainstream risk assessment has achieved to date. These seven possible deficiencies pale by comparison to the systemic problems with many quantitative risk assessments, which I have written about at length (see, for example, the 1997 article "Disconnect Brain and Repeat After Me: 'Risk Assessment is Too Conservative.'" In Preventive Strategies for Living in a Chemical World, E. Bingham and D. P. Rall, eds., Annals of the New York Academy of Sciences, 837, 397-417). Risk assessment has brought us fallacious comparisons, meaningless "best estimates" that average real risks away, and arrogant pronouncements about what "rational" people should and should not fear. But these abuses indict the practitioners -- a suspicious proportion of whom profess to be trained in risk assessment but never were -- not the method itself, just as the half-baked actions taken in precaution's name should not be generalized to indict that method.

So in the end, you seem to make room for a version of the precautionary principle in which risk assessment provides crucial raw material for quantifying the pros and cons of different alternative actions. Meanwhile, I have always advocated for a version of quantitative risk assessment that emphasizes precautionary responses to uncertainty (and to human interindividual variability), so that we can take actions where the health and environmental benefits may not even exceed the expected costs of control (in other words, "give the benefit of the doubt to nature and to public health"). The reason the precautionary principle and quantitative risk assessment seem to be at odds is that despite the death-camp remark, you are more tolerant of risk assessment than the center of gravity of precaution, while I am more tolerant of precaution than the center of gravity of my field. If the majorities don't move any closer together than they are now, and continue to be hijacked by the bad apples in each camp, I guess you and I will have to agree to disagree about whether mediocre precaution or mediocre quantitative risk assessment is preferable. But I'll continue to try to convince my colleagues that since risk assessment under uncertainty must either choose to be precautionary with health, or else choose to pretend that errors that waste lives and errors that waste dollars are morally equivalent, we should embrace the first bias rather than the second. I hope you will try to convince your colleagues (and readers) that precaution without analysis is like the "revelation" to invade Iraq -- it offers no justification but sorely needs one.

[1] As you admit, there are countless variations on the basic theme of precaution. I was careful to say in my review of Sunstein's book that I prefer quantitative risk assessment to "a precautionary principle that escews analysis," and did not mean to suggest that most or all current versions of it fit the description.

From: Oakland Tribune (Oakland, Calif.), Aug. 1, 2007

STUDY: EARLY LIFE EXPOSURES IMPACT BREAST CANCER RISK DECADES LATER

By Douglas Fischer, Staff writer

Susan Lydon, a Bay Area author and journalist, never forgot the DDT fog trucks that rumbled through the Long Island, New York, neighborhood where she grew up.

She was her block's fastest kid. The mist was cool. The trucks slow. Her speed allowed her to stay longer than any other pals in that comforting, pesticide-laced mist the sprayers left in their wake.

Lydon died of breast cancer at age 61 in 2005, going to her deathbed certain those carefree runs decades ago sealed her fate.

Her concern, it appears now, was justified.

A breakthrough study of Oakland women suggests exposure early in life to DDT significantly increases a woman's chances of developing breast cancer decades later, according to a new study published last week in the online edition of the journal Environmental Health Perspectives. [Read Dr. Pete Myers' analysis of the new study here.]

The findings bolster the controversial notion that exposure to low doses of hormonally active compounds at critical developmental stages -- in this case, as the breast is developing -- load the gun, so to speak, priming the body to develop cancer years later.

It also makes clear the final chapter of DDT's legacy is not yet written. The young girls most heavily exposed to the pesticide -- women born in the late 1950s and early 1960s, when use of the pesticide peaked in the United States -- have not yet reached age 50, let alone the age of greatest breast cancer risk, typically sometime after menopause and around age 60.

The findings further suggest society is destined to relearn the lesson of DDT many times over. Myriad synthetic chemicals in our environment today interact with our bodies, with unknown consequences. Government regulators have little power to take precautionary action against compounds that appear problematic.

Reports like this, said Barbara Brenner, executive director of San Francisco-based Breast Cancer Action, show the fallacy of that approach.

"We have to start paying very close attention to what we put in our environment," she said. "This is an example of doing something to our environment where we did not understand the long-term consequences. I don't know how many times this story has to be told."

The study probed a unique database of some 15,000 Kaiser Permanente Health Plan members who participated in a longitudinal study tracking their health over decades.

Researchers with the Berkeley-based Public Health Institute selected 129 women within that study who developed breast cancer before age 50, then analyzed their archived blood samples taken between 1959 and 1967, while they were much younger.

Every sample from a woman with cancer was matched as a control with a sample from a woman of the same age without cancer.

Researchers found women who developed cancer later in life had far higher concentrations of DDT in their blood as youths.

More significantly, women who were 14 years old or older in 1945, when DDT first hit the market, saw no increased breast cancer rates, suggesting exposure while the breast is developing is critical.

The study has its limits. Researchers don't know about other known risk factors that may have predisposed the women toward cancer. They don't know when the women were exposed to DDT. And the study size is small.

For those who, like Lydon, have memories of chasing DDT sprayers as a child, researchers involved in the study preached caution against drawing any firm conclusions.

"I don't think it's just early life exposures," said Mary Wolff, a professor of oncology at Mount Sinai School of Medicine and a report co-author. "Most cancers are an accumulation of a lot of factors."

Even among women most at risk -- those with the so-called "breast cancer gene" -- 30 percent live to age 70 and beyond without cancer, for reasons unknown, Wolff said. "It's a complex disease even when we know one of the biggest risk factors.

DDT, or dichloro diphenyl trichloroethane, was banned in the United States in 1972 amid concerns it was concentrating in the food chain and killing off bald eagles and other raptors.

But the report goes far beyond the pesticide. It indicts widely held ideas and common practices concerning minute amounts of chemicals ubiquitous in our environment.

"The work that needs to be done to identify whether there are environmental risk factors (with any particular compound) is very complicated," said Barbara Cohn, a senior researcher at the Health Institute and the report's lead author. "But it's very important. We need to look deeply at that."

The report suggests, for instance, that society is heading down the same path with atrazine, one of the world's most widely applied pesticides, said Breast Cancer Action's Brenner.

The most cutting-edge drugs in the fight against breast cancer are known as aromatase inhibitors. Post-menopausal women only produce estrogen in their adrenal glands, using the enzyme aromatase to convert the glands' androgen hormones to estrogen. Because estrogen stimulates some breast cancers, doctors attempt to curb cancer growth by blocking the body's production of aromatase.

Atrazine is an aromatase stimulator.

Despite this and other concerns about the pesticide's impact on wildlife, federal regulators say the science is too inconclusive to curb its use.

"We start using chemicals as if the only thing they're going to affect is the plant," Brenner said. "We have to start doing business a different way."

Equally worrisome, the authors say, is that many of the women most heavily exposed to DDT have not yet reached age 50. DDT production peaked in the United States in 1965, and while most studies to date have concluded such exposure wasn't meaningful, this new evidence suggests those assurances may be premature. The most strongly affected women -- those exposed when young -- are just now reaching age 50.

Said Cohn: "It's a caution that maybe there might be other types of evidence that need to be considered before that conclusion can be reached."

Contact Douglas Fischer at dfischer@angnewspapers.com or at (510) 208-6425.

(c) 2007 The Oakland Tribune

From: Environmental Science & Technology, Aug. 15, 2007

PBDES, CATS, AND CHILDREN

New research suggests that chronic exposure to PBDE flame retardants may be more hormone-disrupting than previously believed.

By Kellyn Betts

New ES&T research documents that house cats can have extraordinarily high concentrations of polybrominated diphenyl ether (PBDE) flame retardants in their blood. Janice Dye, a veterinary internist at the U.S. EPA's National Health and Environmental Effects Research Laboratory (NHEERL), and her colleagues say their findings suggest that "chronic [cumulative] low-dose PBDE exposure may be more endocrine-disrupting than would be predicted by most short-term or even chronic PBDE exposure studies in laboratory rodents." They contend that cats can serve as sentinels for chronic human exposure -- of both children and adults -- to the persistent, bioaccumulative, and toxic compounds.

PBDEs are known to impair thyroid functioning. They have been used since the late 1970s as flame retardants in household products, including upholstered furniture, carpet padding, and electronics. During that same time period, the incidence of a cat thyroid ailment, known as feline hyperthyroidism, has risen dramatically. "Feline hyperthyroidism... was never reported" 35 years ago, but "now it is very common," explains coauthor Linda Birnbaum, director of NHEERL's experimental toxicology division. The disease's cause has been a mystery, Dye says.

PBDE concentrations in blood serum of the 23 house cats participating in the study were 20-100 times higher than the median levels of PBDEs in people living in North America, who have been shown to have the world's highest human PBDE levels. Eleven of the cats in the study suffered from feline hyperthyroidism, and the study "points the finger at the association" between the endocrine-disrupting compounds and the disease, Dye says.

Dye and her colleagues observed that the median PBDE concentrations in their study group's young cats were on a par with the levels reported in a sampling of North American children. The paper shows that both cat food and house dust are likely sources of the cats' PBDEs. Although scant research has examined PBDE uptake in small children, studies from Australia, Norway, and the U.S. document that children younger than 4 years can have far higher levels of the compounds than adults.

Scientists hypothesize that this is because PBDEs can be found in house dust and young children are exposed to far more dust than older people. Cats' meticulous and continuous grooming habits could conceivably result in PBDE uptake similar to what toddlers are exposed to through their increased contact with floors and "mouthing" behaviors, Birnbaum says. Laboratory animals exposed to PBDEs before and after birth can have problems with brain development, including learning, memory, and behavior.

The PBDE uptake pattern of the cats in the study mirrors that of North American people, Dye points out. Both have unusually large "outlier populations" of individuals with PBDE levels that are four to seven times greater than the median concentrations.

The paper makes a convincing case that cats can be "a useful sentinel species for both [human] exposure to PBDEs and examination of endocrine disruption," notes Tom Webster, an associate professor at the Boston University School of Public Health's department of environmental health. Ake Bergman, chair of Stockholm University's environmental chemistry department, agrees, adding that the paper is noteworthy for showing that many cats harbor high quantities of the only PBDE compounds still being used in North America and Europe, which are associated with the Deca formulation used to flame retard electronics products. As of 2004, the lighter weight PBDEs associated with the Penta and Octa PBDE formulations used in polyurethane foam and other plastics were banned in Europe and discontinued in the U.S. However, these compounds are still found in older furniture and household furnishings, including upholstered furniture and carpeting.

For all of these reasons, the new research "supports the need for more studies on [PBDE] exposure to children from house dust," says Heather Stapleton, an assistant professor at Duke University's Nicholas School of the Environment.

Copyright 2007 American Chemical Society

From: The New York Times (pg. A21), Aug. 16, 2007

THE BIG MELT

By Nicholas D. Kristof

[Nicholas Kristof is a regular columnist for the New York Times.]

If we learned that Al Qaeda was secretly developing a new terrorist technique that could disrupt water supplies around the globe, force tens of millions from their homes and potentially endanger our entire planet, we would be aroused into a frenzy and deploy every possible asset to neutralize the threat.

Yet that is precisely the threat that we're creating ourselves, with our greenhouse gases. While there is still much uncertainty about the severity of the consequences, a series of new studies indicate that we're cooking our favorite planet more quickly than experts had expected.

The newly published studies haven't received much attention, because they're not in English but in Scientese and hence drier than the Sahara Desert. But they suggest that ice is melting and our seas are rising more quickly than most experts had anticipated.

The latest source of alarm is the news, as reported by my Times colleague Andrew Revkin, that sea ice in the northern polar region just set a new low -- and it still has another month of melting ahead of it. At this rate, the "permanent" north polar ice cap may disappear entirely in our lifetimes.

In case you missed the May edition of "Geophysical Research Letters," an article by five scientists has the backdrop. They analyze the extent of Arctic sea ice each summer since 1953. The computer models anticipated a loss of ice of 2.5 percent per decade, but the actual loss was 7.8 percent per decade -- three times greater.

The article notes that the extent of summer ice melting is 30 years ahead of where the models predict.

Three other recent reports underscore that climate change seems to be occurring more quickly than computer models had anticipated:

Science magazine reported in March that Antarctica and Greenland are both losing ice overall, about 125 billion metric tons a year between the two of them -- and the amount has accelerated over the last decade. To put that in context, the West Antarctic Ice Sheet (the most unstable part of the frosty cloak over the southernmost continent) and Greenland together hold enough ice to raise global sea levels by 40 feet or so, although they would take hundreds of years to melt. We hope.

In January, Science reported that actual rises in sea level in recent years followed the uppermost limit of the range predicted by computer models of climate change -- meaning that past studies had understated the rise. As a result, the study found that the sea is likely to rise higher than most previous forecasts -- to between 50 centimeters and 1.4 meters by the year 2100 (and then continuing from there).

Science Express, the online edition of Science, reported last month that the world's several hundred thousand glaciers and small ice caps are thinning more quickly than people realized. "At the very least, our projections indicate that future sea-level rise maybe larger than anticipated," the article declared.

What does all this mean?

"Over and over again, we're finding that models correctly predict the patterns of change but understate their magnitude," notes Jay Gulledge, a senior scientist at the Pew Center on Global Climate Change.

This may all sound abstract, but climate change apparently is already causing crop failures in Africa. In countries like Burundi, you can hold children who are starving and dying because of weather changes that many experts believe are driven by our carbon emissions.

There are practical steps we can take to curb carbon emissions, and I'll talk about them in a forthcoming column. But the tragedy is that the U.S. has become a big part of the problem.

"Not only is the U.S. not leading on climate change, we're holding others back," said Jessica Bailey, who works on climate issues for the Rockefeller Brothers Fund. "We're inhibiting progress on climate change globally."

I ran into Al Gore at a climate/energy conference this month, and he vibrates with passion about this issue -- recognizing that we should confront mortal threats even when they don't emanate from Al Qaeda.

"We are now treating the Earth's atmosphere as an open sewer," he said, and (perhaps because my teenage son was beside me) he encouraged young people to engage in peaceful protests to block major new carbon sources.

"I can't understand why there aren't rings of young people blocking bulldozers," Mr. Gore said, "and preventing them from constructing coal-fired power plants."

Critics scoff that the scientific debate is continuing, that the consequences are uncertain -- and they're right. There is natural variability and lots of uncertainty, especially about the magnitude and timing of climate change.

In the same way, terror experts aren't sure about the magnitude and timing of Al Qaeda's next strike. But it would be myopic to shrug that because there's uncertainty about the risks, we shouldn't act vigorously to confront them -- yet that's our national policy toward climate change, and it's a disgrace.


TWO RULES FOR DECISIONS: TRUST IN ECONOMIC GROWTH VS. PRECAUTION

By Joseph H. Guth

[Joseph H. Guth, JD, PhD, is Legal Director of the Science and Environmental Health Network. He holds a PhD in biochemistry from University of Wisconsin (Madison), and a law degree from New York University.

Everyone knows the role of law is to control and guide the economy. From law, not economics, springs freedom from slavery, child labor and unreasonable working conditions. Law, reflecting the values we hold dear, governs our economy's infliction of damage to the environment.

Our law contains what might be called an overarching environmental decision rule that implements our social choices. The structure of this decision rule is an intensely political issue, for the people of our democracy must support its far-reaching consequences. Today we (all of us) are rethinking our current environmental decision rule, which our society adopted in the course of the Industrial Revolution.

The "trust in economic growth" environmental decision rule

Our overarching environmental decision rule (which is also prevalent in much of the rest of the world) constitutes a rarely-stated balance of social values that is hard to discern even though it pervades every aspect of our society.

This decision rule relies extensively on cost-benefit analysis and risk assessment, but the decision rule itself is even broader in scope. The foundation of the rule is the assumption that economic activity usually provides a net benefit to society, even when it causes some damage to human health and the environment. (This conception of "net benefit" refers to the effect on society as a whole, and does not trouble itself too much with the unequal distribution of costs and benefits.)

From this assumption, it follows that we should allow all economic activities, except those for which someone can prove the costs outweigh the benefits. This, then, is the prevailing environmental decision rule of our entire legal system: economic activity is presumed to provide a net social benefit even if it causes some environmental damage, and government may regulate (or a plaintiff may sue) only if it can carry its burden of proof to demonstrate that the costs outweigh the benefits in a particular case. Let's call this the "trust in economic growth" decision rule.

The "precautionary principle" environmental decision rule

The "precautionary principle" is equal in scope to the "trust in economic growth" decision rule, but incorporates a profoundly different judgment about how to balance environment and economic activity when they come into conflict. Under this principle, damage to the environment should be avoided, even if scientific uncertainties remain. This rule implements a judgment that we should presumptively avoid environmental damage, rather than presumptively accept it as we do under the "trust in economic growth" rule.

The role of "cost-benefit" analysis

"Cost-benefit" analysis is a subsidiary tool of both decision rules. However, it is used in very different contexts under the two rules. It can sometimes be employed under the "precautionary" decision rule as a way to compare and decide among alternatives. But under the "trust in economic growth" decision rule, cost-benefit analysis appears as the only major issue and often masquerades as the decision rule itself. This is because most of our laws implicitly incorporate the unstated presumption that economic activities should be allowed even if they cause environmental damage. By and large, our laws silently bypass that unstated presumption and start out at the point of instructing government to develop only regulations that pass a cost-benefit test.

Thus, the foundational presumption of the "trust in economic growth" decision rule is simply accepted as a received truth and is rarely examined or even identified as supplying our law's overarching context for cost-benefit calculations. Almost all economists probably agree with it (except those few who are concerned with the global human footprint and are trying to do full cost accounting for the economy as a whole).

The role of "sound science"

How does science, particularly "sound science," relate to all this? Science supplies a critical factual input used by governments and courts in implementing environmental decision rules. Science is employed differently by the two decision rules, but science does not constitute or supply a decision rule itself. Like cost-benefit analysis, science is a subsidiary tool of the decision rules and so cannot properly be placed in "opposition" to either decision rule. A claim that the precautionary principle, as an overarching environmental decision rule implementing a complex balancing of social values, is in "opposition" to science is a senseless claim.

The phrase "sound science" represents the proposition that a scientific fact should not be accepted by the legal system unless there is a high degree of scientific certainty about it. It is a term used by industry in regulatory and legal contexts and is not commonly used by scientists while doing scientific research. However, it resonates within much of the scientific community because it is a call to be careful and rigorous.

"Sound science" also represents a brake on the legal system's acceptance of emerging science, of science that cuts across disciplines, and of science that diverges from the established traditions and methodologies that characterize many specific disciplines of science. "Sound science" encourages scientists who are concerned about the world to remain in their silos, to avoid looking at the world from a holistic viewpoint, and to avoid professional risks.

But, why does it work for industry? The call for government and law to rely only on "sound science" when they regulate is a call for them to narrow the universe of scientific findings that they will consider to those that have a high degree of certainty.

This serves industrial interests under our prevailing "trust in economic growth" decision rule because it restricts the harms to human health and the environment that can be considered by government and law to those that are sufficiently well established to constitute "sound science."

Because the burden of proof is on government, requiring government to rely only on facts established by "sound science" reduces the scope of possible regulatory activity by making it harder for government to carry its burden to show that the benefits of regulation (avoidance of damage to health and environment) outweigh the costs to industry. Exactly the same dynamic is at play when plaintiffs try to carry their burden of proof to establish legal liability for environmental damage. Shifting the burden of proof would shift the effect of "sound science"

"Sound science" can help industrial interests under a precautionary decision rule, but it also contains a seed of disaster for them.

Precaution is triggered when a threat to the environment is identified, so that the more evidence required to identify a threat as such, the fewer triggers will be pulled. While the precautionary principle is designed to encourage environmental protection even in the face of uncertainty, those opposed to environmental protection urge that the threshold for identification of threats should require as much certainty as possible, and preferably be based only on "sound science."

The seed of disaster for industrial interests is this: the burden of proof can be switched under the precautionary principle (so that when a threat to the environment is identified the proponent of an activity must prove it is safe -- just as a pharmaceutical company must prove that a drug is safe and effective before it can be marketed). When that happens, a call for "sound science" actually would cut against such proponents rather than for them. This is because proponents of an activity would have to provide proof of safety under a "sound science" standard. In other words, the call for "sound science" creates higher burdens on those bearing the burden of proof. In fact, while debates about "sound science" masquerade as debates about the quality of science, the positions that different actors take are actually driven entirely by the underlying legal assignment of the burden of proof. Why precaution? Because of cumulative impacts.

One of the reasons for adopting the precautionary principle, rather than the "trust in economic growth" decision rule, is "cumulative impacts."

The foundational assumption of the "trust in economic growth" rule (that economic activity is generally to the net benefit of society, even if it causes environmental damage) is further assumed to be true no matter how large our economy becomes. To implement the "trust in economic growth" rule, all we do is eliminate any activity without a net benefit, and in doing this we examine each activity independently. The surviving economic activities, and the accompanying cost-benefit- justified damage to the environment, are both thought to be able to grow forever.

Not only is there no limit to how large our economy can become, there is no limit as to how large justified environmental damage can become either. The "trust in economic growth" decision rule contains no independent constraint on the total damage we do to Earth -- indeed the core structure of this decision rule assumes that we do not need any such constraint. People who think this way see no need for the precautionary principle precisely because they see no need for the preferential avoidance of damage to the environment that it embodies.

But, as we now know, there is in fact a need for a limit to the damage we do to earth. Unfortunately, the human enterprise has now grown so large that we are running up against the limits of the Earth -- if we are not careful, we can destroy our only home. (Examples abound: global warming, thinning of Earth's ozone shield, depletion of ocean fisheries, shortages of fresh water, accelerated loss of species, and so on.)

And it is the cumulative impact of all we are doing that creates this problem. One can liken it to the famous "straw that broke the camel's back." At some point "the last straw" is added to the camel's load, its carrying capacity exceeded. Just as it would miss the larger picture to assume that since one or a few straws do not hurt the camel, straw after straw can be piled on without concern, so the "trust in economic growth" decision rule misses the larger picture by assuming that cost-benefit-justified environmental damage can grow forever.

Thus, it is the total size of our cumulative impacts that is prompting us to revisit our prevailing decision rule. This is why we now need a decision rule that leads us to contain the damage we do. It is why we now must work preferentially to avoid damage to the Earth, even if we forego some activities that would provide a net benefit if we lived in an "open" or "empty" world whose limits were not being exceeded. We can still develop economically, but we must live within the constraints imposed by Earth itself.

Ultimately, the conclusion that we must learn to live within the capacity of a fragile Earth to provide for us, painful as it is, is thrust upon us by the best science that we have -- the science that looks at the whole biosphere, senses the deep interconnections between all its parts, places us as an element of its ecology, recognizes the time scale involved in its creation and our own evolution within it, and reveals, forever incompletely, the manifold and mounting impacts that we are having upon it and ourselves.

From: Environment News service, Jul. 27, 2007

CHILDHOOD GROWTH STAGES DETERMINE WHAT HARM POLLUTION DOES

Geneva, Switzerland -- An increased risk of cancer, heart and lung disease in adults can result from exposures to certain environmental chemicals during childhood, the World Health Organization said today. This finding is part of the first report ever issued by the agency focusing on children's special susceptibility to harmful chemical exposures at different stages of their growth.

Air and water contaminants, pesticides in food, lead in soil, as well many other environmental threats which alter the delicate organism of a growing child may cause or worsen disease and induce developmental problems, said the World Health Organization, WHO, releasing the report at its Geneva headquarters.

The peer-reviewed report highlights the fact that in children, the stage in their development when exposure to a threat occurs may be just as important as the magnitude of the exposure.

"Children are not just small adults," said Dr. Terri Damstra, team leader for WHO's Interregional Research Unit. "Children are especially vulnerable and respond differently from adults when exposed to environmental factors -- and this response may differ according to the different periods of development they are going through."

"For example, their lungs are not fully developed at birth, or even at the age of eight, and lung maturation may be altered by air pollutants that induce acute respiratory effects in childhood and may be the origin of chronic respiratory disease later in life," Dr. Damstra said.

Over 30 percent of the global burden of disease in children can be attributed to environmental factors, the WHO study found.

The global health body said this report is the most comprehensive work yet undertaken on the scientific principles to be considered in assessing health risks in children.

The work was undertaken by an advisory group of 24 scientific experts, representing 18 countries. They were convened by WHO to provide insight, expertise, and guidance, and to ensure scientific accuracy and objectivity. Once the text was finalized, it was sent to more than 100 contact points throughout the world for review and comment, and also made available on WHO's International Programme of Chemical Safety website for external review and comment.

The central focus of the study is on the child from embryo through adolescence and on the need to have a good understanding of the interactions between exposure, biological susceptibility, and socioeconomic and nutritional factors at each stage of a child's development.

The scientific principles proposed in the document for evaluating environmental health risks in children will help the health sector, researchers and policy makers to protect children of all ages through improved risk assessments, appropriate interventions and focused research to become healthy adults.

Children have different susceptibilities during different life stages, due to their dynamic growth and developmental processes, the authors said.

Health effects resulting from developmental exposures prenatally and at birth may include miscarriage, still birth, low birth weight and birth defects.

Young children may die or develop asthma, neurobehavioral and immune impairment. Adolescents may experience precocious or delayed puberty.

The vulnerability of children is increased in degraded and poor environments, the report confirms. Neglected and malnourished children suffer the most. These children often live in unhealthy housing, lack clean water and sanitation services, and have limited access to health care and education. For example, lead is known to be more toxic to children whose diets are deficient in calories, iron and calcium.

WHO warns, "One in five children in the poorest parts of the world will not live longer than their fifth birthday -- mainly because of environment-related diseases."

This new volume of the Environmental Health Criteria series, Principles for Evaluating Health Risks in Children Associated with Exposure to Chemicals, is online here.

Copyright Environment News Service (ENS) 2007

Return to Table of Contents

: From: Voice of America, Aug. 8, 2007

SCIENTISTS STUDY HEALTH RISKS OF PLASTIC INGREDIENT

By Melinda Smith

Bisphenol A is everywhere. BPA, as it is also called, can be found in the plastic milk bottle used to feed your baby, the cola bottle or food container you pick up for a fast food meal, the kidney dialysis machine patients need to keep them alive, and the dental sealant used to help prevent tooth decay.

A recent study in the journal Reproductive Toxicology raises concerns about adverse effects of BPA in fetal mice, at levels even lower than U.S. government standards permit.

The scientists say widespread exposure through food and liquid containers occurs when the chemical bonds which hold bisphenol A in the plastic degrades. BPA then leaches into the containers.

Frederick vom Saal, a professor of biological sciences at the University of Missouri, says he is concerned about what we absorb and then transmit to our infants. "Very low doses of this -- below the amounts that are present in humans. When particularly exposure occurs in fetuses, in fetuses and newborns, you end up with those babies eventually developing prostate cancer, breast cancer -- they become hyperactive."

Kimberly Lisack is a mother who wants more information about what she feeds her baby. "I get concerned looking at a lot of the packaged baby foods. It's a lot of chemicals and things I don't know what they are."

Bisphenol A has been produced in polycarbonate plastic for decades. A statement released by the American Chemistry Council says the report is at odds with other international studies that say BPA levels pose no health risks to consumers.

From: OnEarth, Aug. 1, 2007

LOOKING DEEP, DEEP INTO YOUR GENES

By Laura Wright

Martha Herbert, a pediatric neurologist at Boston's Massachusetts General Hospital, studies brain images of children with autism. She was seeing patients one day a few years ago when a 3-year-old girl walked in with more than the usual cognitive and behavioral problems. She was lactose intolerant, and foods containing gluten always seemed to upset her stomach. Autistic children suffer profoundly, and not just in their difficulty forming emotional bonds with family members, making friends, or tolerating minor deviations from their daily routines. Herbert has seen many young children who've had a dozen or more ear infections by the time they made their way through her door, and many others -- "gut kids" -- with chronic diarrhea and other gastrointestinal problems, including severe food allergies. Such symptoms don't fit with the traditional explanation of autism as a genetic disorder rooted in the brain, and that was precisely what was on Herbert's mind that day. She's seen too many kids whose entire systems have gone haywire.

During the course of the little girl's appointment, Herbert learned that the child's father was a computer scientist -- a bioinformatist no less, someone trained to crunch biological data and pick out patterns of interest. She shared with him her belief that autism research was overly focused on examining genes that play a role in brain development and function, to the exclusion of other factors -- namely, children's susceptibility to environmental insults, such as exposure to chemicals and toxic substances. Inspired by their conversation, Herbert left the office that day with a plan: She and the girl's father, John Russo, head of computer science at the Wentworth Institute of Technology, would cobble together a team of geneticists and bioinformatists to root through the scientific literature looking for genes that might be involved in autism without necessarily being related to brain development or the nervous system. The group scanned databases of genes already known to respond to chemicals in the environment, selecting those that lie within sequences of DNA with suspected ties to autism. They came up with more than a hundred matches, reinforcing Herbert's belief that such chemicals interact with specific genes to make certain children susceptible to autism.

Although some diseases are inherited through a single genetic mutation -- cystic fibrosis and sickle cell anemia are examples -- the classic "one gene, one disease" model doesn't adequately explain the complex interplay between an individual's unique genetic code and his or her personal history of environmental exposures. That fragile web of interactions, when pulled out of alignment, is probably what causes many chronic diseases: cancer, obesity, asthma, heart disease, autism, and Alzheimer's, to name just a few. To unravel the underlying biological mechanisms of these seemingly intractable ailments requires that scientists understand the precise molecular dialogue that occurs between our genes and the environment -- where we live and work, what we eat, drink, breathe, and put on our skin. Herbert's literature scan was a nod in this direction, but actually teasing out the answers in a laboratory has been well beyond her or anyone else's reach -- until now.

Consider for a moment that humans have some 30,000 genes, which interact in any number of ways with one or more of the 85,000 synthetic, commercially produced chemicals, as well as heavy metals, foods, drugs, myriad pollutants in the air and water, and anything else our bodies absorb from the environment. The completion of the Human Genome Project in 2003 armed scientists with a basic road map of every gene in the human body, allowing them to probe more deeply into the ways our DNA controls who we are and why we get sick, in part by broadening our understanding of how genes respond to external factors. In the years leading up to the project's completion, scientists began developing powerful new tools for studying our genes. One is something called a gene chip, or DNA microarray, which came about through the marriage of molecular biology and computer science. The earliest prototype was devised about a decade ago; since then these tiny devices, as well as other molecular investigative tools, have grown exponentially in their sophistication, pushing medical science toward a new frontier.

Gene chips are small, often no larger than your typical domino or glass laboratory slide, yet they can hold many thousands of genes at a time. Human genes are synthesized and bound to the surface of the chip such that a single copy of each gene -- up to every gene in an organism's entire genome -- is affixed in a grid pattern. The DNA microarray allows scientists to take a molecular snapshot of the activity of every gene in a cell at a given moment in time.

The process works this way: Every cell in your body contains the same DNA, but DNA activity -- or expression -- is different in a liver cell, say, than it is in a lung, brain, or immune cell. Suppose a scientist wishes to analyze the effect of a particular pesticide on gene activity in liver cells. (This makes sense, since it is the liver that processes and purges many toxins from the body.) A researcher would first expose a liver cell culture in a test tube to a precise dose of the chemical. A gene's activity is observed through the action of its RNA, molecules that convey the chemical messages issued by DNA. RNA is extracted from the test tube, suspended in a solution, then poured over the gene chip. Any given RNA molecule will latch on only to the specific gene that generated it. The genes on the chip with the most RNA stuck to them are the ones that were most active in the liver cells, or most "highly expressed." The genes that don't have any RNA stuck to them are said to be "turned off" in those cells. Scientists use the microarray to compare the exposed cells to non-exposed, control cells (see sidebar). Those genes that show activity in the exposed cells but not in the control cells, or vice versa, are the ones that may have been most affected by the pesticide exposure.

DNA microarrays open the door to an entirely new way of safety-testing synthetic chemicals: Each chemical alters the pattern of gene activity in specific ways, and thus possesses a unique genetic fingerprint. If a chemical's genetic fingerprint closely matches that of another substance already known to be toxic, there is good reason to suspect that that chemical can also do us harm. Ultimately, government agencies charged with regulating chemicals and protecting our health could use this method, one aspect of a field called toxicogenomics, to wade through the thousands of untested or inadequately studied chemicals that circulate in our environment. In other words, these agencies could make our world safer by identifying -- and, one hopes, banning -- hazardous substances.

For such a young field, toxicogenomics has already begun to challenge some fundamental assumptions about the origins of disease and the mechanisms through which chemicals and various environmental exposures affect our bodies. Consider the case of mercury, which was identified as poisonous many centuries ago. Its potential to wreak havoc on the human nervous system was most tragically demonstrated in the mass poisoning of the Japanese fishing village of Minamata in the 1950s. More recently, scientists have begun to amass evidence suggesting that mercury also harms the immune system. In 2001, Jennifer Sass, a neurotoxicologist and senior scientist at the Natural Resources Defense Council (NRDC), who was then a postdoctoral researcher at the University of Maryland, designed an experiment that included the use of microarrays and other molecular tools to figure out how, exactly, mercury was interfering with both our nervous and immune systems. She grew cells in test tubes -- one set for mouse brain cells, another for mouse liver cells -- and exposed them to various doses of mercury so that she could see which genes were being switched on and off in the presence of the toxic metal. In the brain and the liver cells, she noticed unusual activity in the gene interleukin-6, which both responds to infection and directs the development of neurons.

"We thought we had mercury figured out," says Ellen Silbergeld, a professor of environmental health sciences at Johns Hopkins University, who collaborated with Sass on the study. Genomic tools may identify effects of other chemicals by allowing scientists to "go fishing," as Silbergeld puts it, for things they didn't know to look for.

The findings of Sass, Silbergeld, and others indicate that mercury might play a role in the development of diseases involving immune system dysfunction. These diseases perhaps include autism -- think of Herbert's patients with their inexplicable collection of infections and allergies -- but also the spate of autoimmune disorders that we can't fully explain, from Graves' disease and rheumatoid arthritis to multiple sclerosis and lupus.

"Do we need to reevaluate our fish advisories?" Silbergeld asks. "Are our regulations actually protecting the most sensitive people?" We target pregnant women and children because we've presumed that mercury's neurotoxic effects are most damaging to those whose brains are still developing. Sass and Silbergeld's findings don't contradict that assumption, but they do suggest that there might be other adults who are far more vulnerable than we'd realized -- who simply can't tolerate the more subtle effect the metal has on their immune system because of a peculiarity in their genetic makeup. Designing fish advisories for those people, whose sensitivities are coded in their DNA, is a challenge we've never tackled before. Translating new findings about how chemicals affect gene activity into something of broader public health value will require that we understand precisely the tiny genetic differences among us that make one person or group of people more vulnerable than others to certain environmental exposures. One way to do that is by slightly modifying the gene chip to allow researchers to scan up to a million common genetic variants -- alternate spellings of genes, so to speak, that differ by just a single letter -- to look for small differences that might make some people more likely to get sick from a toxic exposure.

Our attempts to identify those who are most genetically susceptible to developing a particular disease as a result of environmental exposures have already yielded important insights. Patricia Buffler, dean emerita of the School of Public Health at the University of California, Berkeley, has found that children with a certain genetic variant may be susceptible to developing leukemia in high-traffic areas, where they're likely to be exposed to benzene compounds in auto exhaust. Other studies have found that a particular genetic variation in some women who drink chlorinated municipal water leads to an increased likelihood that they'll give birth to underweight babies. Still others have found that a specific version of an immune gene, HLA-DP, renders people vulnerable to the toxic effects of the metal beryllium, which causes a chronic lung condition in the genetically sensitive population. This particular vulnerability raises some sticky workplace issues. Toxic exposure to beryllium occurs almost exclusively in industrial settings where welders and other machinists come in contact with the metal while making defense industry equipment, computers, and other electronics. Should employers test their workers for genetic variants that may put them at risk for developing a disease? Could that information be used to bar someone from a job? Such ethical considerations, and their legal and public policy ramifications, will only multiply as we learn more.

But first, a more fundamental question: Do we even understand what today's chronic diseases are? It is beginning to appear that what we call autism may in fact be many illnesses that we've lumped together because those who are afflicted seem to behave similarly. Doctors base their diagnosis on behavioral symptoms, not on what caused those symptoms. Some scientists now refer to the condition as "autisms," acknowledging that we've yet to find a single, unifying biological mechanism, despite the identification, in some studies, of a handful of genes that may confer increased vulnerability. But then, genes or environmental exposures that appear to be important causal factors in one study may not show up at all in another. This leaves scientists to wonder whether the problem isn't that the disease is so diverse in its biological origins that only a truly massive study -- involving many thousands of patients -- would have the statistical power to tease apart the various factors involved.

The same difficulty probably holds true for many chronic diseases, explains Linda Greer, a toxicologist and director of the health program at NRDC. "What we think of as asthma, for example, is probably not one condition at all. It's probably many different diseases that today we simply call asthma." Seemingly contradictory explanations for the epidemic could all turn out to be true. Until we are able to sift out what makes one asthmatic child different from the next -- how and why their respective molecular makeups differ -- treatments or preventive measures that work for one child will continue to fail for another.

At the Centers for Disease Control and Prevention, Muin Khoury, the director of the National Office of Public Health Genomics, has created theoretical models to try to figure out just how many different factors may be involved in most chronic diseases. His findings suggest that some combination of 10 to 20 genes plus a similar number of environmental influences could explain most of the complex chronic diseases that plague the population. But to analyze how even a dozen genes interact with a dozen environmental exposures across large populations requires vast studies: immense numbers of people and huge volumes of data -- everything from precise measurements of gene activity inside cells to exact recordkeeping of subject" exposure to environmental hazards. Microarrays and other molecular tools now make such studies possible.

In 2003, Columbia University and the Norwegian government together launched the Autism Birth Cohort, one of the largest autism investigations in history. The study will track 100,000 Norwegian mothers and children -- from before birth through age 18 -- collecting clinical data, blood, urine, and other biological materials. It will also collect RNA in order to analyze gene activity. Though initial results are due in 2011, it will take decades to complete this study, and RNA samples will have to be painstakingly archived while the investigators await additional funding. Although the current study is not focused on environmental health per se, researchers plan to measure a variety of biological exposures -- including infection, environmental toxins, and dietary deficiencies -- in each mother and child. As the children grow up, and as some among them develop disease, scientists will have complete records to analyze for key commonalities and differences. Which genes do the sick children have in common? Which chemical exposures were most meaningful? The answers may provide clues not only to the origins of autism, but to many other disorders, from cerebral palsy to asthma to diabetes. Other archiving projects are even more ambitious, such as the U.K. Biobank project, which has begun to enroll 500,000 people to create the world's largest resource for studying the role of the environment and genetics in health and disease.

As vital to our understanding of human disease as such studies may prove to be, a 50-year-old taking part in the U.K. Biobank project isn't likely to reap the rewards. "It will take a long time to make sense of the data," says Paul Nurse, a 2001 Nobel laureate in medicine and the president of Rockefeller University. According to Nurse, it may well be that most of the researchers starting these studies today won't see the final results -- the data will be analyzed by their children. In his estimation, that's all the more reason "to get on with it."

In response to concern that environmental exposures were affecting children's health, the Clinton administration in 2000 launched the National Children's Study, the largest such undertaking in the United States, under the auspices of the National Institutes of Health. The goal was to enroll 100,000 children; a genetic biobanking component has since been added. Investigators have not yet recruited participants, in part because of financial uncertainties. The Bush administration's 2007 budget proposal completely eliminated money for the study, though Congress reinstated funding in February.

The irony is that cutting funding for such projects may be the most expensive option of all. Even if we successfully address campaign- dominating political issues like skyrocketing medical costs and the growing ranks of the uninsured, our failure to consider the fundamental mechanisms of disease -- the interplay between our genes and the environment -- could still bankrupt us, socially if not financially. Until we're able to interrupt the slide toward disease much earlier, based on our developing knowledge of how genes and the environment interact, medicine will remain the practice of "putting people back together after they've been hit by the train," says Wayne Matson, a colleague of Martha Herbert's who studies Huntington's and other neurodegenerative diseases at the Edith Nourse Rogers Memorial Veterans Hospital in Bedford, Massachusetts. "It would be a lot better if we knew how to pull that person off the tracks in the first place."

Copyright 2007 by the Natural Resources Defense Council

From: Reuters, Aug. 9, 2007

GLOBAL WARMING WILL STEP UP AFTER 2009: SCIENTISTS

Washington (Reuters) -- Global warming is forecast to set in with a vengeance after 2009, with at least half of the five following years expected to be hotter than 1998, the warmest year on record, scientists reported on Thursday.

Climate experts have long predicted a general warming trend over the 21st century spurred by the greenhouse effect, but this new study gets more specific about what is likely to happen in the decade that started in 2005.

To make this kind of prediction, researchers at Britain's Met Office -- which deals with meteorology -- made a computer model that takes into account such natural phenomena as the El Nino pattern in the Pacific Ocean and other fluctuations in ocean circulation and heat content.

A forecast of the next decade is particularly useful, because climate could be dominated over this period by these natural changes, rather than human-caused global warming, study author Douglas Smith said by telephone.

In research published in the journal Science, Smith and his colleagues predicted that the next three or four years would show little warming despite an overall forecast that saw warming over the decade.

"There is... particular interest in the coming decade, which represents a key planning horizon for infrastructure upgrades, insurance, energy policy and business development," Smith and his co- authors noted.

The real heat will start after 2009, they said.

Until then, the natural forces will offset the expected warming caused by human activities, such as the burning of fossil fuels, which releases the greenhouse gas carbon dioxide.

"HINDCASTS" FOR THE FUTURE

"There is... particular interest in the coming decade, which represents a key planning horizon for infrastructure upgrades, insurance, energy policy and business development," Smith and his co- authors noted.

To check their models, the scientists used a series of "hindcasts" -- forecasts that look back in time -- going back to 1982, and compared what their models predicted with what actually occurred.

Factoring in the natural variability of ocean currents and temperature fluctuations yielded an accurate picture, the researchers found. This differed from other models which mainly considered human-caused climate change.

"Over the 100-year timescale, the main change is going to come from greenhouse gases that will dominate natural variability, but in the coming 10 years the natural internal variability is comparable," Smith said.

In another climate change article in the online journal Science Express, U.S. researchers reported that soot from industry and forest fires had a dramatic impact on the Arctic climate, starting around the time of the Industrial Revolution.

Industrial pollution brought a seven-fold increase in soot -- also known as black carbon -- in Arctic snow during the late 19th and early 20th centuries, scientists at the Desert Research Institute found.

Soot, mostly from burning coal, reduces the reflectivity of snow and ice, letting Earth's surface absorb more solar energy and possibly resulting in earlier snow melts and exposure of much darker underlying soil, rock and sea ice. This in turn led to warming across much of the Arctic region.

At its height from 1906 to 1910, estimated warming from soot on Arctic snow was eight times that of the pre-industrial era, the researchers said.

Copyright 2007 Reuters Ltd

From: News & Observer (Charlotte, N.C.), Aug. 9, 2007

TREES OF LIMITED VALUE VS. WARMING

Doubt is cast on 'an attractive solution'

By Margaret Lillard, The Associated Press

RALEIGH -- A decade-long experiment led by Duke University scientists indicates that trees provide little help in offsetting increased levels of the greenhouse gas carbon dioxide.

That's because the trees grew more, but only those that got the most water and nutrients could store significant levels of carbon.

"The responses are very variable according to how available other resources are -- nutrients and water -- that are necessary for tree growth," said Heather McCarthy, a former graduate student at the private university in Durham who spent 6 1/2 years on the project. "It's really not anywhere near the magnitude that we would really need to offset emissions."

McCarthy, now a postdoctoral fellow at the University of California at Irvine, presented the findings this week at a national meeting of the Ecological Society of America in San Jose, Calif. Researchers from the U.S. Forest Service, Boston University and the University of Charleston also contributed to the report.

All helped in the Free Air Carbon Enrichment experiment, in which pine trees in Duke Forest were exposed to higher-than-normal levels of carbon dioxide.

The scientists also gathered data on whether the forest could grow fast enough to help control predicted increases in the level of carbon dioxide.

The loblolly pines grew more tissue, but only those that got the most water and nutrients were able to store enough carbon to have any impact on global warming, the scientists discovered.

"These trees are storing carbon," McCarthy said Wednesday. "It's just not such a dramatic quantity more."

That means proposals to use trees to bank increasing amounts of carbon dioxide emitted by humans may depend too heavily on the weather and large-scale fertilization to be feasible.

"It would be an attractive solution, for sure," McCarthy said. "I don't know how realistic people thought it was, but I think there were certainly high hopes."

Scientists blame the worldwide buildup of carbon dioxide -- due largely to the burning of fossil fuel -- for global warming. The United States is second only to China in the level of greenhouse gas it emits as a nation.

The experiment site, funded by the U.S. Department of Energy, holds four forest plots dominated by loblolly pines in Duke Forest, in north-central North Carolina.

The trees are exposed to extra levels of carbon dioxide from computer- controlled valves that are mounted on rings of towers above the treetops. The valves can be adjusted to account for wind speed and direction, and sensors throughout the plot monitor carbon dioxide levels, McCarthy said. Four more plots received no extra gas.

Trees exposed to the gas produced about 20 percent more biomass -- wood and vegetation -- than untreated trees. But the researchers said the amounts of available water and nitrogen nutrients varied substantially between plots.

Ram Oren, the project director and a professor of ecology at Duke's Nicholas School of the Environment and Earth Sciences, said in a written statement that replicating the tree growth in the real world would be virtually impossible.

"In order to actually have an effect on the atmospheric concentration of CO2, the results suggest a future need to fertilize vast areas," Oren said. "And the impact on water quality of fertilizing large areas will be intolerable to society."

Copyright Copyright 2007, The News & Observer Publishing Company

From: International Herald Tribune, Aug. 7, 2007

EXTREME WEATHER: A GLOBAL PROBLEM

By Reuters and The Associated Press

GENEVA: Much of the world has experienced record-breaking weather events this year, from flooding in Asia to heat waves in Europe and snowfall in South Africa, the United Nations weather agency said Tuesday.

The World Meteorological Organization said global land surface temperatures in January and April were the warmest since such data began to be recorded in 1880, at more than one degree Celsius higher than average for those months.

There have also been severe monsoon floods across South Asia; abnormally heavy rains in northern Europe, China, Sudan, Mozambique and Uruguay; extreme heat waves in southeastern Europe and Russia; and unusual snowfall in South Africa and South America this year, the meteorological agency said.

"The start of the year 2007 was a very active period in terms of extreme weather events," Omar Baddour of the agency's World Climate Program said in Geneva.

While most scientists believe extreme weather events will be more frequent as heat-trapping carbon dioxide emissions cause global temperatures to rise, Baddour said it was impossible to say with certainty what the second half of 2007 would bring. "It is very difficult to make projections for the rest of the year," he said.

The Intergovernmental Panel on Climate Change, a UN group of hundreds of experts, has noted an increasing trend in extreme weather events over the past 50 years and said irregular patterns will probably intensify.

South Asia's worst monsoon flooding in recent memory has affected 30 million people in India, Bangladesh and Nepal, destroying croplands, livestock and property and raising fears of new health crises in the densely populated region.

Heavy rains also hit southern China in June, with nearly 14 million people affected by floods and landslides that killed 120 people, the World Meteorological Organization said.

England and Wales this year had their wettest May and June since records began in 1766, resulting in extensive flooding and more than $6 billion in damage, as well as at least nine deaths. Germany swung from its driest April since country-wide observations started in 1901 to its wettest May on record. And torrential rains have followed weeks of severe drought in northern Bulgaria -- officials said Tuesday that at least seven people have been killed in floods.

Mozambique suffered its worst floods in six years in February, followed by a tropical cyclone the same month. Flooding of the Nile River in June caused damage in Sudan.

In May, Uruguay had its worst flooding since 1959.

In June, the Arabian Sea had its first documented cyclone, which touched Oman and Iran.

Temperatures also strayed from the expected this year. Records were broken in southeastern Europe in June and July, and in western and central Russia in May. In many European countries, April was the warmest ever recorded.

Argentina and Chile saw unusually cold winter temperatures in July while South Africa had its first significant snowfall since 1981 in June.

The World Meteorological Organization and its 188 member states are working to set up an early warning system for extreme weather events. The agency also wants to improve monitoring of the impacts of climate change, particularly in poorer countries that are expected to bear the brunt of floods.

As exceptionally heavy rains continued to cut a wide swath of ruin across northern India, a top UN official warned Tuesday that climate change could destroy vast swaths of farmland in the country, ultimately affecting food production and adding to the problems of already desperate peasants, The New York Times reported from New Delhi.

Even a small increase in temperatures, said Jacques Diouf, head of the Food and Agricultural Organization, could push down crop yields in the world's southern regions, even as agricultural productivity goes up in the north. A greater frequency of droughts and floods, one of the hallmarks of climate change, the agency added, could be particularly bad for agriculture.

"Rain-fed agriculture in marginal areas in semi-arid and sub-humid regions is mostly at risk," Diouf said. "India could lose 125 million tons of its rain-fed cereal production -- equivalent to 18 percent of its total production." That is a sign of the steep human and economic impact of extreme weather in India.

Copyright International Herald Tribune


TOXIC LEAD IS STILL ROBBING OUR CHILDREN OF BRAIN POWER

By Peter Montague

In a front-page story June 22, the New York Times reported that a first-born child typically has a 3-point IQ advantage over any brothers or sisters born later.[1] The editors of the Times considered this information so important that they featured it in a second news story,[2] an op-ed commentary,[3] and four letters to the editor.[4]

Here is how the Times initially described the importance of a 3-point IQ advantage:

"Three points on an I.Q. test may not sound like much. But experts say it can be a tipping point for some people -- the difference between a high B average and a low A, for instance. That, in turn, can have a cumulative effect that could mean the difference between admission to an elite private liberal-arts college and a less exclusive public one."[1]

The Times did not mention it, but for some children the loss of 3 IQ points could mean the difference between a high D average and a low C, with a cumulative effect that could mean the difference between staying in school and dropping out. In other words, a 3-point loss of IQ may be crucially important in every child's life, not just those headed for the Ivy League.

The U.S. Department of Labor says 19 million jobs will be created in the next decade and 12 million of them (63%) will require education beyond high-school.[5] As the globalized economy puts U.S. workers under greater competitive pressure, workers are expected to survive by retraining themselves 2 or 3 times during their working years. In this new world, every IQ point takes on new importance.

Unfortunately, the loss of 4 to 7 IQ points is far more widespread among U.S. children than anyone has so far reported, except in obscure medical journals.

One of the main causes of widespread loss of IQ is the toxic metal, lead, which is a potent neurotoxin. This soft gray metal was widely used in paint, in leaded gasoline, in sealing "tin" cans, and in water pipes throughout most of the 20th century, and the residuals are still taking a toll today in the form of peeling paint, toxic house-dust in older homes, contaminated soil, and a measurable body burden in almost all our children.

The most common units of measurement for lead in blood are micrograms per deciLiter of blood (ug/dL). A microgram is a millionth of a gram and there are 28 grams in an ounce. A deciLiter is a tenth of a liter and a liter is roughly a quart.[6]

As lead in your blood goes up, your IQ goes down. And paradoxically the first few micrograms of lead are the most damaging.

As a child's lead rises from less than 1 ug/dL up to 10, he or she loses an average of 7 IQ points.[7,8,9,10] If lead continues rising from 10 to 20, another 2 IQ points get shaved off. The first 5 ug/dL reduce a child's IQ by about 4 points.[7,8,9,10]

According to the latest available data, 26 percent of all children in the U.S. between the ages of 1 and 5 have 5 to 10 micrograms of toxic lead in each deciLiter of blood[11] -- which corresponds to a loss of 4 to 7 IQ points.[7,8,9,10] The estimate of lead in blood was published in December 2003, covering the period 1988-1994. Average levels today are probably somewhat lower because the trend for lead in children's blood is downward.

Unfortunately this 26% average for all U.S. children masks a disproportionate effect among non-whites, who tend to live in families with low income and in older homes that may have peeling paint containing toxic lead.

In the 2003 report, nearly half (47%) of non-Hispanic Black children ages 1 to 5 had blood lead levels in the range of 5 to 10 ug/dL, which corresponds to a loss of 4 to 7 IQ points. Nineteen percent of white children and 28% of Hispanic children fell in the same range.[11]

This means that exposure to toxic lead is still a huge problem in the U.S., robbing more than a million children each year of the intellectual potential they were born with.[12]

Unfortunately, there is widespread misunderstanding (and muddled reporting in the media) about this problem, due in no small part to confusing and contradictory policies set by the federal Centers for Disease Control and Prevention (CDC) and U.S. Environmental Protection Agency (EPA). State governments by and large just go along.

Prior to 1971, doctors only treated children for lead poisoning if they had more than 60 ug/dL.[13] At this level many children died, and those who survived had major permanent brain damage. Permanent damage from lead poisoning was well documented at least as early as 1943, but it wasn't until 1971 that the definition of "elevated" lead in children's blood was reduced to 40 ug/dL. By 1978, it was apparent that children were still being brain-damaged at 40 ug/dL, so the definition of "elevated" was reduced to 30. In 1985, the definition of "elevated" was reduced again, to 25, and in 1991 it was reduced again, to 10 ug/dL.[14]

In 2005, the Centers for Disease Control and Prevention (CDC) reaffirmed its 10 ug/dL "level of concern," using tortured logic. CDC first acknowledged that "there is no 'safe' threshold for blood lead levels." [15, pg. ix] In other words, CDC acknowledges that any amount of lead greater than zero causes some harm. CDC then says, "Although there is evidence of adverse health effects in children with blood lead levels below 10 ug/dL, CDC has not changed its level of concern, which remains at levels equal to or greater than 10 ug/dL.... If no threshold level exists for adverse health effects, setting a new BLL [blood lead level] of concern somewhere below 10 ug/dL would be based on an arbitrary decision," CDC says.[15, pg. ix]

In other words, since any amount of lead in blood greater than zero is harmful to children, then 10 is as good a number as any for defining where the problem begins. It's like saying automobiles are dangerous at any speed above zero, so setting the legal speed limit at 100 mph is as good as any other number.

So this is where it stands today: CDC says children are being harmed at levels below 10, yet CDC retains its official "level of concern" of 10 because picking any number below 10 (except zero) would be arbitrary.[15]

It gets worse: CDC says 10 ug/dL is the "level of concern" but finding 10 ug/dL in a child's blood still does not trigger official attention to that individual child. When a community finds 10 ug/dL in some of its children, it is supposed to take community-wide action to prevent lead exposures -- urging homeowners to wet-mop to reduce household dust, for example. Yes, this will help, but it is an adequate response?

By current CDC guidelines, a child must have 15 ug/dL before the local health department is supposed to initiate "case management," visiting the home, for example, to discuss ways to reduce exposure. If a child has 20 ug/dL or more, then serious intervention may be initiated -- forcing homeowners or landlords to remove sources of lead (such as old paint) from the home, for example.

But here's the worst news: CDC's "level of concern" is widely interpreted as a "safe" level by other government agencies. It was never intended as such. As one lead researcher has written, "Although the CDC's intervention level is not a statement concerning the level of childhood blood lead considered 'safe' or 'acceptable,' it has been interpreted as such by the general public and by federal regulatory agencies."[16] And, we should add, by state agencies as well.

For example, U.S. Environmental Protection Agency (EPA) has never set a "reference dose" for inorganic lead, as it has for several other neurotoxins about which far less information is available. EPA uses CDC's logic: it cannot find a level of exposure to lead that is "likely to be without deleterious effects during a lifetime" of exposure. So it ignores the problem by refusing to set a reference dose.[16]

As you can probably gather from this description, CDC guidelines do not flag 10 ug/dL as a serious threat to children. And that is the way it is understood across America, as a recent scan of newspapers revealed [with my comments inside square brackets]: ** The Wasau (Wisc.) Daily Herald reported May 27, 2007, that in Marathon County, Wisconsin, 1617 children were tested "with 43 registering levels higher than 10 micrograms per deciliter of blood." [With only 43 out of 1617 affected, the problem doesn't sound very serious, does it?]

** The Arizona Daily Star reported Feb. 4, 2007 that only 1 percent of children in Pima County have "elevated blood-lead levels." [Only 1 percent? Sounds like the problem has been solved, doesn't it?]

** The Westerly, Rhode Island, Sun reported Feb. 3, 2007 that "In 2005, about two percent of 31,669 children screened in Rhode Island, or 621 children, showed an elevated lead count in their blood..." [Only two percent -- sounds like the problem is small.]

** In Fitchburg, Massachusetts the Sentinel & Enterprise reported Nov. 6, 2006, that childhood lead poisoning has dropped from 8.2 per 1000 children in 1998 to 2.7 per 1000 in 2005 (with "lead poisoning" defined as 20 ug/dL). [Sounds like the problem is small and under control.]

** The Denver Post reported April 29, 2007, "About 38 out of every 100,000 children under the age of 6 tested in Colorado in 2003-04 showed elevated levels of lead." [Only 38 out of 100,000? Sounds like the problem has been solved.]

** The Erie (Pa.) Times-News reported Dec. 3, 2006, "... the U.S. Centers for Disease Control and Prevention estimates that 310,000 children nationwide between the ages of 1 and 5 have blood lead levels of 10 micrograms per deciliter or greater. Ten micrograms per deciliter is the federal threshold for lead poisoning in children that can result in development, learning and behavior problems." [A wonderfully clear statement of the point I'm making -- 10 ug/dL is almost universally reported as a level below which there are no real problems.]

To be fair, several of these news stories quoted one individual or another (often a community activist) saying that levels of lead below 10 can cause problems in children -- but none of the stories mentions the number of children exposed at levels below 10. It's as if levels below 10 don't really matter. All the published numerical estimates are expressed in terms of CDC's official "level of concern" -- and all the published estimates make the problem appear small.

The habit of only reporting 10 ug/dL or more comes directly from CDC itself[17] and from state health departments, many of whom measure, but do not publish, data on lead in blood below 10 ug/dL. For example, here is how the New Jersey state health department presented its summary of lead in N.J. children in 2005 (the latest year for which N.J. data are available):

"While 191,788 (97.7%) children tested in New Jersey in FY 2005 had blood lead levels below the Centers for Disease Control and Prevention (CDC) threshold of 10 ug/dL, there were 4,547 (2.3%) children with a blood lead test result above this level."[18, pg. 7]

So in all of New Jersey, only 2.3% of children rise to the level of concern defined by CDC. This is very different from estimating, for example, that about 140,000 kids younger than 5 in New Jersey have lost 4 to 7 IQ points because they have 5 to 10 ug/dL lead in their blood.[19]

Numerical data on how many children have lead levels below 10 ug/dL seem to be a closely guarded secret. A review of dozens of published reports on lead in children's blood since 1985 uncovered only one report that estimated the proportion of children in the U.S. with 5 to 10 ug/dL.[11] The federal government and many state governments collect this data -- but none of them publish it. They focus instead on the small number of children with more than 10 ug/dL, continuing the illusion that 10 or more is the only amount that matters.

How could a small amount like 5 ug/dL harm anyone?

How could such a small amount of lead -- 5 ug in each deciLiter of blood -- cause brain damage? One way to understand such a question is to ask about the environment in which our species, Homo sapiens, evolved. How much lead are humans accustomed to?

From modern studies, scientists know the relationship of lead in blood to lead in bones. So in 1992, a group of scientists measured lead in the bones of pre-industrial humans, for the purpose of estimating "natural background" (pre-industrial) levels of lead in blood. They concluded that the natural background level of lead in human blood is 0.016 ug/dL -- so 5 ug/dL represents a level 300 times as high as natural background.[20]

A 300-fold increase in a potent nerve poison seems certain to take its toll on humans so exposed, especially if they are exposed during early childhood, when their brains are developing rapidly.

Brain damage is not the only harm caused by lead at levels below 10 ug/dL. In 2004, CDC asked a panel of experts to evaluate and summarize the current scientific literature on adverse health effects associated with blood lead levels less than 10 ug/dL. [See the Appendix in footnote 15.]

They found that intellectual impairment -- brain damage -- was number one, but they also found:

** Reduced height and head circumference as blood lead levels rise above 1 ug/dL.

** Delayed sexual maturation. Two studies observed late puberty in girls with blood lead levels in the range of 2 to 5 ug/dL. This seems to indicate that lead is interfering with the endocrine (hormone) system.

** Dental caries (popularly known as "cavities" in teeth) were more likely to develop as a child's blood lead level rose from 1 to 3 ug/dL.

And a study too recent to have been included in the Appendix has shown that a child is 4 times as likely to have attention deficit hyperactivity disorder (ADHD) when blood lead levels reached 2 ug/dL or greater, compared to children with lead at 1 ug/dL.[21] In the U.S., an estimated 4.4 million children have been diagnosed with ADHD.[22]

So the problem is large -- but the government and the media together have managed to make it appear small. Yes, we have made progress in curbing the very substantial harm done to ourselves and our children by the paint and gasoline corporations during the 20th century. But we've still got a long way to go.

To make any real progress, government agencies need to stop pretending that this problem has been solved. Publishing all the available data on lead in children's blood would be a good start. Yes, parents would find it disturbing and there might be an uproar. That's as it should be.

From: New York Times, Jul. 25, 2007

FDA SAYS NO NEW LABELING FOR NANOTECH PRODUCTS

By Reuters

CHICAGO (Reuters) -- The Food and Drug Administration on Wednesday said the rising number of cosmetics, drugs and other products made using nanotechnology do not require special regulations or labeling.

The recommendations come as the agency looks at the oversight of products that employ the design and use of particles as small as one- billionth of a meter. There are fears by consumer groups and others that these tiny particles are unpredictable, could be toxic and therefore have unforeseen health impacts.

A task force within the FDA concluded that although nano-sized materials may have completely different properties than their bigger counterparts, there is no evidence that they pose any major safety risks at this time.

"We believe we do not have scientific evidence about nano-sized materials posing safety questions that merit being mentioned on the label," said Dr. Randall Lutter, FDA's associate commissioner for policy and planning, during a briefing with reporters.

As least 300 consumers products, including sunscreen, toothpaste and shampoo are now made using nanotechnology, according to a Woodrow Wilson International Center for Scholars report.

The technology is also being used in medicine, where scientists are developing tiny sensors that detect disease markers in the body, and in the food industry, which is using it to extend shelf life in food packaging.

The FDA now treats products made with nanotechnology the same way it handles all products -- requiring companies to prove safety and efficacy before their product can come to market.

But some product categories, such as cosmetics, foods and dietary supplements are not subject to FDA oversight before they are sold, which already worries some advocates. Producing them with nanotechnology adds another layer of concern.

The International Center for Technology Assessment, a nonprofit policy group that is suing the FDA calling for more oversight over the technology, said the recommendations lack teeth.

"Nano means more than just tiny. It means these materials can be fundamentally different, exhibiting chemical and physical properties that are drastically different," said George Kimbrell, staff attorney at the group. "The consumer is being made the guinea pig."

The group sites studies showing certain types of the particles can cause inflammatory and immune system responses in animals as an example of possible dangers.

The FDA said it will soon issue guidance documents for industries using nanotechnology, which include pharmaceutical companies, medical device makers and consumer products firms.

Lutter said the task force concluded that nanotechnology is not substantially different from earlier emerging technologies such as biotechnology or irradiation.

Copyright 2007 Reuters Ltd.

A VERY, VERY SMALL OPPORTUNITY

How science and society can avoid a collision over nanotechnology

By David Rejeski

Over the last few decades, scientists have developed tools that allow them to see and manipulate matter at an atomic scale, down to a nanometer (that's around one eighty-thousandth the width of a human hair). Nano is an invisible technology with big impacts that almost nobody is talking about; bring manufacturing down to a nanoscale and you have the makings of the next industrial revolution.

Government and industry are betting that nanotechnology will allow us to create new properties from old matter, making materials stronger and lighter, for instance, and even create whole new forms of matter. If you talk with investors, they will tell you that nano is the next big thing.

By 2014, nanotechnology is expected to account for over $1.4 trillion of global economic production. Like most technological revolutions, this one will have some downsides. Animal studies have shown that nanoparticles can enter the bloodstream, cross the blood-brain barrier, and damage tissue and DNA -- reasons for concern, and for more research.

Given the size of the global investment, possible risks, and what's at stake for our lives, our economy, and the environment, you might ask: "Shouldn't we be having a conversation about this technology?" Yet recent surveys have shown that 70 to 80 percent of Americans have heard "nothing" or "very little" about nanotech, despite its potentially transformative effects on medicine, agriculture, computation, defense, and energy production.

This is nothing new. When was the last time the government asked you how to spend your taxes on science? That didn't happen with nuclear power, genetics, or agricultural biotechnology. For people who lived through the biotech revolution, nanotech is a flashback: the collision of rapidly advancing technology with lagging public understanding, which could scuttle billions of dollars in public and private sector investment in nanotech and jeopardize some real breakthroughs, like better treatments for cancer and far cheaper solar energy.

It doesn't have to be this way. In nanotechnology we find an unprecedented opportunity to do things differently, to develop a social contract between the public and the scientific community that is built on openness and trust. And that begins with a conversation.

For the past two years, a number of surveys and focus groups have been conducted around public attitudes toward nanotechnology. When given some balanced background material on nanotechnology and its potential benefits and risks, few people in the U.S. want to shut down scientific progress. But most do not trust industry to self-regulate. They want effective oversight, more disclosure and transparency, premarket testing, and testing done by independent, third parties -- all rational expectations for a new science with some inherent risks. These are expectations that could form the foundation for a new social contract between society and science that helps define mechanisms for oversight, industry disclosure, better risk research, and public consultation.

Movement in this direction is starting at a community level. Berkeley, California, recently passed the world's first nanotechnology ordinance, requiring nanotech firms within city limits to detail what they are producing and what they know about its risks; Cambridge, Massachusetts, may do the same. NGOs are asking valid questions about the risks and benefits of nanotechnology, and media coverage is finally expanding beyond the science journals. If we are on the cusp of the next industrial revolution, we need a public conversation about our goals. Nano may be the small technology that creates that large opportunity.

David Rejeski is the director of the Project on Emerging Nanotechnologies at the Woodrow Wilson International Center for Scholars.

From: Environmental Science & Technology Online News, Jul. 25, 2007

DIOXINS LINKED WITH BEHAVIORAL DISORDERS

By Robert Weinhold

Two clinically significant behavioral disorders, namely learning disabilities and attention deficit disorder, have been linked with low or average blood serum concentrations of two dioxins and one furan.

Researchers say these are the first indicators of a connection between such levels of persistent organic pollutants and diagnosed behavioral problems in children in the general population. Previous work has shown a correlation between these chemicals and reductions in cognitive function indicators. The findings, by Duk-Hee Lee with Kyungpook National University School of Medicine (South Korea) and colleagues from Spain and the U.S., were published in the Journal of Epidemiology and Community Health (2007, 61, 591-596).

The team discovered the link after reviewing 1999-2000 data for seven polychlorinated compounds as well as lead and cadmium from the U.S. Centers for Disease Control and Prevention's (CDC's) National Health and Nutrition Examination Survey. In 278 children aged 12-15, those who had detectable concentrations of three of the polychlorinated compounds were about 2-3 times as likely as those without detectable concentrations to report that they had been diagnosed with a learning disability. The researchers also found that exposure to two of those three compounds was linked with reports of a diagnosis of attention deficit disorder. The affected children tended to be white and to have mothers who were younger and smoked during pregnancy.

The tested concentrations of the three implicated compounds -- 1,2,3,4,6,7,8-heptachlorodibenzo-p-dioxin (HPCDD); 1,2,3,4,6,7,8,9- octachlorodibenzo-p-dioxin(OCDD); and 1,2,3,4,6,7,8- heptachlorodibenzofuran (HPCDF) -- were in the middle or lower end of the concentration ranges in the CDC's Third National Report on Human Exposure to Environmental Chemicals.

These compounds usually are generated by certain chlorination, manufacturing, or incineration processes. Human exposures largely occur via breast milk or contaminated meat, milk, eggs, or fish.

The researchers acknowledge that limitations of the study preclude firm conclusions about the cause-effect relationships of these substances and behavioral disorders. However, they say that their research -- including the discovery that these results would not have been predicted by using accepted toxic equivalency factors -- adds to the growing knowledge and uncertainties about the neurotoxic effects of dioxins and furans.

POLLUTION-CHOLESTEROL LINK TO HEART DISEASE SEEN

The combination activates genes that can cause clogged arteries, UCLA researchers say.

By Marla Cone

Strengthening the link between air pollution and cardiovascular disease, new research suggests that people with high cholesterol are especially vulnerable to heart disease when they are exposed to diesel exhaust and other ultra-fine particles that are common pollutants in urban air.

Microscopic particles in diesel exhaust combine with cholesterol to activate genes that trigger hardening of the arteries, according to a study by UCLA scientists to be published today.

"Their combination creates a dangerous synergy that wreaks cardiovascular havoc far beyond what's caused by the diesel or cholesterol alone," said Dr. Andre Nel, chief of nanomedicine at the David Geffen School of Medicine at UCLA and a researcher at UCLA's California NanoSystems Institute. He led a team of 10 scientists who conducted the study, published in an online version of the journal Genome Biology.

Although diet, smoking and other factors contribute to the risk of cardiovascular disease -- the leading cause of death in the Western world -- scientists have long believed that air pollution, particularly tiny pieces of soot from trucks and factories, plays a major role, too.

For years, scientists around the world have reported that on days when fine-particle pollution increases, deaths from lung diseases, heart attacks and strokes rise substantially. Riverside County and the San Gabriel Valley have among the worst fine-particle pollution in the nation.

The scientists say their study, conducted on human cells as well as on mice, is the first to explain how particulates in the air activate genes that can cause heart attacks or strokes.

The researchers exposed human blood cells to a combination of diesel particles and oxidized fats, then extracted their DNA. Working together, the particles and fats switched on genes that cause inflammation of blood vessels, which leads to clogged arteries, or atherosclerosis.

The team then duplicated the findings in living animals by exposing mice to a high-fat diet and freeway exhaust in downtown Los Angeles. The same artery-clogging gene groups were activated in the mice.

The scientists reported that diesel particles may enter the body's circulatory system from the lungs, and then react with fats in the arteries to alter how genes are activated, triggering inflammation that causes heart disease. Other research has shown similar inflammatory damage in lungs exposed to fine particles. Diesel exhaust has also been linked to lung cancer, asthma attacks and DNA damage.

"Our results emphasize the importance of controlling air pollution as another tool for preventing cardiovascular disease," said Ke Wei Gong, a UCLA cardiology researcher who was one of the study's authors.

In many urban areas, including the Los Angeles region, ultra-fine particles are the most concentrated near freeways, mostly from diesel exhaust, which is spewed by trucks, buses, off-road vehicles and other vehicle engines.

For decades, California and local air-quality regulators have been ratcheting down particulate emissions from trucks and other sources, but the airborne levels in most of the Los Angeles region still frequently exceed federal health standards.

"There are a few hot spots throughout the country that compete with Los Angeles from time to time, but in general, we tend to have the highest levels here," Nel said.

Exposed in a mobile laboratory moving down the freeway, the mice breathed a concentration of fine particles, 362 micrograms per cubic meter of air. That was five times higher than the peak that people in the San Gabriel Valley were exposed to last year.

However, humans breathe polluted air every day for decades, whereas the mice in the study were exposed five hours per day, three days per week, for eight weeks.

"The levels were high, but they came from real freeway exhaust so they were not artificially high," Nel said. "It was almost within the realm of what we are exposed to."

Diesel particles contain free radicals, which damage tissues, and so do the fatty acids in cholesterol.

The study aimed to find out what happened when these two sources of oxidation came in contact.

In the cells exposed to just the cholesterol or just the diesel, the effects on the genes were much less pronounced. More than 1,500 genes were turned on, and 759 were turned off, when diesel particles were combined with the fats.

"Now that we see this genetic footprint, we have a better understanding of how the injury occurs due to air pollution particles," Nel said.

The UCLA scientists hope to transform the gene changes to a biomarker, which experts can then use to predict which people are most susceptible to heart disease from air pollution.

The smaller the particle, the more harm it can cause. More artery- clogging genes were activated in mice exposed to the ultra-fine particles in diesel exhaust than in those exposed to larger particles in the air. Smaller particles generally come from sources of combustion -- mostly vehicles.

EDITORIAL -- A WARMING WORLD: NO TO NUKES

It's tempting to turn to nuclear plants to combat climate change, but alternatives are safer and cheaper.

Japan sees nuclear power as a solution to global warming, but it's paying a price. Last week, a magnitude 6.8 earthquake caused dozens of problems at the world's biggest nuclear plant, leading to releases of radioactive elements into the air and ocean and an indefinite shutdown. Government and company officials initially downplayed the incident and stuck to the official line that the country's nuclear plants are earthquake-proof, but they gave way in the face of overwhelming evidence to the contrary. Japan has a sordid history of serious nuclear accidents or spills followed by cover-ups.

It isn't alone. The U.S. government allows nuclear plants to operate under a level of secrecy usually reserved for the national security apparatus. Last year, for example, about nine gallons of highly enriched uranium spilled at a processing plant in Tennessee, forming a puddle a few feet from an elevator shaft. Had it dripped into the shaft, it might have formed a critical mass sufficient for a chain reaction, releasing enough radiation to kill or burn workers nearby. A report on the accident from the Nuclear Regulatory Commission was hidden from the public, and only came to light because one of the commissioners wrote a memo on it that became part of the public record.

The dream that nuclear power would turn atomic fission into a force for good rather than destruction unraveled with the Three Mile Island disaster in 1979 and the Chernobyl meltdown in 1986. No U.S. utility has ordered a new nuclear plant since 1978 (that order was later canceled), and until recently it seemed none ever would. But rising natural gas prices and worries about global warming have put the nuclear industry back on track. Many respected academics and environmentalists argue that nuclear power must be part of any solution to climate change because nuclear power plants don't release greenhouse gases.

They make a weak case. The enormous cost of building nuclear plants, the reluctance of investors to fund them, community opposition and an endless controversy over what to do with the waste ensure that ramping up the nuclear infrastructure will be a slow process -- far too slow to make a difference on global warming. That's just as well, because nuclear power is extremely risky. What's more, there are cleaner, cheaper, faster alternatives that come with none of the risks.

Glowing pains

Modern nuclear plants are much safer than the Soviet-era monstrosity at Chernobyl. But accidents can and frequently do happen. The Union of Concerned Scientists cites 51 cases at 41 U.S. nuclear plants in which reactors have been shut down for more than a year as evidence of serious and widespread safety problems.

Nuclear plants are also considered attractive terrorist targets, though that risk too has been reduced. Provisions in the 2005 energy bill required threat assessments at nuclear plants and background checks on workers. What hasn't improved much is the risk of spills or even meltdowns in the event of natural disasters such as earthquakes, making it mystifying why anyone would consider building reactors in seismically unstable places like Japan (or California, which has two, one at San Onofre and the other in Morro Bay).

Weapons proliferation is an even more serious concern. The uranium used in nuclear reactors isn't concentrated enough for anything but a dirty bomb, but the same labs that enrich uranium for nuclear fuel can be used to create weapons-grade uranium. Thus any country, such as Iran, that pursues uranium enrichment for nuclear power might also be building a bomb factory. It would be more than a little hypocritical for the U.S. to expand its own nuclear power capacity while forbidding countries it doesn't like from doing the same.

The risks increase when spent fuel is recycled. Five countries reprocess their spent nuclear fuel, and the Bush administration is pushing strongly to do the same in the U.S. Reprocessing involves separating plutonium from other materials to create new fuel. Plutonium is an excellent bomb material, and it's much easier to steal than enriched uranium. Spent fuel is so radioactive that it would burn a prospective thief to death, while plutonium could be carried out of a processing center in one's pocket. In Japan, 200 kilograms of plutonium from a waste recycling plant have gone missing; in Britain, 30 kilograms can't be accounted for. These have been officially dismissed as clerical errors, but the nuclear industry has never been noted for its truthfulness or transparency. The bomb dropped on Nagasaki contained six kilograms.

Technology might be able to solve the recycling problem, but the question of what to do with the waste defies answers. Even the recycling process leaves behind highly radioactive waste that has to be disposed of. This isn't a temporary issue: Nuclear waste remains hazardous for tens of thousands of years. The only way to get rid of it is to put it in containers and bury it deep underground -- and pray that geological shifts or excavations by future generations that have forgotten where it's buried don't unleash it on the surface.

No country in the world has yet built a permanent underground waste repository, though Finland has come the closest. In the U.S., Congress has been struggling for decades to build a dump at Yucca Mountain in Nevada but has been unable to overcome fierce local opposition. One can hardly blame the Nevadans. Not many people would want 70,000 metric tons of nuclear waste buried in their neighborhood or transported through it on the way to the dump.

The result is that nuclear waste is stored on-site at the power plants, increasing the risk of leaks and the danger to plant workers. Eventually, we'll run out of space for it.

Goin' fission?

Given the drawbacks, it's surprising that anybody would seriously consider a nuclear renaissance. But interest is surging; the NRC expects applications for up to 28 new reactors in the next two years. Even California, which has a 31-year-old ban on construction of nuclear plants, is looking into it. Last month, the state Energy Commission held a hearing on nuclear power, and a group of Fresno businessmen plans a ballot measure to assess voter interest in rescinding the state's ban.

Behind all this is a perception that nuclear power is needed to help fight climate change. But there's little chance that nuclear plants could be built quickly enough to make much difference. The existing 104 nuclear plants in the U.S., which supply roughly 20% of the nation's electricity, are old and nearing the end of their useful lives. Just to replace them would require building a new reactor every four or five months for the next 40 years. To significantly increase the nation's nuclear capacity would require far more.

The average nuclear plant is estimated to cost about $4 billion. Because of the risks involved, there is scarce interest among investors in putting up the needed capital. Nor have tax incentives and subsidies been enough to lure them. In part, that's because the regulatory process for new plants is glacially slow. The newest nuclear plant in the U.S. opened in 1996, after having been ordered in 1970 -- a 26-year gap. Though a carbon tax or carbon trading might someday make the economics of nuclear power more attractive, and the NRC has taken steps to speed its assessments, community opposition remains high, and it could still take more than a decade to get a plant built.

Meanwhile, a 2006 study by the Institute for Energy and Environmental Research found that for nuclear power to play a meaningful role in cutting greenhouse gas emissions, the world would need to build a new plant every one to two weeks until mid-century. Even if that were feasible, it would overwhelm the handful of companies that make specialized parts for nuclear plants, sending costs through the roof.

The accelerating threat of global warming requires innovation and may demand risk-taking, but there are better options than nuclear power. A combination of energy-efficiency measures, renewable power like wind and solar, and decentralized power generators are already producing more energy worldwide than nuclear power plants. Their use is expanding more quickly, and the decentralized approach they represent is more attractive on several levels. One fast-growing technology allows commercial buildings or complexes, such as schools, hospitals, hotels or offices, to generate their own electricity and hot water with micro-turbines fueled by natural gas or even biofuel, much more efficiently than utilities can do it and with far lower emissions.

The potential for wind power alone is nearly limitless and, according to a May report by research firm Standard & Poor's, it's cheaper to produce than nuclear power. Further, the amount of electricity that could be generated simply by making existing non-nuclear power plants more efficient is staggering. On average, coal plants operate at 30% efficiency worldwide, but newer plants operate at 46%. If the world average could be raised to 42%, it would save the same amount of carbon as building 800 nuclear plants.

Nevertheless, the U.S. government spends more on nuclear power than it does on renewables and efficiency. Taxpayer subsidies to the nuclear industry amounted to $9 billion 2006, according to Doug Koplow, a researcher based in Cambridge, Mass., whose Earth Track consultancy monitors energy spending. Renewable power sources, including hydropower but not ethanol, got $6 billion, and $2 billion went toward conservation.

That's out of whack. Some countries -- notably France, which gets nearly 80% of its power from nuclear plants and has never had a major accident -- have made nuclear energy work, but at a high cost. The state-owned French power monopoly is severely indebted, and although France recycles its waste, it is no closer than the U.S. to approving a permanent repository. Tax dollars are better spent on windmills than on cooling towers.

BETTER HEALTH THROUGH FAIRER WEALTH

By Brydie Ragan

Research now tells us that lower socio-economic status may be more harmful to health than risky personal habits...

I recently saw a billboard for an employment service that said, "If you think cigarette smoking is bad for your health, try a dead-end job." This warning may not just be an advertising quip: public health research now tells us that lower socio-economic status may be more harmful to health than risky personal habits, such as smoking or eating junk food.

In 1967, British epidemiologist Michael Marmot began to study the relationship between poverty and health. He showed that each step up or down the socio-economic ladder correlates with increasing or decreasing health.

Over time, research linking health and wealth became more nuanced. It turns out that "what matters in determining mortality and health in a society is less the overall wealth of that society and more how evenly wealth is distributed. The more equally wealth is distributed, the better the health of that society," according to the editors of the April 20, 1996 issue of the British Medical Journal. In that issue, American epidemiologist George Kaplan and his colleagues showed that the disparity of income in each of the individual U.S. states, rather than the average income per state, predicted the death rate.

"The People's Epidemiologists," an article in the March/April 2006 issue of Harvard Magazine, takes the analysis a step further. Fundamental social forces such as "poverty, discrimination, stressful jobs, marketing-driven global food companies, substandard housing, dangerous neighborhoods and so on" actually cause individuals to become ill, according to the studies cited in the article. Nancy Krieger, the epidemiologist featured in the article, has shown that poverty and other social determinants are as formidable as hostile microbes or personal habits when it comes to making us sick. This may seem obvious, but it is a revolutionary idea: the public generally believes that poor lifestyle choices, faulty genes, infectious agents, and poisons are the major factors that give rise to illness.

Krieger is one of many prominent researchers making connections between health and inequality. Michael Marmot recently explained in his book, The Status Syndrome, that the experience of inequality impacts health, making the perception of our place in the social hierarchy an important factor. According to Harvard's Ichiro Kawachi, the distribution of wealth in the United States has become an "important public health problem." The claims of Kawachi and his colleagues move public health firmly into the political arena, where some people don't think it belongs. But the links between socio- economic status and health are so compelling that public health researchers are beginning to suggest economic and political remedies.

Richard Wilkinson, an epidemiologist at the University of Nottingham, points out that we are not fated to live in stressful dominance hierarchies that make us sick -- we can choose to create more egalitarian societies. In his book, The Impact of Inequality, Wilkinson suggests that employee ownership may provide a path toward greater equality and consequently better health. The University of Washington's Stephen Bezruchka, another leading researcher on status and health, also reminds us that we can choose. He encourages us to participate in our democracy to effect change. In a 2003 lecture he said that "working together and organizing is our hope."

It is always true that we have choices, but some conditions embolden us to create the future while others invite powerlessness. When it comes to health care these days, Americans are reluctant to act because we are full of fear. We are afraid: afraid because we have no health care insurance, afraid of losing our health care insurance if we have it, or afraid that the insurance we have will not cover our health care expenses. But in the shadow of those fears is an even greater fear -- the fear of poverty -- which can either cause or be caused by illness.

In the United States we have all the resources we need to create a new picture: an abundance of talent, ideas, intelligence, and material wealth. We can decide to create a society that not only includes guaranteed health care but also replaces our crushing climate of fear with a creative culture of care. As Wilkinson and Bezruchka suggest, we can choose to work for better health by working for greater equality.

Brydie Ragan is an indefatigable advocate for guaranteed health care. She travels nationwide to present "Share the Health," a program that inspires Americans to envision health care for everyone.


COMMENTARY -- ADDING FLUORIDE TO DRINKING WATER: A GOOD IDEA?

By Ted Schettler MD, MPH

[Dr. Ted Schettler is science director of the Science and Environmental Health Network. He has co-authored two books on children's health, In Harm's Way, and Generations at Risk, as well as numerous articles.]

Seeking to prevent tooth decay, many U.S. communities add fluoride to public drinking water, usually in the form of hydrofluorosilicic acid, which is a waste product of the phosphate fertilizer industry.

From the beginning, the practice was controversial, but the Centers for Disease Control and Prevention (CDC) and American Dental Association (ADA) have vigorously supported it. The CDC claims that fluoridating public drinking water is one of the ten great public health achievements of the 20th century, giving it primary credit for the decline in tooth decay in the U.S. Despite their enthusiasm, abundant evidence raises serious concerns about the safety and efficacy of adding fluoride to drinking water today.

Since 1945, when the public health intervention began, much has changed with regard to dental health. Several trends are worth mentioning:

* Tooth decay has markedly declined in countries and communities that do not fluoridate drinking water as well as in those that do. Dramatic increases in the use of topically-applied fluoride-containing oral hygiene products are likely to have played a role, along with other changes.

* Today people are exposed to fluoride from bottled drinks, toothpaste, fluoride drops and treatments, pesticides, pharmaceuticals, and industrial discharges. As a result, dental fluorosis, a condition entirely attributable to excessive fluoride intake, is increasing in a substantial portion of the U.S. population.[1]

* A somewhat surprising trend that may increase risks associated with fluoride ingestion involves dietary iodine. In recent years, inadequate iodine intake has become common in the U.S. According to the CDC, the average urinary iodine level today is half what it was in 1971.[2] The agency estimates that 36% of U.S. women now have sub- optimal iodine intake. Adequate dietary iodine is essential for producing normal amounts of thyroid hormone. Excessive dietary fluoride can also lower thyroid hormone production. Excess fluoride and inadequate iodine intake combined increase risks of hypothyroidism.

Much research addresses the potential benefits and adverse impacts of fluoride ingestion. Yet, many data gaps remain. We know that:

* Tooth decay is an infectious process and its origins are multifactorial. General dietary practices, nutrition, oral hygiene, socioeconomic status, and access to dental care play direct and indirect roles. The relative contribution of each depends on the context.

* To the extent that fluoride helps to prevent tooth decay or slow its progression, the predominant advantage is from topical application rather than through ingestion.[3] Topical application includes fluoride in toothpaste, drops, mouth rinses, and fluoride treatments in a dental office, as well as from drinking fluoride-containing beverages.

* There is little disagreement that ingested fluoride has adverse effects as exposures increase beyond some amount.[4] The question is, at what level of exposure do adverse effects begin and when do they begin to outweigh any potential benefits?

* Individuals drinking water with "optimal" fluoride[5] have, on average, less than one fewer missing, decayed, or filled tooth surface than individuals whose drinking water does not have added fluoride.[6] With respect to prevention of tooth decay, therefore, the benefits of fluoride in drinking water are relatively minor. That is not to say that tooth decay has not declined during the last 50 years (it has), or that fluoride has not contributed (it has, but primarily through topical application from many sources), but rather that putting fluoride in drinking water today plays a relatively minor role when compared to other variables.

* Excessive fluoride ingestion from all sources causes dental fluorosis. This is not "just" a cosmetic effect. Dental fluorosis interferes with the integrity of tooth enamel. Many experts conclude that moderate and severe fluorosis can increase the risk of tooth decay. Severe dental fluorosis rises sharply when drinking water levels of fluoride exceed 2 ppm [parts per million].

Depending on the level of exposure, a number of adverse health effects may be linked to fluoride ingestion. In humans, they include bone cancer, bone fracture, skeletal fluorosis, arthritis, impaired thyroid hormone status, impaired neurodevelopment of children, and calcification of the pineal gland. Data are often inconsistent and important information gaps remain. In general, the threshold exposure level at which the risks of various health effects significantly increase is not well understood.

In 2006, an expert committee convened by the National Academy of Sciences issued a report reviewing the appropriateness of the U.S. Environmental Protection Agency's current maximum contaminant level for fluoride in drinking water. The NAS committee concluded:

1) "under certain conditions fluoride can weaken bone and increase the risk of fracture;"

2) "high concentrations of fluoride exposure might be associated with alterations in reproductive hormones, effects on fertility, and developmental outcomes, but [study] design limitations make those studies insufficient for risk evaluation,"

3) "the consistency of results [in a few epidemiologic studies in China] appears significant enough to warrant additional research on the effects of fluoride on intelligence"

4) "the chief endocrine effects of fluoride exposures in experimental animals and in humans include decreased thyroid function, increased calcitonin activity, increased parathyroid hormone activity, secondary hyperparathyroidism, impaired glucose tolerance, and possible effects on timing of sexual maturity. Some of these effects are associated with fluoride intake that is achievable at fluoride concentrations in drinking water of 4 mg/L [milligrams per liter] or less, especially for young children or for individuals with high water intake."

5) "the evidence on the potential of fluoride to initiate or promote cancers, particularly of the bone, is tentative and mixed. Assessing whether fluoride constitutes a risk factor for osteosarcoma is complicated by the rarity of the disease and the difficulty of characterizing biologic dose because of the ubiquity of population exposure to fluoride and the difficulty of acquiring bone samples in non-affected individuals." The committee said that a soon-to-be published study "will be an important addition to the fluoride database, because it will have exposure information on residence histories, water consumption, and assays of bone and toenails. The results of that study should help to identify what future research will be most useful in elucidating fluoride's carcinogenic potential."

That study has now been published. It reports a significant association between exposure to fluoride in drinking water in childhood and the incidence of osteosarcoma among males.[7]

Risks are not limited to humans. Fluoride added to drinking water ultimately ends up in surface water where levels can be high enough to threaten survival and reproduction of aquatic organisms, particularly near the point of discharge.[8]

One health endpoint, the potential impact of fluoride on brain development, illustrates the importance of considering the context of public health interventions:

* We know that adequate thyroid hormone levels are essential during pregnancy (fetal requirement), infancy, and childhood for normal brain development. Even relatively minor deficits in maternal thyroid hormone levels during pregnancy can have long lasting impacts on the function of children's brains.[9]

* Excessive fluoride ingestion lowers thyroid hormone levels.[10] The threshold at which that effect becomes biologically or clinically important is uncertain. But we know that it happens in areas with high naturally-occurring fluoride in drinking water, and it may also be true in areas with fluoride in drinking water in the range of 1-2 ppm, particularly when iodine intake is inadequate.

* Several studies of children in Chinese communities with fluoride drinking water levels of 2.5-4 ppm consistently show significantly lower IQ levels compared to children in communities with minimal fluoride in drinking water.[11] These studies were controlled for other contributory factors.

* Based on biomonitoring studies, the CDC estimates that 36% of women in the U.S. have inadequate iodine intake. Moreover, approximately 6-7% of women (the prevalence increases as women age) have sub- clinical hypothyroidism. Sub-clinical hypothyroidism is characterized by elevated thyroid stimulating hormone (TSH) and normal thyrotropin (the thyroid hormone T4). Without blood tests, sub-clinical hypothyroidism is usually unrecognized because it does not cause symptoms. Sub-clinical hypothyroidism during pregnancy is associated with decreased IQ in children when measured years later.[12]

* Biomonitoring studies conducted by the CDC (NHANES) and other institutions show virtually ubiquitous human exposure to other environmental contaminants that also interfere with thyroid hormone levels or function. They include polychlorinated biphenyls (PCBs), brominated flame retardants, perfluorinated compounds, and perchlorate (a common drinking water and food contaminant from rocket fuel, explosives, and imported nitrate fertilizer). In 2006 CDC scientists reported that ANY amount of perchlorate exposure significantly lowered thyroid hormone levels in women with inadequate iodine intake.[13]

* Few, if any, communities choosing to add fluoride to drinking water are likely to have looked into the iodine status of local residents as well as aggregate exposures to thyroid disrupting compounds, including fluoride, from all sources combined. Yet, collectively, these factors are undeniably relevant to brain development of children born in those communities. Regrettably, the CDC's discussion of the safety of fluoride in drinking water does not even mention potential impacts on the developing brain.[14]

With respect to current and historical perspectives, the NAS committee noted that, on average, fluoride exposure from drinking water in fluoridated communities is near or exceeds the level that raises health concerns.[15] That is, virtually no "margin of safety" exists between levels of fluoride intended to be beneficial and those that may be harmful. This is in sharp distinction from the margin of safety when essential nutrients such as iodine, vitamin D, or vitamin C are added to food. In those cases, maximum potential intake is orders of magnitude lower than exposures that may have toxic effects. Population-wide monitoring of fluoride exposures in the U.S. is surprisingly inadequate. This is particularly disturbing since, despite vigorously recommending putting fluoride into drinking water, the CDC has failed to monitor systematically the levels of fluoride in the population -- despite steadily increasingly sources of fluoride, increasing dental fluorosis, and their well-known and highly useful population-wide monitoring program for a number of other environmental agents (NHANES). Why not fluoride? The NAS review said, "Fluoride should be included in nationwide biomonitoring surveys and nutritional studies... In particular, analysis of fluoride in blood and urine samples taken in these surveys would be valuable."

Conclusions:

Because of

a) uncertainties surrounding fluoride exposure levels from all sources,[16]

b) concurrent exposures to other environmental agents that interact with fluoride or add to the impacts of fluoride,

c) estimates of efficacy and benefits of adding fluoride to drinking water compared with alternative interventions, and

d) potential adverse health effects at current and anticipated exposure levels,

** intentionally fluoridating community drinking water is no longer justified. Adding fluoride to drinking water for the purpose of preventing tooth decay provides virtually no population-wide margin of safety. Under current circumstances, people should not be essentially forced to drink water treated with fluoride when dental benefits can be achieved through topical application and other means.

** An immediate moratorium on the practice of adding fluoride to community drinking water is justified. Risks, benefits, efficacy, and alternatives must be fully, impartially, and transparently re- evaluated, based on current information and data gaps. Moreover, an ethical review of the practice is warranted.

Public health interventions can take many directions. Few, however, are as intrusive as intentionally putting a biologically active chemical into drinking water. Everyone in the community, without exception, is exposed without any opportunity to "opt out" based on individual circumstances. Promoters of this kind of intervention, therefore, have a special responsibility and should at least:

1) Regularly, comprehensively, and transparently re-evaluate benefits and risks of the intervention, based on current science and available alternatives,

2) Regularly monitor and disclose exposure levels in current contexts/circumstances (in humans and wildlife),

3) Ensure an adequate margin of safety, including for the most vulnerable, and

4) Consider the ethical dimensions of intentionally adding a biologically active chemical to public drinking water.

In 2006, the American Dental Association issued an interim guidance advising parents not to reconstitute infant formula with fluoridated water because of the risk of causing dental fluorosis. In general, however, public health agencies and professional associations that advocate putting fluoride into drinking water have failed to provide up-to-date, regular, comprehensive, and transparent re-evaluations of benefits and risks of fluoridating drinking water, based on the most current science and available alternatives. They have not systematically monitored fluoride levels in people and wildlife, adjusting recommendations according to their findings. Rather, they have continued to stress, and often exaggerate, benefits of ingested fluoride while downplaying the risks. Hopefully, the NAS review will prompt an impartial re-evaluation of the justification, safety, and appropriateness of this 50-year-old practice.

THE DURBAN DECLARATION ON CARBON TRADING

The Durban Declaration on Carbon Trading

As representatives of people's movements and independent organisations, we reject the claim that carbon trading will halt the climate crisis. This crisis has been caused more than anything else by the mining of fossil fuels and the release of their carbon to the oceans, air, soil and living things.

This excessive burning of fossil fuels is now jeopardising Earth's ability to maintain a liveable climate.

Governments, export credit agencies, corporations and international financial institutions continue to support and finance fossil fuel exploration, extraction and other activities that worsen global warming, such as forest degradation and destruction on a massive scale, while dedicating only token sums to renewable energy. It is particularly disturbing that the World Bank has recently defied the recommendation of its own Extractive Industries Review which calls for the phasing out of World Bank financing for coal, oil and gas extraction.

We denounce the further delays in ending fossil fuel extraction that are being caused by corporate, government and United Nations' attempts to construct a "carbon market," including a market trading in "carbon sinks".

History has seen attempts to commodify land, food, labour, forests, water, genes and ideas.

Carbon trading follows in the footsteps of this history and turns the earth's carbon-cycling capacity into property to be bought or sold in a global market.

Through this process of creating a new commodity -- carbon -- the Earth's ability and capacity to support a climate conducive to life and human societies is now passing into the same corporate hands that are destroying the climate.

People around the world need to be made aware of this commodification and privatization and actively intervene to ensure the protection of the Earth's climate.

Carbon trading will not contribute to achieving this protection of the Earth's climate. It is a false solution which entrenches and magnifies social inequalities in many ways:

** The carbon market creates transferable rights to dump carbon in the air, oceans, soil and vegetation far in excess of the capacity of these systems to hold it.

Billions of dollars worth of these rights are to be awarded free of charge to the biggest corporate emitters of greenhouse gases in the electric power, iron and steel, cement, pulp and paper, and other sectors in industrialised nations who have caused the climate crisis and already exploit these systems the most. Costs of future reductions in fossil fuel use are likely to fall disproportionately on the public sector, communities, indigenous peoples and individual taxpayers.

** The Kyoto Protocol's Clean Development Mechanism (CDM), as well as many private sector trading schemes, encourage industrialised countries and their corporations to finance or create cheap carbon dumps such as large-scale tree plantations in the South as a lucrative alternative to reducing emissions in the North.

Other CDM projects, such as hydrochlorofluorocarbons (HCFC)-reduction schemes, focus on end-of pipe technologies and thus do nothing to reduce the impact of fossil fuel industries' impacts on local communities. In addition, these projects dwarf the tiny volume of renewable energy projects which constitute the CDM's sustainable development window-dressing.

** Impacts from fossil-fuel industries and other greenhouse-gas producing industries such as displacement, pollution, or climate change, are already disproportionately felt by small island states, coastal peoples, indigenous peoples, local communities, fisherfolk, women, youth, poor people, elderly and marginalized communities. CDM projects intensify these impacts in several ways. First, they sanction continued exploration for, and extraction, refining and burning of fossil fuels. Second, by providing finance for private sector projects such as industrial tree plantations, they appropriate land, water and air already supporting the lives and livelihoods of local communities for new carbon dumps for Northern industries.

** The refusal to phase out the use of coal, oil and gas, which is further entrenched by carbon trading, is also causing more and more military conflicts around the world, magnifying social and environmental injustice. This in turn diverts vast resources to military budgets which could otherwise be utilized to support economies based on renewable energies and energy efficiency.

In addition to these injustices, the internal weaknesses and contradictions of carbon trading are in fact likely to make global warming worse rather than "mitigate" it.

CDM projects, for instance, cannot be verified to be "neutralizing" any given quantity of fossil fuel extraction and burning.

Their claim to be able to do so is increasingly dangerous because it creates the illusion that consumption and production patterns, particularly in the North, can be maintained without harming the climate.

In addition, because of the verification problem, as well as a lack of credible regulation, no one in the CDM market is likely to be sure what they are buying. Without a viable commodity to trade, the CDM market and similar private sector trading schemes are a total waste of time when the world has a critical climate crisis to address.

In an absurd contradiction the World Bank facilitates these false, market-based approaches to climate change through its Prototype Carbon Fund, the BioCarbon Fund and the Community Development Carbon Fund at the same time it is promoting, on a far greater scale, the continued exploration for, and extraction and burning of fossil fuels -- many of which are to ensure increased emissions of the North.

In conclusion, 'giving carbon a price' will not prove to be any more effective, democratic, or conducive to human welfare, than giving genes, forests, biodiversity or clean rivers a price.

We reaffirm that drastic reductions in emissions from fossil fuel use are a pre-requisite if we are to avert the climate crisis. We affirm our responsibility to coming generations to seek real solutions that are viable and truly sustainable and that do not sacrifice marginalized communities.

We therefore commit ourselves to help build a global grassroots movement for climate justice, mobilize communities around the world and pledge our solidarity with people opposing carbon trading on the ground.

See www.sinkswatch.org for up-to-date list of supporting signatories

To sign on to this declaration please send an email to info@fern.org or visit www.sinkswatch.org

TINY TOWN DEMANDS JUSTICE IN DIOXIN POISONING

By Adrianne Appel

BOSTON, Jul 25 (IPS) -- A U.S. health agency has made research subjects of people in tiny Mossville, Louisiana by repeatedly monitoring dangerously high levels of dioxin in their blood while doing nothing to get the community out of harm's way, residents say.

Further, the agency failed to release important test results for five years, and made it difficult for the community to obtain the actual data, say residents and their lawyers.

"The air is staggering," said resident Haki Vincent. "Come stay at my place and you will see firsthand that the air and water is repulsive."

Mossville is closed in by 14 chemical factories, including Petroleum giant Conaco Phillips and Georgia Gulf, a vinyl products manufacturer that had revenues of 2.4 billion dollars in 2006, according to the company.

Dioxin compounds are a byproduct of petroleum processing and vinyl manufacturing and residents in Mossville say the factories are releasing amounts into the air that are making them sick.

Studies show the community suffers from high rates of cancer, upper respiratory problems and reproductive issues, and residents say dioxin pollution is the cause.

Residents want an end to the pollution and want to be moved away from the factories.

"Here in this community, people are being inundated with pollution and it is killing us," said Shirley Johnson, a Mossville resident.

The U.S. health agency, ATSDR, the Agency for Toxic Substances and Disease Registry, tested the blood of 28 Mossville residents in 1999 and found dioxin at levels two to three times higher than what is considered normal.

But the agency offered no explanation for the high dioxin levels and failed to mention the factories as a possible source.

ATSDR agents left Mossville, and returned in 2001 and re-tested 22 people. It found that average dioxin levels had dropped slightly but were still two to three times higher than normal.

This same year, a division of the U.S. Environmental Protection Agency found levels of the dioxin compound vinyl chloride in the air in Mossville at concentrations 100 times what is permitted by federal law, and ethylene dichloride at 20 times.

But again ATSDR failed to consider that the local factories could be responsible for the dioxin in the blood of people in Mossville.

"The source of dioxin exposures in the Mossville residents is not known," the 2001 report says.

The ATSDR did not release the 2001 results until 2006, with no explanation.

"I'm not going to tell you it was the quickest thing we've ever done. It is what it is," Steve Dearwent, an epidemiologist who led the study, told IPS.

"This can only be called callous indifference of agencies to the fact that people in Mossville are sick and dying as a result of toxins being dumped on them," said Nathalie Walker, a lawyer with Advocates for Environmental Human Rights, an environmental group that is representing Mossville.

The historically black community founded in the late 1700s is unincorporated and has had no voting rights in the state and no power to control what businesses operate within its borders. Some factories moved to within 50 feet of people's homes.

"I live in a community that is dying. Schools are gone. Most of the light and happiness of this community doesn't really exist anymore," said resident Delma Bennett. As a project, he photographs many people in the community who use breathing machines.

The ATSDR does not believe that the dioxin levels seen in people in Mossville are high enough to cause health problems, said Dearwent, who was permitted to speak with a reporter only if a U.S. agency communications expert listened in on the conversation.

Dearwent says that in Mossville, older people had the highest levels of dioxin in their blood, and that younger people had nearly normal levels. This points to previous exposures to dioxin, and a reasonable suspect is typical U.S. store-bought food, all of which is contaminated with some amount of dioxin.

"It's perceived that all the dioxin exposure is related to industry. Our interpretation is that it is related to their diet," Dearwent said. However, tests did not show high amounts of dioxin in local Mossville food, he acknowledged.

Before the health agency experts left Mossville in 2001, they advised residents to change their diets, Dearwent said.

There is no evidence that the factories are releasing dioxin that is settling on the community, he said.

"If there is an exorbitant amount of dioxin being released it would show up in the soil, the dust and the people. Especially the younger people," and ATSDR results did not show this, he said.

This interpretation differs markedly from that of independent scientist Wilma Subra, hired by the environmental organisation to do an independent analysis of any dioxin pollution in Mossville.

Subra found dioxin in nearby soil to be 2 to 230 times what the EPA considers acceptable.

Subra also compared the ATSDR data about dioxin in the blood of Mossville residents to the type of dioxin compounds actually being emitted by the five vinyl factories in the town.

The analysis found an exact match between the specific dioxin compounds being released by the factories and the compounds found in the blood, Subra said. Also, the compounds showed up in the blood in the same percentage as those being released by the factories.

"This is inappropriate exposure to the community," Subra said.

Louisiana is known for its long history of gross environmental problems and the situation in Mossville reflects that history, Walker said.

"The politics have not changed. We have a lot of work to do," she said.

"What we're up against is the control of corporations in Louisiana. They have a huge lobbying body and exert a huge influence," said Monique Harden, an attorney with the environmental organisation. Some factories have increased their emissions recently, she said.

Georgia Gulf says the industries in Mossville have improved their environmental records.

"Industry in Louisiana has reduced total [reportable] emissions by more than 80 percent since 1987," Georgia Gulf spokesman Will Hinson said in a statement to IPS.

In 2005, a local Mossville environmental group filed a petition against the U.S. government with the Inter-American Commission on Human Rights of the Organisation of American States, on the grounds that Mossville's environmental human rights are being violated. The group is waiting for a response from the U.S. State Department, Walker said.

On Wednesday, Mossville residents traveled to Washington to testify before a Senate committee, to raise questions about the actions of ATSDR and the EPA and ask for help in ending pollution in Mossville.

Change is long past due, said resident David Prince. "Fourteen facilities are just spewing these poisons and nothing has been done. When will it be our turn?"

From: Florida Times-Union, Jul. 31, 2007 [Printer-friendly version] JEA REVIEWS SELLING ASH FOR USE IN ROAD PROJECTS

By Steve Patterson, The Times-Union

Complaints from neighbors are making Florida's environmental agency rethink a JEA campaign to sell power-plant ash as road-building material.

The gritty gray ash has been used to cover dirt roads in Baker County and Charlton County, Ga., and poured into lower layers of asphalt street projects in Duval, Nassau and St. Johns counties.

But people living near the road projects have complained about the ash, sold under the name EZBase. They say it drifts into yards, covers cars and irritates some people's breathing.

"I can't believe there's not harmful things in that," said Bob Cowell, whose neighborhood streets off Scott Mill Road in Mandarin were torn up for sewer work and are being rebuilt with EZBase.

"I just feel that we were being experimented with.... Who knows how much hazardous material was in that stuff?"

JEA says some road contractors probably used the material incorrectly but insists it is safe. The Florida Department of Environmental Protection approved using EZBase in roadwork two years ago.

"There is really nothing negative about EZBase other than the term 'ash,' " said Scott W. Schultz, the utility's director of byproduct services. JEA officials say selling ash for roadwork has cut the Jacksonville-owned utility's costs by about $8 million. v But there are negative reviews from neighbors.

A mile away from Cowell's home, Bobbie Zontini said her husband has spent a month vacuuming the bottom of their screened-in pool each day to get the grit that blocks the pool filter.

"I kept seeing all this fine silt-like stuff everywhere," said Zontini, who said her asthma got worse when roadwork started. The same stuff has to be swept up from the patio, she said.

Since May, DEP has received EZBase dust complaints from people in Mandarin, Lake Forest in Northwest Jacksonville and Glen St. Mary in Baker County.

The agency has asked JEA for more information about EZBase and how it's being used.

The stuff is made of ash from JEA's Northside Generating Station, which burns petroleum coke and coal mixed with limestone, a common road material.

EZBase becomes cement-like when wet but can dry out and turn brittle.

Now, DEP wants to see whether JEA and road contractors are following rules that were spelled out for those projects.

The agency is also reconsidering whether covering rural dirt roads with the material is a good idea.

"We do, of course, have the right to remove approval," DEP spokeswoman Jill Johnson said. "It's definitely something we're still investigating."

JEA representatives argue people really should think of EZBase a lot like they do limestone. Schultz said more than 90 percent of the material's weight is lime and gypsum. Gypsum from power plants is normally sold as material for drywall, but the Northside plant's low- emissions design results in gypsum that contains enough unburned fuel that it's undesirable.

Cowell, one of the people worried about the ash, notes EZBase comes with a safety sheet warning about exposure to crystalline silica, a material that can damage people's lungs over time.

That's probably not too big a danger, said Guerry H. McClellan, a University of Florida geology professor who has worked on power plant pollution control systems. He said Florida's ground is full of crystalline silica, such as quartz.

The ash also contains relatively high levels of a metal called vanadium. But a toxicologist hired by DEP concluded in 2005 it didn't pose a serious risk.

To see whether the ash produced now is any different, DEP recently asked JEA for results of chemical tests the utility is required to perform on ash every three months.

A few weeks ago, DEP employees found a hill of EZBase stored long-term in Baker County for county road projects. That wasn't allowed under rules set up in 2005, but JEA told the state agency the material will be removed soon.

JEA sells some ash to out-of-state oil refineries that take shipments by rail. But the utility sees road projects as an important way to reuse ash, which it calls byproduct, in hopes of improving public perceptions. It was used in construction on the Wonderwood Expressway and in subdivision roads and parking lots in Jacksonville and St. Johns County.

About 300,000 tons of EZBase have been sold in Florida, according to spokeswoman Gerri Boyce. That's at least 258,000 cubic yards, enough to pour a 12-inch layer over a quarter of a square mile.

Selling ash saves the expense and trouble of dumping it in a landfill, Schultz said, adding that it means mining companies won't have to dig for as much new limestone.

The utility charges contractors just $1 per ton but saved about $8 million on landfill fees, Boyce said.

Even dumping ash can be a problem.

Last year, when a company that was talking with JEA proposed dumping ash at a Ware County, Ga., landfill, neighbors there filled public meetings to keep the ash out.

The U.S. Environmental Protection Agency and U.S. Department of Energy have both supported projects to recycle ash from power plants. The EPA has studied power plant ash for more than 20 years and doesn't consider it a hazardous waste, said David Goss, executive director of the American Coal Ash Association, a trade group.

But it can still annoy people if it's not handled well.

"When you're putting ash down... almost any kind of ash, you have to be cognizant of what kind of conditions you have," Goss said.

Covering dirt roads with ash isn't common, said Debra Pflughoeft- Hassett, a researcher at the University of North Dakota's Energy & Environmental Research Center. She studied Florida's handling of ash for a federal report last year and spent time talking to JEA about EZBase.

"My suspicion is they have a good product that they can probably use with some tweaking," she said.

2006 WIND INSTALLATIONS OFFSET MORE THAN 40 MILLION TONS OF CO2

WASHINGTON, D.C. -- The 15,200 megawatts of new wind turbines installed worldwide last year will generate enough clean electricity annually to offset the carbon dioxide emissions of 23 average-sized U.S. coal- fired power plants, according to a new Vital Signs Update from the Worldwatch Institute.[1] The 43 million tons of carbon dioxide displaced in 2006 is equivalent to the emissions of 7,200 megawatts of coal-fired power plants, or nearly 8 million passenger cars.

Global wind power capacity increased almost 26 percent in 2006, exceeding 74,200 megawatts by year's end. Global investment in wind power was roughly $22 billion in 2006, and in Europe and North America, the power industry added more capacity in wind than it did in coal and nuclear combined. The global market for wind equipment has risen 74 percent in the past two years, leading to long backorders for wind turbine equipment in much of the world.

"Wind power is on track to soon play a major role in reducing fossil fuel dependence and slowing the buildup of greenhouse gases in the atmosphere," according to Worldwatch Senior Researcher Janet Sawin. "Already, the 43 million tons of carbon dioxide displaced by the new wind plants installed last year equaled more than 5 percent of the year's growth in global emissions. If the wind market quadruples over the next nine years -- a highly plausible scenario -- wind power could be reducing global emissions growth by 20 percent in 2015."

Today, Germany, Spain, and the United States generate nearly 60 percent of the world's wind power. But the industry is shifting quickly from its European and North American roots to a new center of gravity in the booming energy markets of Asia.

In 2006, India was the third largest wind turbine installer and China took the fifth spot, thanks to a 170-percent increase in new wind power installations over the previous year. More than 50 nations now tap the wind to produce electricity, and 13 have more than 1,000 megawatts of wind capacity installed.

As efforts to reduce carbon dioxide emissions accelerate around the globe, dozens of countries are working to add or strengthen laws that support the development of wind power and other forms of renewable energy. Rapid growth is expected in the next few years in several countries, including Australia, Brazil, Canada, France, and Portugal.

"China and the United States will compete for leadership of the global wind industry in the years ahead," says Sawin. "Although the U.S. industry got a 20-year head start, the Chinese are gaining ground rapidly. Whichever nation wins, it is encouraging to see the world's top two coal burners fighting for the top spot in wind energy."

==============

[1] Calculations are based on U.S. data: average capacity factor for new wind power capacity (34%, from American Wind Energy Association); average capacity factor for coal-fired power plants (72%, from North American Electric Reliability Council -- NAERC); average CO2 emissions from U.S. coal-fired power plants (0.95 kg/kWh, from U.S. Energy Information Administration); and average coal-fired power plant capacity (318 megawatts, from NAERC).

Copyright 2007 Worldwatch Institute

PCBS ALTER BRAIN DEVELOPMENT

By Adrian Burton

Exposure to the non-coplanar polychlorinated biphenyl PCB95 during gestation and nursing causes abnormal development of the auditory cortex in rats, affecting the brain's representation of what is heard, according to new research in the 1 May 2007 issue of Proceedings of the National Academy of Sciences. Since children with autism and other developmental disorders show abnormal responses to sound, suspicions have been raised that PCB95 and similar molecules in the environment might promote this problem and perhaps other language/cognition disorders in children.

Prior to their being banned in the late 1970s as potential carcinogens, PCBs became ubiquitous environmental pollutants that continue to threaten human health. Their molecular stability has maintained them intact, and they have entered the food chain, accumulating in the fat of exposed organisms.

Most of the early work on PCB-associated health problems focused on the coplanar molecules, but there is now evidence that non-coplanar PCBs may cause trouble of their own (coplanar and non-coplanar refer to the chemical structure of the PCBs in question). "They are reported to prevent dopamine production in monkey brains, to alter behavior in rats, and may even alter neuropsychological functioning in children," explains first author Tal Kenet, a faculty member at Harvard Medical School and Massachusetts General Hospital. "Our research suggests they cause abnormalities in the development of the auditory part of the rat brain."

The researchers fed pregnant rats 6 mg/kg of PCB95 in corn oil daily from day 5 of pregnancy until the weaning of their pups. "We then mapped the boundary and response characteristics of the primary auditory cortex of the pups using a series of electrodes implanted in the brain," says Kenet. "Individual neurons were monitored to see which characteristic sound frequency they responded to." The auditory cortex is one of the first sensory systems to mature.

The maps of the PCB95-exposed rats were found to be oddly shaped and had "holes" in them where neurons seemed not to respond to sound. The maps also included many neurons that showed a lack of frequency selectivity, and the typical posterior-to-anterior distribution of neurons responding to ever higher frequencies was disorganized.

"This must affect how their brains interpret sound," says Kenet. "In addition, we recorded notable imbalances in inhibitory and excitatory signaling between the auditory cortex nerve cells. Without proper balancing, the correct representation of sound cannot be guaranteed. Importantly, children with autism show evidence of imbalances between excitation and inhibition in the brain, but whether it's the same type of imbalance remains to be explored."

The researchers also found the plasticity of the PCB-exposed cortices to be abnormal. Usually, if rat pups are exposed to a particular tone, the area of the cortex that deals with that frequency expands. "That did not happen in the PCB-exposed pups," says Kenet.

"Epidemiological studies have found that children with prenatal PCB exposure do more poorly on tests of verbal learning and memory," says Susan Schantz, a professor of veterinary biosciences at the University of Illinois at Urbana-Champaign College of Veterinary Medicine. "These exciting new findings suggest that underlying changes in the development or plasticity of the auditory cortex may be responsible for those effects." However, Schantz cautions that the rats in these studies received a very high dose of a very potent PCB congener. "In my opinion," she says, "it is unlikely that human infants, even those living in highly polluted areas, would be exposed to similar concentrations. We also need to keep in mind that any effects observed in humans are likely to be much more subtle than the striking changes observed in these rats."

"It would be interesting to know whether animals closer to humans develop disorders resembling autism or other cognitive problems after [environmentally relevant] PCB95 exposures," remarks Jesús Pastor, a senior researcher at the Centre for Environmental Sciences in Madrid, Spain. "That might help reveal how serious this problem could be."

Since PCBs can be passed on to human infants in breast milk, the report raises the question of whether some mothers in highly polluted areas -- perhaps those whose family history points toward a possible genetic risk of autism spectrum disorders -- should bottle-feed rather than breastfeed. However, "some research shows that breastfeeding may actually lessen the negative impact of prenatal exposure, even though children who are breastfed have higher overall body burdens of PCBs," says S

From: Los Angeles Times, Jul. 30, 2007

THE MIRAGE OF NUCLEAR POWER

By Paul Josephson

In the last two weeks, the Chinese signed a deal with Westinghouse to build four nuclear power plants; a U.S. utility joined the French national nuclear juggernaut -- with 60 reactors under its belt -- to build stations throughout the United States; and the Russians neared the launch of the first of a dozen nuclear power stations that float on water, with sales promised to Morocco and Namibia. Two sworn opponents -- environmentalists and President Bush -- tout nuclear energy as a panacea for the nation's dependence on oil and a solution to global warming. They've been joined by all the presidential candidates from both parties, with the exception of John Edwards. And none of them is talking about the recent nuclear accident in Japan caused by an earthquake.

These surprising bedfellows base their sanguine assessment of nuclear power on an underestimation of its huge financial costs, on a failure to consider unresolved problems involving all nuclear power stations and on a willingness to overlook this industry's history of offering far-fetched dreams, failing to deliver and the occasional accident.

Since the 1950s, the nuclear industry has promised energy "too cheap to meter," inherently safe reactors and immediate clean-up and storage of hazardous waste. But nuclear power is hardly cheap -- and far more dangerous than wind, solar and other forms of power generation. Recent French experience shows a reactor will top $3 billion to build. Standard construction techniques have not stemmed rising costs or shortened lead time. Industry spokespeople insist they can erect components in assembly-line fashion a la Henry Ford to hold prices down. But the one effort to achieve this end, the Russian "Atommash" reactor factory, literally collapsed into the muck.

The industry has also underestimated how expensive it will be to operate stations safely against terrorist threat and accident. New reactors will require vast exclusion zones, doubly reinforced containment structures, the employment of large armed private security forces and fail-safe electronic safeguards. How will all of these and other costs be paid and by whom?

To ensure public safety, stations must be built far from population centers and electricity demand, which means higher transmission costs than the industry admits. In the past, regulators approved the siting of reactors near major cities based on the assumption that untested evacuation plans would work. Thankfully, after public protests, Washington did not approve Consolidated Edison's 1962 request to build a reactor in Queens, N.Y., three miles from the United Nations. But it subsequently approved licensing of units within 50 miles of New York, Boston, Chicago and Washington, D.C. New Orleans had three days of warning before Hurricane Katrina hit and was not successfully evacuated. A nuclear accident may give us only 20 minutes to respond; this indicates that reactors should be built only in sparsely populated regions.

Finally, what of the spent fuel and other nuclear waste? More than 70,000 tons of spent fuel at nuclear power stations are stored temporarily in basins of water or above ground in concrete casks. The Bush administration held back release of a 2005 National Research Council study, only excerpts of which have been published, because its findings, unsympathetic to nuclear power, indicated that this fuel remains an inviting target for terrorists.

And more than 150 million Americans live within 75 miles of nuclear waste, according to the Office of Civilian Radioactive Waste Management. A storage facility that was supposed to open at Yucca Mountain, Nev., in 1989 still faces legal and scientific hurdles. And if Yucca Mountain opens, how will we transport all of the waste safely to Nevada, and through whose towns and neighborhoods?

Industry representatives, government regulators and nuclear engineers now promise to secure the nation's energy independence through inherently safe reactors. This is the same industry that gave the world nuclear aircraft and satellites -- three of the 30 satellites launched have plummeted to Earth -- and Three Mile Island, Chernobyl and a series of lesser known accidents.

Let's see them solve the problems of exorbitant capital costs, safe disposition of nuclear waste, realistic measures to deal with the threats of terror, workable evacuation plans and siting far from population centers before they build one more station. In early July, President Bush spoke glowingly about nuclear power at an Alabama reactor recently brought out of moth balls; but it has shut down several times since it reopened because of operational glitches. What clearer indication do we need that nuclear power's time has not yet come?

Paul Josephson writes about nuclear power and teaches history at Colby College.


From: New York Times, Jul. 17, 2007

CHRONIC FATIGUE NO LONGER SEEN AS 'YUPPIE FLU'

By David Tuller

For decades, people suffering from chronic fatigue syndrome have struggled to convince doctors, employers, friends and even family members that they were not imagining their debilitating symptoms. Skeptics called the illness "yuppie flu" and "shirker syndrome."

Donna Flowers, who became ill with chronic fatigue syndrome several years ago after a bout of mononucleosis, working out in her home in Los Gatos, Calif., while taking care of her twins.

But the syndrome is now finally gaining some official respect. The Centers for Disease Control and Prevention, which in 1999 acknowledged that it had diverted millions of dollars allocated by Congress for chronic fatigue syndrome research to other programs, has released studies that linked the condition to genetic mutations and abnormalities in gene expression involved in key physiological processes. The centers have also sponsored a $6 million public awareness campaign about the illness. And last month, the C.D.C. released survey data suggesting that the prevalence of the syndrome is far higher than previously thought, although these findings have stirred controversy among patients and scientists. Some scientists and many patients remain highly critical of the C.D.C.'s record on chronic fatigue syndrome, or C.F.S. But nearly everyone now agrees that the syndrome is real.

"People with C.F.S. are as sick and as functionally impaired as someone with AIDS, with breast cancer, with chronic obstructive pulmonary disease," said Dr. William Reeves, the lead expert on the illness at the C.D.C., who helped expose the centers' misuse of chronic fatigue financing.

Chronic fatigue syndrome was first identified as a distinct entity in the 1980s. (A virtually identical illness had been identified in Britain three decades earlier and called myalgic encephalomyelitis.) The illness causes overwhelming fatigue, sleep disorders and other severe symptoms and afflicts more women than men. No consistent biomarkers have been identified and no treatments have been approved for addressing the underlying causes, although some medications provide symptomatic relief.

Patients say the word "fatigue" does not begin to describe their condition. Donna Flowers of Los Gatos, Calif., a physical therapist and former professional figure skater, said the profound exhaustion was unlike anything she had ever experienced.

"I slept for 12 to 14 hours a day but still felt sleep-deprived," said Ms. Flowers, 51, who fell ill several years ago after a bout of mononucleosis. "I had what we call 'brain fog.' I couldn't think straight, and I could barely read. I couldn't get the energy to go out of the door. I thought I was doomed. I wanted to die."

Studies have shown that people with the syndrome experience abnormalities in the central and autonomic nervous systems, the immune system, cognitive functions, the stress response pathways and other major biological functions. Researchers believe the illness will ultimately prove to have multiple causes, including genetic predisposition and exposure to microbial agents, toxins and other physical and emotional traumas. Studies have linked the onset of chronic fatigue syndrome with an acute bout of Lyme disease, Q fever, Ross River virus, parvovirus, mononucleosis and other infectious diseases.

"It's unlikely that this big cluster of people who fit the symptoms all have the same triggers," said Kimberly McCleary, president of the Chronic Fatigue and Immune Dysfunction Syndrome Association of America, the advocacy group in charge of the C.D.C.-sponsored awareness campaign. "You're looking not just at apples and oranges but pineapples, hot dogs and skateboards, too."

Under the most widely used case definition, a diagnosis of chronic fatigue syndrome requires six months of unexplained fatigue as well as four of eight other persistent symptoms: impaired memory and concentration, sore throat, tender lymph nodes, muscle pain, joint pain, headaches, disturbed sleeping patterns and post-exercise malaise.

The broadness of the definition has led to varying estimates of the syndrome's prevalence. Based on previous surveys, the C.D.C. has estimated that more than a million Americans have the illness.

Last month, however, the disease control centers reported that a randomized telephone survey in Georgia, using a less restrictive methodology to identify cases, found that about 1 in 40 adults ages 18 to 59 met the diagnostic criteria -- an estimate 6 to 10 times higher than previously reported rates.

However, many patients and researchers fear that the expanded prevalence rate could complicate the search for consistent findings across patient cohorts. These critics say the new figures are greatly inflated and include many people who are likely to be suffering not from chronic fatigue syndrome but from psychiatric illnesses.

"There are many, many conditions that are psychological in nature that share symptoms with this illness but do not share much of the underlying biology," said John Herd, 55, a former medical illustrator and a C.F.S. patient for two decades.

Researchers and patient advocates have faulted other aspects of the C.D.C.'s research. Dr. Jonathan Kerr, a microbiologist and chronic fatigue expert at St. George's University of London, said the C.D.C.'s gene expression findings last year were "rather meaningless" because they were not confirmed through more advanced laboratory techniques. Kristin Loomis, executive director of the HHV-6 Foundation, a research advocacy group for a form of herpes virus that has been linked to C.F.S., said studying subsets of patients with similar profiles was more likely to generate useful findings than Dr. Reeves's population- based approach.

Dr. Reeves responded that understanding of the disease and of some newer research technologies is still in its infancy, so methodological disagreements were to be expected. He defended the population-based approach as necessary for obtaining a broad picture and replicable results. "To me, this is the usual scientific dialogue," he said.

Dr. Jose G. Montoya, a Stanford infectious disease specialist pursuing the kind of research favored by Ms. Loomis, caused a buzz last December when he reported remarkable improvement in 9 out of 12 patients given a powerful antiviral medication, valganciclovir. Dr. Montoya has just begun a randomized controlled trial of the drug, which is approved for other uses.

Dr. Montoya said some cases of the syndrome were caused when an acute infection set off a recurrence of latent infections of Epstein Barr virus and HHV-6, two pathogens that most people are exposed to in childhood. Ms. Flowers, the former figure skater, had high levels of antibodies to both viruses and was one of Dr. Montoya's initial C.F.S. patients.

Six months after starting treatment, Ms. Flowers said, she was able to go snowboarding and take yoga and ballet classes. "Now I pace myself, but I'm probably 75 percent of normal," she said.

Many patients point to another problem with chronic fatigue syndrome: the name itself, which they say trivializes their condition and has discouraged researchers, drug companies and government agencies from taking it seriously. Many patients prefer the older British term, myalgic encephalomyelitis, which means "muscle pain with inflammation of the brain and spinal chord," or a more generic term, myalgic encephalopathy.

"You can change people's attributions of the seriousness of the illness if you have a more medical-sounding name," said Dr. Leonard Jason, a professor of community psychology at DePaul University in Chicago.

From: Scientific American, Jul. 16, 2007

LIVING NEAR HIGH TRAFFIC RAISES HEART RISKS

DALLAS (Reuters) -- Living near a busy highway may be bad for your heart.

Long-term exposure to air pollution from a nearby freeway or busy road can raise the risk of hardening of the arteries, which can lead to heart disease and stroke, German researchers reported on Monday.

"The most important finding of our study is that living close to high traffic, a major source of urban air pollution, is associated with atherosclerosis in the coronary arteries -- the blood vessels that supply the heart," Dr. Barbara Hoffmann, who led the study, said in a statement.

"This is the first study to actually show a relationship between long- term traffic exposure and coronary atherosclerosis," said Hoffmann, of the University of Duisburg-Essen in Germany.

The study is published in this week's issue of Circulation, an American Heart Association journal.

Previous studies have linked elevated levels of air pollution to an increased risk of heart problems, but this is the first to demonstrate that living near high traffic is associated with coronary atherosclerosis.

The study looked at 4,494 adults, aged 45 to 74, in three large cities in the industrialized Ruhr area of Germany.

Doctors examined the participants, looking especially for coronary artery calcification, which occurs when fatty plaques forming in the artery walls become calcified, or hardened.

Researchers found that compared with people who lived more than 200 meters (yards) from major traffic, the chance of high coronary artery calcification was 63 percent greater for those living within 50 meters (160 feet).

For people within 51 meters to 100 meters (164 feet to 328 feet) the chance was 34 percent higher. It was 8 percent higher for those within 100 meters to 200 meters (328 feet to 642 feet) of heavy traffic.

These percentages take into account age, gender, smoking and high blood pressure.

A five-year follow up study is set to be completed next year.

"Politicians, regulators and physicians need to be aware that living close to heavy traffic may pose an increased risk of harm to the heart. Potential harm due to proximity to heavy traffic should be considered when planning new buildings and roads," Hoffmann said.

Copyright 1996-2007 Scientific American, Inc.

From: Chemical & Engineering News (pg. 6), Jul. 16, 2007

PERSISTENT ORGANIC POLLUTANTS

More chemicals may have to be flagged as risky

By Celia Arnaud

Some persistent organic pollutants (POPs) can reach high concentrations in humans and other air-breathing animals even though they don't bioaccumulate in fish, according to a study by Canadian researchers (Science 2007, Vol. 317, pg. 236). The observation suggests that the regulatory criteria now used to flag potential POPs may need revision.

Some persistent organic pollutants can bioaccumulate in air-breathing animals such as polar bears even though they don't accumulate in the fish the bears eat. Bioaccumulative compounds are usually assumed to be hydrophobic and fat-soluble if they have an octanol-water partition coefficient (KOW) greater than 100,000. Screening of commercial chemicals to identify potentially bioaccumulative compounds is usually based on the KOW or laboratory tests with fish. But research by environmental chemist Frank A. P. C. Gobas, grad student Barry C. Kelly, and coworkers at Simon Fraser University, in Burnaby, British Columbia, shows that such an approach may overlook a significant fraction of pollutants that pose health risks to air-breathing animals.

The Canadian researchers show that even moderately hydrophobic compounds with KOW between 100 and 100,000 can increase in concentration at each step in the food chain, a process known as biomagnification. Biomagnification of such a compound can occur in food webs that include humans and other air-breathing animals, even when it doesn't happen in food webs that are limited to fish and aquatic invertebrates. To biomagnify in air-breathing animals, a compound must have a high octanol-air partition coefficient (KOA), and it must be metabolized slowly.

Gobas and his coworkers first hypothesized in 2001 that compounds with high KOA would biomagnify in air-breathing animals. At that time, they observed that certain substances with a relatively low KOW biomagnified significantly in the lichen-caribou-wolf food chain. They have now looked at a variety of food webs, including one with only water-respiring organisms, one with only air-breathing organisms, and one with both (including humans). All the food webs they studied are found in northern Canada.

The polychlorinated biphenyl congener 153 (PCB153) is an example of a compound with high KOW and high KOA. As expected, it biomagnifies in all three food webs. In contrast, beta-hexachlorocyclohexane (the insecticide lindane) has a low KOW and a high KOA. It does not biomagnify in the food web that includes only water-respiring organisms, but it does biomagnify in both webs that include air- breathing animals.

Gobas hastens to point out that their model assumes that the chemicals are not metabolized. "The degree to which chemicals are transformed in organisms is difficult to predict at this point," he says.

Gobas hopes that environmental regulations will change as a result of this work. "For regulatory agencies and chemical manufacturers, it is important to recognize that chemicals with a high octanol-air partition coefficient have the potential to bioaccumulate in terrestrial and human food webs," he says.

Lynn R. Goldman, a professor of environmental health sciences at the Johns Hopkins University Bloomberg School of Public Health, says that the paper suggests "a broader range of chemicals, with high KOA but lower KOW, should perhaps be considered to be persistent organic pollutants." Goldman served as assistant administrator for the EPA's Office of Prevention, Pesticides & Toxic Substances during the Clinton Administration.

"Current risk assessments that classify a chemical as a persistent organic pollutant (POP) have based their conclusions primarily on science drawn from aquatic toxicology," says Lawrence Burkhard, an EPA research chemist. "This paper strongly suggests risk assessment methodologies need to be changed to include data on bioaccumulation in birds and mammals. If this is done, more chemicals may be classified as POPs than have been in the past."

Copyright 2007 American Chemical Society

From: New York Sun, Jul. 16, 2007

CONSERVATIVE PENNSYLVANIANS PASS LAWS DEFYING U.S. CONSTITUTION

By Channing Joseph, Staff Reporter

Nearly 220 years after America's Constitution was drafted in Pennsylvania, scores of rural Keystone State communities are declaring the document null and void.

More than 100 largely Republican municipalities have passed laws to abolish the constitutional rights of corporations, inventing what some critics are calling a "radical" new kind of environmental activism. Led by the nonprofit Community Environmental Legal Defense Fund, they are attempting to jumpstart a national movement, with Celdf chapters in at least 23 states actively promoting an agenda of "disobedient lawmaking."

"I understand that state law and federal law is supposed to pre-empt local laws, but federal law tells us we're supposed to have clean air and clean water," the mayor of Tamaqua, Pa., Christian Morrison, told The New York Sun.

More than a year ago, the Lehigh Coal and Navigation Corporation stirred an uproar in Mr. Morrison's eastern Schuylkill County borough with a proposal to use a large strip mine as a disposal site for material dredged up from the Hudson and Delaware rivers.

But in May, the mayor, 37, cast a tie-breaking council vote to enact an ordinance that bans corporate waste dumping -- making his the first community in America to do so -- and abolishes all corporate rights within his borough.

"The state and federal environmental protection agencies... support the big corporations, and they really don't look after the safety of the people that I represent," Mr. Morrison told the Sun. Representatives at Lehigh Coal and Navigation did not respond to multiple requests for comment.

The legal defense fund's director of education, training, and development, Richard Grossman, has said that today's federal and state laws too often condone corporate practices that pose ecological and public health hazards, such as strip mining and toxic waste dumping. The solution, his organization suggests, is to deny corporations of all their rights, thereby denying them the ability to engage in any potentially dangerous practices.

The legal fund's strategy is not without critics, of course. The president of the conservative Eagle Forum of Pennsylvania, Fran Bevan, said the "whole process undermines representative government" and "harkens to 'radical environmentalism.'"

With nongovernmental groups such as Celdf, "elected officials... many times unknowingly give in to their agenda because it sounds like a good solution," Ms. Bevan told the Sun. Instead, she said, "I think that we need educated elected officials and need not depend on NGOs who are accountable to no one.... We certainly have enough agencies and laws that oversee sites where businesses and industries develop."

Celdf was founded in 1995 to provide legal services to environmental groups. Since that time, it has taken on the additional mission of working with rural governments to establish "home rule," the legal notion that small communities can exercise sovereignty at the local level.

About 43 states currently provide for some level of municipal home rule. According to its Web site, Celdf assists those municipalities "that are ready to take this bold step in local self-governance" by helping to draft the necessary legislation.

But in doing so, some members of these communities are going so far as to say their local laws ought to supersede federal authority, defying the Supreme Court's long-standing view that corporate entities are legal persons entitled to due process, equal protection, free speech, and other rights. Their aim is to use one or more of the anti- corporate ordinances they have passed to establish a Supreme Court test case disputing corporate personhood.

Abolitionists in the early 19th century could "have ended up demanding a slavery protection agency -- you know, the equivalent of today's Environmental Protection Agency -- to make slaves' conditions a little less bad," Mr. Grossman said in a 2000 speech comparing corporations to slave owners. Instead, "they denounced the Constitution" -- which permitted slavery at the time -- "and openly violated federal and state laws by aiding runaway slaves."

Just as judges and juries slowly changed their minds about the slave trade, Mr. Grossman said, today's public eventually will come to see corporations in a different light.

Through decades of work, abolitionists "built a political movement ... with the clout to get their three constitutional amendments enacted," he added, referring to the 13th through 15th amendments.

But the young mayor of Tamaqua has less lofty goals.

"I'm trying to protect the community that voted me in," Mr. Morrison said. "Both my parents are riddled with cancer."

Several members of his 7,000-person community have been diagnosed with rare forms of the disease, and the dumping at several nearby superfund sites is to blame, Mr. Morrison said.

Schuylkill County is heavily Republican, and voters there strongly supported President Bush in the last two presidential contests, according to the Pennsylvania Department of State Bureau of Commissions, Elections, and Legislation.

Mr. Morrison, however, is a Democrat. "It's not too often you see a Democrat get elected to anything in Tamaqua," he told the Sun. Still, Mr. Morrison said he is willing to stand behind his community's new ordinance, and go to prison if necessary. The ordinance is "set up so it can be done civilly," he said. "If not, criminally."

The young mayor's spirit of civil disobedience extends beyond his rural community in eastern Pennsylvania. With active chapters in states from Alaska to Arizona, including New York, Celdf is spawning a nationwide movement with weekend workshops that cost between $300 and $400 and that are designed to encourage attendees to push for anti- corporate legislation in their communities. Called the Daniel Pennock Democracy Schools, the 10- to 20-person classes began in 2003 at Wilson College in Chambersburg, Pa., and are named in honor of a 17- year-old boy who died after exposure to sewage sludge.

Shireen Parsons, who has organized several of the courses in Virginia, became involved with Celdf several years ago, when members of her mostly Republican community in Christiansburg, Va., began organizing against a proposed highway project.

She received a telephone call from Celdf's executive director, Thomas Linzey, who told her, "We want to litigate this for you."

After three lawsuits and three appeals, Ms. Parsons said, the courts finally ruled in favor of the corporate highway project. It was at that point that she and other Celdf members decided to change their strategy and challenge the legal system rather than work within it, she said.

State and federal laws are "set up so that we will always fail," Ms. Parsons told the Sun. "We have been working with these regulators for 40 years, and everything is worse. So we haven't gained anything from working within the system."

Lyn Gerry, the host of "Unwelcome Guests," a weekly radio show based in Watkins Glen, N.Y., has aired excerpts of Democracy Schools on her two-hour program, which is broadcast to at least 20 stations in 12 American states, Canada, and New Zealand. According to the New York State Board of Elections, residents of Schuyler County, to which Watkins Glen belongs, voted strongly in favor of Mr. Bush in both the 2000 and 2004 elections.

Ms. Gerry said her neighbors in her village of about 2,100 people have been abuzz about the Democracy Schools after recent proposals to dig a quarry and to dispose of toxic waste on nearby farmland. She added: "We've got some issues here that lend themselves" to Celdf's project, which she refers to on her show as "disobedient lawmaking."

"What underlies the Democracy School, what makes it so powerful is that it's an organizing model," she said. "The community comes to a consensus and says, 'We have the right to decide how it is here. We're making a legal right to do it.'"

Back in Pennsylvania, not everyone is convinced.

"Our environment is the only one we have, so we need to be conscientious about our use and non-use," Ms. Bevan of the Eagle Forum said. "The question that I have concerning Celdf... is, 'Do we need them?' Do we have problems that are not being addressed, or are we creating problems that do not exist?"

From: Public Employees for Environmental Responsibility, Jul. 17, 2007

PUBLIC ACCESS TO EPA LIBRARY HOLDINGS IN JEOPARDY

Agency Refuses to Consult with Its Own Scientists; Arbitration Hearing Scheduled

Washington, DC -- The U.S. Environmental Protection Agency is finalizing procedures that may lock away a large portion of its library collections from access by the public, according to agency documents released today by Public Employees for Environmental Responsibility (PEER). Compounding the inaccessibility of physical collections, the public's ability to electronically search digitized EPA holdings is problematic as well.

Over the past 18 months, EPA has closed large parts of its library network, including regional libraries serving 23 states, as well as its Headquarters Library and several technical libraries. The holdings from these shuttered facilities have been shipped to one of three "repositories" -- located in Cincinnati, North Carolina's Research Triangle Park and D.C. How the public, and even EPA's own staff, access these growing repositories has been uncertain.

Even as Congress moves to reverse EPA's library closures, the agency is now racing to cement new procedures restricting the ability of the public to locate or read technical documents in the agency's possession. A new proposed policy circulated internally for comment on July 11, 2007 provides:

"Repository libraries are not required to provide public access to their collections..."

In the interim, public requests are funneled into a "frequent questions" web page that yields balky and incomplete answers to patrons' questions.

Meanwhile, the remaining libraries are directed to provide public access but may tailor or reduce that access depending upon resource limitations:

"Available public access choices from a member library shall be based on its capacity to provide them."

"EPA claims that its libraries are designed for the twin purposes of improving the quality of information for agency decision-making as well as raising public environmental awareness, but right now the libraries are not serving either purpose very well," stated PEER Associate Director Carol Goldberg. "Significantly, EPA is not even bothering to consult the public who paid for these collections."

In addition to the public, EPA's own scientists have not been consulted either. A union grievance filed on August 16, 2006 protesting the closure of libraries as making it harder for scientists and other specialists to do their work. EPA ignored the grievance. On Monday, February 5, 2007, the American Federation of Government Employees National Council of EPA Locals filed an unfair labor practice complaint before the Federal Labor Relations Authority (FLRA). On June 26, 2007, the FLRA upheld the complaint and ordered EPA into binding arbitration with a hearing slated for August 14, 2007.

"Not only is the public locked out of the libraries, but the agency's own scientists are having trouble getting needed information as well," Goldberg added. "EPA claims that it plans to make more information more readily available, but judging by the results so far it has failed miserably."

Read the new proposed EPA repository policy

View the proposed EPA public access policy

See the hurry-up schedule for these policies

Test EPA's public access by submitting a search question

Look at the unfair labor practices arbitration order against EPA

Trace the dismantlement of the EPA library system

Contact PEER

email: info@peer.org

Copyright peer.org 2007

From: Wall Street Journal (pg. A2), Jul. 16, 2007

POTENTIAL ENERGY CRUNCH MAY BRING OTHER FUELS TO FORE

By Bhushan Bahree

World oil and gas supplies from conventional sources are unlikely to keep up with rising global demand over the next 25 years, the U.S. petroleum industry says in a draft report of a study commissioned by the government.

In the draft report, oil-industry leaders acknowledge the world will need to develop all the supplemental sources of energy it can -- ranging from biofuels to nuclear power to oil extracted by unconventional means from the oil sands of Canada -- to meet soaring demand. The surge in demand is expected to arise from rapid economic growth in such fast-developing countries as China and India, as well as mounting consumption in the U.S., the world's biggest energy market.

** Tight Times: World oil and natural-gas supplies are unlikely to keep up with rising demand over the next 25 years, the U.S. petroleum industry says in a draft report.

** Needed Alternatives: The world will need supplemental sources like biofuels and nuclear power to meet demand, the report says.

** Price Pressure: The findings suggest high energy prices are likely for decades to come.The findings suggest that, far from being temporary, high energy prices are likely for decades to come.

"It is a hard truth that the global supply of oil and natural gas from the conventional sources relied upon historically is unlikely to meet projected 50% to 60% growth in demand over the next 25 years," says the draft report, titled "Facing the Hard Truths About Energy."

"In geoeconomic terms, the biggest impact will come from increasing demand for oil and natural gas from developing countries," said the draft report, a copy of which was reviewed by The Wall Street Journal. "This demand may outpace timely development of new supply sources, thereby pressuring prices to rise."

The study, which was requested by U.S. Energy Secretary Samuel Bodman in October 2005, was conducted by the National Petroleum Council, an industry group that advises the secretary.

The conclusions appear to be the first explicit concession by the petroleum industry that it alone can't meet burgeoning global demand for oil, which may rise to as much as 120 million barrels a day by 2030 from about 84 million barrels a day currently, according to some projections. (U.S. gasoline prices are on the rise. See related article.)

These conclusions follow hard on the heels of a medium-term outlook by the Paris-based International Energy Agency this month, which suggested a supply squeeze will hit by 2012. The fact that the American petroleum industry is warning of a crunch could have an even greater impact on the debate over energy policy.

The draft report proposed that the U.S. work not only to increase output of oil, gas and other fuels, but to cut energy use by improving car and truck mileage standards and implementing stricter building and appliance requirements. "Whether we are effort-constrained or resource-constrained won't become clear until it is too late," said Larry Goldstein, director of the Energy Policy Research Foundation, an industry-funded, nonprofit research organization based in Washington. Policy makers must assume supply constraints, Mr. Goldstein said, declining to comment directly on the study.

The National Petroleum Council has about 175 members, picked by the energy secretary, with extensive participation by the energy industry and other industries and government officials and with help from foreign countries and institutions. The NPC is slated to vote on adopting the draft, which runs more than 450 pages, including annexes, at a meeting Wednesday in Washington to be led by Exxon Mobil Corp. former Chairman and Chief Executive Officer Lee R. Raymond.

Some people who participated in the report declined to comment on the findings until the results were published. Besides Mr. Raymond, leaders of the study included David J. O'Reilly, chairman and chief executive of Chevron Corp.; Andrew Gould, chairman and CEO of Schlumberger Ltd.; and Daniel H. Yergin, chairman of Cambridge Energy Research Associates.

Michael Lynch, president of Strategic Energy and Economic Research, said this is perhaps the first time the NPC, which was founded at President Harry Truman's request in 1946, has taken a global overview. The conclusion seems to be "the situation is serious, but not critical," Mr. Lynch said.

Still, drastically increasing the supply of oil and gas could be difficult. "The oil industry was gutted between 1985 and 2000 because of low prices," said J. Robinson West, chairman of PFC Energy, an industry consulting concern in Washington. "It will be difficult now for it to meaningfully increase its production capacity."

"The fact is there is lots of oil in the world, but the industry and international capital can't reach it," Mr. West said, noting limits imposed on Western companies by holders of oil reserves in the Mideast and elsewhere.

Houston investment banker Matthew Simmons takes a pessimistic view. He believes the world should be preparing for sharply lower oil production. He points out the NPC study didn't squarely address one important issue raised by Mr. Bodman in requesting the study: the point at which global oil production will plateau and then begin to decline, often referred to by the shorthand term "peak oil."

"We should be preparing for a time when, in 10, 15 or 20 years, oil production is likely to be 40 million barrels a day to 60 million barrels a day, not 120 million," he said.

The NPC study noted that total global endowment of fossil fuels appears to be huge, but only a fraction of those estimated volumes can be produced, because of technical constraints. It said the Earth's underground stores of oil were estimated at 13 trillion barrels to 15 trillion barrels.

--Jeffrey Ball contributed to this article.

Write to Bhushan Bahree at bhushan.bahree@wsj.com

Copyright 2007 Dow Jones & Company, Inc.

From: Christian Science Monitor, Jul. 19, 2007

ACCIDENTS DIM HOPES FOR GREEN NUCLEAR OPTION

By Brad Knickerbocker, Staff writer

As concern about global warming has swelled in recent years, so has renewed interest in nuclear energy. The main reason: Nuclear plants produce no carbon dioxide or other greenhouse gases tied to climate change, at least not directly.

New reactor designs make plants safer than those operating in the days of the accidents at Chernobyl and Three Mile Island decades ago, advocates say. And there's no group of OPEC countries in unstable parts of the world controlling the main raw material -- uranium.

But that was before an earthquake in Japan this week rattled the Kashiwazaki nuclear power plant. The plant's operator "said it had found more than 50 problems at the plant caused by Monday's earthquake," The New York Times reported, adding:

"While most of the problems were minor, the largest included 100 drums of radioactive waste that had fallen over, causing the lids on some of the drums to open, the company said.... The company said that the earthquake also caused a small fire at the plant, the world's largest by amount of electricity produced, and the leakage of 317 gallons of water containing trace levels of radioactive materials into the nearby Sea of Japan."

Meanwhile, accidents at two German nuclear reactors last month prompted German Environment Minister Sigmar Gabriel to call for the early shutdown of all older reactors there, reports Bloomberg News.

Concern about the safety of Germany's 17 reactors has grown after a fire at Vattenfall's Kruemmel site June 28 and a network fault at its Brunsbuettel plant on the same day. Der Spiegel online adds:

"It took the fire department hours to extinguish the blaze. Even worse, the plant operator's claim that a fire in the transformer had no effect on the reactor itself proved to be a lie.

In short, the incident made clear that nuclear energy is by no means the modern, well-organized, high-tech sector portrayed until recently by politicians and industry advocates. Indeed, the frequency of problems occurring at Germany's aging reactors is on the rise. Just as old cars succumb to rust, nuclear power plants built in the 1970s and '80s are undergoing a natural aging process.

On Wednesday, the chief executive of Vattenfall Europe AG stepped down. Klaus Rauscher was the second manager to depart this week amid mounting criticism for the utility's handling of a fire at a nuclear plant in northern Germany, reports the AP.

"When it comes to security at nuclear power plants, I can only say, that when it comes to the information policy, this really has not been acceptable and therefore my sympathy for the industry is limited," Chancellor Angela Merkel said.

Merkel, a physicist by training, normally favors nuclear power, but the June 28 fire at the Kruemmel plant, near Hamburg, has put the industry in a bad light.

Still, nuclear power has won some powerful allies in the environmental community, writes E Magazine editor Jim Motavalli on the website AlterNet.

He quotes Fred Krupp, president of Environmental Defense, as saying, "We should all keep an open mind about nuclear power." Jared Diamond, best-selling author of "Collapse," adds, "To deal with our energy problems we need everything available to us, including nuclear power," which, he says, should be "done carefully, like they do in France, where there have been no accidents."

Stewart Brand, who founded The Whole Earth Catalog and Whole Earth Review, concludes, "The only technology ready to fill the gap and stop the carbon dioxide loading of the atmosphere is nuclear power."

Environmentalists continue to push for more benign sources -- wind, biomass, geothermal, and solar, as well as greater conservation -- to power the world.

But The Washington Post reports that "Most of the technologies that could reduce greenhouse gases are not only expensive but would need to be embraced on a global scale...." The article continues:

"Many projections for 2030 include as many as 1 million wind turbines worldwide; enough solar panels to cover half of New Jersey, massive reforestation; a major retooling of the global auto industry; as many as 400 power plants fitted with pricey equipment to capture carbon dioxide and store it underground; and, most controversial, perhaps 350 new nuclear plants around the world." That kind of nuclear expansion in the US seems unlikely. The country hasn't licensed a new plant in more than 30 years, and the devilish political and scientific subject of radioactive waste disposal has yet to be fully addressed.

But that hasn't stopped other countries from pushing ahead. Russia "hopes to export as many as 60 nuclear power plants in the next two decades," The Christian Science Monitor reported this week, including what would be "the first-ever floating atomic power station" at sea. But noting an estimate from the Massachusetts Institute of Technology, The Salt Lake Tribune in Utah reports that "at least 1,000 new nuclear plants would be needed worldwide in the next 50 years to make a dent in global warming."


THE DEADLIEST AIR POLLUTION ISN'T BEING REGULATED OR EVEN MEASURED

By Peter Montague

In the U.S., the deadliest air pollution is not being regulated or even measured as it kills tens of thousands of people each year.

The culprit is the smallest particles of airborne soot, known as "ultrafines," which are emitted routinely by diesel engines, automotive traffic, garbage incinerators, and power plants burning coal, oil, natural gas. and biomass.

Ultrafine particles kill in at least a half-dozen different ways, including (but not limited to) cancer, heart attack and stroke (initiating and worsening atherosclerosis), by narrowing the airways (contributing to chronic obstructive pulmonary disease (COPD) and asthma), and by causing a systemic inflammation response and initiating "oxidative stress," which drastically alters the chemistry inside cells and sets off a cascade of serious problems. Damaged lungs retain a higher proportion of ultrafines, compared to healthy lungs, and retaining more ultrafines causes more lung damage -- a positive feedback loop with negative consequences. For the past 20 years, new dangers from ultrafines have come to light each year, with no end in sight.

The people most endangered by ultrafines are urban dwellers, especially the elderly and anyone with a chronic illness (e.g., asthma, diabetes, COPD, heart problems) but there is evidence that commuters in their cars[16] and children at play[17] are also being harmed. Children exposed continuously to low levels of ultrafines can grow into adults with diminished lung function and shortened lives.

Furthermore there is growing evidence that women are affected somewhat more than men, and that people of color are affected more than whites, women probably because of smaller lung volume and people of color probably because of stress piled on more stress in their lives.

Finally, there is no observable safe level -- no threshold level below which symptoms disappear. Any exposure to ultrafines seems to cause some harm. The only safe level of exposure is zero.

Research over the past 18 years has revealed again and again that the greatest damage from air pollution is coming from the very smallest airborne particles of soot, yet government regulators continue to focus attention on the larger and less harmful particles, which are easier to measure. It's like the drunk searching for his keys under a street lamp, even though he knows he lost them many blocks away. "The light's better over here," he explains.

Airborne particles are classified into three groups -- coarse, fine and ultrafine. Coarse particles are those that measure between 10 micrometers in diameter down to 2.5 micrometers. A micrometer is a millionth of a meter and a meter is about 39 inches. A human hair typically measures about 100 micrometers in diameter, so the largest particles are about 1/10th the thickness of a human hair. These coarse particles are usually referred to as PM10 (particulate matter 10).

Your nose and throat can trap these large particles (on sticky surfaces, for example), to prevent them from entering your lungs. After they are trapped, you eventually excrete them.

"Fines" are particles that measure between 2.5 micrometers and 0.1 micrometers; they are typically referred to as PM2.5. The smallest of these particles are small enough to get into the lower portions of your lungs. There they may be removed by several clearance mechanisms, but very slowly. Their "half-life" in the human lung is five years, meaning that a certain dose today will have diminished by half five years from now. At that rate, a dose today will stay with you for 50 years.

Some fine particles can cause serious harm before they are excreted because their surfaces are typically covered with organic chemicals and metals, which are carried into your airways. And some of these PM2.5 particles can pass directly into your blood stream, carrying their load of metals and organics with them, and distributing them throughout your body.

The federal government began setting standards for PM10 in 1987 and for PM2.5 in 1997 after several studies revealed that fines were killing an estimated 60,000 people each year in the U.S. -- far more than were being killed by traffic accidents. A much larger number of people were (and are) being made sick with lung and heart problems. Corporations challenged the 1997 rules, which were finally upheld by the U.S. Supreme Court in 2001.

But long before the government began to regulate "fines," many studies had revealed that the actual killing was really being done by the smallest particles of all, called "ultrafines," which the government has so far refused to regulate or even measure.

Ultrafines vary in size from 0.1 micrometers down to 0.001 micrometers (or 100 nanometers down to 1 nanometer in diameter; a nanometer is a billionth of a meter). The largest of these particles has a diameter 1/1000th of the width of a human hair, and the smallest has a diameter 1/100,000th of a human hair. Relatively few such particles occur in nature, so our bodies have evolved no efficient means for protecting us against them.

In typical urban air, ultrafines account for only 1 to 5 percent of all airborne particles by weight, yet a typical person breathing the air in Los Angeles will inhale 200 billion (2E11) ultrafine particles every day and retain half of those in their lungs.

As particles get smaller, their surface area gets larger in relation to their volume (a physicist might say, as their diameter decreases, their surface-to-volume ratio increases). The very smallest particles, which are present in the largest numbers, have an enormous surface area compared to larger particles. This large surface area provides a perfect place for airborne toxicants to glom on and be carried efficiently into the deepest portions of your lungs.

Of course your lungs are in the business of transferring oxygen from the air directly into your blood stream. Unfortunately, they do the same thing with ultrafines. healthy lungs retain about of 50% of the ultrafines you breathe[26] and a substantial number of those are passed directly into your blood stream. While in your lungs, ultrafines cause an inflammation response by creating "free radicals" from oxygen, which then combine with your lung tissues in destructive ways.[13,4] Ultrafines in your blood stream provoke an immune response that can include coagulation (thickening) of the blood -- leading in some cases to heart attacks and strokes.

Current government regulations do not take into consideration the number of particles present in air -- only the total weight of the particles. So, for example, the government says it is OK for local air to contain 35 micrograms of PM2.5 in each cubic meter of air averaged over one hour, or 15 micrograms of PM2.5 in each cubic meter averaged over a year's time. A microgram is a millionth of a gram and there are 28 grams in an ounce.

The assumption of this approach is that the total weight of particles is related in some consistent way to the number of particles in the air. Unfortunately in the few cases where this assumption has been tested, it has found to be false.[15,24,25,26] The total number of particles in the air can vary independently of the total weight of particles in the air. This means that, if you want to know the level of danger from fine particles in the air, you need to count particles, not weigh them.

Worse, regulating the weight of particles instead of the number of particles may actually make air pollution more dangerous. Reducing the number of large particles (which is the easiest way to reduce the weight of pollution, to comply with the law) can lead to an increase in the release of ultrafine particles because larger particles serve as a "magnet" for the smallest particles. As the larger particles are removed from air emissions, ultrafine particles have fewer "magnets" to hook onto and thus can enter the ambient air in greater numbers, increasing the danger to human health.

A writer in Science magazine observed in 2005, "Controlling only mass [of airborne particles], as EPA does now, might actually be counterproductive. For example, if larger PM2.5 particle levels go down but levels of ultrafines do not, "that could make things worse," [Mark W.] Frampton says. That's because ultrafines tend to glom onto larger PM2.5 particles, so they don't stay in the air as long when more larger particles are around."

So the most dangerous forms of air pollution -- ultrafines -- which are killing tens of thousands of people each year, are not regulated and are not even being measured.

Worst of all, the "nanotechnology" industry is now ramping up manufacturing facilities to intentionally make ultrafines in ton quantities. Until recently, ultrafines have been created as an unwanted byproduct of combustion. But now ultrafines are being purposefully manufactured for use in tires, fuel cells, electronics, personal care products like sun screens, and many other products.

This will certainly create new occupational hazards -- and will certainly lead to release of ultrafines into the general environment, through products discarded and through spills, leaks and other glitches.

Do we know everything we need to know about the hazards of ultrafines? We do not. Do we know enough to act? We certainly do. As Jocelyn Kaiser summarized it in Science magazine in 2005, "Environmental and health groups, as well as many scientists, say that, as with tobacco smoke and lung cancer, policymakers can't wait for all the scientific answers before taking action to prevent deaths from dirty air."[8] Tens of thousands of lives stand to be saved each year by aggressive action to curb ultrafines. Prevention is possible -- and that's good news.

With 20 years of data to rely on, it is long past time for ultrafines to be strictly controlled -- and controlled by number, not merely by weight.

Citizens concerned about new power plants, new diesel engines, or new incinerators have a right to insist on detailed information about ultrafine emissions. Will new technologies make things worse by emitting larger numbers of deadly ultrafines, even as they reduce the total weight of emissions? Given the available data, it's a fair question.

Additional reading

Brauer, Michael, and others. "Air Pollution and Retained Particles in the Lung." Environmental Health Perspectives Vol. 109, No. 10 (October 2001), pgs. 1039-1043. https://tinyurl.com/2uzp2d

Hunt, Andrew. "Toxicologic and Epidemiologic Clues from the Characterization of the 1952 London Smog Fine Particulate Matter in Archival Autopsy Lung Tissues." Environmental Health Perspectives Vol. 111, No. 9 (July 2003), pgs. 1209-1214. https://tinyurl.com/3yuf27

Nemmar, Abderrahim, and others. "Ultrafine Particles Affect Experimental Thrombosis in an In Vivo Hamster Model." American Journal of Respiratory and Critical Care Medicine Vol. 166 (2002), pgs. 998-1004. https://tinyurl.com/374q2g

Renwick, L.C., and others. "Impairment of Alveolar Macrophage Phagocytosis by Ultrafine Particles." Toxicology and Applied Pharmacology Vol. 172 (2001), pgs. 119-127. https://tinyurl.com/2knz9z

Seaton, A., and M. Dannenkamp. "Hypothesis: Ill health associated with low concentrations of nitrogen dioxide -- an effect of ultrafine particles?" Thorax Vol. 58 (2003), pgs. 1012-1015. https://tinyurl.com/35muq4

From: Washington Post (pg. A2), Jul. 8, 2007

RESEARCH LINKS LEAD EXPOSURE, CRIMINAL ACTIVITY

By Shankar Vedantam, Washington Post Staff Writer

Rudy Giuliani never misses an opportunity to remind people about his track record in fighting crime as mayor of New York City from 1994 to 2001.

"I began with the city that was the crime capital of America," Giuliani, now a candidate for president, recently told Fox's Chris Wallace. "When I left, it was the safest large city in America. I reduced homicides by 67 percent. I reduced overall crime by 57 percent."

Although crime did fall dramatically in New York during Giuliani's tenure, a broad range of scientific research has emerged in recent years to show that the mayor deserves only a fraction of the credit that he claims. The most compelling information has come from an economist in Fairfax who has argued in a series of little-noticed papers that the "New York miracle" was caused by local and federal efforts decades earlier to reduce lead poisoning.

The theory offered by the economist, Rick Nevin, is that lead poisoning accounts for much of the variation in violent crime in the United States. It offers a unifying new neurochemical theory for fluctuations in the crime rate, and it is based on studies linking children's exposure to lead with violent behavior later in their lives.

What makes Nevin's work persuasive is that he has shown an identical, decades-long association between lead poisoning and crime rates in nine countries.

"It is stunning how strong the association is," Nevin said in an interview. "Sixty-five to ninety percent or more of the substantial variation in violent crime in all these countries was explained by lead."

Through much of the 20th century, lead in U.S. paint and gasoline fumes poisoned toddlers as they put contaminated hands in their mouths. The consequences on crime, Nevin found, occurred when poisoning victims became adolescents. Nevin does not say that lead is the only factor behind crime, but he says it is the biggest factor.

Giuliani's presidential campaign declined to address Nevin's contention that the mayor merely was at the right place at the right time. But William Bratton, who served as Giuliani's police commissioner and who initiated many of the policing techniques credited with reducing the crime rate, dismissed Nevin's theory as absurd. Bratton and Giuliani instituted harsh measures against quality-of-life offenses, based on the "broken windows" theory of addressing minor offenses to head off more serious crimes.

Many other theories have emerged to try to explain the crime decline. In the 2005 book "Freakonomics," Steven D. Levitt and Stephen J. Dubner said the legalization of abortion in 1973 had eliminated "unwanted babies" who would have become violent criminals. Other experts credited lengthy prison terms for violent offenders, or demographic changes, socioeconomic factors, and the fall of drug epidemics. New theories have emerged as crime rates have inched up in recent years.

Most of the theories have been long on intuition and short on evidence. Nevin says his data not only explain the decline in crime in the 1990s, but the rise in crime in the 1980s and other fluctuations going back a century. His data from multiple countries, which have different abortion rates, police strategies, demographics and economic conditions, indicate that lead is the only explanation that can account for international trends.

Because the countries phased out lead at different points, they provide a rigorous test: In each instance, the violent crime rate tracks lead poisoning levels two decades earlier.

"It is startling how much mileage has been given to the theory that abortion in the early 1970s was responsible for the decline in crime" in the 1990s, Nevin said. "But they legalized abortion in Britain, and the violent crime in Britain soared in the 1990s. The difference is our gasoline lead levels peaked in the early '70s and started falling in the late '70s, and fell very sharply through the early 1980s and was virtually eliminated by 1986 or '87.

"In Britain and most of Europe, they did not have meaningful constraints [on leaded gasoline] until the mid-1980s and even early 1990s," he said. "This is the reason you are seeing the crime rate soar in Mexico and Latin America, but [it] has fallen in the United States."

Lead levels plummeted in New York in the early 1970s, driven by federal policies to eliminate lead from gasoline and local policies to reduce lead emissions from municipal incinerators. Between 1970 and 1974, the number of New York children heavily poisoned by lead fell by more than 80 percent, according to data from the New York City Department of Health.

Lead levels in New York have continued to fall. One analysis in the late 1990s found that children in New York had lower lead exposure than children in many other big U.S. cities, possibly because of a 1960 policy to replace old windows. That policy, meant to reduce deaths from falls, had an unforeseen benefit -- old windows are a source of lead poisoning, said Dave Jacobs of the National Center for Healthy Housing, an advocacy group that is publicizing Nevin's work. Nevin's research was not funded by the group.

The later drop in violent crime was dramatic. In 1990, 31 New Yorkers out of every 100,000 were murdered. In 2004, the rate was 7 per 100,000 -- lower than in most big cities. The lead theory also may explain why crime fell broadly across the United States in the 1990s, not just in New York.

The centerpiece of Nevin's research is an analysis of crime rates and lead poisoning levels across a century. The United States has had two spikes of lead poisoning: one at the turn of the 20th century, linked to lead in household paint, and one after World War II, when the use of leaded gasoline increased sharply. Both times, the violent crime rate went up and down in concert, with the violent crime peaks coming two decades after the lead poisoning peaks.

Other evidence has accumulated in recent years that lead is a neurotoxin that causes impulsivity and aggression, but these studies have also drawn little attention. In 2001, sociologist Paul B. Stretesky and criminologist Michael Lynch showed that U.S. counties with high lead levels had four times the murder rate of counties with low lead levels, after controlling for multiple environmental and socioeconomic factors.

In 2002, Herbert Needleman, a psychiatrist at the University of Pittsburgh, compared lead levels of 194 adolescents arrested in Pittsburgh with lead levels of 146 high school adolescents: The arrested youths had lead levels that were four times higher.

"Impulsivity means you ignore the consequences of what you do," said Needleman, one of the country's foremost experts on lead poisoning, explaining why Nevin's theory is plausible. Lead decreases the ability to tell yourself, "If I do this, I will go to jail."

Nevin's work has been published mainly in the peer-reviewed journal Environmental Research. Within the field of neurotoxicology, Nevin's findings are unsurprising, said Ellen Silbergeld, professor of environmental health sciences at Johns Hopkins University and the editor of Environmental Research.

"There is a strong literature on lead and sociopathic behavior among adolescents and young adults with a previous history of lead exposure," she said.

Two new studies by criminologists Richard Rosenfeld and Steven F. Messner have looked at Giuliani's policing policies. They found that the mayor's zero-tolerance approach to crime was responsible for 10 percent, maybe 20 percent, at most, of the decline in violent crime in New York City.

Nevin acknowledges that crime rates are rising in some parts of the United States after years of decline, but he points out that crime is falling in other places and is still low overall by historical measures. Also, the biggest reductions in lead poisoning took place by the mid-1980s, which may explain why reductions in crime might have tapered off by 2005. Lastly, he argues that older, recidivist offenders -- who were exposed to lead as toddlers three or four decades ago -- are increasingly accounting for much of the violent crime.

Nevin's finding may even account for phenomena he did not set out to address. His theory addresses why rates of violent crime among black adolescents from inner-city neighborhoods have declined faster than the overall crime rate -- lead amelioration programs had the biggest impact on the urban poor. Children in inner-city neighborhoods were the ones most likely to be poisoned by lead, because they were more likely to live in substandard housing that had lead paint and because public housing projects were often situated near highways.

Chicago's Robert Taylor Homes, for example, were built over the Dan Ryan Expressway, with 150,000 cars going by each day. Eighteen years after the project opened in 1962, one study found that its residents were 22 times more likely to be murderers than people living elsewhere in Chicago.

Nevin's finding implies a double tragedy for America's inner cities: Thousands of children in these neighborhoods were poisoned by lead in the first three quarters of the last century. Large numbers of them then became the targets, in the last quarter, of Giuliani-style law enforcement policies.

Copyright 2007 The Washington Post Company

From: Associated Press, Jul. 3, 2007

ASIA-PACIFIC COUNTRIES SEE EFFECTS OF CLIMATE CHANGE ON HEALTH

By Margie Mason

KUALA LUMPUR, Malaysia (AP) -- Rising temperatures are contributing to more landslides in Nepal, dengue fever cases in Indonesia and flooding in India, threatening to put an even greater strain on health systems across the Asia-Pacific region.

Health officials from more than a dozen countries, ranging from tiny Maldives to China, met Tuesday in Malaysia to outline health problems they are experiencing related to climate change. They discussed ways to work together to limit the impact in a region expected to be hit hard by flooding, drought, heat waves, and mosquito- and waterborne diseases.

The World Health Organization estimates climate change has already directly or indirectly killed more than 1 million people globally since 2000. More than half of those deaths have occurred in the Asia- Pacific, the world's most populous region. Those figures do not include deaths linked to urban air pollution, which kills about 800,000 worldwide each year, according to WHO.

"We're not going to have a magic bullet to fix climate change in the next 50 years. We need to motivate an awful lot of people to change their behavior in a lot of different ways," said Kristie Ebi of WHO's Global Environmental Change unit, a lead author of the health chapter in a report by the Intergovernmental Panel on Climate Change, a U.N. network of 2,000 scientists.

Ebi said health officials are about a decade behind other sectors, such as water and agriculture, in taking a look at what climate change could mean and how to deal with it. She said countries seeing the effects firsthand are now starting to realize that any problems with air, water or food will directly affect people's health. The poorest countries in Asia and Africa are expected to suffer the most.

Scientists have predicted droughts will lower crop yields and raise malnutrition in some areas, dust storms and wildfires will boost respiratory illnesses, and flooding from severe storms will increase deaths by drowning, injuries and diseases such as diarrhea. Rising temperatures could lead to the growth of more harmful algae that can sicken people who eat shellfish and reef fish. People living in low- lying coastal areas will also face more storms, flooding, and saltwater intrusion into fresh groundwater that is vital for drinking.

Many health systems in poor Asian countries are already overwhelmed with diseases like HIV/AIDS and tuberculosis, and officials have been under intense international pressure to combat bird flu outbreaks and prepare for a pandemic despite limited resources.

But tackling current pressing diseases, and investing more in public health systems overall, will help prepare countries for the future effects of global warming while saving money in the long run, said Dr. Shigeru Omi, head of the WHO's Western Pacific region.

"The economic impact will be seen eventually," he said, adding water scarcity could create a worst-case scenario that produces political instability. "I think it will pay off if we take action now."

Globalization, urbanization and the rapid development of many Asian countries are also fueling climate change that's already noticeable. Last month, China passed the United States to become the largest greenhouse gas emitter, according to the Netherlands Environmental Assessment Agency.

Singapore saw mean annual temperatures increase 2.7 degrees Fahrenheit between 1978 and 1998, while the number of dengue fever cases jumped 10-fold during the same period.

Malaria has recently reached Bhutan and new areas in Papua New Guinea for the first time. In the past, mosquitoes that spread the disease were unable to breed in the cooler climates there, but warmer temperatures have helped vector-borne diseases to flourish.

Melting of glaciers in the Himalayas has also created about 20 lakes in Nepal that are in danger of overflowing their banks, which could create a torrent of water and debris capable of wiping out villages and farms below.

Omi said governments can offer tax incentives to help motivate companies to become more environmentally friendly, while pushing for energy-efficient technologies and greener buildings. Promoting walking and bicycling, instead of driving, can also improve overall health while saving the environment.

The four-day workshop in Malaysia lays the groundwork for a ministerial-level meeting on the topic next month in Bangkok, Thailand


NEW POLLUTANTS OVERTAKING 'DIRTY DOZEN' IN NORTH, SCIENTIST WARNS

The Arctic is being polluted by newer, hard-to-detect chemicals that are overtaking toxins that have been in the North for years, a Canadian researcher says.

Perfluorinated chemicals, which include compounds used to repel water and oil, and other toxins have escaped detection by environmental monitoring systems, said Scott Mabury, an environmental chemist at the University of Toronto.

"The amount of these particular pollutants, these fluorinated pollutants that are in the Arctic and certainly high up in the food chain, was a surprise because they now rival or exceed most of the other pollutants that have been around for decades," such as DDT and PCBs, Mabury told CBC News on Monday.

The newer chemicals, often found in stain-resistant carpeting, water- resistant clothing, electronics and industrial goods, have been linked to cancer and other problems affecting the immune system.

Toxins generally migrate to the North via the atmosphere or water currents. In 2004, governments restricted the use of organic pollutants known as the "dirty dozen" -- including DDT, PCBs (polychlorinated biphenyls) and dioxins -- under the Stockholm Convention on Persistent Organic Pollutants.

Mabury's research team has found perfluorinated chemicals and other modern pollutants in Devon ice cap, as well as the water, air and wildlife in the North, and said those pollutants should also be restricted. They tend to stay in the environment for a long time and move up the food chain, he added.

He said researchers will have to find newer and better ways to detect toxins in the environment.

"There are a number of compounds used in industrial and consumer goods that haven't acquired much attention, some of which we think could be highly persistent and potentially of environmental importance." Birgit Braune, a research scientist with Environment Canada, agreed that newer pollutants now in the North need to be restricted under international agreements.

"I definitely feel that we should pursue this and it is being pursued even as we speak, so there is action being taken," she said.

Braune added there should also be tighter controls over chemicals in consumer and industrial goods before they go to market.

From: CPCHE, Jun. 15, 2007

FATHER'S DAY REPORT NOTES GREATER ENVIRONMENTAL RISKS TO BOYS

Report urges precaution and increased awareness

Ottawa and Toronto: In a report released for Father's Day, the Canadian Partnership for Children's Health and Environment urges greater awareness among parents, especially fathers, about environmental risks to boys. (Full report available here and report summary available here.)

"All children are at risk from exposure to environmental hazards, but boys appear to be at greater risk," said Dr. Lynn Marshall, with the Ontario College of Family Physicians.

The report summarizes the evidence about environmental risks to boys. "For health outcomes such as asthma, cancer, learning and behavioural problems and birth defects, the boys are faring worse than the girls," noted Loren Vanderlinden, with Toronto Public Health.

We know that the time of greatest vulnerability for children is in the womb. It appears that boys are even more vulnerable than girls during these critical developmental stages. Brain development in boys is of particular concern. "Four times more boys than girls are affected by autism and ADHD. Boys are also at increased risk for learning disabilities, Tourette's syndrome, cerebral palsy and dyslexia," noted Kathleen Cooper, with the Canadian Environmental Law Association.

The report summarizes what is known about environmental links to health outcomes in children, noting the many areas of uncertainty. Given the risks of lifelong impacts, it is better to be safe than sorry. Like CPCHE's other educational materials, the CPCHE Father's Day report seeks to raise public awareness. Fathers and all members of society can take action to reduce or prevent environmental or occupational exposures that can affect a fetus or child.

Kathleen Cooper, Senior Researcher, Canadian Environmental Law Association 705-324-1608

Loren Vanderlinden, Supervisor, Environmental Health Assessment & Policy, Environmental Protection Office, Toronto Public Health 416-338-8094

Dr. Lynn Marshall, co-chair, Environmental Health Committee, Ontario College of Family Physicians 905-845-3462

From: Environmental Science & Technology, Jun. 27, 2007

PBDES LINKED TO COMMON BIRTH DEFECT IN BOYS

By Kellyn S. Betts

Scientists have long suspected that children may be especially vulnerable to the endocrine-disrupting effects of polybrominated diphenyl ether (PBDE) flame retardants because the main route of exposure to the chemicals is through consumer products in the home. A new study by Katharina Maria Main of Rigshospitalet [2.8 Mbyte PDF], part of the Copenhagen University Hospital, is the first to link elevated PBDE levels with a human birth defect.

The study, published online May 31 in Environmental Health Perspectives, associates cryptorchidism, a condition in which one or both testicles fail to descend into the scrotum, with higher concentrations of PBDEs in breast milk. The incidence of cryptorchidism is increasing rapidly in some countries, which suggests that environmental factors may be involved, according to the paper. Main and her colleagues found that PBDE concentrations in the breast milk of Danish and Finnish mothers of sons born with undescended testicles were significantly higher than those in the breast milk of mothers of sons with normal testicles.

Because testicular descent is strongly androgen-dependent, the researchers say that the new findings are in line with a 2005 study showing that PBDEs are antiandrogenic in mice. They also point out that testicular cancer is the most severe symptom of testicular dysgenesis syndrome, which also includes cryptorchidism. In 2006, Swedish researchers linked early-onset testicular cancer with higher levels of maternal PBDEs.

The new findings aren't clear-cut, because researchers saw no correlation between PBDE levels in the cord blood of infants in the study and the incidence of cryptorchidism. Why this is the case is not clear, the researchers write. They posit that the combined exposure to multiple environmental factors may be responsible for the link they observed between PBDE concentrations in mother's milk and cryptorchidism.

From: Globe and Mail (Toronto, Canada) (pg. A4), Apr. 11, 2007

THE MYSTERY OF THE MISSING BOYS;

Chemical pollutants flagged in new study as possible factor in skewed sex ratio

By Martin Mittelstaedt, Environment Reporter

Where are all the missing boys?

It is a question posed by a new study that has found the proportion of boys born over the past three decades has unexpectedly dropped in both the United States and Japan. In all, more than a quarter of a million boys are missing, compared to what would have been expected had the sex ratio existing in 1970 remained unchanged.

The study also says the world's most skewed sex ratio is in Canada, in a native community surrounded by petrochemical plants in Sarnia, Ont., where the number of boys born has plunged since the mid-1990s at a rate never seen.

Although the researchers do not know why boys are taking a hit, they suspect contributing causes could include widespread exposure to hormone-mimicking pollutants by women during pregnancy and by men before they help conceive children.

"We hypothesize that the decline in sex ratio in industrial countries may be due, in part, to prenatal exposure to metalloestrogens and other endocrine disrupting chemicals," said the study, issued this week in Environmental Health Perspectives, a peer reviewed journal of the U.S. National Institute of Environmental Health Sciences.

These types of chemicals include some pesticides, dioxin and methylmercury, a pollutant from coal-fired power plants and many industrial sources commonly found in seafood.

The study also flagged a host of other possible factors, including rising obesity rates, older parental age, growing stress levels, and the increasing number of children being conceived using fertility aides. Other research has shown some associations between these factors and a drop in boy births.

The study was conducted by researchers in both the U.S. and Japan, and led by Devra Lee Davis, a prominent epidemiologist and director of the Center for Environmental Oncology at the University of Pittsburgh Cancer Institute.

In an interview, Dr. Davis said that although the cause of the decline isn't known, it could be linked to the increasing number of other male reproductive problems, such as falling sperm counts and rising testicular cancer rates.

She said that males during fetal development may be more sensitive to pollutants that mimic hormones, leading to increased fetal deaths and reproductive problems later for the surviving males.

The situation in Sarnia, where nearly twice as many girls are being born than boys on the Aamjiwnaang First Nation, is internationally significant, according to the study. "To our knowledge, this is a more significantly reduced sex ratio and greater rate of change than has been reported previously anywhere," it said.

The reserve is located in the heart of Sarnia's chemical valley, and the native community, along with researchers at the University of Rochester and the Occupational Health Clinics for Ontario Workers, are trying to find the cause of the unusual sex ratio.

Fewer boys than expected are being born in the non-aboriginal community downwind of the petrochemical plants in the area, but not to the same degree as on the reserve. The work force in Sarnia has not been studied, something that would shed light on whether pollutants are the cause.

Researchers in many countries have been reporting a drop in the ratio of boys to girls being born over the past few decades.

It is considered normal in a large population for the number of baby boys to slightly outnumber girls, by a proportion of about 105 males to 100 females. It is widely thought that more boy births are a way nature compensates for higher rates of male mortality.

But the ratio has not been static in industrialized countries, and researchers suspect that increasing numbers of male fetuses are being miscarried, a kind of sex-based culling in the womb.

In Japan, the sex ratio fluctuated with no trend from 1949 to 1970, but then declined steadily to 1999, the end of the study period there.

The decline in the number of boys in Japan equals 37 out of every 10,000 births.

In the U.S., the sex ratio also declined from 1970 to 2002. The drop in the number of boys equals 17 out of every 10,000 births.

The U.S. change was concentrated among whites. There was almost no change among blacks.

From: Center for Public Integrity, Apr. 26, 2007

WASTING AWAY: SUPERFUND'S TOXIC LEGACY

Massive undertaking to clean up hazardous waste sites has lost both momentum and funding

By Joaquin Sapien, with data analysis Richard Mullins

WASHINGTON, April 26, 2007 -- Communities across America face a daunting threat from hazardous waste sites -- some near neighborhoods and schools -- 27 years after the federal government launched the landmark Superfund program to wipe out the problem, a Center for Public Integrity investigation has found.

Initiated in 1980, Superfund is desperately short of money to clean up abandoned waste sites, which has created a backlog of sites that continue to menace the environment and, quite often, the health of nearby residents.

Nearly half of the U.S. population lives within 10 miles of one of the 1,304 active and proposed Superfund sites listed by the Environmental Protection Agency, according to the Center's analysis of these sites and U.S. Census data of the 2000 population.

In its investigation, the Center reviewed data, obtained from the EPA through more than 100 Freedom of Information Act requests, and interviewed dozens of experts inside and outside the agency, which administers Superfund.

Among the findings:

Cleanup work was started at about 145 sites in the past six years, while the startup rate was nearly three times as high for the previous six years.

During the last six years, an average of 42 sites a year reached what the EPA calls "construction complete," compared with an average of 79 sites a year in the previous six years. Construction complete is reached when all the cleanup remedies have been installed at a site.

Lacking sufficient funding, EPA officials said they have had to delay needed work at some hazardous sites, use money left over from other cleanups -- which itself is dwindling -- and resort to cheap, less effective fixes.

While some companies say they have paid their fair share for cleanups, the amount of money Superfund is getting back from other companies in reimbursements for cleanups has steadily declined. The amount of money the agency recovered from those companies has fallen by half in the past six fiscal years, compared with the previous six years, 1995 through 2000. Recovered costs peaked in the fiscal years 1998 and 1999, at about $320 million each year. By fiscal 2004, collected cost recoveries had dropped well below the $100 million mark. In the last two fiscal years, 2005 and 2006, the EPA collected about $60 million each year.

The backlog of sites needing cleanup is growing while the money allocated to do the work is running out, according to former and current EPA officials familiar with Superfund.

Superfund officials keep details about the program secret, meeting behind closed doors to rank which sites are the most dangerous and in need of immediate attention. The ranking is "confidential" because the agency does not want polluters to know which sites are a priority and which ones aren't. Some EPA insiders say the secrecy is intended to avoid provoking the public into demanding a solution from Congress.

"Obviously all these problems stem from a lack of funding, and it is disturbing that EPA is keeping this a secret rather than going to Congress and trying to get more money," said Alex Fidis, an attorney who deals with Superfund issues for U.S. PIRG, a public-interest advocacy group.

Superfund sites are areas contaminated with hazardous material and left by corporate or government entities whose operations may have moved. They can be old landfills, abandoned mines or defunct military complexes.

In some cases, one company is responsible for the pollution at a site; others, like landfills, can have hundreds of "potentially responsible parties" (PRPs), making a coordinated cleanup effort difficult. A single site can take years and hundreds of millions of dollars to clean up.

By the EPA's accounting, Superfund has cleaned up only 319 sites to the point where they can be deleted from the list. Another 1,243 are active and an additional 61 are proposed, which brings the total number of sites ever involved in the Superfund program to 1,623.

Pollution continues

Sites where contamination has been blamed for deaths, caused cancer or poisoned children have existed for decades.

In Libby, Mont., where a plume of asbestos from a nearby vermiculite mine has enveloped the town, more than 200 people have died from asbestos-related diseases, according to EPA estimates. Cleanup at the site began in 2000.

In Smelterville, Idaho, where the nation's worst childhood lead- poisoning epidemic occurred, due, in part, to a 1973 fire at a nearby lead smelter, experts warn that some homes still may have high levels of lead without the owners' knowledge because the homes have not been sampled. Lead is a neurotoxin that is especially dangerous for young children, affecting their mental and physical growth.

Along the Hudson River in upstate New York, where more than a million pounds of polychlorinated biphenyls (PCBs) were dumped by General Electric Co., the New York Department of Health has linked high PCB blood levels to consumption of fish caught in the river. PCBs are considered a probable carcinogen by the EPA and the World Health Organization.

Contamination at these sites has been well documented for decades, and each of the sites has been on the National Priorities List -- a compilation of the toxic waste sites the EPA considers to be the most dangerous -- for at least five years. Cleanup has not been completed at any of them.

For the past 11 years, the EPA has convened a panel of representatives from each of its 10 regions twice a year to decide which sites deserve immediate financial attention. The rankings are based on the site's risk to the surrounding community, the environment and, to an extent, public concern, according to the agency.

But the meetings of the National Risk-Based Priority Panel are closed, and the list of sites that comes out of these sessions is an "enforcement confidential" document, meaning it is off-limits to the public.

Besides the head of the Superfund program and the members of the panel who rank the sites, no one knows which ones are on track to receive funding and get cleaned up, which are not, and why, the Center study found.

Susan Bodine, the top-ranking Superfund official, told the Center that the list is kept confidential to prevent polluters from taking advantage of the EPA's funding decisions; agency insiders, however, say the EPA wants to leave the public in the dark because the agency does not want citizens to turn to Congress for help.

In a statement to the Center, the EPA defended its record, stating that more than 1,000 of the current Superfund sites are "construction complete." The EPA defines this stage as all physical cleanup systems are in place, all immediate threats are eliminated and all long-term threats are under control.

The construction complete phase can take years to reach complete cleanup and deletion from the Superfund list and can involve years of EPA monitoring, reviews and evaluations.

The statement said, however, that "the term 'construction complete' does not indicate that all cleanup goals at a given site have been met. Some sites that achieve 'construction complete' status are determined to be safe for particular uses, others are not."

But the number of construction completions has also been declining: there have been half as many over the past six years, compared with 1995 through 2000. EPA data show exactly 40 construction completions for each of the past four fiscal years.

And, according to recent EPA data, at nearly 40 of these sites considered "construction complete," human exposure to dangerous substances or migration of contaminated groundwater off the site are not under control.

Love Canal legacy

The Superfund program was launched in 1980 in the wake of a national tragedy that unfolded at Love Canal, N.Y. Lois Gibbs, a housewife- turned-activist who would come to be known as the "Mother of Superfund," discovered that her family's and neighbors' sickness could be traced to toxic waste buried underneath her hometown decades earlier by Occidental Petroleum Co.

Initially, the program was funded by a tax on polluters, which fed the actual "Superfund," a pool of money used to pay for the cleanup of sites whose polluters were unknown or unable to do the work. But the tax law expired in 1995, under a Republican-controlled Congress, and the $3.8 billion that had accumulated in the fund at its peak ran dry in 2003.

The program is now funded with taxpayer dollars and money that the EPA manages to recover from polluters for work the agency has done at their sites.

But Superfund's budget has not kept up with inflation. In 1995, the program received $1.43 billion in appropriations; 12 years later, it received $1.25 billion. In inflation-adjusted dollars, funding has declined by 35 percent.

Elliott Laws, an environmental lawyer who was Bill Clinton's Superfund chief, sees that as a problem. "What you've got isn't buying as much as it once could," he said.

Financial constraints are so severe that much of the program's cleanup money is being spent on 10 to 12 large projects, according to the EPA. With less money, the EPA has also started looking at the cheapest remedies when it's paying the bill, critics say.

At an abandoned creosote factory in Pensacola, Fla., for example, plans are underway to place a giant tarp and layers of clay and soil over a nearly 600,000-cubic-yard mound of chemical waste -- a measure that many observers consider inadequate and inefficient, largely because the community's groundwater could become contaminated.

"I think funding is a very important part of what is happening at this site and all orphan sites," said Frances Dunham, an environmental activist with a grassroots nonprofit organization called Citizens Against Toxic Exposure. The EPA's public report on PRPs shows nearly 400 "orphan sites," meaning the agency hasn't found any viable parties it could force to pay for cleanup costs at those sites.

Over the past several years, funding constraints have forced sites ranked by the National Risk-Based Priority Panel to compete for money left over from cleanups completed in previous years. These "deobligated funds" make up a significant amount of the money used to clean up sites that are ready to receive funding.

According to Bill Murray, who has served on the EPA's risk panel for eight years, agency staffers have been told in recent years "to get into those cupboards and scrape together those crumbs" -- referring to the deobligated funds.

Now, even those crumbs are running out.

"This year is going to be a tough year in terms of harvesting more deobligations, and I expect next year will be, too," Murray said.

"It is like having four sick kids at a table, and you only have one aspirin," said Love Canal's Gibbs. "You can't decide which one to give it to even though they all need assistance, and, like a Superfund site, those illnesses are going to get worse and those medical costs are going to get higher the longer it takes you to address the problem."

Another panel member, John Frisco, said cleanups at numerous sites have been stretched out over longer periods of time because there isn't enough money to get them done quickly and still pay for other ongoing cleanups.

"Those kinds of budget evaluations are something you never would have heard of 10 years ago but are now quite common," Frisco said.

As a result of the funding shortages, the EPA's cleanup plans for some sites are being more closely scrutinized, and sometimes delayed on purpose, according to Bradley Campbell, former commissioner of the New Jersey Department of Environmental Protection.

"There were particular cases where I was shown internal EPA correspondence where EPA staff was directed to find faults in the cleanup plan, because funding wasn't there for a cleanup otherwise ready to begin," Campbell said.

According to Fidis, the U.S. PIRG attorney, that poses a major problem to the public. "When you have a situation where site cleanups are being postponed, delayed or not investigated in a timely manner due to financial constraints, then you leave a threat to surrounding residents," he said. "There could be an increased likelihood of groundwater contamination or potential for humans to come into contact with contaminated soil."

Superfund chief Bodine, who is assistant EPA administrator for the Office of Solid Waste and Emergency Response, acknowledged in an interview that deobligated funds have decreased in recent years.

She also told the Center in an interview that the EPA does not share the panel's list of prioritized sites with the public because the agency does not want polluters to know which sites it will be focusing on.

"Letting people know where we are planning to spend money is information we don't want responsible parties to have," Bodine said. "That is information they could use and take into account as they are negotiating settlements with us."

But some say that argument doesn't make sense. One reason these sites are ranked for Superfund attention in the first place is that the EPA has not identified a polluter capable of paying for the cleanup, say members of the panel.

"I think, in general, sites that go to the panel for ranking are submitted to the panel because there is not a viable alternative to [Superfund] funding," Frisco said. "And generally, that is because the site is truly abandoned."

Bodine said that the secrecy is necessary anyway, because a polluter with the ability to pay could be linked to a site after the EPA has already begun the cleanup.

That's unlikely for sites ranked by the panel, say those who have worked with the program. "In my experience, it's never happened," said former EPA deputy regional administrator Tom Voltaggio, who worked at the agency for more than 25 years.

Gibbs said she is troubled by this process. "The public thinks that these decisions are made based on data and threats to public health. They don't think people are sitting around a table trying to determine which site gets the scraps," she said.

One EPA official who is familiar with the panel said that some information is available to the public -- buried on the EPA Web site -- about the panel's site rankings, where the EPA annually reveals how many sites will receive "new construction funding" and how many will not. New construction funding goes to sites where cleanup is ready to begin as soon as money is secured.

Superfund shortfall

The EPA inspector general, the Government Accountability Office and Congress have all issued reports pointing out Superfund's funding shortfalls, and program experts have been recommending budget increases in light of the number of sites in the pipeline that will soon be ready to be funded.

But EPA officials have not requested more money. In fact, they have done the opposite.

As a staff member of the House Subcommittee on Water Resources and the Environment in 1999, seven years before her appointment to head Superfund, Bodine helped author a bill that called for a $300 million reduction in the program's budget. The bill did not make it to the House floor.

During her confirmation hearings for the EPA post, Bodine assured the Senate that she would be a "fierce advocate for Superfund funding."

A month after her confirmation in March 2006, Bodine supported a Bush administration call for a $7 million decrease, from $588.9 million to $581.5 million, in the EPA's remedial budget, which pays for site cleanups.

President Bush's latest EPA budget request for 2008 again sought to reduce Superfund's budget by $7 million.

In an interview with the Center, Bodine expressed confidence that the amount of money the program has been allocated is sufficient to get the job done.

Fewer Superfund sites listed

Overall, the number of Superfund sites listed per year has declined substantially in recent years, but that's not necessarily a good thing, depending on who is asked.

From 1995 to 2000, an average of 25 sites were added each year to the Superfund's National Priorities List. From 2001 to 2006, an average of 17 sites were added per year.

According to Superfund's Bodine, the numbers are shrinking because "the smaller sites are being addressed through the state voluntary cleanup programs so that there are fewer sites being brought forward to the EPA."

But Rena Steinzor, an environmental law professor at the University of Maryland, who co-wrote the 1986 amendments to the Superfund law as a congressional staffer, speculates that the EPA is trying to kill the program "by reducing the perception that it is needed."

Resources for the Future, a Washington-based environmental think tank, proposes this explanation: It's all about declining funding. In a 2001 book written for Congress on the subject, it says EPA managers have been cautious about listing larger, more expensive toxic waste sites to avoid "breaking the bank.... Sites that need cleanup are not being addressed because of funding concerns." The group's book recommends a budget increase, which never came.

Local activists, meanwhile, continue to wait for help that they fear may never come. Gibbs said that many communities have developed a deep distrust of the program.

"They know if they get listed, it's a 10- or 20-year process to get a site cleaned up," said Gibbs, who no longer lives in Love Canal and now heads the Center for Health, Environment and Justice, a nonprofit organization that assists communities struggling with hazardous waste issues.

She said she thought that Superfund was the "perfect solution," but that the program is no longer what it used to be. "It doesn't represent the positive image for communities that it once did," she said.

From: Los Angeles Times (pg. F3), Oct. 2, 2006

'SAFE' LEVELS OF LEAD MAY NOT BE THAT SAFE AFTER ALL

By Melissa Healy, Times Staff Writer

Efforts to reduce lead exposure in the United States have been a good news-bad news affair -- and the bad-news side of the ledger just got a bit longer.

Although the removal of most lead from gasoline and paint in the United States has driven exposure levels down -- way down from levels seen 30 years ago -- new research sharply lowers the level of lead exposure that should be considered safe. And it expands the population of people who need to worry about the toxic chemical.

Concern about lead exposure has long focused on children, who can suffer mental impairment and later fertility problems at elevated levels. More recently, children with blood levels of lead long considered safe have been found more likely to suffer from attention deficit hyperactivity disorder.

Among adults, elevated levels of lead exposure have been found in recent years to raise the risk of high blood pressure and kidney disease. But now comes news that levels long considered safe for adults are linked to higher rates of death from stroke and heart attack. The latest study was published in the Sept. 26 issue of the American Heart Assn.'s journal, Circulation.

Researchers used a comprehensive national health survey of American adults to track 13,946 subjects for 12 years and looked at the relationship of blood lead levels and cause of death. They found that compared with adults with very low levels of lead in their blood, those with blood lead levels of 3.6 to 10 micrograms of lead per deciliter of blood were two and half times more likely to die of a heart attack, 89% more likely to die of stroke and 55% more likely to die of cardiovascular disease. The higher the blood lead levels, the greater the risk of death by stroke or heart attack.

The dangers of lead held steady across all socioeconomic classes and ethnic and racial groups, and between men and women.

Study authors acknowledged that they were unsure how lead in the blood impaired cardiovascular functioning. But they surmised that it might be linked to an earlier finding: that lead exposure stresses the kidneys' ability to filter blood. Lead may also alter the delicate hormonal chemistry that keeps veins and arteries in good tone, the authors wrote.

Federal standards, however, don't reflect the new research. Although almost 4 in 10 Americans between 1999 and 2002 had blood lead levels in the newly identified danger range, the Occupational Safety and Health Administration, the federal agency that regulates toxic exposures in workplaces, considers up to 40 micrograms of lead per deciliter of blood to be safe for adults. And recommendations from the Centers for Disease Control and Prevention allow for up to 10 micrograms per deciliter for women of childbearing age.

Paul Muntner, an epidemiologist at Tulane University and one of the study's authors, says the findings suggest strongly that the federal government should revisit the limits of lead exposure it considers safe for adults. In total, about 120 occupations -- including roofing, shipbuilding, auto manufacturing and printing -- can bring workers in contact with high levels of lead.

For individuals, Muntner adds, the study underscores that every small bit of prevention is worth the trouble. Worried consumers can purchase lead-detection kits from hardware stores.

melissa.healy@latimes.com


THE RESPONSIBILITY GAP

By Steven G. Gilbert

From his uncle Ben, Spiderman learned that "With great power comes great responsibility." Humans now have incredible power to reshape the environment and affect human health, but we have yet to fully acknowledge the responsibility that this implies. One area in which we need to take more responsibility is around the manufacture, use, and disposal of chemicals.

It is estimated that there are more than 80,000 chemicals in commerce and 2,000 new chemicals are added each year. Unfortunately, we know very little about the specific health effects of these chemicals because industry has not generated or made available the data. We do know, however, that children are more vulnerable to the effects of these chemicals and that annual costs of childhood related disease due to environmental contaminates is in the range of $55 billion.[1]

Children and adults are exposed to a wide range of chemicals at home, school, workplace and from the products we use. Exposure to some of these chemicals can cause significant adverse health effects such as cancer, Parkinson's disease, immunological disorders and neurobehavioral deficits, resulting in a needless loss of potential for both the individual and society.

A significant report on chemical policy was developed by Mike Wilson and others that both defined the problem and suggested a more rational approach.[2] Their report identified three gaps that contribute to the current failed chemical policy: a data gap, a safety gap and a technology gap. The data gap addresses the need to have health effects information on chemicals and the public's right to know this information. The safety gap results from the government's inability to prioritize hazardous chemicals and its inability to obtain the needed information. The technology gap reflects the failure by either industry or government to invest in the development of more sustainable chemical processes such as green chemistry. To these three identified gaps, I suggest adding a fourth: the responsibility gap.

Responsibility -- An Overview

Humans have amassed an enormous amount of power to change the physical environment as well as affect human and environmental health. Aldo Leopold, America's first bioethicist, summarized our ethical responsibilities in a simple statement in 1949: "A thing is right when it tends to preserve the integrity, stability, and beauty of the biotic community. It is wrong when it tends otherwise."[3] When we expose our children to lead, mercury, or alcohol we are robbing them of their integrity, stability, and beauty. In essence we robbing them of their potential, reducing their ability to do well in school and to contribute to society.[4] We have the knowledge and must accept the responsibility to preserve the biotic community, which will preserve us and future generations. Key institutions in our society, as well as individuals, must address different aspects of a shared responsibility to ensure a sustainable biotic community.

Precautionary Principle and Responsibility

The precautionary principle is defined in the Wingspread Statement as: "When an activity raises threats of harm to human health or the environment, precautionary measures should be taken even if some cause and effect relationships are not fully established scientifically." It both acknowledges our power and implies responsibility.[5,6]

One of the central elements of the precautionary principle is that proponents of an activity or product must take responsibility to demonstrate its safety. This concept is applied to the development of new drugs. The Food and Drug Administration requires the pharmaceutical or biotech corporations to demonstrate both efficacy and safety of their products before they are approved for use by the public. This precautionary approach was adapted after several high profile disasters with drugs, such as thalidomide. The same concept and responsibility could be required of chemical manufactures, which would result in data-driven decisions on health and would drive a shift toward sustainable and safer chemicals.

Corporate responsibility

Under current corporate rules and regulations the primary responsibility of a corporation is to make money for its shareholders. Corporate management's primary responsibility is to increase the value of the corporation for its shareholders, which is accomplished by increasing revenue or product sales and by reducing or externalizing costs.

In 1994 an array of suited white male tobacco executives stood before the U.S. Congress Subcommittee on Health and the Environment and swore that nicotine was not addictive. This was clearly false, but they were protecting the interest of their corporations and shareholders to profit at the expense of people's health. The health effects of tobacco are borne by the individual and collectively through taxes and health care costs.

The tobacco companies have a long history of externalizing the health costs of their product onto tax payers while reaping profits for the executives and shareholders. Other corporations have also externalized or not accounted for the costs of dumping chemicals into the air, water or land, which results disease and environmental damage. For example, the Asarco smelter in Tacoma, Washington spewed lead and arsenic across a wide area. Devra Davis brilliantly documented how industry poisoned the air and environment, which sickened the people of Donora, Pennsylvania. While the U.S. has tightened pollution laws, Doe Run Peru, an affiliate of the St. Louis-based Doe Run Resources Corp., continues the practice of externalizing costs by spewing lead from their smelters which sickens children, depriving them of their innate abilities. Our government, through the Departments of Defense and Energy, has created some of the most contaminated sites in the world, such as Hanford, Washington.

Corporations contaminate the environment because it is cost effective and our laws shield executives from personal responsibility. In other words, they operate this way because they can make larger profits by not investing in pollution control or adapting sustainable practices and they can get away with it. Of course not all corporations operate irresponsibly, but enough do, which creates problems for everyone. A new form of capitalism is needed that motivates corporate responsibility to the biotic community and greater social good. Peter Barnes explores some of these ideas in his recent book Capitalism 3.0: A Guide to Reclaiming the Commons.[7] The thrust of the book is the idea to create public trusts that are responsible for and account for the value of the common wealth such as that in the land, air, and water. Capitalism must change to account for using this wealth.

Government responsibility

The primary responsibility of the government is to protect and preserve the common wealth for the greater good of the people.

Government has a duty and responsibility to ensure the "integrity, stability, and beauty of the biotic community". In essence government must ensure that future generations have an environment in which they can reach and maintain their full genetic potential. The U.S. Government has made various attempts to control chemicals while the governments of many developing countries such as China are just beginning to consider the problems of uncontrolled corporate exploitation of the environment and people.

A failed effort by U.S. Congress was the passage of Toxic Substances Control Act of 1976 (TSCA). This law was meant to empower the Environmental Protection Agency (EPA) to control the introduction of new chemicals into the environment. Unfortunately, corporations are not required to generate or make available health effects data (thus the data gap), which impedes the government or the public from making informed decisions on safety of products (thus the safety gap). Our representatives in the government must take seriously their responsibility to protect common wealth for the greater good of all. A first step would be to fix TSCA by requiring greater chemical testing and disclosure of this information. Our legislatures can take responsibility by supporting the Kids Safe Chemicals Act.[8]

Media Responsibility

The primary responsibility of the media is to create an informed and engaged public not just inform the educated public. The media has an obligation to produce socially responsible material that is fair, objective, and balanced. This does not mean giving equal time to clearly very minority views as was the case with global warming. Most importantly the media has a responsibility to be open and transparent about sources of information and acknowledge any potential conflicts of interest. The burden and obligation of the media to be responsible must also be shared with the listeners, viewers, and readers. The media has great power to inform and influence people, and with that comes a grave responsibility.

Academic Responsibility

The academic community, particularly those engaged in issues related to public health, have a responsibility to be thoughtful public health advocates and share their knowledge beyond narrow academic journals and conferences. Being a scientist includes the obligation to seek the truth and question the facts, there is also an obligation and responsibility to speak out on public health issues. Scientists and educators have tremendous amounts of knowledge that can be shared with K-12 students, media, legislators, and the general public. Educators and researchers have a responsibility to help create an informed public by sharing their knowledge and being thoughtful public health advocates.

Individual responsibility

Individuals have the greatest burden of responsibility because we must take into account not only the above responsibilities of our professional lives, but we must also address the responsibilities of our personal lives. We must confront individually and collectively that we have the power, and the means to reshape or even destroy the world. Individually it may seem as if we have little control over global warming, nuclear weapons, or the food imported from other countries. We have a responsibility to consider how our individual actions combine to collectively shape the world and society around us. This extends from who we elect for office to what we buy in the store, to the temperature in our homes, and the pesticides on our farms and lawns. We also have a responsibility to stay informed and demand that the media inform us. Democracy is a participatory sport and we must be well informed to participate. Our corporations run on and will respond to what we purchase. Our government and corporations will respond to our opinions and demands for a fair, just, and sustainable society. We must translate responsibility into action to create a just and sustainable world.

References

[1] P.J. Landrigan, C.B. Schechter, J.M. Lipton and others, Environmental Health Perspectives Vol. 110, No. 7 (2002), pg. 721 and following pages..

[2] Michael P. Wilson and others. Green Chemistry in California: A Framework for Leadership in Chemicals Policy and Innovation. Berkeley, Calif.: California Policy Research Center, University of California, 2006.

[3] Aldo Leopold, A Sand County Almanac, 1949.

[4] Steven G. Gilbert, "Ethical, Legal, and Social Issues: Our Children's Future," Neurotoxicology Vol. 26 (2005), pgs. 521-530.

[5] Peter Montague, The Precautionary Principle In A Nutshell. New Brunswick, N.J.: Environmental Research Foundation, 2005.

[6] Steven G. Gilbert, "Public Health and the Precautionary Principle," Northwest Public Health (Spring/Summer, 2005), pg. 4.

[7] Peter Barnes. Capitalism 3.0 -- A Guide to Reclaiming the Commons. San Francisco: Berrett-Koehler Publishers, 2006, pg. 195.

[8] Kids Safe Chemical Act. Senate Bill 1391, 109th Congress. Introduced by Senator Frank Lautenberg. See discussion here and get the original text of the bill here. Reportedly, the bill is presently undergoing significant revisions with input from a broad range of stakeholders.

Steven G. Gilbert, Ph.D., DABT, directs the Institute of Neurotoxicology & Neurological Disorders (INND), 8232 14th Ave NE, Seattle, WA 98115; phone: 206.527.0926; fax: 206.525.5102; E-mail: sgilbert@innd.org.

Web: www.asmalldoseof.org ("A Small Dose of Toxicology") Web: www.toxipedia.org -- connecting science and people

From: PR Watch, Jun. 19, 2007

BLESSED UNREST: JOHN STAUBER INTERVIEWS PAUL HAWKEN

By John Stauber

My first introduction to author Paul Hawken's work was his 1994 book The Ecology of Commerce. It is essential reading for anyone grappling with issues surrounding capitalism, social justice and ecological sustainability. Hawken is, among his plethora of accomplishments, a highly successful businessman, but The Ecology of Commerce pulled few punches in its criticism of even those companies truly trying to set and reach a higher standard of business social responsibility.

I met Paul in person the first time in early 1999 over dinner in his hometown of Berkeley, California, some months before the now-legendary "battle of Seattle" protest, a world-changing event that catalyzed his thinking and eventually led to his newest book, Blessed Unrest, subtitled, "How the Largest Movement in the World Came into Being and Why No One Saw It Coming."

The November 30, 1999, anti-corporate protests at the Seattle meeting of the World Trade Organization (WTO) became a global media moment, a highly visible but inaccurately reported coming out party for the movement Hawken documents and touts in Blessed Unrest.

"More than seven hundred groups, and between forty thousand and sixty thousand individuals, took part in protests against WTO's Third Ministerial in Seattle, constituting one of the most disruptive demonstrations in modern history and, at that time, the most prominent expression of a global citizens' movement resisting what protesters saw as a corporate-driven trade agreement. The demonstrators and activists who took part were not against trade per se.... Their frustration arose because one side held most of the cards; that side comprised heads of corporations, trade associations, government ministries, most media, stockholders and WTO. From the point of view of those on the streets, WTO was trying to put the finishing touches on a financial autobahn that would transfer income to a small proportion of the population in wealthy nations under the guise of trade liberalization."

Hawken penned an insightful and widely read first-hand account of the Seattle protest, an event typically depicted by a surprised and shocked mainstream media as a violent anarchist uprising right out of a bad Hollywood movie. "Most accounts of the Seattle demonstrations refer to them as 'riots,' even though they were 99.9 percent nonviolent," Hawken writes in Blessed Unrest. He relates how a Newsweek reporter in Seattle asked him who the leaders were of this uprising. He named names -- Vandana Shiva, Jerry Mander, Lori Wallach, Maude Barlow -- but was interrupted by the reporter who said "Stop, stop, I can't use these names in my article because Americans have never heard of them." Instead of the many real leaders of the Seattle protests, Newsweek placed on its cover Ted Kaczynski, better known as the Unabomber, although of course Kaczynski had absolutely nothing to do with Seattle's protests, but he epitomized the corporate media's defaming image of how they chose to portray the event and the movement behind it.

The WTO protests sparked Paul Hawken's investigation to better understand the global movement he is part of, its size and depth, its leadership, its goals, and its potential for birthing fundamental political, social, economic and environmental changes to remedy the intertwined crises of social injustice and ecological collapse. His best estimate is that there exist many more than one million organizations worldwide in this movement that "has no name." Blessed Unrest is his "exploration of this movement -- its participants, its aims and its ideals." Through his own non-profit organization the Natural Capital Institute, he has launched an ambitious new website at www.WiserEarth.org to catalog the movement, give it visibility, and to better enable groups to find each other and work together online.

"Groups are intertwingling (sic) -- there are no words to exactly describe the complexity of this web of relationships. The Internet and other communications technologies have revolutionized what is possible for small groups to accomplish and are accordingly changing the loci of power." Hawken believes that this movement is the last, best hope for humankind, describing its promise as "a network of organizations that offer solutions to disentangle what appear to be insoluble dilemmas: poverty, global climate change, terrorism, ecological degradation, polarization of income, loss of culture, and many more.

... Even though the origins and purposes of the various groups comprising the movement are diverse, if you survey their principles, mission statements, or values, you find they do not conflict.... What its members do share is a basic set of fundamental understandings about the earth, how it functions, and the necessity of fairness and equity for all people dependent on the planet's life-giving systems."

Blessed Unrest is a 342 page book, but the last third is a long appendix from the www.WiserEarth.org website categorizing and describing the mind-boggling areas of focus that the myriad of environmental and social justice groups are addressing. That section is best accessed not in the book but online at www.WiserEarth.org.

The book is not a manifesto and it doesn't attempt to define or proscribe any strategies or tactics for the revolutionary structural changes necesary to solve the interwoven political economic, and ecological crises defining the start of the 21st century. These immense problems often overwhelm one's sense of hope. Hawken has hope and he has faith in his movement; he believes it will "prevail." He also believes that its success will be defined by "how rapidly it becomes a part of all other sectors of society. If it remains singular and isolated, it will fail. If it is absorbed and integrated into religion, education, business and government, there is a chance that humans can reverse the trends the beset the earth."

Blessed Unrest, www.WiserEarth.org and Hawken's related efforts are important contributions to furthering the movement. Perhaps he was wise not to examine too deeply the differences and divisions within the movement, or the real-world political challenges of how to reclaim democracy and to build power at the grassroots, taking it away it from the corporate elite, the ultimate challenge. Congratulations to Paul Hawken for creating a place where the movement can better see itself, meet up and collaborate online. Whether the website he calls "Wiser" will succeed, and to what degree, will depend on how it benefits and is used by the movement. In any case, Hawken has taken his best shot and broken new ground trying to help the movement forward.

I have not had a long face-to-face discussion with Paul since our 1999-dinner meeting, but I caught up with him in cyberspace for this interview below.

STAUBER: Twelve years ago when I was writing Toxic Sludge Is Good For You, I quoted your book The Ecology of Commerce. You criticized the fundamental structural problems of business and you wrote, "Rather than a management problem, we have a design problem that runs through all business."

It struck me reading Blessed Unrest that you do not address in it the issue of replacing the business corporation as the dominant engine of economic activity, with new forms of economic structure that can be more accountable to human needs, human rights and ecological sustainability.

Business corporations are structurally incapable of meeting human and ecological needs, and the largest corporations dominate global politics. Isn't the movement's meta-challenge to bring about democratic structural change in both the economic and political spheres, to make business and government accountable to people and the planet?

HAWKEN: Yes! Absolutely! There is a pervasive subtext to most of the issues NGOs focus on: the abrogation of rights, the damage to place, and the corruption of political process by business. I didn't address the idea of replacing the corporation because the book was about civil society. It is about the largest movement in the world. It is not about what I think the movement should do nor is it about what I think we should do about corporate charters. My record is pretty clear on this in previous writings. I was being more an anthropologist, trying to figure out where this movement came from and how it works.

STAUBER: Thomas Friedman uses and abuses the term "economic democracy" to describe the corporate globalization he promotes. I would like to rescue and revitalize "democracy" and make it meaningful. Do you think that "democracy" is a common theme in the movement for ecological and social justice, and how do you see the movement's relationship with democracy?

HAWKEN: My sense is that the word and concept of democracy has lost much of its meaning. We have these winner-takes-all slugfests in the US where there are truly no ethical or moral limits and have the audacity to call it democracy because there were voting machines. Our democracy is corrupt from the top down and I think this movement is forming from the bottom up to correct the lack of process and governing principles that inform democratic movements. Although most of the media thinks this unnamed movement is about protest, my guess is that more than 98% of it is about solutions, and these are usually about solutions to problems in regions or communities. To achieve this requires the creation of what I call handmade democracies, processes that are not win-lose, and it requires a quality of interaction, respect, and listening that is now lost in US politics.

STAUBER: On pages 64-65 of Blessed Unrest you write about the PR flack E. Bruce Harrison, whose attacks on Rachel Carson in the early 1960s on behalf of the chemical industry gave rise to greenwashing and the tactic of coopting of groups like Environmental Defense to partner with polluters. Given your examination of greenwashing, corporate PR, and front groups, I was surprised that no where in Blessed Unrest do you analyze the shortcomings and contradictions of these Big Green Groups that raise and spend tens or hundreds of millions of dollars annually, pay their executives six-figure salaries, partner with corporations, place corporate executives on their boards, and have no meaningful accountability to anyone except a small elite group of funders. You lump these groups with the hundreds of thousands of smaller grassroots groups. Why did you not try to better differentiate groups that are under-funded, grassroots and voluntary, from groups that are essentially large, sophisticated non-profit corporations that, while staffed by well-meaning people, often undermine and thwart fundamental change?

HAWKEN: I don't believe I am lumping. I am describing a movement that is more complex and diverse than any prior social movement. Its strength and resiliency derives from this complexity. I understand your concerns, but I was describing something, not evaluating groups in order to announce to the world which ones I think are good and which are bad. I made it clear that some of the organizations that arose from what I attribute to George Perkins Marsh's influence are wealthy and very establishment oriented, and those that are Emersonian/Thoreauvian in origin tend to be smaller and under- resourced.

We tend to be uncomfortable with contradictions, we want the world to be the way we want it be, and are not happy when it veers. But the world is never the way one person wants it to be. That would be hell. That is why I included Barry Lopez's quote in the beginning: "One must live in the middle of contradiction, because if all contradiction were eliminated at once life would collapse. There are simply no answers to some of the great pressing questions. You continue to live them out, making your life a worthy expression of leaning into the light." I do not like where we are, but I believe in my heart that we will have to figure this out together, and that we will not get out of the situation we are in by throwing each other overboard.

STAUBER: You often make the point that the movement is "leaderless" in terms of a King or Gandhi type leader. Yet obviously some form of leadership and accountability exists and better leadership and coordination are necessary if the movement is going to prevail, as you believe it will. Could you elaborate on this issue of leadership in the movement, and how more leadership might emerge that can begin to better coordinate, synergize, and cross pollinate within the movement, and make the parts more aware of the whole?

HAWKEN: My point is that it does not have a charismatic leader in the traditional sense, that there is no one person or group of persons who speak for it, and thus it can be overlooked by the media that thrives on that kind of centrality. I believe there are thousands and thousands of leaders, stunning in their qualities, courage, and faithfulness to principle. I agree with the sense of your question, that it is time to link and connect up in more powerful ways. The movement is atomized because that is how it came into being. It now has the communication and technological tools to work far more closely and effectively.

STAUBER: I sensed in our dinner conversation in 1999 that this issue of envisioning structural change and how to bring it about might be the focus of a manifesto you were considering. Is a manifesto still in the works?

HAWKEN: I remember well speaking with you about writing down a list of principles that reflected a universal set of values coupled with clear actions that needed to be taken in order to bring about justice, eliminate poverty, and prevent ecological collapse. I think about it all the time. Sometimes when I read Murdock's 63 Universals and other anthropological literature I wonder if there is not a set of such values that inform socio-political-ecologic-economic matters.

STAUBER: Our primary mission here at CMD is exposing propaganda and revealing how the powerful use propaganda and the media to manage public perceptions, opinions and policies. It struck me how the pollution of our information environment and the nefarious role of big media corporations in serving the powerful and preventing fundamental change are not addressed in the book. Likewise, when I looked at the categories for WiserEarth, I see little regarding the pollution of our information environment and the issues of propaganda that we address here at CMD.

HAWKEN: www.WiserEarth.org is an online relational database that can be edited by its users. It is not finished but a work in progress created by its community. We are receiving excellent suggestions and there are lively discussions on how to expand the taxonomy. Your contribution here, given your experience and background, would be so welcome and hugely constructive.

From: Caribbean Net News, Jun. 14, 2007

MAIZE OF DECEPTION

How corn-based ethanol can lead to environmental disaster

By Eliana Monteforte, COHA Research Associate

As the Bush administration continues to push its alternative fuels agenda, it has become increasingly evident that corn-based ethanol could be as much the global villain as a boon to society. Instead of improving the environment and moderating oil prices, corn-based ethanol could result in mass deforestation, strained land and water resources, increased food prices, augmented poverty and swarms of farmers uprooted from the land. While the negative effects of corn- based biofuels are obvious, Washington continues to emphasize their importance, while increasing the size and number of subventions to the ethanol industry. This is being done despite the adverse ramifications that its cultivation is having on the sites where it already is being produced, with the situation likely to further deteriorate in the near future.

The Emergence of Ethanol

Ethanol is a substance created by the fermentation of simple sugars. In the United States, corn is the main source for ethanol production, while other countries like Brazil rely on a sugar cane process as well as other plants and byproducts to be used in making alternative energy sources. Typically, ethanol is mixed into gasoline creating "gasohol," resulting in higher octane ratings, improved combustion, and is viewed as more environmentally friendly. Currently, around 30% of gasoline in the United States contains some ethanol, and US initiatives indicate the possibility for much larger concentrations in coming years.

Before corn-based ethanol became prominent in US industries, lead was used as a performance enhancer when added to gasoline. It was not until the 1970s and 1980s that corn-based ethanol began to replace lead -- a very toxic substance -- mostly due to the oil embargo that the Oil and Petroleum Exporting Countries (OPEC) imposed in 1974. Amid the clamor of American voices calling out for energy independence, President Jimmy Carter gave his memorable speech on April 18, 1977, ushering in a new era in US economic history. From this point on the US would try to cater to its high energy demand from its own domestic resources. To Carter, this decision was the "moral equivalent of war" between the US and OPEC. Thirty years later, it seems that America is losing its own self-designated "war" and is likely to continue to suffer unnecessary losses in this conflict unless it pursues a fundamental change in its economic policy.

An Economic Giant

Over the past few years, a combination of increasing oil prices and generous government subsidies has resulted in the continued expansion of the US ethanol industry. According to the Council on Foreign Relations, as of 2006, 110 ethanol refineries have been built in the US, with 73 more under construction. It is estimated that by the end of 2008, ethanol production will have reached 11.4 billion gallons a year. In his 2007 State of the Union address, President George W. Bush set out goals to produce over 35 billion gallons of ethanol fuel by the year 2017. He added that the US also plans to cut petroleum consumption by 20% over a ten year time span.

The tumultuous ethanol industry receives Midas-like support as a result of direct government subventions which equaled about $8.9 billion in 2005. These include tax cuts, grants, and government loans in order to encourage production and remain economically competitive with conventional gasoline. The federal government for example already has established a tax credit of 51 cents for every gallon the industry produces. Although accompanied by severe consequences, with continued government support at such a high level, it is quite possible that Bush's consumption goals could be fulfilled within the stipulated time period.

Feeding Cars and Starving the Poor

On March 29, 2007, Cuban leader Fidel Castro berated Bush's economic initiatives for ethanol production in the Cuban Communist party newspaper Granma, stating that using corn, or any food source, to produce ethanol could result in the "premature death" of upwards of three billion people. He explained that the drive to produce corn- based ethanol would hike up food prices around the world, adversely effecting poverty in developing countries. Castro then restated his beliefs in a second article, also published in Granma, on April 3. Although the ailing Cuban president is known for adamantly and automatically opposing US foreign policy initiatives, it would be foolhardy for the US to ignore his foreboding message on this subject.

As a result of the Washington-backed initiatives, an enormous volume of corn is being consumed for ethanol production. Consequently, the decreasing availability of it as a food crop and for livestock has contributed to the rise of corn futures from $2.80 to $4.38 a bushel. This recent price hike occurred over the course of several months and is said to be the sharpest increase in the past ten years. Thus, fewer low income consumers are able to purchase corn-based products, which is a very serious detriment to countries where corn is a staple of a population's diet.

Mexico already has been significantly affected by the rising costs of corn. Because 107 million Mexicans rely on corn as their main source of sustenance, its soaring increase has sent shockwaves throughout the country's corn-related industries. The price of tortillas in Mexico has risen by 100%, resulting in mass protests by tens of thousands of enraged consumers last January. Recently inaugurated Mexican President Felipe Calderon stated that the price increase of corn is unjustifiable and "threatens the economy and millions of families." In response to the strike, Calderon signed an accord that limited the price of tortillas to 8.50 pesos per kilogram, and increased the quota of duty-free corn products imported from the United States. Despite Calderon's efforts to regulate corn prices, the situation remains unresolved, since the accord expired in May.

The rapidly changing international corn market also has affected the prices of other produce. Due to the high demand for corn, farmers in the US are now planting more acres of the commodity. This has decreased the production of other crops, such as wheat, soy and rice, making them more expensive and less available. Beer prices also have risen due to the substitution of barley for corn. Even the price of meats and poultry such as turkey, chicken, pork, beef as well as eggs and dairy products are beginning increase due to the high cost of feeding farm animals. Fidel Castro may have a point; current US economic policy seems to indicate greater interest in fueling cars than feeding people.

Is Ethanol Really Better For The Environment?

In May of 2007, the United Nations issued a report warning the world against the production of ethanol. The report stated that thus far, the production of ethanol has resulted in "the destruction of endangered rainforests, contamination of soil, air and water and the expulsion of rural populations from their homes." Because more acreage needs to be cultivated in order to produce the amount of corn, sugarcane and other foodstock needed for ethanol production, farmers around the world are wantonly cutting down forests to make way for new plantations. In the long term, the Amazon Rainforest, for example, will experience vast deforestation due to Brazil's increased sugarcane production in order to meet its ethanol export goals. This inevitably will result in the slow degradation of one of the Americas most precious and fragile ecosystems.

The UN also added that "where crops are grown for energy purposes, the use of large scale cropping could lead to significant biodiversity loss, soil erosion, and nutrient leaching." Fidel Castro warned the US that corn-based ethanol production will not only damage the environment, but will also put increasing pressure on the world's already dwindling water supplies, possibly resulting in future water wars.

In a COHA interview with Boston University's International Relations professor Kevin P. Gallagher, he asserted that we have found ourselves in a "climate constrained insecure world," where we must shift our dependence away from fossil fuels and have a more climate friendly energy policy. Moreover, Gallagher stresses that "corn-based ethanol is not a panacea to solve a country's climate and security problems." He emphasized that currently the US has the opportunity to develop a more efficient energy path, but with its present, poorly managed corn- based energy policy, the US is "taking one step forward and two steps backward."

Gallagher also pointed out that the corn, wheat and soy sectors are highly concentrated, meaning that at times "only two or three firms can control 75% to 85% of the market." This raises possible concerns that these firms are manipulating the price of their products, thereby artificially impacting the commodity market to their advantage, but not necessarily to society's benefit. Because these mega-firms face so little competition, it is relatively easy for them to drive up the price of their products in order to generate greater profits. At the present time, corn-based ethanol production is benefiting mainly the larger firms.

In Mexico there are only a relatively small handful of tortilla makers whose prices, as mentioned above, have rapidly shot up. Yet it is very important to note that these tortilla prices increased somewhat faster than the price of corn in general. While the situation in Mexico is currently under investigation, its present fate illustrates the importance of rapidly addressing this issue.

It is evident that while ethanol, as an alternative to fossil fuel, may be beneficial to the general population by reducing and stabilizing fuel prices, its consequences may far outweigh such advantages. As Food Rights Coordinator Celso Marcatto at ActionAid in Brazil stated, "The benefits of biofuels cannot be achieved at the expenses of increased food shortages, environmental degradation and poverty." Unfortunately, that is what the US is inadvertently setting itself up for in the future.

Alternatives To Corn-Based Ethanol

The US currently uses more energy per unit of GDP than do most other countries in the world. Yet there are many ways the US could utilize its energy more efficiently. For example, steel mills in the US use more energy per dollar than their equivalent in Germany or Japan. The Japanese car company Toyota is currently using its hybrid technology to manufacture more fuel efficient cars. Germany uses energy efficient light bulbs from which it derives huge savings. The US needs to use the eco-technology which now exists to mirror these countries by adapting them to its own use.

Wind and solar energy, a function of geography, should be a key component in the US's quest for energy efficiency. Professor Gallagher suggests that former President Carter's energy policy had the US "perfectly positioned to, by now, be the world leader." Yet because the succeeding administrations strayed away from Carter's path, the US is now far behind. "We have engineers and ingenuity but the current administration has locked itself into a specific framework and is resistant to change," says Gallagher. The US still has time to alter its course toward a more energy efficient arrangement. Hopefully, the White House will acknowledge its current unwise economic policy and join other governments that value the use of eco-friendly energy.

COHA, The Council on Hemispheric Affairs, founded in 1975, is an independent, non-profit, non-partisan, tax-exempt research and information organization. It has been described on the Senate floor as being "one of the nation's most respected bodies of scholars and policy makers." For more information, visit www.coha.org; or email coha@coha.org

Copyright 2003-2007 Caribbean Net News

From: The Economist, Jun. 7, 2007

THE TRUTH ABOUT RECYCLING

As the importance of recycling becomes more apparent, questions about it linger. Is it worth the effort? How does it work? Is recycling waste just going into a landfill in China? Here are some answers

It is an awful lot of rubbish. Since 1960 the amount of municipal waste being collected in America has nearly tripled, reaching 245m tonnes in 2005. According to European Union statistics, the amount of municipal waste produced in western Europe increased by 23% between 1995 and 2003, to reach 577kg per person. (So much for the plan to reduce waste per person to 300kg by 2000.) As the volume of waste has increased, so have recycling efforts. In 1980 America recycled only 9.6% of its municipal rubbish; today the rate stands at 32%. A similar trend can be seen in Europe, where some countries, such as Austria and the Netherlands, now recycle 60% or more of their municipal waste. Britain's recycling rate, at 27%, is low, but it is improving fast, having nearly doubled in the past three years.

Even so, when a city introduces a kerbside recycling programme, the sight of all those recycling lorries trundling around can raise doubts about whether the collection and transportation of waste materials requires more energy than it saves. "We are constantly being asked: Is recycling worth doing on environmental grounds?" says Julian Parfitt, principal analyst at Waste & Resources Action Programme (WRAP), a non- profit British company that encourages recycling and develops markets for recycled materials.

Studies that look at the entire life cycle of a particular material can shed light on this question in a particular case, but WRAP decided to take a broader look. It asked the Technical University of Denmark and the Danish Topic Centre on Waste to conduct a review of 55 life- cycle analyses, all of which were selected because of their rigorous methodology. The researchers then looked at more than 200 scenarios, comparing the impact of recycling with that of burying or burning particular types of waste material. They found that in 83% of all scenarios that included recycling, it was indeed better for the environment.

Based on this study, WRAP calculated that Britain's recycling efforts reduce its carbon-dioxide emissions by 10m-15m tonnes per year. That is equivalent to a 10% reduction in Britain's annual carbon-dioxide emissions from transport, or roughly equivalent to taking 3.5m cars off the roads. Similarly, America's Environmental Protection Agency estimates that recycling reduced the country's carbon emissions by 49m tonnes in 2005.

Recycling has many other benefits, too. It conserves natural resources. It also reduces the amount of waste that is buried or burnt, hardly ideal ways to get rid of the stuff. (Landfills take up valuable space and emit methane, a potent greenhouse gas; and although incinerators are not as polluting as they once were, they still produce noxious emissions, so people dislike having them around.) But perhaps the most valuable benefit of recycling is the saving in energy and the reduction in greenhouse gases and pollution that result when scrap materials are substituted for virgin feedstock. "If you can use recycled materials, you don't have to mine ores, cut trees and drill for oil as much," says Jeffrey Morris of Sound Resource Management, a consulting firm based in Olympia, Washington.

Extracting metals from ore, in particular, is extremely energy- intensive. Recycling aluminium, for example, can reduce energy consumption by as much as 95%. Savings for other materials are lower but still substantial: about 70% for plastics, 60% for steel, 40% for paper and 30% for glass. Recycling also reduces emissions of pollutants that can cause smog, acid rain and the contamination of waterways.

A brief history of recycling

The virtue of recycling has been appreciated for centuries. For thousands of years metal items have been recycled by melting and reforming them into new weapons or tools. It is said that the broken pieces of the Colossus of Rhodes, a statue deemed one of the seven wonders of the ancient world, were recycled for scrap. During the industrial revolution, recyclers began to form businesses and later trade associations, dealing in the collection, trade and processing of metals and paper. America's Institute of Scrap Recycling Industries (ISRI), a trade association with more than 1,400 member companies, traces its roots back to one such organisation founded in 1913. In the 1930s many people survived the Great Depression by peddling scraps of metal, rags and other items. In those days reuse and recycling were often economic necessities. Recycling also played an important role during the second world war, when scrap metal was turned into weapons.

As industrial societies began to produce ever-growing quantities of garbage, recycling took on a new meaning. Rather than recycling materials for purely economic reasons, communities began to think about how to reduce the waste flow to landfills and incinerators. Around 1970 the environmental movement sparked the creation of America's first kerbside collection schemes, though it was another 20 years before such programmes really took off.

In 1991 Germany made history when it passed an ordinance shifting responsibility for the entire life cycle of packaging to producers. In response, the industry created Duales System Deutschland (DSD), a company that organises a separate waste-management system that exists alongside public rubbish-collection. By charging a licensing fee for its "green dot" trademark, DSD pays for the collection, sorting and recycling of packaging materials. Although the system turned out to be expensive, it has been highly influential. Many European countries later adopted their own recycling initiatives incorporating some degree of producer responsibility.

In 1987 a rubbish-laden barge cruised up and down America's East Coast looking for a place to unload, sparking a public discussion about waste management and serving as a catalyst for the country's growing recycling movement. By the early 1990s so many American cities had established recycling programmes that the resulting glut of materials caused the market price for kerbside recyclables to fall from around $50 per ton to about $30, says Dr Morris, who has been tracking prices for recyclables in the Pacific Northwest since the mid-1980s. As with all commodities, costs for recyclables fluctuate. But the average price for kerbside materials has since slowly increased to about $90 per ton.

Even so, most kerbside recycling programmes are not financially self- sustaining. The cost of collecting, transporting and sorting materials generally exceeds the revenues generated by selling the recyclables, and is also greater than the disposal costs. Exceptions do exist, says Dr Morris, largely near ports in dense urban areas that charge high fees for landfill disposal and enjoy good market conditions for the sale of recyclables.

Sorting things out

Originally kerbside programmes asked people to put paper, glass and cans into separate bins. But now the trend is toward co-mingled or "single stream" collection. About 700 of America's 10,000 kerbside programmes now use this approach, says Kate Krebs, executive director of America's National Recycling Coalition. But the switch can make people suspicious: if there is no longer any need to separate different materials, people may conclude that the waste is simply being buried or burned. In fact, the switch towards single-stream collection is being driven by new technologies that can identify and sort the various materials with little or no human intervention. Single-stream collection makes it more convenient for householders to recycle, and means that more materials are diverted from the waste stream.

San Francisco, which changed from multi to single-stream collection a few years ago, now boasts a recycling rate of 69% -- one of the highest in America. With the exception of garden and food waste, all the city's kerbside recyclables are sorted in a 200,000-square-foot facility that combines machines with the manpower of 155 employees. The $38m plant, next to the San Francisco Bay, opened in 2003. Operated by Norcal Waste Systems, it processes an average of 750 tons of paper, plastic, glass and metals a day.

The process begins when a truck arrives and dumps its load of recyclables at one end of the building. The materials are then piled on to large conveyer belts that transport them to a manual sorting station. There, workers sift through everything, taking out plastic bags, large pieces of cardboard and other items that could damage or obstruct the sorting machines. Plastic bags are especially troublesome as they tend to get caught in the spinning-disk screens that send weightier materials, such as bottles and cans, down in one direction and the paper up in another.

Corrugated cardboard is separated from mixed paper, both of which are then baled and sold. Plastic bottles and cartons are plucked out by hand. The most common types, PET (type 1) and HDPE (type 2), are collected separately; the rest go into a mixed-plastics bin.

Next, a magnet pulls out any ferrous metals, typically tin-plated or steel cans, while the non-ferrous metals, mostly aluminium cans, are ejected by eddy current. Eddy-current separators, in use since the early 1990s, consist of a rapidly revolving magnetic rotor inside a long, cylindrical drum that rotates at a slower speed. As the aluminium cans are carried over this drum by a conveyer belt, the magnetic field from the rotor induces circulating electric currents, called eddy currents, within them. This creates a secondary magnetic field around the cans that is repelled by the magnetic field of the rotor, literally ejecting the aluminium cans from the other waste materials.

Finally, the glass is separated by hand into clear, brown, amber and green glass. For each load, the entire sorting process from start to finish takes about an hour, says Bob Besso, Norcal's recycling- programme manager for San Francisco.

Although all recycling facilities still employ people, investment is increasing in optical sorting technologies that can separate different types of paper and plastic. Development of the first near-infra-red- based waste-sorting systems began in the early 1990s. At the time Elopak, a Norwegian producer of drink cartons made of plastic- laminated cardboard, worried that it would have to pay a considerable fee to meet its producer responsibilities in Germany and other European countries. To reduce the overall life-cycle costs associated with its products, Elopak set out to find a way to automate the sorting of its cartons. The company teamed up with SINTEF, a Norwegian research centre, and in 1996 sold its first unit in Germany. The technology was later spun off into a company now called TiTech.

TiTech's systems -- more than 1,000 of which are now installed worldwide -- rely on spectroscopy to identify different materials. Paper and plastic items are spread out on a conveyor belt in a single layer. When illuminated by a halogen lamp, each type of material reflects a unique combination of wavelengths in the infra-red spectrum that can be identified, much like a fingerprint. By analysing data from a sensor that detects light in both the visible and the near- infra-red spectrum, a computer is able to determine the colour, type, shape and position of each item. Air jets are then activated to push particular items from one conveyor belt to another, or into a bin. Numerous types of paper, plastic or combinations thereof can thus be sorted with up to 98% accuracy.

For many materials the process of turning them back into useful raw materials is straightforward: metals are shredded into pieces, paper is reduced to pulp and glass is crushed into cullet. Metals and glass can be remelted almost indefinitely without any loss in quality, while paper can be recycled up to six times. (As it goes through the process, its fibres get shorter and the quality deteriorates.)

Plastics, which are made from fossil fuels, are somewhat different. Although they have many useful properties -- they are flexible, lightweight and can be shaped into any form -- there are many different types, most of which need to be processed separately. In 2005 less than 6% of the plastic from America's municipal waste stream was recovered. And of that small fraction, the only two types recycled in significant quantities were PET and HDPE. For PET, food-grade bottle- to-bottle recycling exists. But plastic is often "down-cycled" into other products such as plastic lumber (used in place of wood), drain pipes and carpet fibres, which tend to end up in landfills or incinerators at the end of their useful lives.

Even so, plastics are being used more and more, not just for packaging, but also in consumer goods such as cars, televisions and personal computers. Because such products are made of a variety of materials and can contain multiple types of plastic, metals (some of them toxic), and glass, they are especially difficult and expensive to dismantle and recycle.

Europe and Japan have initiated "take back" laws that require electronics manufacturers to recycle their products. But in America only a handful of states have passed such legislation. That has caused problems for companies that specialise in recycling plastics from complex waste streams and depend on take-back laws for getting the necessary feedstock. Michael Biddle, the boss of MBA Polymers, says the lack of such laws is one of the reasons why his company operates only a pilot plant in America and has its main facilities in China and Austria.

Much recyclable material can be processed locally, but ever more is being shipped to developing nations, especially China. The country has a large appetite for raw materials and that includes scrap metals, waste paper and plastics, all of which can be cheaper than virgin materials. In most cases, these waste materials are recycled into consumer goods or packaging and returned to Europe and America via container ships. With its hunger for resources and the availability of cheap labour, China has become the largest importer of recyclable materials in the world.

The China question

But the practice of shipping recyclables to China is controversial. Especially in Britain, politicians have voiced the concern that some of those exports may end up in landfills. Many experts disagree. According to Pieter van Beukering, an economist who has studied the trade of waste paper to India and waste plastics to China: "as soon as somebody is paying for the material, you bet it will be recycled."

In fact, Dr van Beukering argues that by importing waste materials, recycling firms in developing countries are able to build larger factories and achieve economies of scale, recycling materials more efficiently and at lower environmental cost. He has witnessed as much in India, he says, where dozens of inefficient, polluting paper mills near Mumbai were transformed into a smaller number of far more productive and environmentally friendly factories within a few years.

Still, compared with Western countries, factories in developing nations may be less tightly regulated, and the recycling industry is no exception. China especially has been plagued by countless illegal- waste imports, many of which are processed by poor migrants in China's coastal regions. They dismantle and recycle anything from plastic to electronic waste without any protection for themselves or the environment.

The Chinese government has banned such practices, but migrant workers have spawned a mobile cottage industry that is difficult to wipe out, says Aya Yoshida, a researcher at Japan's National Institute for Environmental Studies who has studied Chinese waste imports and recycling practices. Because this type of industry operates largely under the radar, it is difficult to assess its overall impact. But it is clear that processing plastic and electronic waste in a crude manner releases toxic chemicals, harming people and the environment -- the opposite of what recycling is supposed to achieve.

Under pressure from environmental groups, such as the Silicon Valley Toxics Coalition, some computer-makers have established rules to ensure that their products are recycled in a responsible way. Hewlett- Packard has been a leader in this and even operates its own recycling factories in California and Tennessee. Dell, which was once criticised for using prison labour to recycle its machines, now takes back its old computers for no charge. And last month Steve Jobs detailed Apple's plans to eliminate the use of toxic substances in its products.

Far less controversial is the recycling of glass -- except, that is, in places where there is no market for it. Britain, for example, is struggling with a mountain of green glass. It is the largest importer of wine in the world, bringing in more than 1 billion litres every year, much of it in green glass bottles. But with only a tiny wine industry of its own, there is little demand for the resulting glass. Instead what is needed is clear glass, which is turned into bottles for spirits, and often exported to other countries. As a result, says Andy Dawe, WRAP's glass-technology manager, Britain is in the "peculiar situation" of having more green glass than it has production capacity for.

Britain's bottle-makers already use as much recycled green glass as they can in their furnaces to produce new bottles. So some of the surplus glass is down-cycled into construction aggregates or sand for filtration systems. But WRAP's own analysis reveals that the energy savings for both appear to be "marginal or even disadvantageous". Working with industry, WRAP has started a new programme called GlassRite Wine, in an effort to right the imbalance. Instead of being bottled at source, some wine is now imported in 24,000-litre containers and then bottled in Britain. This may dismay some wine connoisseurs, but it solves two problems, says Mr Dawe: it reduces the amount of green glass that is imported and puts what is imported to good use. It can also cut shipping costs by up to 40%.

The future of recycling

This is an unusual case, however. More generally, one of the biggest barriers to more efficient recycling is that most products were not designed with recycling in mind. Remedying this problem may require a complete rethinking of industrial processes, says William McDonough, an architect and the co-author of a book published in 2002 called "Cradle to Cradle: Remaking the Way We Make Things". Along with Michael Braungart, his fellow author and a chemist, he lays out a vision for establishing "closed-loop" cycles where there is no waste. Recycling should be taken into account at the design stage, they argue, and all materials should either be able to return to the soil safely or be recycled indefinitely. This may sound like wishful thinking, but Mr McDonough has a good pedigree. Over the years he has worked with companies including Ford and Google.

An outgrowth of "Cradle to Cradle" is the Sustainable Packaging Coalition, a non-profit working group that has developed guidelines that look beyond the traditional benchmarks of packaging design to emphasise the use of renewable, recycled and non-toxic source materials, among other things. Founded in 2003 with just nine members, the group now boasts nearly 100 members, including Target, Starbucks and Estee Lauder, some of which have already begun to change the design of their packaging.

Sustainable packaging not only benefits the environment but can also cut costs. Last year Wal-Mart, the world's biggest retailer, announced that it wanted to reduce the amount of packaging it uses by 5% by 2013, which could save the company as much as $3.4 billion and reduce carbon-dioxide emissions by 667,000 tonnes. As well as trying to reduce the amount of packaging, Wal-Mart also wants to recycle more of it. Two years ago the company began to use an unusual process, called the "sandwich bale", to collect waste material at its stores and distribution centres for recycling. It involves putting a layer of cardboard at the bottom of a rubbish compactor before filling it with waste material, and then putting another layer of cardboard on top. The compactor then produces a "sandwich" which is easier to handle and transport, says Jeff Ashby of Rocky Mountain Recycling, who invented the process for Wal-Mart. As well as avoiding disposal costs for materials it previously sent to landfill, the company now makes money by selling waste at market prices.

Evidently there is plenty of scope for further innovation in recycling. New ideas and approaches will be needed, since many communities and organisations have set high targets for recycling. Europe's packaging directive requires member states to recycle 60% of their glass and paper, 50% of metals and 22.5% of plastic packaging by the end of 2008. Earlier this year the European Parliament voted to increase recycling rates by 2020 to 50% of municipal waste and 70% of industrial waste. Recycling rates can be boosted by charging households and businesses more if they produce more rubbish, and by reducing the frequency of rubbish collections while increasing that of recycling collections.

Meanwhile a number of cities and firms (including Wal-Mart, Toyota and Nike) have adopted zero-waste targets. This may be unrealistic but Matt Hale, director of the office of solid waste at America's Environmental Protection Agency, says it is a worthy goal and can help companies think about better ways to manage materials. It forces people to look at the entire life-cycle of a product, says Dr Hale, and ask questions: Can you reduce the amount of material to begin with? Can you design the product to make recycling easier?

If done right, there is no doubt that recycling saves energy and raw materials, and reduces pollution. But as well as trying to recycle more, it is also important to try to recycle better. As technologies and materials evolve, there is room for improvement and cause for optimism. In the end, says Ms Krebs, "waste is really a design flaw."

Web resources recommended by The Economist:

William McDonough's book, Cradle to Cradle.

The Environmental Protection Agency reports on waste and recycling. The Waste & Resources Action Programme has a profile of Julian Parfitt. The University of Amsterdam has a biography of Pieter van Beukering. See also William McDonough, the Danish Topic Centre on Waste and Resources, the Institute of Scrap Recycling Industries and TiTech.

Return to Table of Contents

From: Grist, Mar. 28, 2007 [Printer-friendly version]

KEEP YOUR EYES ON THE SIZE

By Stacy Mitchell

With its recent flurry of green initiatives, Wal-Mart has won the embrace of several prominent environmental groups. "If they do even half what they say they want to do, it will make a huge difference for the planet," said Ashok Gupta of the Natural Resources Defense Council. Environmental Defense, meanwhile, has deemed Wal-Mart's actions momentous enough to warrant opening an office near the retailer's headquarters in Bentonville, Ark. "If [we] can nudge Wal- Mart in the right direction on the environment, we can have a huge impact," said the organization's executive vice president, David Yarnold.

Wal-Mart's eco-commitments are not without substance. The two most significant are a pledge to make its stores 20 percent more energy efficient by 2013, which will cut annual electricity use by 3.5 million megawatt-hours, and a plan to double the fuel economy of its trucks by 2015, which will save 60 million gallons of diesel fuel a year.

Acting with unusual transparency, Wal-Mart has even published a benchmark calculation of its carbon footprint [Excel]. The company estimates that its U.S. operations were responsible for 15.3 million metric tons of CO2 emissions in 2005. About three-quarters of this pollution came from the electricity generated to power its stores.

This cannot be dismissed as greenwashing. It's actually far more dangerous than that. Wal-Mart's initiatives have just enough meat to have distracted much of the environmental movement, along with most journalists and many ordinary people, from the fundamental fact that, as a system of distributing goods to people, big-box retailing is as intrinsically unsustainable as clear-cut logging is as a method of harvesting trees.

Here's the key issue. Wal-Mart's carbon estimate omits a massive source of CO2 that is inherent to its operations and amounts to more than all of its other greenhouse-gas emissions combined: the CO2 produced by customers driving to its stores.

The dramatic growth of big-box retailers, including Wal-Mart, Target, and Home Depot, over the last 15 years has been mirrored by an equally dramatic rise in how many miles we travel running errands. Between 1990 and 2001 (the most recent year for which the U.S. Department of Transportation has data), the number of miles that the average American household drove each year for shopping grew by more than 40 percent.

It's not that we are going to the store more often, but rather that each trip is an average of about two miles longer. The general trend toward suburbanization is only partly to blame: shopping-related driving grew three times as fast as driving for all other purposes. The culprit is big-box retail. These companies have displaced tens of thousands of neighborhood and downtown businesses and consolidated the necessities of life into massive stores that aggregate car-borne shoppers from large areas. During the 1990s, for example, about 5,000 independent hardware stores, dispersed across almost as many neighborhoods, were replaced by just 1,500 Home Depot and Lowe's superstores, most erected on the outer fringes of our cities. The same trend is under way in virtually every retail sector. According to the market research firm Retail Forward, every time Wal-Mart converts one of its stores into a Supercenter with groceries, it leads to the closure of two existing grocery stores, leaving many residents with farther to drive for milk and bread.

Altogether, by 2001, Americans logged over 330 billion miles going to and from the store, generating more than 140 million metric tons of CO2. If we conservatively estimate that shopping-related driving over the last five years grew at only half the rate of the 1990s, that means Americans are now driving more than 365 billion miles each year and producing 154 million metric tons of CO2 in the process.

Since Wal-Mart accounts for 10 percent of U.S. retail sales, the company's share of these emissions is at least 15.4 million metric tons -- and likely higher, because Wal-Mart has led the way in auto- oriented store formats and locations. This amounts to more than all of its other domestic CO2 output combined.

Land-use consultant Kennedy Smith notes that another way to estimate these emissions is to start with the 100 million shoppers Wal-Mart says its stores attract each week, generously assume two shoppers per car, and then multiply by the average length of a shopping trip. This produces an almost identical result: over 15 million metric tons of CO2.

Shopping-related driving has been growing so fast that even a phenomenal improvement in the fuel economy of cars would soon be eclipsed by more miles on the road. Nor is CO2 the only environmental impact of all of this driving. Tens of thousands of acres of habitat have been paved for big-box parking lots, which, during rainstorms, deliver large doses of oil and other petrochemicals deposited by cars to nearby lakes and streams.

By embracing Wal-Mart, groups like NRDC and Environmental Defense are not only absolving the company of the consequences of its business model, but implying that this method of retailing goods can, with adjustments, be made sustainable.

Worst of all, they are helping Wal-Mart expand. In the Northeast and West Coast, where Supercenters are relatively few and environmental sentiment runs strong, a greener image is just what Wal-Mart needs to overcome widespread public opposition to new stores.

In January alone, Wal-Mart opened 70 U.S. stores. At current growth rates, by 2015 Wal-Mart will have enlarged its domestic footprint by 20,000 acres, turning CO2-absorbing fields and forests into stores and parking lots. Big-box stores make incredibly inefficient use of land. While 200,000 square feet of retail spread over several two-story downtown buildings with shared parking takes up about four acres, a single-story Superstore of this size, with its standard 1,000 parking spaces, consumes nearly 20 acres.

Wal-Mart's new stores will use more electricity than its energy- efficiency measures will save. By making its existing outlets 20 percent more efficient, Wal-Mart says it will cut CO2 emissions by 2.5 million metric tons by 2013. But new stores built this year alone will consume enough electricity to add about 1 million metric tons of CO2 to the atmosphere.

It is not as though we need these stores. Between 1990 and 2005, the amount of store space per capita in this country doubled, while consumer spending grew at less than half that rate. The predictable result is that the U.S. is now home to thousands of dead malls and vacant-strip shopping centers. City planners are not the only ones alarmed. "The most over-retailed country in the world hardly needs more shopping outlets of any kind," advised PricewaterhouseCoopers in a report to real-estate investors.

Yet Wal-Mart continues to build -- consuming land, inducing more driving, and, perhaps most perilous of all, destroying what remains of small-scale, locally owned businesses. Tucked close to their customers in neighborhoods and downtowns, and sized to fit sidewalks rather than regional highway systems, it is these stores that are the true building blocks of a sustainable way of distributing goods. It is they, not Wal-Mart, that deserve the admiration and support of the environmental movement.

Stacy Mitchell is a senior researcher with the Institute for Local Self-Reliance and author of Big-Box Swindle: The True Cost of Mega- Retailers and the Fight for America's Independent Businesses. Copyright 2007. Grist Magazine, Inc.

Rachel's Democracy & Health News (formerly Rachel's Environment & Health News) highlights the connections between issues that are often considered separately or not at all. The natural world is deteriorating and human health is declining

because those who make the important decisions aren't the ones who bear the brunt. Our purpose is to connect the dots between human health, the destruction of nature, the decline of community, the rise of economic insecurity and inequalities, growing stress among workers and families, and the crippling legacies of patriarchy, intolerance, and racial injustice that allow us to be divided and therefore ruled by the few. In a democracy, there are no more fundamental questions than, "Who gets to decide?" And, "How do the few control the many, and what might be done about it?"


A RESPONSE TO PAUL HAWKEN'S 'TO REMAKE THE WORLD'

By Kate Davies

Hooray for Paul Hawken! His article "To Remake the World" in Rachel's News #908 and his new book "Blessed Unrest: How the Largest Movement in the World Came into Being and Why No One Saw it Coming" are extremely timely and thought-provoking.

Hawken has put his finger on a global phenomenon that has been growing since the 1999 protests against the World Trade Organization in Seattle. Largely unacknowledged by the spotlight of media attention, a new social movement has been quietly gaining strength in the U.S. and internationally. In bringing it to light, Hawken has revealed a trend that is positive and hopeful at a time when these qualities are sorely needed in the world.

Although he has done an outstanding job of describing the new movement, several points call out for further exploration.

First, Hawken shies away from giving the new movement the full recognition of a name, calling it instead "this unnamed movement." This is a little strange because it has already been given several names. My favorite is "the new progressive movement," in homage to the U.S. Progressive Movement of the late 19th and early 20th centuries. The new progressive movement embraces many of the same principles as its predecessor, including beliefs in truly democratic institutions and processes, efficient government and the elimination of corruption, social and economic justice, regulation of large corporations and monopolies, and environmental protection.

He also asserts that the new movement lacks many basic attributes of previous social movements, specifically an ideology, leaders, and internal organization. Let's look at each of these in more detail.

Ideology

Hawken says the new movement does not have an ideology and its "big contribution is the absence of one big idea." He is right -- in a sense. The new movement does not impose a rigid article of faith on its members, but it is guided by one big, inspirational idea. Indeed, Hawken acknowledges as much in the article's title.

The movement's big, inspirational idea is that ordinary people, acting together, can "remake the world." Collectively, empowered citizens can do more than just succeed on individual issues, like climate change or immigration. They can do more than just win legislative victories, like banning toxic flame retardants or protecting endangered species. The new movement is motivated by the transformative idea that by working together citizens can recreate the whole of society.

This is not a new concept. It is the same one that stimulated the birth of this country. But it is an idea that most Americans seem to have forgotten of late. In today's social and political climate, the thought that ordinary people can shape society -- rather than just relying on politicians, corporate leaders and economists -- is truly radical. This may not be "ideology" in the sense that Hawken uses the word, but it is a "big idea" that motivates the entire movement.

In addition to this, there are four goals or aspirations that unite much of the movement:

** Creating an open, participatory and fully accountable democracy;

** Social and economic justice;

** Sustainability for people and the planet; and

** Health and wellbeing for all.

Most members of the new movement are committed to all these goals, even if they work on only one. Collectively, they provide an inspiring and world-changing ideology, especially when combined with the idea that empowered citizens really can remake society.

Leaders

Hawken states that the new movement has few recognizable leaders. He says: "Its leaders are farmers, zoologists, shoemakers, and poets." In short, there is no Martin Luther King Jr. or Mahatma Gandhi to look up to and venerate.

Going one step further, I would say that the un-acclaimed leaders of the new movement exemplify new types of leadership. Transcending traditional concepts of charismatic and authoritative leadership, they are extremely low key and modest. They are people who emerge in response to specific situations and then relinquish their role when circumstances change. And they are people who serve a group rather than impose their will upon it.

The new movement is not alone in embodying new types of leadership. Many organizations are now experimenting with different approaches. Indeed, innovative ways of thinking about leadership have become very fashionable lately. Many authors, including Ronald Heifetz, Peter Senge and Meg Wheatley, have advocated many innovative ideas, such as:

** Seeing leadership as a process of relationship, rather than control;

** Recognizing that there are many different types of leaders;

** Thinking about leadership from a systems perspective; and

** Focusing on the adaptive challenges of long term change, rather than imposing immediate technical fixes.

They highlight that the concept of leadership itself is changing. So it should not be surprising that the leaders of the emerging movement are different from those of previous movements.

Internal Organization

Hawken asserts that the new movement does not have any internal organization, saying: "It forms, gathers and dissipates quickly," an organic process that is "dispersed, inchoate, and fiercely independent."

This is true, but the idea that the emerging movement is more of a loose network than a coherent organization is not new. In early 2004, Gideon Rosenblatt, Executive Director of ONE/Northwest, published a paper called "Movement as Network: Connecting People and Organizations in the Environmental Movement." In it, Rosenblatt made the point that the strength of the environmental movement is the countless links between people and organizations, rather than the people or the organizations themselves.

Although the "movement as network" idea espoused by Hawken and Rosenblatt has much to commend it, social movements need at least some internal organization. Without any lasting internal structures, it can be difficult to sustain the long-term political momentum needed to successfully confront the entrenched power elites.

So what types of structures would be helpful? There are many candidates including policy "think tanks" to facilitate strategic planning, national or regional groups to help local ones mobilize the public, research units to provide information, educational institutions to provide training and support, groups with expertise in communications, and last but not least, organizations with fundraising experience.

Beyond "To Remake the World" and "Blessed Unrest"

The next step beyond Paul Hawken's article and book is to ask: "How can we build the new movement?" The answer may determine not only the success of the movement itself but also whether it can truly "remake the world."

I believe that the emerging movement needs to deepen its understanding of what it takes to achieve systemic social change. This will require a greater understanding of the culture it wants to transform and a more strategic approach to advance progressive change.

Understanding Culture

Many members of the new movement are natural activists -- me included. By this, I mean we want to identify problems and solve them. We want to fix what's wrong with the world! Our strengths lie in targeting specific issues and promoting solutions.

But this emphasis on particular problems means that we pay less attention to the cultural origins that cause the problems we seek to correct. Developing an in-depth understanding of the fundamental economic, political and social forces that shape western culture is essential to identify the leverage points for change. If the new movement does not have a comprehensive knowledge of the culture in which it operates, how can it hope to intervene effectively?

This is challenging because issues are usually represented separately from each other by the media and other mainstream social institutions. Unemployment is portrayed as a different issue from racism. Racism is framed independently of environmental quality. Environmental quality is described without any connection to the economy. This fragmentation makes the public perceive individual issues in isolation from one another and prevents them from seeing the common cultural origins that connect different issues.

A Strategic Approach to Progressive Change

Activists' usual emphasis on immediate solutions also means that the new movement pays less attention to strategies for long term success. As a result, it is relatively unskilled at achieving lasting, resilient change. Although the emerging movement is good at winning battles, it needs a better understanding of the strategies necessary to win the war.

Developing a strategic approach to progressive change will require knowledge of how social change actually happens. So how can the new movement acquire such knowledge?

1. One key source of information is previous movements, such as the civil rights, anti-Vietnam War, and women's movements. These and other movements have not yet been adequately studied for what they can teach the new movement about progressive social change.

2. Current thinking about the process of social change provides another source. Ideas about social constructivism will be particularly relevant.

3. A third source is adult learning theory. Much work has already been done on the relationship between learning and change that will be helpful.

In summary, the emerging movement could learn a lot about the process of progressive social change that will enable it to be more strategic.

Closing Comment

Paul Hawken's article and book make an important contribution to progressive social change. They describe what has previously been an unnoticed, but widespread, movement and in doing they so make it much more visible.

But Hawken's work is double-edged. At the same time as he describes the new movement, he asserts that it is fundamentally indescribable, saying: "No book can explain it, no person can represent it, no words can encompass it." This remark runs the risk of being more poetic than helpful.

Indeed, on the basis of these words, Hawken's readers may question the existence of a movement at all. If it cannot be explained, is it in fact real? If it cannot be represented, does it actually exist? If it cannot be encompassed, is it really a single entity? I fear that Hawken's dualistic representation of the movement could dilute its significance and effectiveness. It also threatens to undermine his central thesis -- that there is a new global movement for progressive social change. Hawken's true gift is to help us all see just how real this movement is -- real enough "to remake the world."

Kate Davies is Core Faculty in the Center for Creative Change at Antioch University Seattle. She is currently working on a book called "Making Change: Ideas, Values and Strategies for Building the New Progressive Movement."


TO REMAKE THE WORLD

By Paul Hawken

Something earth-changing is afoot among civil society -- a significant social movement is eluding the radar of mainstream culture.

I have given nearly one thousand talks about the environment in the past fifteen years, and after every speech a smaller crowd gathered to talk, ask questions, and exchange business cards. The people offering their cards were working on the most salient issues of our day: climate change, poverty, deforestation, peace, water, hunger, conservation, human rights, and more. They were from the nonprofit and nongovernmental world, also known as civil society. They looked after rivers and bays, educated consumers about sustainable agriculture, retrofitted houses with solar panels, lobbied state legislatures about pollution, fought against corporate-weighted trade policies, worked to green inner cities, or taught children about the environment. Quite simply, they were trying to safeguard nature and ensure justice.

After being on the road for a week or two, I would return with a couple hundred cards stuffed into various pockets. I would lay them out on the table in my kitchen, read the names, look at the logos, envisage the missions, and marvel at what groups do on behalf of others. Later, I would put them into drawers or paper bags, keepsakes of the journey. I couldn't throw them away.

Over the years the cards mounted into the thousands, and whenever I glanced at the bags in my closet, I kept coming back to one question: did anyone know how many groups there were? At first, this was a matter of curiosity, but it slowly grew into a hunch that something larger was afoot, a significant social movement that was eluding the radar of mainstream culture.

I began to count. I looked at government records for different countries and, using various methods to approximate the number of environmental and social justice groups from tax census data, I initially estimated that there were thirty thousand environmental organizations strung around the globe; when I added social justice and indigenous organizations, the number exceeded one hundred thousand. I then researched past social movements to see if there were any equal in scale and scope, but I couldn't find anything.

The more I probed, the more I unearthed, and the numbers continued to climb. In trying to pick up a stone, I found the exposed tip of a geological formation. I discovered lists, indexes, and small databases specific to certain sectors or geographic areas, but no set of data came close to describing the movement's breadth. Extrapolating from the records being accessed, I realized that the initial estimate of a hundred thousand organizations was off by at least a factor of ten. I now believe there are over one million organizations working toward ecological sustainability and social justice. Maybe two.

By conventional definition, this is not a movement. Movements have leaders and ideologies. You join movements, study tracts, and identify yourself with a group. You read the biography of the founder(s) or listen to them perorate on tape or in person. Movements have followers, but this movement doesn't work that way. It is dispersed, inchoate, and fiercely independent. There is no manifesto or doctrine, no authority to check with.

I sought a name for it, but there isn't one.

Historically, social movements have arisen primarily because of injustice, inequalities, and corruption. Those woes remain legion, but a new condition exists that has no precedent: the planet has a life- threatening disease that is marked by massive ecological degradation and rapid climate change. It crossed my mind that perhaps I was seeing something organic, if not biologic. Rather than a movement in the conventional sense, is it a collective response to threat? Is it splintered for reasons that are innate to its purpose? Or is it simply disorganized? More questions followed. How does it function? How fast is it growing? How is it connected? Why is it largely ignored?

After spending years researching this phenomenon, including creating with my colleagues a global database of these organizations, I have come to these conclusions: this is the largest social movement in all of history, no one knows its scope, and how it functions is more mysterious than what meets the eye.

What does meet the eye is compelling: tens of millions of ordinary and not-so-ordinary people willing to confront despair, power, and incalculable odds in order to restore some semblance of grace, justice, and beauty to this world.

Clayton Thomas-Muller speaks to a community gathering of the Cree nation about waste sites on their native land in Northern Alberta, toxic lakes so big you can see them from outer space. Shi Lihong, founder of Wild China Films, makes documentaries with her husband on migrants displaced by construction of large dams. Rosalina Tuyuc Vel?squez, a member of the Maya-Kaqchikel people, fights for full accountability for tens of thousands of people killed by death squads in Guatemala. Rodrigo Baggio retrieves discarded computers from New York, London, and Toronto and installs them in the favelas of Brazil, where he and his staff teach computer skills to poor children. Biologist Janine Benyus speaks to twelve hundred executives at a business forum in Queensland about biologically inspired industrial development. Paul Sykes, a volunteer for the National Audubon Society, completes his fifty-second Christmas Bird Count in Little Creek, Virginia, joining fifty thousand other people who tally 70 million birds on one day.

Sumita Dasgupta leads students, engineers, journalists, farmers, and Adivasis (tribal people) on a ten-day trek through Gujarat exploring the rebirth of ancient rainwater harvesting and catchment systems that bring life back to drought-prone areas of India. Silas Kpanan'Ayoung Siakor, who exposed links between the genocidal policies of former president Charles Taylor and illegal logging in Liberia, now creates certified, sustainable timber policies.

These eight, who may never meet and know one another, are part of a coalescence comprising hundreds of thousands of organizations with no center, codified beliefs, or charismatic leader. The movement grows and spreads in every city and country. Virtually every tribe, culture, language, and religion is part of it, from Mongolians to Uzbeks to Tamils. It is comprised of families in India, students in Australia, farmers in France, the landless in Brazil, the bananeras of Honduras, the "poors" of Durban, villagers in Irian Jaya, indigenous tribes of Bolivia, and housewives in Japan. Its leaders are farmers, zoologists, shoemakers, and poets.

The movement can't be divided because it is atomized -- small pieces loosely joined. It forms, gathers, and dissipates quickly. Many inside and out dismiss it as powerless, but it has been known to bring down governments, companies, and leaders through witnessing, informing, and massing.

The movement has three basic roots: the environmental and social justice movements, and indigenous cultures' resistance to globalization -- all of which are intertwining. It arises spontaneously from different economic sectors, cultures, regions, and cohorts, resulting in a global, classless, diverse, and embedded movement, spreading worldwide without exception. In a world grown too complex for constrictive ideologies, the very word movement may be too small, for it is the largest coming together of citizens in history.

There are research institutes, community development agencies, village- and citizen-based organizations, corporations, networks, faith-based groups, trusts, and foundations. They defend against corrupt politics and climate change, corporate predation and the death of the oceans, governmental indifference and pandemic poverty, industrial forestry and farming, depletion of soil and water.

Describing the breadth of the movement is like trying to hold the ocean in your hand. It is that large. When a part rises above the waterline, the iceberg beneath usually remains unseen. When Wangari Maathai won the Nobel Peace Prize, the wire service stories didn't mention the network of six thousand different women's groups in Africa planting trees. When we hear about a chemical spill in a river, it is never mentioned that more than four thousand organizations in North America have adopted a river, creek, or stream. We read that organic agriculture is the fastest-growing sector of farming in America, Japan, Mexico, and Europe, but no connection is made to the more than three thousand organizations that educate farmers, customers, and legislators about sustainable agriculture.

This is the first time in history that a large social movement is not bound together by an "ism." What binds it together is ideas, not ideologies. This unnamed movement's big contribution is the absence of one big idea; in its stead it offers thousands of practical and useful ideas. In place of isms are processes, concerns, and compassion. The movement demonstrates a pliable, resonant, and generous side of humanity.

And it is impossible to pin down. Generalities are largely inaccurate. It is nonviolent, and grassroots; it has no bombs, armies, or helicopters. A charismatic male vertebrate is not in charge. The movement does not agree on everything nor will it ever, because that would be an ideology. But it shares a basic set of fundamental understandings about the Earth, how it functions, and the necessity of fairness and equity for all people partaking of the planet's life- giving systems.

The promise of this unnamed movement is to offer solutions to what appear to be insoluble dilemmas: poverty, global climate change, terrorism, ecological degradation, polarization of income, loss of culture. It is not burdened with a syndrome of trying to save the world; it is trying to remake the world.

There is fierceness here. There is no other explanation for the raw courage and heart seen over and again in the people who march, speak, create, resist, and build. It is the fierceness of what it means to know we are human and want to survive.

This movement is relentless and unafraid. It cannot be mollified, pacified, or suppressed. There can be no Berlin Wall moment, no treaty-signing, no morning to awaken when the superpowers agree to stand down. The movement will continue to take myriad forms. It will not rest. There will be no Marx, Alexander, or Kennedy. No book can explain it, no person can represent it, no words can encompass it, because the movement is the breathing, sentient testament of the living world.

And I believe it will prevail. I don't mean defeat, conquer, or cause harm to someone else. And I don't tender the claim in an oracular sense. I mean the thinking that informs the movement's goal -- to create a just society conducive to life on Earth -- will reign. It will soon suffuse and permeate most institutions. But before then, it will change a sufficient number of people so as to begin the reversal of centuries of frenzied self-destruction.

Inspiration is not garnered from litanies of what is flawed; it resides in humanity's willingness to restore, redress, reform, recover, reimagine, and reconsider. Healing the wounds of the Earth and its people does not require saintliness or a political party. It is not a liberal or conservative activity. It is a sacred act.

Paul Hawken is an entrepreneur and social activist living in California. His article in this issue is adapted from "Blessed Unrest," to be published by Viking Press and used by permission.

Orion magazine, 187 Main Street, Great Barrington, MA 01230, 888/909-6568, ($35/year for 6 issues). Subscriptions are available online: www.orionmagazine.org.

BEYOND ECO-APARTHEID

Is the Green Movement too White? Van Jones proposes a solution

By Van Jones

In 2005, Americans sat before our television sets, horrified by images of an American city underwater. In 2006, we sat in the nation's movie houses, watching Al Gore make the case for urgent action. In 2007, Americans are finally rising from our seats and demanding action to reverse global warming.

Students are planning marches and protests to push Congress to curb emissions. Consumers and investors are flocking to carbon-cutting solutions like hybrid cars, bio-diesel and solar power. Reporters and editors are moving their environmental stories from the back of the paper to Page 1A, above the fold. Corporations are stampeding each other to showcase their love of clear skies and lush forests. And both the blue Democrats and the red Republicans are suddenly waving green banners.

The climate crisis is galloping from the margins of geek science to the epicenter of our politics, culture and economics. As the new environmentalists advance, only two questions remain: whom will they take with them? And whom will they leave behind?

We know that climate activists will convince Congress to adopt market- based solutions (like "cap and trade"). This approach may help big businesses do the right thing. But will those same activists use their growing clout to push Congress to better aid survivors of Hurricane Katrina? Black and impoverished victims of our biggest eco-disaster still lack housing and the means to rebuild. Will they find any champions in the rising environmental lobby?

We know that the climate activists will fight for subsidies and supports for the booming clean energy and energy conservation markets. But will they insist that these new industries be accessible beyond the eco-elite -- creating jobs and wealth-building opportunities for low-income people and people of color? In other words, will the new environmental leaders fight for eco-equity in this "green economy" they are birthing? Or will they try to take the easy way out -- in effect, settling for an eco-apartheid?

The sad racial history of environmental activism tends to discourage high hopes among racial justice activists. And yet this new wave has the potential to be infinitely more expansive and inclusive than previous eco-upsurges.

Environmentalism's 1st Wave: Conservation

But first, the bad news: no previous wave of US environmentalism ever broke with the racism or elitism of its day. In fact, earlier environmental movements often either ignored racial inequality or exacerbated it.

For example, consider the first wave of environmentalism: the "conservation" wave.

The true original conservationists were not John Muir, Teddy Roosevelt or David Brower. They were the Native Americans. The original Americans were geniuses at living in harmonic balance with their sister and brother species. Before the Europeans arrived, the entire continent was effectively a gigantic nature preserve. A squirrel could climb a tree at the Atlantic Ocean and move branch-to-branch-to-branch until she reached the Mississippi River. So many birds flew south for the winter that their beating wings were like thunder, and their numbers blotted out the sun.

Native Americans achieved this feat of conservation on a continent that was fully populated by humans. In fact, the leading indigenous civilizations achieved world-historic heights of political statesmanship by founding the Iroquois Federation, a model for the US founders.

Unfortunately, those same founders rejected the Indians' example of environmental stewardship. Colonizers wiped out whole species to make pelts, felled forests and destroyed watersheds. Settlers almost exterminated the buffalo just for shooting sport.

The destruction of nature was so relentless, heedless and massive that some Europeans balked. They created the famed "conservation movement," a slogan for which could well have been: "Okay, but let's not pave EVERY-thing!"

Fortunately, the conservationists' enjoyed some success; their worthy efforts continue to this day. But the first and best practitioners of "environmental conservation" were not white people. They were red people. And the mostly-white conservation movement still owes an incalculable debt to the physical and philosophical legacy of indigenous peoples. But it is a debt that conservation leaders apparently have no intention of ever repaying.

Case in point: today's large conservation groups together have countless members, hundreds of millions of dollars and scores of professional lobbyists. But when Native Americans fight poverty, hostile federal bureaucracies and the impact of broken treaties, these massive groups are almost always missing in action. In that regard, Indian-killing Teddy Roosevelt set the enduring pattern for most conservationists' racial politics: "Let's preserve the land we stole."

Environmentalism's 2nd Wave: Regulation

In the 1960s, the second wave of environmentalism got under way. Sparked by Rachel Carson's book, Silent Spring, this wave could be called the "regulation" wave. It challenged the worst excesses of industrial-age pollution and toxics. Among other important successes, this wave produced the Clean Air Act, the Clean Water Act, the EPA and the first Earth Day in 1970.

But this wave, too, was affluent and lily white. As a result, it developed huge blind spots to toxic pollution concentrating in communities of poor and brown-skinned people. In fact, some people of color began to wonder if white polluters and white environmentalists were unconsciously collaborating. They were effectively steering the worst polluters and foulest dumps into Black, Latino, Asian and poor neighborhoods.

Finally, people of color people began speaking out. And in the 1980s, a new movement was born to combat what its leaders called "environmental racism." Those leaders said: "Regulate pollution, yes -- but do it with equity. Do it fairly. Don't make black, brown and poor children bear a disproportionate burden of asthma and cancer."

Two decades later, that so-called "environmental justice" movement continues to defend the poor and vulnerable. But it functions separately from so-called "mainstream" (white) environmentalism. That movement has never fully embraced the cause of environmentalists of color. In other words, since the 1980s, we have had an environmental movement that is segregated by race.

Given this history of racial apathy, exclusion and even hostility, is there any reason to expect much different from the latest upsurge of eco-activism?

The Third Time's the Charm: Investment

Well, in fact: there is. The reason for hope has to do with the very nature of the present wave. Simply put, this wave is qualitatively different from the previous ones.

The first wave was about preserving the natural bounty of the past. The second wave was about regulating the problems of the industrial present. But the new wave is different. It is about investing in solutions for the future: solar power, hybrid technology, bio-fuels, wind turbines, tidal power, fuel cells, energy conservation methods and more.

The green wave's new products, services and technologies could also mean something important to struggling communities: the possibility of new green-collar jobs, a chance to improve community health and opportunities to build wealth in the green economy. If the mostly- white global warming activists join forces with people of color, the United States can avoid both eco-apocalypse and eco-apartheid -- and achieve eco-equity.

Discussions of race, class and the environment today can go beyond how to atone for past hurts or distribute present harms. Today we can ask: how do we equitably carve up the benefits of a bright future?

And that kind of question gives a powerful incentive for people of color, labor leaders and low-income folks to come back to the environmental table. At the same time, for all their present momentum, the newly ascendant greens cannot meet their short-term objectives or their long-term goals -- without the support of a much broader coalition.

Green Rush = Green-Collar Jobs?

From the perspective of people of color, helping to build a bigger green tent would be worth the effort. Green is rapidly becoming the new gold. The LOHAS (lifestyles of health and sustainability) sector is growing like crazy: it was a $229 billion piece of the US economy in 2006. And it is growing on a vertical.

But unfortunately, the LOHAS sector is probably the most racially segregated part of the US economy -- in terms of its customers, owners and employees. Changing that could create better health, more jobs and increased wealth for communities that need all three.

For example, an urban youth trained to install solar panels can go on to become an electrical engineer. Imagine a young adult trained to keep buildings from leaking energy by putting in double-paned glass -- on track to becoming a glazer. Those trained to work with eco-chic bamboo or to fix hybrid engines will find good work.

We need Green Technology Training Centers in every public high school, vocational school and community college. And America needs an Energy Corps, like Americorps and the Peace Corps, to train and deploy millions of youth in the vital work of rewiring a nation.

Beyond that, people of color must also have the chance to become inventors, investors, owners, entrepreneurs and employers in the new greener world. They should also use their political power to influence the scope, scale and shape of the green economy.

It makes sense for people of color to work for a green growth agenda, as long as green partisans embrace broad opportunity and shared prosperity as key values.

Eco-Equity Is Smart Politics

For global warming activists, embracing eco-equity would be a politically brilliant move. In the short term, a more inclusive approach will prevent polluters from isolating and derailing the new movement. In the long run, it is the only strategy that will save the Earth.

In the near term, opponents of change will actively recruit everyone whom this new movement ignores, offends or excludes. California provides a cautionary tale; voters there rejected a 2006 ballot measure to fund clean energy research. A small excise tax on the state's oil extraction would have produced a huge fund, propelling California into the global lead in renewable energy. But the same message that wooed Silicon Valley and Hollywood elites flopped among regular voters.

Clean energy proponents ran abstract ads about "energy independence" and the bright eco-future. But big oil spoke directly to pocket-book issues, running ads that warned (falsely) that the tax would send gas prices through the roof. On that basis, an NAACP leader and others joined the opposition. And the measure's original sky-high support plummeted.

To avoid getting out-maneuvered politically, green economy proponents must actively pursue alliances with people of color. And they must include leaders, organizations and messages that will resonate with the working class.

The Hidden Danger of Eco-Apartheid

But the real danger lies in the long term. The United States is the world's biggest polluter. To avoid eco-apocalypse, Congress will have to do more than pass a "cap and trade" bill. And Americans will have to do more than stick in better light bulbs.

To pull off this ecological U-turn, we will have to fundamentally restructure the US economy. We will need to "green" whole cities. We will have to build thousands of wind farms, install tens of millions of solar panels and retrofit millions of buildings. We will have to retire our car, truck and bus fleets, which are based on combustible engines and oil, replacing them with plug-in hybrids and electric vehicles powered by a clean-energy grid.

Reversing global warming will require a WWII level of mobilization. It is the work of tens of millions, not hundreds of thousands. Such a shift will require massive support at the social, cultural and political levels. And in an increasingly non-white nation, that means enlisting the passionate involvement of millions of so-called "minorities" -- as consumers, inventors, entrepreneurs, investors, buzz marketers, voters and workers.

All For Green & Green For All

It is obvious that eco-chic, embraced by the eco-elite, won't save the planet. Climate change activists may be tempted to try to sidestep the issues of racial inclusion, in the name of expedience -- but the truth is that eco-apartheid is just a speed-bump on the way to eco- apocalypse. Any successful, long-term strategy will require a full and passionate embrace of the principle of eco-equity.

Beyond that, there is the moral imperative. The predicted ecological disasters will hit poor people and people of color -- first and worst. Our society has an obligation to insure equal protection from the peril -- and equal access to the promise -- of our new, ecological age.

So now is the time for the green movement to reach out. By definition, a politics of investment is a politics of hope, optimism and opportunity. The bright promise of the green economy could soon include, inspire and energize people of all races and classes. And nowhere is the need for a politics of hope more profound than it is among America's urban and rural poor.

More importantly, climate activists can open the door to a grand historic alliance -- a political force with the power to bend history in a new direction. Let the climate activists say: "We want to build a green economy, strong enough to lift people out of poverty. We want to create green pathways out of poverty and into great careers for America's children. We want this 'green wave' to lift all boats. This country can save the polar bears and black kids, too."

Let them say: "In the wake of Katrina, we reject the idea of 'free market' evacuation plans. Families should not be left behind to drown because they lack a functioning car or a credit card. Katrina's survivors still need our help. And we need a plan to rescue everybody next time. In an age of floods, we reject the ideology that says must let our neighbors 'sink or swim.'"

Let them say: "We want those communities that were locked out of the last century's pollution-based economy to be locked into the new, clean and green economy. We don't have any throw-away species or resources. And we don't have any throw-away children or neighborhoods either. All of creation is precious. And we are all in this together."

A Green Growth Alliance

Those words would make environmental history.

More importantly, they could begin a complete realignment of American politics. The idea of "social uplift environmentalism" could serve as the cornerstone for an unprecedented "Green Growth Alliance." Imagine a coalition that unites the best of labor, business, racial justice activists, environmentalists, intellectuals, students and more. That combination would rival the last century's New Deal and New Right coalitions.

To give the Earth and her peoples a fighting chance, we need a broad, eco-populist alliance -- one that includes every class under the sun and every color in the rainbow. By embracing eco-equity as their ultimate goal, the climate crisis activists can play a key role in birthing such a force.

Van Jones is the president of the Ella Baker Center for Human Rights, in Oakland, California (ellabakercenter.org) and a National Apollo Alliance steering committee member.

Copyright CHICAGO CONSCIOUS CHOICE 920 N. Franklin, Suite 202 Chicago, IL 60610 Phone: 312.440.4373 Fax: 312.751.3973

Return to Table of Contents


HOW SHOULD GOVERNMENT VIEW A LOOMING CATASTROPHE?

By Mary Christina Wood

Editor's note: Mary Christina Wood is the Philip H. Knight Professor of Law and Morse Center for Law and Politics Resident Scholar (2006-07) at the University of Oregon School of Law, where she teaches natural resources law, federal Indian law, public lands law, wildlife law, hazardous waste law and property law. She is the founding director of the school's Environmental and Natural Resources Law Program. This is a transcript of her talk to Eugene City Club May 4, 2007.

Last month Time magazine issued a special edition on climate change in which it said, "Never mind what you've heard about global warming as a slow-motion emergency that would take decades to play out. Suddenly and unexpectedly, the crisis is upon us."

United Nations reports show rapid melting of the polar ice sheets, Antarctica, Greenland and glaciers throughout the world. The oceans are heating and rising. Coral reefs are bleaching and dying. Species are on exodus from their habitats towards the poles. As a result of global warming the world now faces crop losses, food shortages, flooding, coastal loss, wildfire, drought, pests, hurricanes, heat waves, disease and extinctions. An international climate team has warned countries to prepare for as many as 50 million human environmental refugees by 2010. Scientists explain that, due to the carbon already in the atmosphere, we are locked into a temperature rise of at least 2 degrees F. This alone will have impacts for generations to come, but if we continue business as usual, they predict Earth will warm as much as 10.4 degrees F, which will leave as many as 600 million people in the world facing starvation and 3.2 billion people suffering water shortages; it will convert the Amazon rainforest into savannah, and trigger the kind of mass extinction that hasn't occurred on Earth for 55 million years.

Global heating is leagues beyond what civilization has ever faced before.

I will give only brief background here. As you know, global heating is caused largely by heat-trapping gasses that we emit into our atmosphere. The more greenhouse gasses we put into the atmosphere, the hotter Earth gets. It's rather like putting a greenhouse roof around the entire Earth and locking it down. Carbon dioxide has climbed to levels unknown in the past 650,000 years, and we are still pumping it out at an annual increase of over 2 percent per year. The U.S. produces 25 percent of the world's carbon emissions. Carbon persists in the atmosphere up to a few centuries, so our emissions on this very day will have impacts far beyond our lifetimes. We can't turn this thermostat down. Scientists across the globe warn that we are nearing a dangerous "tipping point" that will set in motion irreversible dynamics through environmental feedback loops. After that tipping point, our subsequent carbon reductions, no matter how impressive, will not thwart long-term catastrophe. British Prime Minister Tony Blair said months ago, "This disaster is not set to happen in some science fiction future many years ahead, but in our lifetime. Unless we act now... these consequences, disastrous as they are, will be irreversible."

Let us consider the magnitude of the challenge we face.

First -- the scale of the threat. It's global. It affects every square inch of Earth.

Second, the intensity of the threat. Global warming threatens all of our basic survival mechanisms -- food, water, shelter, and health. British commentator Mark Lynas, author of High Tide, summarizes it this way: If we go on emitting greenhouse gases at anything like the current rate, most of the surface of the globe will be rendered uninhabitable within the lifetimes of most readers of this article.

Third -- the timeframe for response. Jim Hansen, the leading climate scientist for NASA states: "[W]e have at most 10 years -- not 10 years to decide upon action, but 10 years to alter fundamentally the trajectory of global greenhouse emissions." We have to reverse what is now still a climbing trajectory of greenhouse gas emissions and bring it down within 10 years at most, then reduce it 80 percent by 2050. You can think of these requirements as Nature's Mandate. The tipping point concept means that we are sitting on a ticking clock. If we fail to bring carbon down in the next decade, we effectively lock the doors of our heating greenhouse and throw out the keys, leaving ourselves and future generations trapped inside as disaster unfolds over the long term.

Fourth -- consider the scale of response needed to meet Nature's Mandate. Nearly every aspect of human daily living results in carbon emissions. Therefore, climate response must reach into virtually every sector of society: residential, commercial, industrial, transportation -- everything.

When we consider the scale, intensity, timeframe and kind of action needed, it is plain to see that we have never faced anything remotely like global warming before. Only a national response that is swift, focused and encompassing will be sufficient to confront this threat.

Let's reflect back to when citizens across this country rose in solidarity behind a clear national purpose. The attack on Pearl Harbor galvanized America in a way that we desperately need today. Almost overnight, the private business sector began retooling and overhauling production lines. The automobile industry scaled down car sales and channeled its workers and materials into the production of defense vehicles. The financial world sold war bonds. Communities planted victory gardens to grow food locally so that the commercial food supplies could be sent to the military. Consumers made do with the bare minimum. States lowered their speed limits to conserve gas.

Individuals took initiative without being asked. Men signed up for active duty. Women took their place in the work force. Volunteers entertained the troops.

Speakers Bureaus formed in cities across the country, drawing 100,000 volunteers. These Victory Speakers, as they were called, were key to mobilizing the nation quickly. They would give five-minute speeches at theaters, club meetings, town halls, schools -- any forum they could find -- to explain the nature of the threat and the need for citizen support. Victory Speakers were not chosen for their outstanding oratory skills, but rather were the "trusted and familiar voices" in the community -- the banker, carpenter, mother or school teacher.

People did not just sit by. This was a time in our nation's history when individuals, families, businesses, schools and neighborhoods were engaged together, tapping their resources, ingenuity and energy in concerted defense of the country they loved and the future they hoped to pass to their children.

Generations later, how is this same country responding to the threat of climate crisis?

The reality today is that most Americans are too absorbed in their own routines to make time for global warming. We parents tend to be an especially busy group. We are so consumed with taking our children to soccer games and piano lessons that we don't think ahead to how our children will get food and water, and be safe from storms, disease, and all of the other life-threatening circumstances that planet's heating will bring them. By living out the American dream, we are essentially signing our own children up for a draft for their lifetimes. But this war will be the most frightening because it has no end in sight for even their descendants, and all of Nature's survival resources will be scarce. Unfortunately, it's no consolation that we are good, devoted parents who just aren't that interested in global warming. Nature won't recognize our children as conscientious objectors to climate crisis.

To be sure, there are some Americans who are engaged and responding with small changes in their lives. They ride the bus more often, they refuse to buy bottled water, they turn off lights. This brings them comfort, thinking the problem is on its way to being solved. These people are important models, but national defense cannot be put on the backs of a few good soldiers. Most concerned citizens are doing nothing to enlist the rest of society in climate defense. There are no Victory Speakers for climate crisis.

Small progress can give us a dangerous sense of security. Overall, our society is nowhere near decarbonizing. Climate defense entails carbon math. We lose this war for countless generations to come if we can't get our total planetary carbon levels down before the tipping point. Each day that passes, the window of opportunity to avert global catastrophe closes a little more.

Looking back, Hurricane Katrina was the Pearl Harbor of climate crisis. But in World War II, new agencies and commissions sprang up overnight to amass a national defense effort. One would think that every elected body and every agency in America would be convening task forces to achieve carbon lockdown within a decade. But aside from a small handful of officials, there is no leadership at the helm. There are plenty of hummers on the streets of America, but we have no national defense against global warming.

We simply cannot meet Nature's Carbon Mandate without leadership. Only government can provide both the regulation and the infrastructure necessary to bring carbon down within 10 years. We have thousands of agencies -- more than any other nation in the world. If every one of them made global warming a top priority, we might stand a chance of meeting Nature's Mandate head on. But government would have to start now. Tony Blair said to the world five months ago, "There is nothing more serious, more urgent, more demanding of leadership... in the global community."

Instead of defending our atmosphere, our government is driving this country towards runaway greenhouse gas emissions. County commissioners are approving trophy home subdivisions and destination resorts as if global warming didn't exist. State environmental agencies are approving air permits as if global warming didn't exist. The Forest Service is approving timber sales as if global warming didn't exist. And the electric power industry is racing to build more than 150 new coal-fired power plants across the U.S., banking on federal approval as if global warming didn't exist.

You might ask why, in the face of an unprecedented threat to the planet, does our American population just sit by and allow government to act as if the problem doesn't exist? Harvard psychology professor Daniel Gilbert suggests that humans evolved to respond to immediate threats, like enemies coming over the hillside. Intelligent as we are, it's hard for us to take seriously any threat that is not immediate. In other words, we'd be better off being invaded by Martians. But I think there is even more to it than that. Global warming has been captured by the press and the public as an environmental issue. Americans are fundamentally confused about government's role towards our environment, and that confusion operates as a dead-weight against decisive action.

In the remaining time, I want to suggest why our modern environmental law inhibits a response to global warming. And then I will suggest how Americans could demand climate response through asserting their collective property rights.

Let me first explain how our atmosphere has been caught in a legal death spiral. Environmental law consists of hundreds of statutes and regulations passed since the 1970s to protect our natural resources. This is the body of law I have taught over the past 16 years. Had environmental law worked, we would not have this ecological crisis on our hands. The heart of the problem is this: While the purpose of every local, state and federal environmental law is to protect natural resources, nearly every law authorizes the agencies to permit the very pollution or damage that the statutes were designed to prevent. Of course, the permit systems were never intended to subvert the goals of environmental statutes. But most agencies today spend nearly all of their resources to permit, rather than prohibit, environmental destruction. Most officials are good, dedicated individuals, but as a group, they dread saying no to permits. Essentially, our agencies have taken the discretion in the law and have used it to destroy nature, including its atmosphere.

You can think of environmental law, with all of its statutes and regulations, as one big picture. The agencies have constructed a frame for that picture. The four sides of that frame are discretion, discretion, discretion and discretion -- to allow damage to our natural resources. All of environmental law is carried out through that frame. And so, though our statutes were passed to protect the air, water, farmland, wildlife and other resources, when the laws are carried out through the discretion frame, they are used as tools to openly legalize damage. That is why we have species extinctions, rivers running dry, dead zones in our oceans, and global warming.

Why would public servants whose salaries are funded by tax dollars use their discretion to allow destruction of resources? It is because the discretion frame never characterizes natural resources as quantified property assets. Instead, the environment is portrayed as a nebulous feature of our world. So when private parties come to agencies seeking permits to pollute or destroy resources, they almost always carry the day because their property rights are clear and tangible.

Our federal government uses this discretion frame to justify inaction in the face of climate crisis. Protecting our atmosphere is characterized as a political choice. EPA claims discretion to permit pollution by the oil, gas, coal, and automobile industries -- no matter that this legalized pollution will degrade the atmosphere so much that it will no longer support human civilization as we know it.

So how can the public engage government to immediately respond to global warming? The public has to find a new frame for viewing government's role towards Nature. As author George Lakoff says, "Reframing is changing the way the public sees the world. It is changing what counts as common sense." This new way of looking at government's role must engage all agencies and officials in climate defense as the supreme national priority.

Reframing environmental law does not mean throwing out our environmental statutes. Those statutes give us a tremendous bureaucracy that we can steer back on course. They simply have to be infused with clear principles. The reframing I suggest draws on Supreme Court jurisprudence that has been around since the beginning of this country. It characterizes all of the resources essential to human survival -- including the waters, wildlife, and air -- as being packaged together in a legal endowment which I call Nature's Trust. Our imperiled atmosphere is one of the assets in that trust. A trust is a fundamental type of ownership whereby one manages property for the benefit of another. Long ago, the Supreme Court said that government, as the only enduring institution with control over human actions, is a trustee of Nature's resources. In other words, government holds this great natural trust for all generations of citizens. We all hold a common property interest in Nature's Trust.

With every trust there is a core duty of protection. The trustee must defend the trust against injury. When we call upon government to safeguard our atmosphere, we are invoking principles that are engrained in government itself. Back in 1892, our Supreme Court said: "The state can no more abdicate its trust over property in which the whole people are interested... than it can abdicate its police powers in the administration of government." The Nature's Trust concept is so basic to governance that it is found in many other countries today. For example, 13 years ago, the Philippines Supreme Court invoked the trust to halt logging the last of the ancient rainforest there, saying, "[E]very generation has a responsibility to the next to preserve that... harmony [of Nature]... [These principles] are assumed to exist from the inception of humankind." In contrast to the discretion frame, the trust frame forces government to protect Nature's Endowment as property for future generations to inherit. Failure to protect natural inheritance amounts to generational theft.

We can all take the very same set of environmental laws, and without changing a word of them, reframe the government's discretion to destroy Nature into an obligation to protect Nature. But this principle works in reverse as well. We can pass any new law we want, and no matter what it says, if it is pressed through the discretion frame, the government will continue to impoverish natural resources until our society can no longer sustain itself.

The trust frame can be a coalescing force to confront climate crisis, in three ways. First, it may generate a national feeling of entitlement towards Nature. The discretion frame gives no hint of environmental loss. Because air and other natural resources are not defined assets, we never imagine that they could be all spent down, all used up. We seem unbothered even when our government leads us into global environmental catastrophe. But when we portray nature as a trust rather than an ill-defined commons, we vest citizens with expectations of enduring property rights to a defined, bounded asset. We start thinking, "Hey, that's my air, even if I share it with others." Pollution of that air becomes an infringement on American property. Government is obligated to defend that property. The failure to mount a national climate defense becomes as absurd a proposition as the idea of government sitting idle during an attack on American soil.

Second, by defining Nature in familiar property terms, the trust frame reconciles private property rights with environmental protection. The discretion frame doesn't do this. It portrays environmental resources as nebulous features of the world we live in. Private property rights carry the day in our agencies simply because they draw upon a language of property that is so deeply embedded in our national culture. To confront any environmental crisis today, including global warming, we have to be clear on how public resources and private property rights fit together in the scheme of things. The trust frame is itself a property concept, so rather than pitting environment against property rights, you are fitting Nature into the system of property rights. The Nature's Trust frame is not anti-property rights. To the contrary, it affirms our collective property rights in assets, like the atmosphere, that support humanity. In securing our public property, the trust also anchors our entire system of private property rights. All private property depends on Nature's infrastructure. When that infrastructure collapses, it causes natural disasters that make property boundaries irrelevant. Remember, private property deeds didn't account for anything in the aftermath of Hurricane Katrina. And they won't account for anything along coastlines inundated by rising sea levels.

Third, the trust frame positions all nations of the world in a logical relationship towards Nature. The atmosphere is shared as property among sovereign nations of the Earth. They are sovereign co-tenant trustees of that atmosphere. They are all bound by the same duties that organize, for example, the relationship of family members who share ownership of a cabin as co-tenants. Property law has always imposed a responsibility on co-tenants to not degrade the common asset. This one concept lends definition to international climate responsibilities.

Let me conclude. Global heating dwarfs any threat we have known in the history of Humankind. Giving our government political discretion to allow further damage to our atmosphere puts the future of this nation and the rest of the world in grave danger. If Americans take the lead to reframe our government's purpose as a trust duty to safeguard the commonly held atmosphere, we may soon find every other nation in the world engaged with us, not against us, in a massive, urgent defense effort to secure the systems of life on Earth for all generations to come.


PEAK SOIL

By Alice Friedemann
From: Culture Change

"The nation that destroys its soil destroys itself." -- President Franklin D. Roosevelt

Part 1. The Dirt on Dirt.

Ethanol is an agribusiness get-rich-quick scheme that will bankrupt our topsoil.

Nineteenth century western farmers converted their corn into whiskey to make a profit (Rorabaugh 1979). Archer Daniels Midland (ADM), a large grain processor, came up with the same scheme in the 20th century. But ethanol was a product in search of a market, so ADM spent three decades relentlessly lobbying for ethanol to be used in gasoline. Today ADM makes record profits from ethanol sales and government subsidies (Barrionuevo 2006).

The Department of Energy hopes to have biomass supply 5% of the nation's power, 20% of transportation fuels, and 25% of chemicals by 2030. These combined goals are 30% of the current petroleum consumption (DOE Biomass Plan, DOE Feedstock Roadmap).

Fuels made from biomass are a lot like the nuclear powered airplanes the Air Force tried to build from 1946 to 1961, for billions of dollars. They never got off the ground. The idea was interesting -- atomic jets could fly for months without refueling. But the lead shielding to protect the crew and several months of food and water was too heavy for the plane to take off. The weight problem, the ease of shooting this behemoth down, and the consequences of a crash landing were so obvious, it's amazing the project was ever funded, let alone kept going for 15 years.

Biomass fuels have equally obvious and predictable reasons for failure. Odum says that time explains why renewable energy provides such low energy yields compared to non-renewable fossil fuels. The more work left to nature, the higher the energy yield, but the longer the time required. Although coal and oil took millions of years to form into dense, concentrated solar power, all we had to do was extract and transport them (Odum 1996)

With every step required to transform a fuel into energy, there is less and less energy yield. For example, to make ethanol from corn grain, which is how all U.S. ethanol is made now, corn is first grown to develop hybrid seeds, which next season are planted, harvested, delivered, stored, and preprocessed to remove dirt. Dry-mill ethanol is milled, liquefied, heated, saccharified, fermented, evaporated, centrifuged, distilled, scrubbed, dried, stored, and transported to customers (McAloon 2000).

Fertile soil will be destroyed if crops and other "wastes" are removed to make cellulosic ethanol.

"We stand, in most places on earth, only six inches from desolation, for that is the thickness of the topsoil layer upon which the entire life of the planet depends" (Sampson 1981).

Loss of topsoil has been a major factor in the fall of civilizations (Sundquist 2005 Chapter 3, Lowdermilk 1953, Perlin 1991, Ponting 1993). You end up with a country like Iraq, formerly Mesopotamia, where 75% of the farm land became a salty desert.

Fuels from biomass are not sustainable, are ecologically destructive, have a net energy loss, and there isn't enough biomass in America to make significant amounts of energy because essential inputs like water, land, fossil fuels, and phosphate ores are limited.

Soil Science 101 -- There Is No "Waste" Biomass

Long before there was "Peak Oil", there was "Peak Soil". Iowa has some of the best topsoil in the world. In the past century, half of it's been lost, from an average of 18 to 10 inches deep (Pate 2004, Klee 1991).

Productivity drops off sharply when topsoil reaches 6 inches or less, the average crop root zone depth (Sundquist 2005).

Crop productivity continually declines as topsoil is lost and residues are removed. (Al-Kaisi May 2001, Ball 2005, Blanco-Canqui 2006, BOA 1986, Calvino 2003, Franzleubbers 2006, Grandy 2006, Johnson 2004, Johnson 2005, Miranowski 1984, Power 1998, Sadras 2001, Troeh 2005, Wilhelm 2004).

On over half of America's best crop land, the erosion rate is 27 times the natural rate, 11,000 pounds per acre (NCRS 2006). The natural, geological erosion rate is about 400 pounds of soil per acre per year (Troeh 2005). Some is due to farmers not being paid enough to conserve their land, but most is due to investors who farm for profit. Erosion control cuts into profits.

Erosion is happening ten to twenty times faster than the rate topsoil can be formed by natural processes (Pimentel 2006). That might make the average person concerned. But not the USDA -- they've defined erosion as the average soil loss that could occur without causing a decline in long term productivity.

Troeh (2005) believes that the tolerable soil loss (T) value is set too high, because it's based only on the upper layers -- how long it takes subsoil to be converted into topsoil. T ought to be based on deeper layers -- the time for subsoil to develop from parent material or parent material from rock. If he's right, erosion is even worse than NCRS figures.

Erosion removes the most fertile parts of the soil (USDA-ARS). When you feed the soil [with organic matter], you're not feeding plants; you're feeding the biota in the soil. Underground creatures and fungi break down fallen leaves and twigs into microscopic bits that plants can eat, and create tunnels air and water can infiltrate. In nature there are no elves feeding (fertilizing) the wild lands. When plants die, they're recycled into basic elements and become a part of new plants. It's a closed cycle. There is no bio-waste.

Soil creatures and fungi act as an immune system for plants against diseases, weeds, and insects -- when this living community is harmed by agricultural chemicals and fertilizers, even more chemicals are needed in an increasing vicious cycle (Wolfe 2001).

There's so much life in the soil, there can be 10 "biomass horses" underground for every horse grazing on an acre of pasture (Wardle 2004). If you dove into the soil and swam around, you'd be surrounded by miles of thin strands of mycorrhizal fungi that help plant roots absorb more nutrients and water, plus millions of creatures, most of them unknown. There'd be thousands of species in just a handful of earth -- springtails, bacteria, and worms digging airy subways. As you swam along, plant roots would tower above you like trees as you wove through underground skyscrapers.

Plants and creatures underground need to drink, eat, and breathe just as we do. An ideal soil is half rock, and a quarter each water and air. When tractors plant and harvest, they crush the life out of the soil, as underground apartments collapse 9/11 style. The tracks left by tractors in the soil are the erosion route for half of the soil that washes or blows away (Wilhelm 2004).

Corn Biofuel -- Especially Harmful

Corn Biofuel (i.e. butanol, ethanol, biodiesel) is especially harmful because: Row crops such as corn and soy cause 50 times more soil erosion than sod crops [e.g., hay] (Sullivan 2004) or more (Al-Kaisi 2000), because the soil between the rows can wash or blow away. If corn is planted with last year's corn stalks left on the ground (no- till), erosion is less of a problem, but only about 20% of corn is grown no-till. Soy is usually grown no-till, but [leaves] insignificant residues to harvest for fuel. Corn uses more water, insecticide, and fertilizer than most crops (Pimentel 2003). Due to high corn prices, continuous corn (corn crop after corn crop) is increasing, rather than rotation of nitrogen fixing (fertilizer) and erosion control sod crops with corn.

The government has studied the effect of growing continuous corn, and found it increases eutrophication by 189%, global warming by 71%, and acidification by 6% (Powers 2005).

Farmers want to plant corn on highly-erodible, water protecting, or wildlife sustaining Conservation Reserve Program land. Farmers are paid not to grow crops on this land. But with high corn prices, farmers are now asking the Agricultural Department to release them from these contracts so they can plant corn on these low-producing, environmentally sensitive lands (Tomson 2007).

Crop residues are essential for soil nutrition, water retention, and soil carbon. Making cellulosic ethanol from corn residues -- the parts of the plant we don't eat (stalk, roots, and leaves) -- removes water, carbon, and nutrients (Nelson, 2002, McAloon 2000, Sheehan, 2003). These practices lead to lower crop production and ultimately deserts. Growing plants for fuel will accelerate the already unacceptable levels of topsoil erosion, soil carbon and nutrient depletion, soil compaction, water retention, water depletion, water pollution, air pollution, eutrophication, destruction of fisheries, siltation of dams and waterways, salination, loss of biodiversity, and damage to human health (Tegtmeier 2004).

Why are soil scientists absent from the biofuels debate?

I asked 35 soil scientists why topsoil wasn't part of the biofuels debate. These are just a few of the responses from the ten who replied to my off-the-record poll (no one wanted me to quote them, mostly due to fear of losing their jobs): "I have no idea why soil scientists aren't questioning corn and cellulosic ethanol plans. Quite frankly I'm not sure that our society has had any sort of reasonable debate about this with all the facts laid out. When you see that even if all of the corn was converted to ethanol and that would not provide more than 20% of our current liquid fuel use, it certainly makes me wonder, even before considering the conversion efficiency, soil loss, water contamination, food price problems, etc."

"Biomass production is not sustainable. Only business men and women in the refinery business believe it is."

"Should we be using our best crop land to grow gasohol and contribute further to global warming? What will our children grow their food on?"

"As agricultural scientists, we are programmed to make farmers profitable, and therefore profits are at the top of the list, and not soil, family, or environmental sustainability".

"Government policy since WWII has been to encourage overproduction to keep food prices down (people with full bellies don't revolt or object too much). It's hard to make a living farming commodities when the selling price is always at or below the break even point. Farmers have had to get bigger and bigger to make ends meet since the margins keep getting thinner and thinner. We have sacrificed our family farms in the name of cheap food. When farmers stand to make few bucks (as with biofuels) agricultural scientists tend to look the other way".

"You are quite correct in your concern that soil science should be factored into decisions about biofuel production. Unfortunately, we soil scientists have missed the boat on the importance of soil management to the sustainability of biomass production, and the long- term impact for soil productivity.

This is not a new debate. Here's what scientists had to say decades ago: Removing "crop residues...would rob organic matter that is vital to the maintenance of soil fertility and tilth, leading to disastrous soil erosion levels. Not considered is the importance of plant residues as a primary source of energy for soil microbial activity. The most prudent course, clearly, is to continue to recycle most crop residues back into the soil, where they are vital in keeping organic matter levels high enough to make the soil more open to air and water, more resistant to soil erosion, and more productive" (Sampson 1981).

"...Massive alcohol production from our farms is an immoral use of our soils since it rapidly promotes their wasting away. We must save these soils for an oil-less future" (Jackson 1980).

Natural Gas in Agriculture

When you take out more nutrients and organic matter from the soil than you put back in, you are "mining" the topsoil. The organic matter is especially important, since that's what prevents erosion, improves soil structure, health, water retention, and gives the next crop its nutrition. Modern agriculture only addresses the nutritional component by adding fossil-fuel based fertilizers, and because the soil is unhealthy from a lack of organic matter, copes with insects and disease with oil-based pesticides.

"Fertilizer energy" is 28% of the energy used in agriculture (Heller, 2000). Fertilizer uses natural gas both as a feedstock and the source of energy to create the high temperatures and pressures necessary to coax inert nitrogen out of the air (nitrogen is often the limiting factor in crop production). This is known as the Haber-Bosch process, and it's a big part of the green revolution that made it possible for the world's population to grow from half a billion to 6.5 billion today (Smil 2000, Fisher 2001).

Our national security is at risk as we become dependent on unstable foreign states to provide us with increasingly expensive fertilizer. Between 1995 and 2005 we increased our fertilizer imports by more than 148% for Anhydrous Ammonia, 93% for Urea (solid), and 349 % of other nitrogen fertilizers (USDA ERS). Removing crop residues will require large amounts of imported fertilizer from potential cartels, potentially so expensive farmers won't sell crops and residues for biofuels.

Improve national security and topsoil by returning residues to the land as fertilizer. We are vulnerable to high-priced fertilizer imports or "food for oil", which would greatly increase the cost of food for Americans.

Agriculture competes with homes and industry for fast depleting North American natural gas. Natural gas price increases have already caused over 280,000 job losses (Gerard 2006). Natural gas is also used for heating and cooking in over half our homes, generates 15% of electricity, and is a feedstock for thousands of products.

Return crop residues to the soil to provide organic fertilizer, don't increase the need for natural gas fertilizers by removing crop residues to make cellulosic biofuels.

Part 2. The Poop on Ethanol

Energy Returned on Energy Invested (EROEI)

To understand the concept of EROEI, imagine a magician doing a variation on the rabbit-out-of-a-hat trick. He strides onstage with a rabbit, puts it into a top hat, and then spends the next five minutes pulling 100 more rabbits out. That is a pretty good return on investment!

Oil was like that in the beginning: one barrel of oil energy was required to get 100 more out, an Energy Returned on Energy Invested of 100:1.

When the biofuel magician tries to do the same trick decades later, he puts the rabbit into the hat, and pulls out only one pooping rabbit. The excrement is known as byproduct or coproduct in the ethanol industry.

Studies that show a positive energy gain for ethanol would have a negative return if the byproduct were left out (Farrell 2006). Here's where byproduct comes from: if you made ethanol from corn in your back yard, you'd dump a bushel of corn, two gallons of water, and yeast into your contraption. Out would come 18 pounds of ethanol, 18 pounds of CO2, and 18 pounds of byproduct -- the leftover corn solids.

Patzek and Pimentel believe you shouldn't include the energy contained in the byproduct, because you need to return it to the soil to improve nutrition and soil structure (Patzek June 2006). Giampetro believes the byproduct should be treated as a "serious waste disposal problem and... an energy cost", because if we supplied 10% of our energy from biomass, we'd generate 37 times more livestock feed than is used (Giampetro 1997).

It's even worse than he realized -- Giampetro didn't know most of this "livestock feed" can't be fed to livestock because it's too energy and monetarily expensive to deliver -- especially heavy wet distillers byproduct, which is short-lived, succumbing to mold and fungi after 4 to 10 days. Also, byproduct is a subset of what animals eat. Cattle are fed byproduct in 20% of their diet at most. Iowa's a big hog state, but commercial swine operations feed pigs a maximum of 5 to 10% byproduct (Trenkle 2006; Shurson 2003).

Worst of all, the EROEI of ethanol is 1.2:1 or 1.2 units of energy out for every unit of energy in, a gain of ".2". The "1" in "1.2" represents the liquid ethanol. What is the ".2" then? It's the rabbit feces -- the byproduct. So you have no ethanol for your car, because the liquid "1" needs to be used to make more ethanol. That leaves you with just the ".2" --- a bucket of byproduct to feed your horse -- you do have a horse, don't you? If horses are like cattle, then you can only use your byproduct for one-fifth of his diet, so you'll need four supplemental buckets of hay from your back yard to feed him. No doubt the byproduct could be used to make other things, but that would take energy.

Byproduct could be burned, but it takes a significant amount of energy to dry it out, and requires additional handling and equipment. More money can be made selling it wet to the cattle industry, which is hurting from the high price of corn. Byproduct should be put back into the ground to improve soil nutrition and structure for future generations, not sold for short-term profit and fed to cattle who aren't biologically adapted to eating corn.

The boundaries of what is included in EROEI calculations are kept as narrow as possible to reach positive results.

Researchers who find a positive EROEI for ethanol have not accounted for all of the energy inputs. For example, Shapouri admits the "energy used in the production of... farm machinery and equipment..., and cement, steel, and stainless steel used in the construction of ethanol plants, are not included". (Shapouri 2002). Or they assign overstated values of ethanol yield from corn (Patzek Dec 2006). Many, many, other inputs are left out.

Patzek and Pimentel have compelling evidence showing that about 30 percent more fossil energy is required to produce a gallon of ethanol than you get from it. Their papers are published in peer-reviewed journals where their data and methods are public, unlike many of the positive net energy results.

Infrastructure. Current EROEI figures don't take into account the delivery infrastructure that needs to be built. There are 850 million combustion engines in the world today. Just to replace half the 245 million cars and light trucks in the United States with E85 vehicles will take 12-15 years, It would take over $544 million dollars of delivery ethanol infrastructure (Reynolds 2002 case B1) and $5 to $34 billion to revamp 170,000 gas stations nationwide (Heinson 2007).

The EROEI of oil when we built most of the infrastructure in this country was about 100:1, and it's about 25:1 worldwide now. Even if you believe ethanol has a positive EROEI, you'd probably need at least an EROEI of at least 5 to maintain modern civilization (Hall 2003). A civilization based on ethanol's ".2" rabbit poop would only work for coprophagous (dung-eating) rabbits.

Of the four articles that showed a positive net energy for ethanol in Farrells 2006 Science article, three were not peer-reviewed. The only positive peer-reviewed article (Dias De Oliveira, 2005) states "The use of ethanol as a substitute for gasoline proved to be neither a sustainable nor an environmentally friendly option" and the "environmental impacts outweigh its benefits". Dias De Oliveria concluded there'd be a tremendous loss of biodiversity, and if all vehicles ran on E85 and their numbers grew by 4% per year, by 2048, the entire country, except for cities, would be covered with corn.

Part 3. Biofuel is a Grim Reaper.

The energy to remediate environmental damage is left out of EROEI calculations.

Global Warming

Soils contain twice the amount of carbon found in the atmosphere, and three times more carbon than is stored in all the Earth's vegetation (Jones 2006).

Climate change could increase soil loss by 33% to 274%, depending on the region (O'Neal 2005).

Intensive agriculture has already removed 20 to 50% of the original soil carbon, and some areas have lost 70%. To maintain soil C levels, no crop residues at all could be harvested under many tillage systems or on highly erodible lands, and none to a small percent on no-till, depending on crop production levels (Johnson 2006).

Deforestation of temperate hardwood forests, and conversion of range and wetlands to grow energy and food crops increases global warming. An average of 2.6 million acres of crop land were paved over or developed every year between 1982 and 2002 in the USA (NCRS 2004). The only new crop land is forest, range, or wetland.

Rainforest destruction is increasing global warming. Energy farming is playing a huge role in deforestation, reducing biodiversity, water and water quality, and increasing soil erosion. Fires to clear land for palm oil plantations are destroying one of the last great remaining rainforests in Borneo, spewing so much carbon that Indonesia is third behind the United States and China in releasing greenhouse gases. Orangutans, rhinos, tigers and thousands of other species may be driven extinct (Monbiot 2005). Borneo palm oil plantation lands have grown 2,500% since 1984 (Barta 2006). Soybeans cause even more erosion than corn and suffer from all the same sustainability issues. The Amazon is being destroyed by farmers growing soybeans for food (National Geographic Jan 2007).and fuel (Olmstead 2006).

Biofuel from coal-burning biomass factories increases global warming (Farrell 2006). Driving a mile on ethanol from a coal-using biorefinery releases more CO2 than a mile on gasoline (Ward 2007). Coal in ethanol production is seen as a way to displace petroleum (Farrell 2006, Yacobucci 2006) and it's already happening (Clayton 2006).

Current and future quantities of biofuels are too minisucle to affect global warming (ScienceDaily 2007).

Surface Albedo

"How much the sun warms our climate depends on how much sunlight the land reflects (cooling us), versus how much it absorbs (heating us). A plausible 2% increase in the absorbed sunlight on a switch grass plantation could negate the climatic cooling benefit of the ethanol produced on it. We need to figure out now, not later, the full range of climatic consequences of growing cellulose crops" (Harte 2007).

Eutrophication

Farm runoff of nitrogen fertilizers has contributed to the hypoxia (low oxygen) of rivers and lakes across the country and the dead zone in the Gulf of Mexico. Yet the cost of the lost shrimp and fisheries and increased cost of water treatment are not subtracted from the EROEI of ethanol.

Soil Erosion

Corn and soybeans have higher than average erosion rates. Eroded soil pollutes air, fills up reservoirs, and shortens the time dams can store water and generate electricity. Yet the energy of the hydropower lost to siltation, energy to remediate flood damage, energy to dredge dams, agricultural drainage ditches, harbors, and navigation channels, aren't considered in EROEI calculations.

The majority of the best soil in the nation is rented and has the highest erosion rates. More than half the best farmland in the United States is rented: 65% in Iowa, 74% in Minnesota, 84% in Illinois, and 86% in Indiana. Owners seeking short-term profits have far less incentive than farmers who work their land to preserve soil and water. As you can see in the map below [click on image for original USDA site], the dark areas, which represent the highest erosion rates, are the same areas with the highest percentage of rented farmland. [Red - High, Yellow -- Medium, Green -- Low]

Water Pollution

Soil erosion is a serious source of water pollution, since it causes runoff of sediments, nutrients, salts, eutrophication, and chemicals that have had no chance to decompose into streams. This increases water treatment costs, increases health costs, kills fish with insecticides that work their way up the food chain (Troeh 2005).

Ethanol plants pollute water. They generate 13 liters of wastewater for every liter of ethanol produced (Pimentel March 2005)

Water depletion

Biofuel factories use a huge amount of water -- four gallons for every gallon of ethanol produced. Despite 30 inches of rain per year in Iowa, there may not be enough water for corn ethanol factories as well as people and industry. Drought years will make matters worse (Cruse 2006).

Fifty percent of Americans rely on groundwater (Glennon 2002), and in many states, this groundwater is being depleted by agriculture faster than it is being recharged. This is already threatening current food supplies (Giampetro 1997). In some western irrigated corn acreage, groundwater is being mined at a rate 25% faster than the natural recharge of its aquifer (Pimentel 2003).

Biodiversity

Every acre of forest and wetland converted to crop land decreases soil biota, insect, bird, reptile, and mammal biodiversity.

Part 4. Biodiesel: Can we eat enough French Fries?

The idea we could run our economy on discarded fried food grease is very amusing. For starters, you'd need to feed 7 million heavy diesel trucks getting less than 8 mpg. Seems like we're all going to need to eat a lot more French Fries, but if anyone can pull it off, it would be Americans. Spin it as a patriotic duty and you'd see people out the door before the TV ad finished, the most popular government edict ever.

Scale. Where's the Soy? Biodiesel is not ready for prime time. Although John Deere is working on fuel additives and technologies to burn more than 5% accredited biodiesel (made to ASTM D6751 specifications -- vegetable oil does not qualify), that is a long way off. 52 billion gallons of diesel fuel are consumed a year in the United States, but only 75 million gallons of biodiesel were produced - two-tenths of one percent of what's needed. To get the country to the point where gasoline was mixed with 5 percent biodiesel would require 64 percent of the soybean crop and 71,875 square miles of land (Borgman 2007), an area the size of the state of Washington. Soybeans cause even more erosion than corn.

Biodiesel shortens engine life. Currently, biodiesel concentrations higher than 5 percent can cause "water in the fuel due to storage problems, foreign material plugging filters..., fuel system seal and gasket failure, fuel gelling in cold weather, crankcase dilution, injection pump failure due to water ingestion, power loss, and, in some instances, can be detrimental to long engine life" (Borgman 2007). Biodiesel also has a short shelf life and it's hard to store - it easily absorbs moisture (water is a bane to combustion engines), oxidizes, and gets contaminated with microbes. It increases engine NOx emissions (ozone) and has thermal degradation at high temperatures (John Deere 2006).

On the cusp of energy descent, we can't even run the most vital aspect of our economy, agricultural machines, on "renewable" fuels. John Deere tractors can run on no more than 5% accredited biodiesel (Borgman 2007). Perhaps this is unintentionally wise -- biofuels have yet to be proven viable, and mechanization may not be a great strategy in a world of declining energy.

Part 5. If we can't drink and drive, then burn baby burn.

Energy Crop Combustion

Wood is a crop, subject to the same issues as corn, and takes a lot longer to grow. Burning wood in your stove at home delivers far more energy than the logs would if converted to biofuels (Pimentel 2005). Wood was scarce in America when there were just 75 million people. Electricity from biomass combustion is not economic or sustainable.

Combustion pollution is expensive to control. Some biomass has absorbed heavy metals and other pollutants from sources like coal power plants, industry, and treated wood. Combustion can release chlorinated dioxins, benzofurans, polycyclic aromatic hydrocarbons, cadmium, mercury, arsenic, lead, nickel, and zinc.

Combustion contributes to global warming by adding nitrogen oxides and the carbon stored in plants back into the atmosphere, as well as removes agriculturally essential nitrogen and phosphate (Reijnders 2006)

EROEI in doubt. Combustion plants need to produce, transport, prepare, dry, burn, and control toxic emissions. Collection is energy intensive, requiring some combination of bunchers, skidders, whole- tree choppers, or tub grinders, and then hauling it to the biomass plant. There, the feedstock is chopped into similar sizes and placed on a conveyor belt to be fed to the plant. If biomass is co-fired with coal, it needs to be reduced in size, and the resulting fly ash may not be marketable to the concrete industry (Bain 2003). Any alkali or chlorine released in combustion gets deposited on the equipment, reducing overall plant efficiencies, as well as accelerating corrosion and erosion of plant components, requiring high replacement and maintenance energy.

Processing materials with different physical properties is energy intensive, requiring sorting, handling, drying, and chopping. It's hard to optimize the pyrolysis, gasification, and combustion processes if different combustible fuels are used. Urban waste requires a lot of sorting, since it often has material that must be removed, such as rocks, concrete and metal. The material that can be burned must also be sorted, since it varies from yard trimmings with high moisture content to chemically treated wood.

Biomass combustion competes with other industries that want this material for construction, mulch, compost, paper, and other profitable ventures, often driving the price of wood higher than a wood-burning biomass plant can afford. Much of the forest wood that could be burned is inaccessible due to a lack of roads.

Efficiency is lowered if material with a high water content is burned, like fresh wood. Different physical and chemical characteristics in fuel can lead to control problems (Badger 2002). When wet fuel is burned, so much energy goes into vaporizing the water that very little energy emerges as heat, and drying takes time and energy.

Material is limited and expensive. California couldn't use crop residues due to low bulk density. In 2000, the viability of California biomass enterprise was in serious doubt because the energy to produce biomass was so high due to the small facilities and high cost of collecting and transporting material to the plants (Bain 2003).

Part 6. The problems with Cellulosic Ethanol could drive you to drink.

Many plants want animals to eat their seed and fruit to disperse them. Some seeds only germinate after going through an animal gut and coming out in ready-made fertilizer. Seeds and fruits are easy to digest compared to the rest of the plant, that's why all of the commercial ethanol and biodiesel are made from the yummy parts of plants, the grain, rather than the stalks, leaves, and roots.

But plants don't want to be entirely devoured. They've spent hundreds of millions of years perfecting structures that can't easily be eaten. Be thankful plants figured this out, or everything would be mown down to bedrock.

If we ever did figure out how to break down cellulose in our back yard stills, it wouldn't be long before the 6.5 billion people on the planet destroyed the grasslands and forests of the world to power generators and motorbikes (Huber 2006)

Don Augenstein and John Benemann, who've been researching biofuels for over 30 years, are skeptical as well. According to them, "...severe barriers remain to ethanol from lignocellulose. The barriers look as daunting as they did 30 years ago".

Benemann says the EROEI can be easily determined to be about five times as much energy required to make cellulosic ethanol than the energy contained in the ethanol.

The success of cellulosic ethanol depends on finding or engineering organisms that can tolerate extremely high concentrations of ethanol. Augenstein argues that this creature would already exist if it were possible. Organisms have had a billion years of optimization through evolution to develop a tolerance to high ethanol levels (Benemann 2006). Someone making beer, wine, or moonshine would have already discovered this creature if it could exist.

The range of chemical and physical properties in biomass, even just corn stover (Ruth 2003, Sluiter 2000), is a challenge. It's hard to make cellulosic ethanol plants optimally efficient, because processes can't be tuned to such wide feedstock variation.

Where will the Billion Tons of Biomass for Cellulosic Fuels Come From?

The government believes there is a billion tons of biomass "waste" to make cellulosic biofuels, chemicals, and generate electricity with.

The United States lost 52 million acres of cropland between 1982 and 2002 (NCRS 2004). At that rate, all of the cropland will be gone in 140 years.

There isn't enough biomass to replace 30% of our petroleum use. The potential biomass energy is miniscule compared to the fossil fuel energy we consume every year, about 105 exa joules (EJ) in the USA. If you burned every living plant and its roots, you'd have 94 EJ of energy and we could all pretend we lived on Mars. Most of this 94 EJ of biomass is already being used for food and feed crops, and wood for paper and homes. Sparse vegetation and the 30 EJ in root systems are economically unavailable -- leaving only a small amount of biomass unspoken for (Patzek June 2006).

Over 25% of the "waste" biomass is expected to come from 280 million tons of corn stover. Stover is what's left after the corn grain is harvested. Another 120 million tons will come from soy and cereal straw (DOE Feedstock Roadmap, DOE Biomass Plan).

There isn't enough no-till corn stover to harvest. The success of biofuels depends on corn residues. About 80% of farmers disk corn stover into the land after harvest. That renders it useless -- the crop residue is buried in mud and decomposing rapidly.

Only the 20 percent of farmers who farm no-till will have stover to sell. The DOE Billion Ton vision assumes all farmers are no-till, 75% of residues will be harvested, and fantasizes corn and wheat yields 50% higher than now are reached (DOE Billion Ton Vision 2005).

Many tons will never be available because farmers won't sell any, or much of their residue (certainly not 75%).

Many more tons will be lost due to drought, rain, or loss in storage.

Sustainable harvesting of plants is only 1/200th at best. [? Sustainable harvesting of plants only captures 1/200th of the solar energy they receive, at best. -EB ed] Plants can only fix a tiny part of solar energy into plant matter every year -- about one-tenth to one-half of one percent new growth in temperate climates.

To prevent erosion, you could only harvest 51 million tons of corn and wheat residues, not 400 million tons (Nelson, 2002). Other factors, like soil structure, soil compression, water depletion, and environmental damage weren't considered. Fifty one million tons of residue could make about 3.8 billion gallons of ethanol, less than 1% of our energy needs.

Using corn stover is a problem, because corn, soy, and other row crops cause 50 times more soil erosion than sod crops (Sullivan 2004) or more (Al-Kaisi 2000), and corn also uses more water, insecticides and fertilizers than most crops (Pimentel 2003).

The amount of soy and cereal straw (wheat, oats, etc) is insignificant. It would be best to use cereal grain straw, because grains use far less water and cause far less erosion than row crops like corn and soybeans. But that isn't going to happen, because the green revolution fed billions more people by shortening grain height so that plant energy went into the edible seed, leaving little straw for biofuels. Often 90% of soybean and cereal straw is grown no-till, but the amount of cereal straw is insignificant and the soybean residues must remain on the field to prevent erosion

Limitations on Energy Crops

Poor, erodible land. There aren't enough acres of land to grow significant amounts of energy crops. Potential energy crop land is usually poor quality or highly erodible land that shouldn't be harvested. Farmers are often paid not to farm this unproductive land. Many acres in switchgrass are being used for wildlife and recreation.

Few suitable bio-factory sites. Biorefineries can't be built just anywhere -- very few sites could be found to build switchgrass plants in all of South Dakota (Wu 1998). Much of the state didn't have enough water or adequate drainage to build an ethanol factory. The sites had to be on main roads, near railroad and natural gas lines, out of floodplains, on parcels of at least 40 acres to provide storage for the residues, have electric power, and enough biomass nearby to supply the plant year round.

No energy crop farmers or investors. Farmers won't grow switchgrass until there's a switchgrass plant. Machines to harvest and transport switchgrass efficiently don't exist yet (Barrionuevo 2006). The capital to build switchgrass plants won't materialize until there are switchgrass farmers. Since "ethanol production using switchgrass required 50% more fossil energy than the ethanol fuel produced" (Pimentel 2005), investors for these plants will be hard to find.

Energy crops are subject to Liebig's law of the minimum too. Switchgrass may grow on marginal land, but it hasn't escaped the need for minerals and water. Studies have shown the more rainfall, the more switchgrass you get, and if you remove switchgrass, you're going to need to fertilize the land to replace the lost biomass, or you'll get continually lower yields of switchgrass every time you harvest the crop (Vogel 2002). Sugar cane has been touted as an "all you need is sunshine" plant. But according to the FAO, the nitrogen, phosphate, and potassium requirements of sugar cane are roughly similar to maize (FAO 2004).

Bioinvasive Potential. These fast-growing disease-resistant plants are potentially bioinvasive, another kudzu. Bioinvasion costs our country billions of dollars a year (Bright, 1998). Johnson grass was introduced as a forage grass and it's now an invasive weed in many states. Another fast-growing grass, Miscanthus, is now being proposed as a biofuel. It's been described as "Johnson grass on steroids" (Raghu 2006).

Sugar cane: too little to import. Brazil uses oil for 90% of their energy, and 17 times less oil [than the U.S.] (Jordan 2006). Brazilian ethanol production in 2003 was 3.3 billion gallons, about the same as the USA in 2004, or 1% of our transportation energy. Brazil uses 85% of their cane ethanol, leaving only 15% for export.

Sugar Cane: can't grow it here. Although we grow some sugar cane despite tremendous environmental damage (WWF) in Florida thanks to the sugar lobby, we're too far north to grow a significant amount of sugar cane or other fast growing C4 plants.

Wood ethanol is an energy sink. Ethanol production using wood biomass required 57% more fossil energy than the ethanol fuel produced (Pimentel 2005).

Wood is a nonrenewable resource. Old-growth forests had very dense wood, with a high energy content, but wood from fast-growing plantations is so low-density and low calorie it's not even good to burn in a fireplace. These plantations require energy to plant, fertilize, weed, thin, cut, and deliver. The trees are finally available for use after 20 to 90 years -- too long for them to be considered a renewable fuel (Odum 1996). Nor do secondary forests always come back with the vigor of the preceding forest due to soil erosion, soil nutrition depletion, and mycorrhizae destruction (Luoma 1999).

There's not enough wood to fuel a civilization of 300 million people. Over half of North America was deforested by 1900, at a time when there were only 75 million people (Williams 2003). Most of this was from home use. In the 18th century the average Northeastern family used 10 to 20 cords per year. At least one acre of woods is required to sustainably harvest one cord of wood (Whitney 1994).

Energy crop limits. Energy crops may not be sustainable due to water, fertilizer, and harvesting impacts on the soil (DOE Biomass Roadmap 2005). Like all other monoculture crops, ultimately yields of energy crops will be reduced due to "pest problems, diseases, and soil degradation" (Giampetro, 1997).

Energy crop monoculture. The "physical and chemical characteristics of feedstocks vary by source, by year, and by season, increasing processing costs" (DOE Feedstock Roadmap). That will encourage the development of genetically engineered biomass to minimize variation. Harvesting economies of scale will mean these crops will be grown in monoculture, just as food crops are. That's the wrong direction -- to farm with less energy there'll need to be a return to rotation of diverse crops, and composted residues for soil nutrition, pest, and disease resistance.

A way around this would be to spend more on researching how cellulose digesting microbes tackle different herbaceous and woody biomass. The ideal energy crop would be a perennial, tall-grass prairie / herbivore ecosystem (Tilman 2006).

Farmers aren't stupid: They won't sell their residues. Farmers are some of the smartest people on earth or they'd soon go out of business. They have to know everything from soil science to commodity futures.

Crop production is reduced when residues are removed from the soil. Why would farmers want to sell their residues?

Erosion, water, compression, nutrition. Harvesting of stover on the scale needed to fuel a cellulosic industry won't happen because farmers aren't stupid, especially the ones who work their own land. Although there is a wide range of opinion about the amount of residue that can be harvested safely without causing erosion, loss of soil nutrition, and soil structure, many farmers will want to be on the safe side, and stick with the studies showing that 20% (Nelson, 2002) to 30% (McAloon et al., 2000; Sheehan, 2003) at most can be harvested, not the 75% agribusiness claims is possible. Farmers also care about water quality (Lal 1998, Mann et al, 2002). And farmers will decide that permanent soil compression is not worth any price (Wilhelm 2004). As prices of fertilizer inexorably rise due to natural gas depletion, it will be cheaper to return residues to the soil than to buy fertilizer.

Residues are a headache. The further the farmer is from the biorefinery or railroad, the slimmer the profit, and the less likely a farmer will want the extra headache and cost of hiring and scheduling many different harvesting, collection, baling, and transportation contractors for corn stover.

Residues are used by other industries. Farm managers working for distant owners are more likely to sell crop residues since they're paid to generate profits, not preserve land. But even they will sell to the highest bidder, which might be the livestock or dairy industries, furfural factories, hydromulching companies, biocomposite manufacturers, pulp mills, or city dwellers faced with skyrocketing utility bills, since the high heating value of residue has twice the energy of the converted ethanol.

Investors aren't stupid either. If farmers can't supply enough crop residues to fuel the large biorefinery in their region, who will put up the capital to build one?

Can the biomass be harvested, baled, stored, and transported economically?

Harvesting. Sixteen ton tractors harvest corn and spit out stover. Many of these harvesters are contracted and will continue to collect corn in the limited harvest time, not stover. If tractors are still available, the land isn't wet, snow doesn't fall, and the stover is dry, three additional tractor runs will mow, rake, and bale the stover (Wilhelm 2004). This will triple the compaction damage to the soil (Troeh 2005), create more erosion-prone tire tracks, increase CO2 emissions, add to labor costs, and put unwanted foreign matter into the bale (soil, rocks, baling wire, etc).

So biomass roadmaps call for a new type of tractor or attachment to harvest both corn and stover in one pass. But then the tractor would need to be much larger and heavier, which could cause decades-long or even permanent soil compaction. Farmers worry that mixing corn and stover might harm the quality of the grain. And on the cusp of energy descent, is it a good idea to build an even larger and more complex machine?

If the stover is harvested, the soil is now vulnerable to erosion if it rains, because there's no vegetation to protect the soil from the impact of falling raindrops. Rain also compacts the surface of the soil so that less water can enter, forcing more to run off, increasing erosion. Water landing on dense vegetation soaks into the soil, increasing plant growth and recharging underground aquifers. The more stover left on the land, the better.

Baling. The current technology to harvest residues is to put them into bales of hay. Hay is a dangerous commodity -- it can spontaneously combust, and once on fire, can't be extinguished, leading to fire loss and increased fire insurance costs. Somehow the bales have to be kept from combusting during the several months it takes to dry them from 50 to 15 percent moisture. A large, well drained, covered area is needed to vent fumes and dissipate heat. If the bales get wet they will compost (Atchison 2004).

Baling was developed for hay and has been adapted to corn stover with limited success. Biorefineries need at least half a million tons of biomass on hand to smooth supply bumps, much greater than any bale system has been designed for. Pelletization is not an option, it's too expensive. Other options need to be found. (DOE Feedstock Roadmap)

To get around the problems of exploding hay bales, wet stover could be collected. The moisture content needs to be around 60 percent, which means a lot of water will be transported, adding significantly to the delivery cost.

Storage. Stover needs to be stored with a moisture content of 15% or less, but it's typically 35-50%, and rain or snow during harvest will raise these levels even higher (DOE Feedstock Roadmap). If it's harvested wet anyhow, there'll be high or complete losses of biomass in storage (Atchison 2004).

Residues could be stored wet, as they are in ensilage, but a great deal of R&D are needed and to see if there are disease, pest, emission, runoff, groundwater contamination, dust, mold, or odor control problems. The amount of water required is unknown. The transit time must be short, or aerobic microbial activity will damage it. At the storage site, the wet biomass must be immediately washed, shredded, and transported to a drainage pad under a roof for storage, instead of baled when drier and left at the farm. The wet residues are heavy, making transportation costlier than for dry residues, perhaps uneconomical. It can freeze in the winter making it hard to handle. If the moisture is too low, air gets in, making aerobic fermentation possible, resulting in molds and spoilage.

Transportation. Although a 6,000 dry ton per day biorefinery would have 33% lower [unit] costs than [one that processed 2,000 dry tons per day], the price of gas and diesel limits the distance the biofuel refinery can be from farms, since the bales are large in volume but low in density, which limits how many bales can be loaded onto a truck and transported economically.

So the "economy of scale" achieved by a very large refinery has to be reduced to a 2,000 dry ton per day biorefinery. Even this smaller refinery would require 200 trucks per hour delivering biomass during harvest season (7 x 24), or 100 trucks per day if satellite sites for storage are used. This plant would need 90% of the no-till crop residues from the surrounding 7,000 square miles with half the farmers participating. Yet less than 20% of farmers practice no-till corn and not all of the farmland is planted in corn. When this biomass is delivered to the biorefinery, it will take up at least 100 acres of land stacked 25 feet high.

The average stover haul to the biorefinery would be 43 miles one way if these rosy assumptions all came true (Perlack 2002). If less than 30% of the stover is available, the average one-way trip becomes 100 miles and the biorefinery is economically impossible.

There is also a shortage of truck drivers, the rail system can't handle any new capacity, and trains are designed to operate between hubs, not intermodally (truck to train to truck). The existing transportation system has not changed much in 30 years, yet this congested, inadequate infrastructure somehow has to be used to transport huge amounts of ethanol, biomass, and byproducts (Haney 2006).

Cellulosic Biorefineries (see Appendix for more barriers)

There are over 60 barriers to be overcome in making cellulosic ethanol in Section III of the DOE "Roadmap for Agriculture Biomass Feedstock Supply in the United States" (DOE Feedstock Roadmap 2003). For example: "Enzyme Biochemistry. Enzymes that exhibit high thermostability and substantial resistance to sugar end-product inhibition will be essential to fully realize enzyme-based sugar platform technology. The ability to develop such enzymes and consequently very low cost enzymatic hydrolysis technology requires increasing our understanding of the fundamental mechanisms underlying the biochemistry of enzymatic cellulose hydrolysis, including the impact of biomass structure on enzymatic cellulose decrystallization. Additional efforts aimed at understanding the role of cellulases and their interaction not only with cellulose but also the process environment is needed to affect further reductions in cellulase cost through improved production". No wonder many of the issues with cellulosic ethanol aren't discussed -- there's no way to express the problems in a sound bite.

It may not be possible to reduce the complex cellulose digesting strategies of bacteria and fungi into microorganisms or enzymes that can convert cellulose into ethanol in giant steel vats, especially given the huge physical and chemical variations in feedstock. The field of metagenomics is trying to create a chimera from snips of genetic material of cellulose-digesting bacteria and fungi. That would be the ultimate Swiss Army-knife microbe, able to convert cellulose to sugar and then sugar to ethanol.

There's also research to replicate termite gut cellulose breakdown. Termites depend on fascinating creatures called protists in their guts to digest wood. The protists in turn outsource the work to multiple kinds of bacteria living inside of them. This is done with energy (ATP) and architecture (membranes) in a system that evolved over millions of years. If the termite could fire the protists and work directly with the bacteria, that probably would have happened 50 million years ago. This process involves many kinds of bacteria, waste products, and other complexities that may not be reducible to an enzyme or a bacteria.

Finally, ethanol must be delivered. A motivation to develop cellulosic ethanol is the high delivery cost of corn grain ethanol from the Midwest to the coasts, since ethanol can't be delivered cheaply through pipelines, but must be transported by truck, rail, or barge (Yacobucci 2003).

The whole cellulosic ethanol enterprise falls apart if the energy returned is less than the energy invested or even one of the major stumbling blocks can't be overcome. If there isn't enough biomass, if the residues can't be stored without exploding or composting, if the oil to transport low-density residues to biorefineries or deliver the final product is too great, if no cheap enzymes or microbes are found to break down lignocellulose in wildly varying feedstocks, if the energy to clean up toxic byproducts is too expensive, or if organisms capable of tolerating high ethanol concentrations aren't found, if the barriers in Appendix A can't be overcome, then cellulosic fuels are not going to happen.

If the obstacles can be overcome, but we lose topsoil, deplete aquifers, poison the land, air, and water -- what kind of Faustian bargain is that?

Scientists have been trying to solve these issues for over thirty years now.

Nevertheless, this is worthy of research money, but not public funds for commercial refineries until the issues above have been solved. This is the best hope we have for replacing the half million products made from and with fossil fuels, and for liquid transportation fuels when population falls to pre-coal levels.

Part 7. Where do we go from here?

Subsidies and Politics

How come there are over 116 ethanol plants with 79 under construction and 200 more planned? The answer: subsidies and tax breaks.

Federal and state ethanol subsidies add up to 79 cents per liter (McCain 2003), with most of that going to agribusiness, not farmers. There is also a tax break of 5.3 cents per gallon for ethanol (Wall Street Journal 2002). An additional 51 cents per gallon goes mainly to the oil industry to get them to blend ethanol with gasoline.

In addition to the $8.4 billion per year subsidies for corn and ethanol production, the consumer pays an additional amount for any product with corn in it (Pollan 20005), beef, milk, and eggs, because corn diverted to ethanol raises the price of corn for the livestock industry.

Worst of all, the subsidies may never end, because Iowa plays a leading role in who's selected to be the next president. John McCain has softened his stand on ethanol (Birger 2006). All four senators in California and New York have pointed out that "ethanol subsidies are nothing but a way to funnel money to agribusiness and corn states at the expense of the rest of the country" (Washington Post 2002).

"Once we have a corn-based technology up and running the political system will protect it," said Lawrence J. Goldstein, a board member at the Energy Policy Research Foundation. "We cannot afford to have 15 billion gallons of corn-based ethanol in 2015, and that's exactly where we are headed" (Barrionuevo 2007).

Conclusion

Soil is the bedrock of civilization (Perlin 1991, Ponting 1993). Biofuels are not sustainable or renewable. Why would we destroy our topsoil, increase global warming, deplete and pollute groundwater, destroy fisheries, and use more energy than what's gained to make ethanol? Why would we do this to our children and grandchildren?

Perhaps it's a combination of pork barrel politics, an uninformed public, short-sighted greedy agribusiness corporations, jobs for the Midwest, politicians getting too large a percent of their campaign money from agribusiness (Lavelle 2007), elected leaders without science degrees, and desperation to provide liquid transportation fuels (Bucknell 1981, Hirsch 2005).

But this madness puts our national security at risk. Destruction of topsoil and collateral damage to water, fisheries, and food production will result in less food to eat or sell for petroleum and natural gas imports. Diversion of precious dwindling energy and money to impossible solutions is a threat to our nations' future.

Recommendations

Fix the unsustainable and destructive aspects of industrial agriculture. At least some good would come out of the ethanol fiasco if more attention were paid to how we grow our food. The effects of soil erosion on crop production have been hidden by mechanization and intensive use of fossil fuel fertilizers and chemicals on crops bred to tolerate them. As energy declines, crop yields will decline as well.

Jobs. Since part of what's driving the ethanol insanity is job creation, divert the subsidies and pork barrel money to erosion control and sustainable agriculture. Maybe Iowa will emerge from its makeover looking like Provence, France, and volunteers won't be needed to hand out free coffee at rest areas along I-80.

Continue to fund cellulosic ethanol research, focusing on how to make 500,000 fossil-fuel-based products (i.e. medicine, chemicals, plastics, etc) and fuel for when population declines to pre-fossil fuel carrying capacity. The feedstock should be from a perennial, tall-grass prairie herbivore ecosystem, not food crops. But don't waste taxpayer money to build demonstration or commercial plants until most of the research and sustainability barriers have been solved.

California should not adopt the E10 ethanol blend for global warming bill AB 32. Biofuels are at best neutral and at worst contribute to global warming. A better early action item would be to favor low- emission vehicle sales and require all new cars to have energy efficient tires.

Take away the E85 loophole that allows Detroit automakers to ignore CAFE standards and get away with selling even more gas guzzling vehicles (Consumer Reports 2006). Raise the CAFE standards higher immediately.

There are better, easier ways to stretch out petroleum than adding ethanol to it. Just keeping tires inflated properly would save more energy than all the ethanol produced today. Reducing the maximum speed limit to 55, consumer driving tips, truck stop electrification, and many other measures can save far more fuel in a shorter time than biofuels ever will, far less destructively. Better yet, Americans can bike or walk, which will save energy used in the health care system.

Let's stop the subsidies and see if ethanol can fly.

Reform our non-sustainable agricultural system

Give integrated pest management and organic agriculture research more funding

The National Resources Conservation Service (NCRS) and other conservation agencies have done a superb job of lowering the erosion rate since the dustbowl of the 1930's. Give these agencies a larger budget to further the effort.

To promote land stewardship, change taxes and zoning laws to favor small family farms. This will make possible the "social, economic, and environmental diversity necessary for agricultural and ecosystem stability" (Opie 2000).

Make the land grant universities follow the directive of the Hatch Act of 1887 to improve the lives of family farmers. Stop funding agricultural mechanization and petrochemical research and start funding how to fight pests and disease with diverse crops, crop rotations, and so on (Hightower 1978).

Don't allow construction of homes and businesses on prime farm land.  Integrate livestock into the crop rotation.

Teach family farmers and suburban homeowners how to maximize food production in limited space with Rodale and Biointensive techniques.

Since less than 1 percent of our elected leaders and their staff have scientific backgrounds, educate them in systems ecology, population ecology, soil, and climate science. So many of the important issues that face us need scientific understanding and judgment.

Divert funding from new airports, roads, and other future senseless infrastructure towards research in solar, wind, and cellulosic products. We're at the peak of scientific knowledge and our economic system hasn't been knocked flat yet by energy shortages -- if we don't do the research now, it may never happen.

It's not unreasonable to expect farmers to conserve the soil, since the fate of civilization lies in their hands. But we need to pay farmers for far more than the cost of growing food so they can afford to conserve the land. In an oil-less future, healthy topsoil will be our most important resource.

Responsible politicians need to tell Americans why their love affair with the car can't continue. Leaders need to make the public understand that there are limits to growth, and an increasing population leads to the "Tragedy of the Commons". Even if it means they won't be re-elected. Arguing this amidst the church of development that prevails this is like walking into a Bible-belt church and telling the congregation God doesn't exist, but it must be done.

We are betting the farm on making cellulosic fuels work at a time when our energy and financial resources are diminishing. No matter how desperately we want to believe that human ingenuity will invent liquid or combustible fuels despite the laws of thermodynamics and how ecological systems actually work, the possibility of failure needs to be contemplated.

Living in the moment might be enlightenment for individuals, but for a nation, it's disastrous. Is there a Plan B if biofuels don't work? Coal is not an option. CO2 levels over 1,000 ppm could lead to the extinction of 95% of life on the planet (Lynas 2007, Ward 2006, Benton 2003).

Here we are, on the cusp of energy descent, with mechanized petrochemical farms. We import more farm products now than we sell abroad (Rohter 2004). Suburban sprawl destroys millions of acres of prime farm land as population grows every year. We've gone from 7 million family farms to 2 million much larger farms and destroyed a deeply satisfying rural way of life.

There need to be plans for de-mechanization of the farm economy if liquid fuels aren't found. There are less than four million horses, donkeys, and mules in America today. According to Bucknell, if the farm economy were de-mechanized, you'd need at least 31 million farm workers and 61 million horses. (Bucknell 1981)

The population of the United States has grown over 25 percent since Bucknell published Energy and the National Defense. To de-mechanize now, we'd need 39 million farm workers and 76 million horses. The horsepower represented by just farm tractors alone is equal to 400 million horses. It's time to start increasing horse and oxen numbers, which will leave even less biomass for biorefineries.

We need to transition from petroleum power to muscle power gracefully if we want to preserve democracy. Paul Roberts wonders whether the coming change will be "peaceful and orderly or chaotic and violent because we waited too long to begin planning for it" (Roberts 2004).

What is the carrying capacity of the nation? Is it 100 million (Pimentel 1991) or 250 million (Smil 2000)? Whatever carrying capacity is decided upon, pass legislation to drastically lower immigration and encourage one child families until America reaches this number. Or we can let resource wars, hunger, disease, extreme weather, rising oceans, and social chaos legislate the outcome.

Do you want to eat or drive? Even without growing food for biofuels, crop production per capita is going to go down as population keeps increasing, fossil fuel energy decreases, topsoil loss continues, and aquifers deplete, especially the Ogallala (Opie 2000). Where will the money come from to buy imported oil and natural gas if we don't have food to export?

There is no such thing as "waste" biomass. As we go down the energy ladder, plants will increasingly be needed to stabilize climate, provide food, medicine, shelter, furniture, heat, light, cooking fuel, clothing, etc.

Biofuels are a threat to the long-term national security of our nation. Is Dr. Strangelove in charge, with a plan to solve defense worries by creating a country that's such a salty polluted desert, no one would want to invade us? Why is Dr. Strangelove spending the last bits of energy in Uncle Sam's pocket on moonshine? Perhaps he's thinking that we're all going to need it, and the way things are going, he's probably right.

Appendix: Department of Energy Biofuel Roadmap Barriers

This is a partial summary of biofuel barriers from Department of Energy. Unless otherwise footnoted, the problems with biomass fuel production are from the Multi Year Program Plan DOE Biomass Plan or Roadmap for Agriculture Biomass Feedstock Supply in the United States. (DOE Biomass Plan, DOE Feedstock Roadmap). Resource and Sustainability Barriers

1) Biomass feedstock will ultimately be limited by finite amounts of land and water

2) Biomass production may not be sustainable because of impacts on soil compaction, erosion, carbon, and nutrition.

3) Nor is it clear that perennial energy crops are sustainable, since not enough is known about their water and fertilizer needs, harvesting impacts on the soil, etc.

4) Farmers are concerned about the long-term effects on soil, crop productivity, and the return on investment when collecting residues.

5) The effects of biomass feedstock production on water flows and water quality are unknown

6) The risks of impact on biodiversity and public lands haven't been assessed.

Economic Barriers (or Investors Aren't Stupid)

1) Biomass can't compete economically with fossil fuels in transportation, chemicals, or electrical generation.

2) There aren't any credible data on price, location, quality and quantity of biomass.

3) Genetically-modified energy crops worry investors because they may create risks to native populations of related species and affect the value of the grain.

4) Biomass is inherently more expensive than fossil fuel refineries because

a) Biomass is of such low density that it can't be transported over large distances economically. Yet analysis has shown that biorefineries need to be large to be economically attractive -- it will be difficult to find enough biomass close to the refinery to be delivered economically.

b) Biomass feedstock amounts are unpredictable since unknown quantities will be lost to extreme weather, sold to non-biofuel businesses, rot or combust in storage, or by used by farmers to improve their soil.

c) Ethanol can't be delivered in pipelines due to likely water contamination. Delivery by truck, barge, and rail is more expensive. Ethanol is a hazardous commodity which adds to its transportation cost and handling.

d) Biomass varies so widely in physical and chemical composition, size, shape, moisture levels, and density that it's difficult and expensive to supply, store, and process.

e) The capital and operating costs are high to bale, stack, palletize, and transport residues

f) Biomass is more geographically dispersed, and in much more ecologically sensitive areas than fossil resources.

g) The synthesis gas produced has potentially higher levels of tars and particulates than fossil fuels.

h) Biomass plants can't benefit from the same large-scale cost savings of oil refineries because biomass is too dispersed and of low density.

5) Consumers won't buy ethanol because it costs more than gasoline and contains 34% less energy per gallon. Consumer reports wrote they got the lowest fuel mileage in recent years from ethanol due to its low energy content compared to gasoline, effectively making ethanol $3.99 per gallon. Worse yet, automakers are getting fuel-economy credits for every E85 burning vehicle they sell, which lowers the overall mileage of auto fleets, which increases the amount of oil used and lessens energy independence. (Consumer Reports)

Equipment and Storage Barriers

1) There are no harvesting machines to harvest the wide range of residue from different crops, or to selectively harvest components of corn stover.

2) Current biomass harvesting and collection methods can't handle the many millions of tons of biomass that need to be collected.

3) How to store huge amounts of dry biomass hasn't been figured out.

4) No one knows how to store and handle vast quantities of different kinds of wet biomass. You can lose it all since it's prone to spoiling, rotting, and spontaneous combustion

Preprocessing Barriers

1) We don't even know what the optimum properties of biomass to produce biofuels are, let alone have instruments to measure these unknown qualities.

2) Incoming biomass has impurities that have to be gotten out before grinding, compacting, and blending, or you may damage equipment and foul chemical and biological processes downstream.

3) Harvest season for crops can be so short that it will be difficult to find the time to harvest cellulosic biomass and pre-process and store a year of feedstock stably.

4) Cellulosic biomass needs to be pretreated so that it's easier for enzymes to break down. Biomass has evolved for hundreds of millions of years to avoid chemical and biological degradation. How to overcome this reluctance isn't well enough understood yet to design efficient and cost-effective pre-treatments.

5) Pretreatment reactors are made of expensive materials to resist acid and alkalis at high temperatures for long periods. Cheaper reactors or low acid/alkali biomass is needed.

6) To create value added products, ways to biologically, chemically, and mechanically split components off (fractionate) need to be figured out.

7) Corn mash needs to be thoroughly sterilized before microorganisms are added, or a bad batch may ensue. Bad batches pollute waterways if improperly disposed of. (Patzek Dec 2006).

Cellulosic Ethanol Showstoppers

1) The enzymes used in cellulosic biomass production are too expensive.

2) An enzyme that breaks down cellulose must be found that isn't disabled by high heat or ethanol and other end-products, and other low cost enzymes for specific tasks in other processes are needed. 3) If these enzymes are found, then cheap methods to remove the impurities generated are needed. Impurities like acids, phenols, alkalis, and salts inhibit fermentation and can poison chemical catalysts.

4) Catalysts for hydrogenation, hydrgenolysis, dehydration, upgrading pyrolysis oils, and oxidation steps are essential to succeeding in producing chemicals, materials, and transportation fuels. These catalysts must be cheap, long-lasting, work well in fouled environments, and be 90% selective.

5) Ethanol production needs major improvements in finding robust organisms that utilize all sugars efficiently in impure environments.

6) Key to making the process economic are cheap, efficient fermentation organisms that can produce chemicals and materials. Wald writes that the bacteria scientists are trying to tame come from the guts of termites, and they're much harder to domesticate than yeast was. Nor have we yet convinced "them to multiply inside the unfamiliar confines of a 2,000-gallon stainless-steel tank" or "control their activity in the industrial-scale quantities needed" (Wald 2007).

7) Efficient aerobic fermentation organisms to lower capital fermentation costs.

8) Fermentation organisms that can make 95% pure fermentation products.

9) Cheap ways of removing impurities generated in fermentation and other steps are essential since the costs now are far too high.


HUMAN IGNORANCE IS GROWING

By Peter Montague

Some scientists may not like to admit it, but we humans are pretty much flying blind when we intrude into natural systems. Our understanding of the natural world is rudimentary at best. As a result, many of our technologies end up scrambling the natural world, replacing the natural order with disorder.

In this issue of Rachel's News we learn about three new problems --

** The mysterious disappearance of millions upon millions of bees, whose pollination services support $14 billion in farm production each year. At this po9int, the cause is a complete mystery, but almost certainly humans have a hand in it.

** A new virus has appeared in the Great Lakes during the past few years, and it is spreading westward through the lakes, killing large numbers of fish and thus endangering a $4 billion fishing industry. The main suspect is ships arriving from foreign ports and discharging their ballast water into the Lakes.

** The development of herbicide-resistant weeds that are creating major headaches (and costs) for cotton farmers. Monsanto's genetically-engineered cotton was created to withstand heavy application of Monsanto's most profitable weed-killer, glyphosate (sold widely under the trade name Roundup). When Monsanto announced "Roundup-Ready" cotton, everyone knew it was only a matter of time before Roundup-resistant weeds would develop, because that's how nature works. When a weed-killer is applied, a few hardy weeds survive; they multiply while the others die. Soon the hardy weeds dominate -- and farmers find themselves without an easy or affordable way to manage the new weed problem. Presumably Monsanto's business plan was to stay one step ahead of nature, always having a new chemical ready to sell to farmers, to help them overcome the problems created by yesterday's chemical. Unfortunately, it hasn't worked out that way and the farmers are hurting.

Our Ignorance is Expanding

As time passes, we should expect a continuing (perhaps even accelerating) stream of bad news about human intrusions into natural systems. In a very real sense the systems we are trying to study are growing more complicated as we scramble them, so understanding them is becoming more difficult.

Take the problem of disappearing amphibians (frogs, toads and salamanders). In 1989, scientists began noticing frogs were disappearing around the globe. They identified many causes:

** loss of wetland habitat (rice paddies turning into golf courses, for example, and swamps turning into condominiums);

** increased ultraviolet radiation arriving at the surface of the earth because DuPont's chlorofluorocarbons (CFCs) have depleted the Earth's ozone shield;

** stocking streams with edible sport fish (e.g., large-mouth bass) that eat tadpoles;

** acid rain and acid snow caused by combustion of fossil fuels;

** increasing droughts and floods, brought on by global warming, are taking their toll on amphibians;

** pollutants that mimic the female sex hormone, estrogen, may be interfering with the reproductive cycle of amphibians, as is known to be happening with fish;

** amphibians may have started falling prey to bacteria and viruses with which they have co-existed for 200 million years -- indicating, perhaps, that some combination of environmental insults has weakened amphibian immune systems.

The truth is, no one know what combination of these (and other, perhaps yet-unrecognized) changes in natural systems have contributed to the disappearance of frogs, toads, and salamanders all across the planet.

One thing is sure: every time we introduce a new chemical into commerce, it enters natural systems and makes the job of scientists more difficult because the system they are studying is now more complex than it was yesterday. In the U.S., we introduce roughly 1800 new chemicals into commerce each year.

As our technology expands, our ability to understand what is going on in nature declines, and we are flying blinder and blinder.

Until we take a precautionary approach, give the benefit of the doubt to natural systems, and do our level best to understand our actions before we act, we are in for an endless parade of unpleasant and expensive surprises. Yes, a precautionary approach would mean that the pace of technological innovation would slow down (compared to today's frenetic pace) -- but it would help avoid expensive problems like the loss of bees, the invasion of new viruses into the Great Lakes, and creation of Super Weeds. It might also give humans a better chance of surviving as a species.

FISH-KILLING VIRUS SPREADING IN THE GREAT LAKES

By Susan Saulny

CHICAGO, April 20 -- A virus that has already killed tens of thousands of fish in the eastern Great Lakes is spreading, scientists said, and now threatens almost two dozen aquatic species over a wide swath of the lakes and nearby waterways.

The virus, a mutated pathogen not native to North America that causes hemorrhaging and organ failure, is not harmful to humans, even if they eat contaminated fish. But it is devastating to the ecosystem and so unfamiliar, experts said, that its full biological impact might not be clear for years. It is also having a significant impact on the lakes' $4 billion fishing industry.

There is no known treatment for the virus. As a result, scientists are focusing on managing its spread to uncontaminated water -- quite a challenge since the Great Lakes are linked and fall under the jurisdiction of several states and provinces in Canada.

"Updates over the winter suggest it has spread further than we thought, even last year," said John Dettmers, a senior fisheries biologist for the Great Lakes Fisheries Commission in Ann Arbor, Mich.

"It's really early," Mr. Dettmers said. "As much as I'd like to say we know exactly what's going on, we don't. We're all sitting here on the edge of our chairs waiting to see how bad it's going to be this year."

When it was first detected about two years ago, the virus had affected only two species in a limited amount of water. But it has aggressively spread to other areas and other fish and is now being confirmed in Lake Huron after infecting Lakes Ontario and Erie, Lake St. Clair, the St. Lawrence River and the Niagara River. It is suspected in Lake Michigan as well, although there is no official confirmation.

Last year, the virus, called viral hemorrhagic septicemia and known as V.H.S., caused untold thousands of dead fish to wash up in places like the eastern shoreline of Lake Ontario, a warning sign that scientists said could just be the tip of the iceberg in terms of what is going on underwater.

The five Great Lakes -- Superior, Erie, Huron, Michigan and Ontario -- hold 20 percent of the world's fresh surface water.

"We anticipate that this will continue and get worse over the next few years," said Dr. Jim Casey, associate professor of virology at Cornell University. "We fear there may be more widespread presence of the virus."

One of Dr. Casey's colleagues researching the virus, Dr. Paul Bowser, a professor of aquatic animal medicine, added, "This is a new pathogen and for the first number of years -- 4, 5 or 10 years -- things are going to be pretty rough, then the animals will become more immune and resistant and the mortalities will decline."

No one is sure where the virus came from or how it got to the Great Lakes. In the late 1980s, scientists saw a version of V.H.S. in salmon in the Pacific Northwest, which was the first sighting anywhere in North America. V.H.S. is also present in the Atlantic Ocean. But the genesis of a new, highly aggressive mutated strain concentrating on the Great Lakes is a biological mystery.

"We really don't know how it got there," said Jill Roland, a fish pathologist and assistant director for aquaculture at the U.S. Department of Agriculture. "People's awareness of V.H.S. in the lakes was unknown until 2005. But archived samples showed the virus was there as early as 2003."

Scientists pointed to likely suspects, mainly oceangoing vessels that dump ballast water from around the world into the Great Lakes. (Ships carry ballast water to help provide stability, but it is often contaminated and provides a home for foreign species. The water is loaded and discharged as needed for balance.)

Fish migrate naturally, but also move with people as they cast nets for sport, for instance, or move contaminated water on pleasure boats from lake to lake.

The United States Department of Agriculture issued an emergency order in October to prohibit the movement of live fish that are susceptible to the virus out of the Great Lakes or bordering states. The order was later amended to allow limited movement of fish that tested negative for the virus.

"Getting rid of it is extremely hard to foresee," said Henry Henderson, director of the Natural Resources Defense Council's Midwest office in Chicago. "These species spread, and reproduce. It is a living pollution."


THE DEATH OF RECYCLING

By Paul Palmer, Ph.D

[Paul Palmer hold a Ph.D. in Physical Chemistry from Yale. He is interested in hearing from readers who may want to join him in starting a new organization focused on zero waste. Contact him at paulp@sonic.net.]

A little more than a year ago, an article entitled The Death of Environmentalism by Shellenberger and Nordhaus made a splash when it claimed that environmentalists had become complacent, relying on their time-honored methods of banning behaviors that they found objectionable through political and judicial activism, rather than through engaging the moral base of the American public. The critique was applied to the looming crisis of global warming and seemed to portend a gigantic failure if environmentalists did not embrace a new awareness of public concern and participation and stop relying on public policy correctness and technical fixes.

Shellenberger and Nordhaus noted the lack of inspiring visions to energize the public. They went on to lament the smugness of officials of environmental organizations who have become used to rich rewards in salaries, grants, dues and acclaim from growing membership lists.

There is a movement for resource recapture that suffers from the same defects. It has come to be called Recycling. Latterly it too has become lazy, relying on yesterday's methods and advancing no new ideas to inspire the public. The practitioners, while not profiting from dues or grants nearly as much as the defenders of wildlife, nevertheless have their own stultifying income source. They have become used to income derived from the low grade collection of garbage. Their method is to pick away at garbage streams recapturing small amounts of smashed up lowgrade materials. Alternatively they profit by exacting garbage dumping surcharges, resembling guilt taxes, from the dumpers. They have formed close alliances with the garbage industry, the two often being indistinguishable. Since no approach to conservation that relies on harvesting garbage can ever threaten the garbage paradigm, they have no way to inspire the public. They do promote themselves mendaciously as being fundamentally opposed to garbage, but that ideology is merely a holdover from a time when recycling was young. The contradiction is disturbing to even casual observers.

What would you think about a gigantic piece of the environmental movement, involving trillions of dollars worth of resources annually in this country alone, that environmentalists ignore? The way in which resources are used to create products is exactly such an item. After working in this field for thirty years, I have seen that environmentalists are afraid to deal with industrial production because they don't understand it. It seems like a technical subject that they have no hope of getting a handle on. If a single resource is badly harvested, like old growth trees, they will organize. If the process produces an obvious pollution, they will demand regulation to correct it. But there it stops. The way in which products are designed specifically for waste is simply not on their screen.

In the United States, recycling as a theory of resource management arose in the nineteen seventies. Since that time, no new theory or even interpretation has been put forward until today. Three major developments should be noted.

** First, the garbage industry realized that it could take over the movement for recycling, turning it into a division of garbage management, finally paying recyclers a surcharge to co-opt them.

** Second, the recyclers accepted the pre-eminence of the garbage industry and dropped any notion of replacing or closing dumps.

** Third, a few progressive individuals and organizations began to discuss a new resource management plan to which they gave the name zero waste.

At some point, the recyclers, now working for garbage management, saw that zero waste could become a slogan that appeared to the public to be a higher theory of resources. Because of their immersion in the recycling paradigm as an ultimate theory, they were actually unable to put flesh on the bones of the zero waste approach, but they began to spread the bare slogan. On the ground, nothing changed. The recyclers went to a number of jurisdictions (several California cities especially) and urged the cities to join them in putting forward the facade of a new resource management plan, which a number of cities did. In actuality, the new plans concerned Zero Waste in name only. They proposed only more recycling.

How Does Zero Waste Differ From Recycling?

What should have been in such plans, that would have revealed a truly new theory of resource conservation? The essence of the new synthesis can be summed up in one pregnant phrase: redesign for reuse. But what kind of redesign for what kind of reuse? That is where the new theory flexes its muscles.

The basic problem that has always plagued recycling is that it accepts garbage creation as fundamental. Zero waste strategies reject garbage creation as a failure, actually an abomination that threatens the planet, an historical accident, a politically motivated defect in the design of our industrial-commercial system of production. Zero waste actually goes deeper in that it rejects waste of every kind at every stage of production. Zero waste demands that all products be redesigned so that they produce no waste at all and furthermore, that the production processes (a kind of product in themselves because they too are carefully designed) also produce no waste. Zero waste at no point interfaces with garbage but rather simply looks beyond it. In the theory of zero waste, once all waste is eliminated, there will be no garbage, no need for any garbage collection, no garbage industry and no dumps. All that superstructure of garbage management will fade away as simply irrelevant.

The currently operative theory of recycling is entirely different. It contemplates the continual, even perpetual collection of garbage and then attempts to find innovative ways to reuse the maximum part of that garbage. In the current jargon, recycling is an end-of-pipe theory. Zero waste is a redesign theory. Because end-of-pipe approaches are necessarily inefficient and difficult (since products were never designed for reuse) the best that recycling is able to hold out for in most cases is destruction of products after one use (through smashing, chopping, grinding, etc.) and the laborious recapture of only the bare materials. Thus the common recycling obsession with steel, aluminum, paper, glass and plastic, ignoring fifty thousand additional common chemicals, plastics, metals, glasses, minerals etc. It is no exaggeration to say that recycling has no comment on the vast majority of products, processes and materials, while zero waste has solutions or improved practices to offer for every single product, production process, material and (current) waste. In addition, zero waste offers a compelling spirituality as it elevates the conservation of our one precious planet to the level of a holy creed and demands that our design for resources usage reflects that creed.

Recyclers also try to find last-minute ways of reuse, such as is done by thrift shops, by turning junk into artwork or by construction reuse yards which resell doors, windows, sinks and more. While a single piece reused is indeed a victory, these are again end-of-pipe enterprises which probably account for less than 1% of all discards. Zero waste seeks to elevate reuse into an integral part of the design of 100% of all of these products.

In the new zero waste theory, products are designed from the start to be reused over and over. After many uses, including repairs, rebuilding, remanufacturing etc., disassembly into materials may become necessary for a step that resembles recycling, but even at this last stage, the reuse of materials has been carefully designed into the original product, planning for it in many critical ways. Thus even when zero waste comes down to the reuse of component materials, it does so in a way that is sharply different from an end-of-pipe method. For example, zero waste principles strongly recommend against the lamination or joining of different materials in an indissoluble bond unless the lamination can be reused as lamination, or disassembled. All parts must be well identified by markings and history, not something to be guessed at with inadequate symbols (like the recycling labels on plastic) of such generality that they convey little information of any use. Extensive information about every part, every piece, every material will be key, using every tool of modern information tracking such as radio frequency tags (rfid's), bar codes, public specifications and the internet. Recyclers, by contrast, have no response to difficult items like laminations except to toss them into a dump, as non-recyclable.

If zero waste thinking is new to you, you may be wondering how all of this can be done. In my book, Getting To Zero Waste, I detail many practical applications that are simple and straightforward even in this world of production dedicated to waste. As one simple example, repair has all but faded away. Repair of electronic and other technical instruments has been replaced with discard followed by purchase of a new, cheap product from China. But why did repair become so economically disfavored? For electronics, four major reasons were these:

** Circuit diagrams were generally unobtainable, necessitating a constant series of educated guesses.

** When circuit diagrams were obtained, they were filled with arcane, uninterpretable proprietary symbols. Even a simple resistor could be unrecognizable.

** Parts, including simple mechanical ones, were not grouped into standard, interchangeable assemblies such as a standard circuit board or a tape loading mechanism.

** Lastly, parts themselves became unavailable, sometimes after a few months or at most after a few years.

Look at this list! The needed changes leap off the page at you. Begin by demanding, under pain of not being allowed to sell product, that every single circuit diagram must be published openly on the web, for all to see. Then demand that all symbols used on the diagrams must be publicly understandable and explained. Insist that repair shops be established, or certified, for every group of products or by every manufacturer. Require long-term availability of parts. This is only the beginning, yet it shows the narrow end of a funnel opening up to a revolution in reparability.

Even products that the recyclers have no clue how to reuse or even think about are commonplace for zero waste strategizing. I worked successfully and easily in chemical reuse for thirty years. The recyclers have never had any ideas to offer except fear, bans and urging users to discard chemicals into dumps. The software industry depends critically on reuse of its intellectual creations, yet the recyclers have only the trivial focus on the paper or discs that are used for storage of software.

Why have the designers been able to design waste so cavalierly into their products? A large part of the answer is the ready availability of a subsidized dump. As we get further into a zero waste society, dumps will not only become unnecessary but as soon as any zero waste solution can be applied, the dump can be legally put off limits. When there is no eternally welcoming dump for a product, there will be no alternative to designing for perpetual reuse.

Necessarily, this is only a hint of a long discussion. Are there too many products and designs for you to contemplate? Simple, establish Zero Waste Divisions in the Engineering Departments of every university. Obviously this author, or a hundred like him, are not going to be able to subject every product to a deep analysis with solutions. That is the function of research. Let us give employment to thousands of industrial redesigners, chemical engineers, biologists, fermentation technicians and every other kind of professional. The kinds of jobs found in the garbage industry are not worth hanging on to, compared to the brand new jobs needed for innovative, intelligent and responsible design of products for perpetual reuse. Design for responsibility should create a flood of new patents, protecting designs which can then spread worldwide as brand new businesses carry the message around the globe that the Age of Garbage has ended.

By now the reader can see that zero waste differs from recycling approaches in an important way: its intellectual roots. Recycling is a simple notion, hardly more complicated than the dumping of garbage into a hole in the ground. Simply find some component of the garbage being collected and divert it into a (usually existing) alternate process as a raw material. True, recycling encounters many political problems needing to be solved, collection and diversion systems to be designed, as well as the difficulties of introducing mixed or contaminated materials to a processing system used to completely refined and "clean" raw materials. Engineering and scientific professionals play almost no role. Zero waste, on the other hand, essentially requires high level redesign. Every product being made needs to be designed under a brand new constraint -- the disappearance of easy discard. Chemical products will require chemists and chemical engineers. Other technical products will likewise require help from other professionals. Contamination will be fundamentally unacceptable.

Hopefully the reader begins to see the outlines of a new paradigm which will make garbage creation obsolete. Yet it is only common sense applied to production. We have come back to a tenet of The Death of Environmentalism -- the one that laments the lack of inspiration. Is it not an inspiring vision to demand that rational design be applied to reuse? Is the complete elimination of garbage and dumping not a vision that ordinary people can seize and insist on? In another hundred years, people will read with disbelief that in the twenty- first century, industry actually designed products for a single use, then to be smashed and buried underground. That will seem to be a story about Neanderthal behavior.

I have had to digress to explain zero waste so that the reader will be able to understand the most recent developments.

All design work takes place in a universe of broadly shared assumptions. These include the availability and cost of materials, the price of robots or labor, consumer acceptance, etc. Sometimes the intended use controls everything, such as with high end research equipment or racing yachts. But there is one assumption that always pervades the entire design process -- waste is to be expected and it costs practically nothing.

Eliminate this one assumption of wasting and the whole design process will be turned on its head. Time-honored methods of designing for cheap assembly and quick obsolescence will themselves need to be discarded. Instead, quality components, expertly assembled will be the norm and the design will necessarily become one for perpetual repair, refurbishment, upgrading, reuse of every part and every function.

The recyclers also like to talk about an end to dumps. So how does this vision differ from theirs?

One can reasonably say that recycling and reuse has always been with us. We wash our clothes hundreds of times; we do not throw them out after one use. We repair our automobiles endlessly. Home Depot counts on the fact that we will fix up our houses. We patch roads. Since airplanes can never be allowed to fall out of the sky due to obsolescence, the airplane industry maintains a kind of zero waste attitude, constantly repairing and downgrading for decades. Yet in spite of these conservative attitudes, garbage dumping exploded in the last century and is still growing. Various studies claim that Americans account for many tons of garbage for every pound of product they buy. The recycling approach has clearly failed to stanch this torrent of garbage.

More troubling is the development over the last thirty years of a close, symbiotic relationship between the methods of the garbage industry and the recycling movement. When recyclers seek inputs of materials, they primarily employ collection methods based on discard. Classically, they simply task the garbage collector to set out one more green or blue or red container next to his garbage can. The result is predictable -- the public frames recycling as tantamount to garbage collection and treats it with disdain. Households have no idea which container to use for what and everything gets mixed up. If there is any doubt, it is understood that recycling is just garbage anyway so what difference can it make which can is used? The recyclers themselves go along with the garbage company pleas and accept the nonsensical notion that everything can all be mixed together (i.e. making garbage) and then sorted out later.

The public acceptance of waste comes from two sources. First, the unconcerned public have come to accept the canard that garbage is natural. They support the whole superstructure of subsidized dumps and profitable garbage collection. We hear that "you have to put it somewhere"; "just get rid of it" and we treat garbage as a social "service". Second, the only claim to a popular alternative is the recycling one, which in turn supports garbage to the hilt. The developing crisis in planetary resources will force the abandonment of both of these defective notions.

Recyclers have recently begun to create analyses claiming to be based on zero waste. Many of these claim to be aware that zero waste is not just more recycling. However, despite the good words, not one of them presents any programs, projects or ideas which go beyond mere recycling or challenge the primacy of garbage. This is not an accident. The close relationship to garbage methods contaminates the analysis. These erroneous writings are easily available on the web under the name of various cities and counties, especially in California, that have adopted putative zero waste resolutions. These include Palo Alto, San Francisco, Oakland and Nevada County. It is essential that newcomers not accept every program that calls itself "zero waste" as part of the new paradigm.

Even without its crippling association with the garbage industry, recycling suffers from a crippling constriction of goals. At its best, the ideology of recycling has always been limited to an enervating focus on the dump! Because it has never transcended its early ideology, which was forged in the 1960's and early '70's, recycling has never claimed to do more than target the elimination of dumps, yet even this modest goal is unattainable by recycling. Even if recycling were amazingly effective, taking out 90% of some material which was heading to the dump (no project is close to this effectiveness), ten percent would still go into the dump on every cycle. After about eight cycles, virtually the entire load of original material will be sitting at the bottom of a dump and it is only new, virgin materials which are still circulating. In the case of aluminum cans, the project that recyclers like to point at with pride, about fifty percent of the aluminum ends up in the dump on each cycle and the typical cycle is about three months long. At the end of a year, just about the whole load of aluminum is found in the dump and all the cans in circulation are made of new material which will likewise soon be residing in the dump. No wonder the garbage industry is hardly shaking in its shoes over the success of recycling. The deficiencies of recycling are even worse than this. As I said earlier, recycling entirely overlooks the processes that call for the materials that it is concerned with. So the processes can continue to be as wasteful as a waste oriented society can make them. Instead of a tightly designed process, we find them designed in a lazy way to create, for example, chemical excesses for which recyclers can find no use. No problem: our society reserves portions of soil, water and air by regulation that are good for nothing but being polluted. So long as the regulations are followed, pollution is accepted. But who is to question why unusable excesses are produced in the first place? Recycling makes no objection, while zero waste thinking demands that cheap disposal eventually be eliminated and that wasteful practices be redesigned to function without the benefit of the welcoming dump.

Consider now the enormous waste of designing products to be fragile, breakable, trashy, lightweight and with signature, critically weak parts inside. This practice is part of the strategy called "planned obsolescence". When the pieces immediately break, the recyclers may be standing by to snatch some of the materials, but how does that compare to a product that is so well designed for reuse that only a tenth as much raw material ends up passing through the industrial meatgrinder? Only a fraction as much energy has to be used. Only a fraction as much soil exhaustion is caused in extracting the natural resources that go into the product. And remember that among those natural resources is food for the humans working in the factory as well as fuel for their transportation and the resources for their education, entertainment and all the rest of life. That can all be minimized by repairing and refurbishing the products endlessly. The recycler, by contrast, accepts this wasting as natural, so long as a portion of the bare materials are captured for reuse at the last moment.

The conceptual analysis which ties up all the loose ends is functional reuse. This means the reuse of the highest function of every product, not the lowest materials. For example, the unfortunately classical method of recycling a glass bottle is to destroy its function. As a container, its function is to contain. The fact that it is made from a nearly valueless glass material is of virtually no interest. Yet the recycler will gleefully abandon the valuable function for the valueless material and crow about his success. This is a serious failure of design. The common-sense way that zero waste approaches this reuse is by using the containment function -- by refilling the bottle. All of the value is recaptured and there is no reason to transport broken glass across the country, remelt it, fill it in a distant factory and ship it back to where it started.

Recycling claims to save energy, but this is by and large an empty claim, Recycling actually is a way to insure that energy is wasted for no reason. Zero waste already shows the way to recapture almost 100% of the energy, by refilling, so why are we still smashing bottles? Only because garbage fleets demand methods which make use of their core capability -- hauling heavy loads around the country, no matter whether to a dump or a recycling facility.

Functional reuse is a broad general principle that applies to every single product made anywhere. Not to ten or twenty percent of the contaminated materials in a garbage can, but to everything. It is only from working with inherent functions that new patents and new worldwide businesses can emerge.

One estimate says that industry produces seventy-one times as much garbage as households, while producing the products we want. A theory that ignores 98.5% of a problem no longer commands respect.

The conclusion is inescapable. Recycling has had its day and is now moribund. Those of us concerned about the destruction of the earth need to adopt a new, healthier understanding of the real world. That new synthesis is Zero Waste.


ELIMINATING AND PREVENTING HEALTH DISPARITIES

By Peter Montague

Human health is a prime indicator of environmental quality, and of the success of any society. If the health of residents is poor, then look for problems in the environment. If some groups are thriving and others are not, it's a sure sign of injustice.

Such differences are called "health disparities" (or, sometimes, "health inequities"[1]) and the U.S. has set a goal of eliminating them.

There's a long way to go to meet that goal.

A recent study of longevity in the U.S. found that white "Middle Americans" live five years longer, on average, than black "Middle Americans" -- 77.9 years vs. 72.9 years. And this does not include "high-risk urban black men," who live, on average, only 71.1 years. The differences between "Middle American" racial groups are not explained by health insurance coverage or by frequency of medical appointments -- so access to medical care is not the primary driver of these disparities.

Hispanics are about twice as likely as non-Hispanic whites to have diabetes, or to get cancer of the stomach, liver, gall bladder, or cervix. Hispanic women develop heart disease about a decade earlier than non-Hispanic white women.

Two times more black women die in childbirth than white women. The nation's cancer death rate is 35 percent higher in black men and 18 percent higher in black women than in white men and women, according to a new report from the American Cancer Society. These facts barely scratch the surface.

How are 'health disparities' defined?

The National Association of County and City Health Officials (NACCHO) has defined "health disparities" as "differences in populations' health status that are avoidable and can be changed. These differences can result from social and/or economic conditions, as well as public policy. Examples include situations whereby hazardous waste sites are located in poor communities, there is a lack of affordable housing, and there is limited or no access to transportation. These and other factors adversely affect population health."

On paper at least, the U.S. is committed to eliminating health disparities.

* The federal Department of Health and Human Services has established an Office of Minority Health with a "National action agenda to eliminate health disparities for racial and ethnic minority populations."

* The federal program, Healthy People 2010, has two goals: "(1) Increase quality and years of healthy life; and (2) Eliminate health disparities, including differences that occur by gender, race or ethnicity, education or income, disability, geographic location, or sexual orientation."

* The American Public Health Association has called for action to eliminate health disparities.

* NACCHO -- The National Association of County and City Health Officials -- has passed a strong resolution advocating for programs and policies that minimize health inequities;

* NACCHO has published standards that every local health department (LHD) should be able to meet: "These standards describe the responsibilities that every person, regardless of where they live, should reasonably expect their LHD to fulfill." The NACCHO standards say, "...all LHDs (local health departments) exist for the common good and are responsible for demonstrating strong leadership in the promotion of physical, behavioral, environmental, social, and economic conditions that improve health and well-being; prevent illness, disease, injury, and premature death; and eliminate health disparities."

"The national Society for Public Health Education in 2002 passed a "Resolution for Eliminating Racial and Ethnic Health Disparities."

Health disparities can be eliminated by prevention

Health disparities are difficult (or impossible) to remedy once they have been allowed to develop. However, they can be prevented.

A recent report from the Prevention Institute in Oakland, California describes steps that communities can take to eliminate health disparities through prevention.

The "main premise" of the report is that, "reducing disparities can only be achieved if attention is paid to eliminating and minimizing diseases and injuries before the need for treatment, therapy, and disease management, and this can only be done by changing fundamental conditions of the environment that arise from racial and economic injustice." And: "Eliminating racial and ethnic health disparities is imperative both as a matter of fairness and economic common sense."

The Prevention Institute has identified 13 "community factors" that "play a pivotal role in determining health and disparities." (pg. iii)

The report then offers 10 disparity-reducing strategies that public health practitioners, advocates and decision-makers can focus on. This is good stuff -- grounded in the scientific and medical literature, yet very practical. The Prevention Institute even has a "community assessment tool" -- called Thrive -- that everyone can apply in their own community begin to prevent health disparities.

The 13 "community factors" are organized into 3 clusters:

1. The Equitable Opportunity Cluster: the equitable distribution of opportunity and resources

Availability of jobs paying living wages -- individual income alone has been shown to account for nearly one-third of health risks among blacks. The remainder may be explained by residential segregation, which locks people into poor housing, neighborhoods without recreational opportunity or access to nutritious food, and so on.

Education -- high school dropout rates correlate closely with poor health. Lower education levels are associated with smoking, overweight, and low levels of physical activity.

Racial justice -- racial and ethnic discrimination are harmful to health. Economic inequity, racism, and oppression can maintain or widen gaps in socioeconomic status, and increase stress.

2. The People-Factors Cluster

Social networks and trust: "strong social networks and connections correspond with significant increases in physical and mental health, academic achievement, and local economic development, as well as lower rates of homicide, suicide, and alcohol and drug abuse." (pg. 9)

Participation and willingness to act for the social good: "Social connections also contribute to community willingness to take action for the common good which is associated with lower rates of violence, improved food access, and anecdotally with such issues as school improvement, environmental quality, improved local services, local design and zoning decisions, and increasing economic opportunity. Changes that benefit the community are more likely to succeed and more likely to last when those who benefit are involved in the process; therefore, active participation by people in the community is important."

The behavioral norms within a community, "may structure and influence health behaviors and one's motivation and ability to change those behaviors." Norms contribute to many preventable social problems such as substance abuse, tobacco use, levels of violence, and levels of physical activity. For example, traditional beliefs about manhood are associated with a variety of poor health behaviors, including drinking, drug use, and high-risk sexual activity.

3. The Place-Factors Cluster

This refers to the physical environment in which people live, work, play, pray and go to school. This includes:

What's sold and how it's promoted. The presence of a supermarket in a neighborhood increases the consumption of fruits and vegetables by more than 30%. The presence of many liquor stores is associated with greater liquor consumption which in turn is linked to increased violence. If large portions of high-fat, high-sugar junk food are aggressively marketed, that's what people tend to eat.

Neighborhood look and feel, and safety. "The physical space of communities influences patterns of life. The distances between home and work, the look and feel of a streetscape, the presence or lack of retail stores and parks influence whether people drive, walk, or bike and how they spend their leisure time.All too often, residents in low- income communities cope with inadequate sidewalks, inadequate access to public transportation, absence of bike lanes for cyclists, absence of walking and biking trails and absence or ill maintenance of parks, along with inaccessible recreational facilities and crime. Safety is a dominant concern leading parents to drive their children to school, rather than letting them walk, and to prohibit outdoor play." (pg. 12)

Parks and open space. Physical activity levels -- and positive interactions between neighbors -- are influenced by enjoyable scenery, greenery, access to parks, convenient transportation, and the design of streets and neighborhoods.

Getting around. "A well-utilized public transit system contributes to improved environmental quality, lower motor vehicle crashes and pedestrian injury, less stress, decreased social isolation, increased access to economic opportunities, such as jobs, increased access to needed services such as health and mental health services, and access to food, since low-income households are less likely than more affluent households to have a car."

Housing. Poor housing contributes to health problems in low-income communities and communities of color and is associated with increased risk for injury, violence, exposure to toxins, molds, viruses, and pests, and psychological stress.

Air, water, and soil. "Low-income communities and communities of color are more likely to have poor air quality and toxic brownfield sites. Poor air quality prevents individuals from engaging in physical activity, especially if they have asthma or other respiratory illnesses. Contaminated empty lots, which could serve as badly needed parks and open space, frequently require large sums of money for sufficient clean-up.... Cancer, asthma, birth defects, developmental disabilities, infertility, and Parkinson's disease are on the rise, and they are linked to chemical exposures from air, water and soil, and products and practices used in our schools, homes, neighborhoods, and workplaces. Low-income people and people of color are typically the most affected by environmental health concerns."

Arts and culture. "The presence of art and other cultural institutions contributes to an environment that is conducive to health and safety. Artistic outlets, such as gardens, murals, and music, promote a healing environment. This has been demonstrated in hospitals and other health care facilities, where the incorporation of arts into the building's spaces has reduced patient recovery time and assisted in relief for the disabled, infirm, or their caregivers. The visual and creative arts enable people at all developmental stages to appropriately express their emotions and to experience risk taking in a safe environment. For those who have witnessed violence, art can serve as a healing mechanism. More broadly, art can mobilize a community while reflecting and validating its cultural values and beliefs, including those about violence. Also, artistic expression can encourage physical activity, as in the case of dance. A report commissioned by the Ottawa City Hall states that culture "provides benefits in terms of...social cohesion, community empowerment... health and well being and economic benefit."

So there you have it -- 13 community factors that strongly determine health disparities. Next week we'll discuss 10 disparity-reducing strategies that communities can use.

Meanwhile, here's one final thought. Health disparities go to the heart of "environmental health." Eliminating health disparities will take us in some new directions. As the Prevention Institute report says.

"This approach to improving health outcomes necessarily requires that the public health sector and health advocates approach health in a new way. It requires a new way of thinking and a new way of doing business. This is not an approach that identifies a medical condition and asks,"How do we treat this?" Rather, it requires understanding how the fundamental root causes of health disparities play out in the community in a way that affects health and asking, "Who do we need to engage and what do we need to do in order to prevent people from getting sick and injured?"


SOME IDEAS FOR A COMMON AGENDA

By Peter Montague and Carolyn Raffensperger

We could benefit if we had a few common ideas to guide our work. To provoke discussion about the elements of a common agenda, we have put together these initial thoughts. This draft is centered on the U.S. because it is the place we know best. The picture we sketch here contains elements that, to some, may seem remote from the traditional work of environmental protection, environment and health, or even public health. Perhaps few will be want to engage all aspects of this picture; nevertheless we hope there is some value in painting with broad strokes on a large canvas. Everything really is connected. Furthermore, we believe that people of good will, sharing a few common ideas and goals -- and willing to form surprising alliances -- can create a successful web of transformation.

First, we want to acknowledge some of our assumptions:

The Golden Rule

The wellspring of these ideas is a simple, universal ethic -- every culture and every religion endorses the Golden Rule [1, 2], which says, "Treat others the way you want to be treated." This tells us, first, to alleviate suffering. This, in turn, leads directly to human rights -- we all have a basic right to a life free of suffering, to the extent possible. The elements of such a life were laid out in the Universal Declaration of Human Rights, which the U.S. endorsed Dec. 10, 1948.

From the Golden Rule and the Universal Declaration: Justice

For us, the Golden Rule and the Universal Declaration together define justice. Justice is action that tends to manifest the Golden Rule and the Universal Declaration; injustice is action in another direction. It is unjust, unfair and therefore unacceptable to impose suffering on others or to stand by and allow suffering to go unnoticed or unchecked. It is unjust, unfair and unacceptable to deprive anyone of any human right as spelled out in 1948. Justice is not passive. Justice demands action, sometimes aggressive action, conflict, and struggle. Without justice, there can be no peace. We stand with Gandhi, advocating non-violent action.

In recent years, science has confirmed what people have always known: community is essential for human well-being. We humans evolved as social creatures[3, 4] who cannot thrive when separated from our circle of family, friends, acquaintances, and animal companions. Social isolation makes us sick and leads to an early death. This is one reason why racism and white privilege are profoundly wrong. At a minimum, they create social isolation, which leads to illness and suffering, and so they are unjust and unacceptable.

Furthermore, when we damage nature we diminish our own -- and everyone's -- possibilities for a life as free as possible of suffering. When we create havoc via global warming or damage to the Earth's protective ozone layer, or when we pave over fertile farmland, or exterminate the fish of the sea or the birds of the air, we diminish everyone's possibilities for securing life, liberty and the pursuit of happiness (to quote the Declaration of Independence of 1776). This is unjust and unacceptable.

As Jeremy Bentham told us in 1789, animals too have a right to live a life free of suffering to the extent possible. As Bentham said, the question is not whether they can reason, or whether they can talk. Their right to live free from torment hinges on the question, can they suffer? Their suffering stands on a moral plane with ours.

However, we want to emphasize that humans are dependent upon all creatures, not just those that are sentient. Science now confirms the wisdom of indigenous peoples, that we are all interdependent, all humans, all species. We humans are part of, and are supported by, a biological platform of enormous complexity, which we cannot understand, but which we know with absolute certainty nourishes and sustains us. Even a child can see that, without it, we are lost.

Because rights and justice cannot be secured if our biological platform is shredded, we all have a right to intact natural and social environments -- environments that enable us to provide for ourselves the essentials of air, water, food, shelter and community, which we all require to prevent suffering.

The earth is our home and we have to take care of it, for the reason that we absolutely depend on it. To preserve our home without understanding all its billions of inter-related parts, we can aim to preserve every part of it. No part of creation can be presumed dispensable. We can say we know what's dispensable, but what if we're wrong? In recent years we humans came close to making the surface of the earth uninhabitable for humans because we failed to understand how CFC chemicals were damaging the ozone layer. It was a close call. Our ignorance is vast. As Albert Einstein reportedly said, "We still do not know one-thousandth of one percent of what nature has revealed to us."

Because the biological platform, upon which we all depend, cannot be secured unless we are free to take action to protect it, human rights and justice are essential requirements for human survival.

Good health is a fundamental right

What is health? What conditions are necessary for health?

Aldo Leopold defined health as the capacity for self-renewal. The preamble to the constitution of the World Health Organization (WHO, July 22, 1946), defines health as "a state of complete well- being, physical, social, and mental, and not merely the absence of disease or infirmity." The WHO's Ottawa Charter says, "The fundamental conditions and resources for health are: peace, shelter, education, food, income, a stable eco-system, sustainable resources, social justice, and equity."

The WHO constitution also defines health as a basic human right: "The enjoyment of the highest standard of health is one of the fundamental rights of every human being without distinction of race, religion, political belief, economic or social condition." This is consistent with Article 25 of the Universal Declaration of Human Rights of 1948, which says,

"Everyone has the right to a standard of living adequate for the health and well-being of himself and his/her family, including food, clothing, housing, and medical care."

The right to health is crucial to all other human rights

Enjoyment of the human right to health is vital to all aspects of a person's life and well-being, and is crucial to the realization of all other fundamental human rights and freedoms.

Our health depends upon three environments:

1) The natural environment (air, water, soil, flora and fauna)

2) The built environment (roads, power plants, suburban sprawl, chemicals, etc.)

3) The all-important social environment (relationships of trust, mutual respect, and friendship but also poverty, racism and white privilege, sexism, homophobia, insecurity, the sense that life is out of control, and so on). The social environment creates what the United Nations calls "the social determinants of health." There is a very large body of literature indicating the importance of these determinants of a person's resilience in the face of stress.

All three environments are always intertwined in all "environmental" work and especially so in all "environment and health" work.

The basis of community and the economy is sharing the commons

The commons includes all the other things that we share together and that none of us owns or controls individually. The commons has been described as a river with three forks:

1. Nature, which includes air, water, DNA, photosynthesis, seeds, topsoil, airwaves, minerals, animals, plants, antibiotics, oceans, fisheries, aquifers, quiet, wetlands, forests, rivers, lakes, solar energy, wind energy... and so on;

2. Community: streets, playgrounds, the calendar, holidays, universities, libraries, museums, social insurance [e.g., social security], law, money, accounting standards, capital markets, political institutions, farmers' markets, flea markets, craigslist... etc.;

3. Culture: language, philosophy, religion, physics, chemistry, musical instruments, classical music, jazz, ballet, hip-hop, astronomy, electronics, the Internet, broadcast spectrum, medicine, biology, mathematics, open-source software... and so forth. Even the collective enterprise we call the "private sector" depends for its success upon the roads, the bridges, the water systems, the currency, the mercantile exchanges, the laws, the language, the knowledge, the understanding and the trust that all of us, and our ancestors, have built in common.

Government has three main purposes, which cannot be separated from each other

1) to guarantee the rights of the individual, as outlined in the Universal Declaration of 1948;

2) to ensure justice, and

3) to protect and restore the commons, holding them in trust for this generation and for those to come.

Prevention is essential

The 20th century has left us with an intractable legacy -- toxic and radioactive wastes, proliferating weapons, global warming, nearly two million citizens imprisoned, rising rates of childhood disease and chronic illness (e.g., asthma, attention deficits, autism, diabetes, Alzheimer's). We have learned the hard way that managing large problems such as these is prohibitively expensive. Therefore, our best hope is to create a culture of prevention -- to develop a habit of always doing our best to prevent problems before they occur, rather than paying to manage them afterward. This is the precautionary approach, and it lies at the heart of traditional public health practice.

Just as our great-grandparents made slavery unthinkable, our challenge is to make it unthinkable to finalize any large decision without examining the alternatives we face, asking who bears the burden of proof, and anticipating ways to prevent and minimize harm.

Our goal together can be to permanently alter the culture

We can aim to permanently alter the culture, not merely its laws, though laws can play an important part in both provoking and institutionalizing cultural transformation. Just as our forebears made slavery unthinkable, our goal together can be to make unsustainable life ways unthinkable.

Historically, in the U.S. and Europe, culture has been changed by social movements -- the first of them being the anti-slavery movement in England, 1787-1838. [See Adam Hochschild, Bury the Chains, ISBN 0618619070.] Therefore we believe that our goal of changing the culture can only succeed if it encourages, appeals to, and engages large numbers of people.

Accordingly, we believe a common agenda could be constructed from among the following ideas (plus others that we have not yet learned about):

I. Build a Multi-Issue, Multi-Racial, Multi-Ethnic Movement

1) We can make our work explicitly anti-racist. Because of European and U.S. history, it is essential that we take a strong position against racism and white privilege. This entails a relentless, ongoing effort to change the culture of the U.S. In addition to being a matter of simple justice, opposing racism is crucial politically because the New Deal coalition that governed the U.S. from 1940 to 1980 was divided and conquered using race as the wedge issue, beginning with Senator Goldwater's presidential platform opposing civil rights laws in 1964.[7] If we ever hope to become politically influential in the U.S., we will need to build a multi-racial, multi-ethnic, multi-issue coalition. Understanding and confronting white privilege will be essential in any such effort.

It is worth pointing out that the various movements for health and justice in the U.S., taken together, make up a numerical majority in the U.S. by at least two to one, and on many issues by far more than that. Therefore, the only way our adversaries can prevail is by dividing us. Race (and to a lesser extent class, ethnicity, national origin, religion, gender, and sexual orientation) has been the dividing wedge that our adversaries have used most effectively. (What are some other issues that our adversaries use to divide us? This seems worthy of considerable discussion.)

II. Reform the system for choosing candidates for public office

2) We can get private money out of elections. In principle, our republican democracy rests on the bedrock of "one person, one vote," not "one dollar, one vote." In the modern day, this means getting the mountains of private money out of elections, which in turn requires that elections be publicly financed so that every qualified individual is eligible to become a candidate for office, regardless of his or her personal wealth. (Various eligibility requirements have been proposed, such as the requirement that prospective candidates must gather a certain number of signatures to qualify as a candidate deserving of public financing.)

3) We can adopt the election system called instant runoff voting (IRV). In this system each voter ranks the candidates, 1, 2, 3, etc. If one candidate gets a clear majority of first-rank votes, he or she is declared the winner. However if no one candidate gets a clear majority of first-place votes, then the candidate with the least first-place votes (let's call this the "least popular candidate") is eliminated and his or her votes are re-distributed to the remaining candidates in the following way:

Each ballot that ranked the "least popular candidate" as No. 1 is examined and the second-place choice on those ballots is the candidate who receives that particular "least popular candidate" ballot. This process of elimination goes on until there is a clear winner holding a majority of ballots.

The system has many advantages over the current system and it is catching on across the U.S.

III. Protect the Commons

4) We can explicitly give all individuals the right to a safe and healthy environment. However, this alone will not suffice. We can also give these rights far higher priority than they have under current law, where they are presently trumped (for example) by the right to use one's property as one chooses for economic gain.

5) We can designate or elect guardians ad litem for future generations.

6) We can conduct annual audits of the commons (using consistent measures) with public reports supported by action plans for preservation, restoration, and prevention of harm.

7) We can establish a public interest research agenda that has as its first priority protecting and restoring the commons.

IV. Develop an Economy Whose Footprint is Not Growing, or is Even Shrinking

8) We can create an economy whose ecological footprint is not growing, or is even shrinking. Sustainability cannot be achieved without this bedrock idea, so it needs some elaboration and discussion.

A sustainable society is one in which the human economy provides the basics of a "good life" for everyone but the "footprint" of the economy never grows so large as to overwhelm the planet's natural ability to renew itself. As we will see, a sustainable society is also one in which justice and equity continuously pursued.

There are two parts to the "footprint" -- the number of people and their individual demands on the ecosystem.

Our current way of thinking and being in the world -- premised on perpetual growth of the human footprint -- is not sustainable. At present, the human footprint is simply too large for the planet to sustain, and the evidence is emerging all around us -- global warming; destruction of the ozone layer; decimation of marine fisheries; industrial poisons in breast milk; increasing rates of chronic disease (attention deficits; asthma, diabetes, some childhood cancers); accelerating extinction of species; and so on.

Because the total human footprint is unsustainably large, both human population and individual consumption must shrink. However, the only proven way to curb human population is to (a) achieve economic growth to escape the chains of poverty and (b) achieve freedom and opportunity for women. The evidence is that, once the chains of poverty are broken, and children are no longer the only available old- age insurance, then most women with prospects choose not to bear large numbers of children.

This implies that all societies need sufficient economic growth to escape poverty -- which implies the need for more roads, power plants, ports, and so on. However, given that the global human footprint is already unsustainably large, the need for growth in the global south requires footprint shrinkage in the global north. This, then, is the goal of forward-looking (precautionary) decision-making: to make choices that can shrink the footprint of the global North, to make room for growth in the global South, to end poverty and liberate women so they can choose small families.

An end to growth-as-we-know-it immediately raises the issue of a just distribution of available goods (and bads). In the traditional way of thinking (at least in the U.S.), poverty will be alleviated by economic growth -- the poor are promised that, as the pie grows, even their small piece of the pie will grow apace. They need only be patient. But if the size of the pie is going to remain constant, or perhaps even shrink in some dimensions, growth can no longer serve as the safety-valve for "solving" poverty. Now we must begin to ask, "What's a fair distribution of the pie?" Thus a sustainable society not only has a sustainable footprint, but it also will never abandon the active pursuit of justice and equity.

Therefore, we need an economy that can grow (in places where growth is needed today to eliminate poverty, for example in Africa) but is not required to grow as the present economy is required to do (so that "developed" nations can achieve a constant or shrinking footprint). By "growth" we mean growth in capital stock, or growth in "throughput of materials" ("stuff"). A steady-state economy will still be dynamic and innovative. What is needed is a constant (or shrinking) "footprint" for the human economy -- but within that footprint, technical and ethical innovation can be boundless.

One proposal envisions an economy based on competitive markets plus public ownership of productive facilities (factories, farms), renting them to producer co-ops, with investment capital raised by a flat tax on productive assets and distributed each year to all regions of the nation on a per-capita basis. No doubt there are other ways to achieve the steady-state economy -- all we know is that a steady-state economy (or an economy with a steady-state footprint) is essential. Perpetual growth on a finite planet is a certain recipe for a failed future.

Perhaps the nub of this issue is money lent at interest -- usury in the original meaning of the word. It is payment of interest on borrowed funds that creates the requirement for economic growth. So the question could be framed as, "How can a society provide interest- free investment funds to replace and modernize infrastructure as it decays?"

9) We can organize our economy around the concept of zero waste. The present one-way torrent of materials out of the ground for single-use (or nearly so), followed rapidly by re-entry into the ground, will be recognized as an unsustainable absurdity. In a sustainable economy, every product will be designed for repeated re-use and the cost of its reprocessing for re-use will be included in the original sale price.

10) We can guarantee full employment with decent wages to end poverty. The Humphrey-Hawkins Full Employment Act of 1978 [P.L. 95-253] can serve as a model, with the added proviso that the federal government can serve as the employer of last resort. Everyone who wants to work has a right to a place in the shared enterprise.

V. Prevent Illness, Eliminate Health Disparities, Provide Universal Health Care

11) We can create a single-payer universal health care program, modeled on Canada's. This will be a health care program that seeks first to prevent illness and relies on "cures" only as secondary measures.

A central goal of any health system will be the elimination of health disparities -- including disparities based on race and ethnicity, gender, and geography.

Health disparities are a human rights violation because they indicate that someone has been deprived of their right to health; therefore health disparities are unacceptable and must be eliminated and prevented.

NACCHO (National Association of County and City Health Officials) has defined "health disparities" as "differences in populations' health status that are avoidable and can be changed. These differences can result from social and/or economic conditions, as well as public policy. Examples include situations whereby hazardous waste sites are located in poor communities, there is a lack of affordable housing, and there is limited or no access to transportation. These and other factors adversely affect population health."

VI. Make Decisions to Prevent Harm

12) We can adopt the precautionary principle to avoid trouble and prevent harm, rather than clean up messes.

VII. Expressing an Anti-Racist Intention

13) From its early beginnings, European society has been based on racist assumptions that have produced unacknowledged systems of white privilege. Racist ideology predates capitalism and has been fundamental to the creation of much of the modern world. The U.S. has been caught up in this mindset to an even greater degree than most European societies. The first step is to openly acknowledge the problem in its many dimensions.

14) Expressions of an explicit anti-racist intention are needed throughout the culture to counteract hundreds of years of silent violence against people of color. Anti-racism can be expressly practiced in the courts, the schools, our elections, the media, the churches, NGOs, in our funding priorities, our public health goals and practice, and on and on. Racism is not limited to individual acts of meanness, as much of the culture would have us believe. Racism is a largely-invisible, embedded system of privilege that gives white people unearned assets that they can count on cashing in each day, but about which they remain largely oblivious. As Peggy McIntosh has described it, "White privilege is like an invisible weightless knapsack of special provisions, maps, passports, codebooks, visas, clothes, tools, and blank checks." White privilege is difficult for some people to acknowledge because its pervasive nature means we do not, in fact, live in the meritocracy. The deck is stacked at birth by skin color.

VIII. Restoring Justice: A vision of the courts for the 21st century

15) We can develop a vision of the courts for the 21st century. Elements of this could include:

a. Eliminating racist outcomes from court proceedings, with a goal of vastly reducing the number of people in prison;

b. Ending the status of corporations as persons entitled to the same rights as individuals under the Constitution, to restore individual responsibility and accountability. The term "person" in the 14th amendment should not include corporations. The goal here is democratic control of the nature and behavior of corporations, as was the norm in the U.S. at an earlier time.

c. Reversing the burden of proof, giving the benefit of the doubt to ecosystems, to future generations, to the luckless and the downtrodden;

d. Taking seriously our commitment to future generations, to pass along to them undamaged the world we inherited from our forebears, and establishing our priorities in the courts to allow this to happen;

IX. Free Education for All

16) We can provide free education from pre-school through college. Investment in education -- whether Head Start or the GI Bill of Rights -- is an investment that demonstrably pays enormous dividends, generation after generation.

X. A Foreign Policy free of Imperialism or Colonialism

17) We can adopt a foreign policy that brings an end to imperialism and colonialism. Bretton Wood institutions can be abolished and new institutions of international finance invented. The aim of military dominance of the planet and of outer space can be discarded.

XI. Organize Society to Provide Time for Democratic Engagement

18) Society can be organized to give everyone time to participate in democratic decision-making. This will require a work-week shorter than 40 hours, living near the workplace to minimize travel time, and partners sharing child-rearing and household tasks.

19) Gender equity can be made a priority as a matter of fairness and justice, and because, for many people, having time for democratic participation depends on sharing child-rearing and household tasks with a partner. Furthermore, worldwide, gender equity accompanied by opportunity and education, is the only proven formula for limiting human population, as discussed above.

20) We can establish as a goal that everyone can walk to work, with incentives for city planners, urban developers and local decision- makers who meet this goal. This is important for personal health, for gender equity (couples must work near home if they are to share child- rearing and household tasks), and for democratic participation (time not wasted on commuting can become available for community engagement).

XII. The Future of These Ideas for a Common Agenda

21) This is just a beginning. Please contribute your ideas. Together all of us can be wiser and more successful than any of us alone.


CANCER: HOW DANGEROUS ARE OUR COSMETICS?

By Devra Davis

We know that children are not simply little adults. With their quick heartbeats, fast-growing organs and enviable metabolism, the young absorb proportionally more pollutants than those who are older. Exposures to minute amounts of hormones, environmental tobacco smoke or pollutants early in the life of an animal or human embryo can deform reproductive tracts, lower birth weight and increase the chance of developing cancer. And yet results from an independent chemical testing laboratory released last week found a probable human carcinogen, 1,4-dioxane (also known as para-dioxane), in some common children's shampoos at levels higher than those recommended by the U.S. Food and Drug Administration. The Environmental Working Group, a research and advocacy organization that ran the study, estimates that more than a quarter of all personal-care products sold in the United States may contain this cancer-causing agent.

The presence of a cancerous agent at levels above those suggested by the FDA is disturbing enough. The idea that such a compound exists at any amount in products that can be in regular contact with babies' skin is even more disconcerting. Scientists have long known that certain chemicals like para-dioxane can cause cancer. (The World Health Organization considers para-dioxane a probable human carcinogen because it is proven to cause cancer in male and female mice and rats.) Now we're beginning to realize that the sum total of a person's exposure to all the little amounts of cancerous agents in the environment may be just as harmful as big doses of a few well-known carcinogens. Over a lifetime, cigarettes deliver massive quantities of carcinogens that increase the risk of lung and other cancers. Our chances of getting cancer reflect the full gamut of carcinogens we're exposed to each day -- in air, water and food pollution and in cancerous ingredients or contaminants in household cleaners, clothing, furniture and the dozens of personal-care products many of us use daily.

Of the many cancer risks we face, shampoos and bubble baths should not be among them. The risks of para-dioxane in American baby soaps, for instance, could be completely eliminated through simple manufacturing changes -- as they are in Europe. To remove such carcinogens, however, would require intervention by the federal government, but the federal Food, Drug and Cosmetic Act allows the industry to police itself. Europe has banned the use of para-dioxane in all personal-care products and recently initiated a recall of any contaminated products. There's a problem with the way the United States and other countries look at toxicity in commercial agents. Regulators nowadays often won't take action until enough people have already complained of harm. This makes little sense. Scientists can seldom discern how the myriad substances, both good and bad, that we encounter in our lives precisely affect our health. We need to be smarter about using experimental evidence to predict and therefore prevent harm from happening. A few decades ago, people accepted the fact that cigarette smoking was harmful, even though no scientist could explain precisely how this happened in any particular cancer patient. If we had insisted in having perfect proof of how smoking damaged the lungs before acting to discourage this unhealthy practice, we would still be questioning what to do. By the same token, we now have to get used to the idea that scientists are unlikely to be able to say with certainty that a trace chemical in shampoo accounts for a specific disease in a given child. But if we're to reduce our cancer risk, we need to lower our exposures to those agents that can be avoided and find safer substitutes for those that can't.

Scientists don't experiment on humans, for obvious reasons, but we have found some clues from lab and wildlife studies. Medical researchers have demonstrated that trace chemicals of some widely used synthetic organic materials can damage cultured human tissue. The effects don't just accumulate, they mushroom. UC Berkeley Professor Tyrone Hayes has shown that very low levels of pesticide residues in Nebraska cornfields can combine to create male frogs with female features that are vulnerable to infection and can't reproduce.

Should we wait for these same things to happen to baby boys before acting to lower exposures? There's plenty of solid human evidence that combined pollutants can cause more harm together than they do alone. We are not surprised to hear that people who smoke, drink and work as painters have much higher risks of kidney cancer than those who only engage in one of these known cancer-causing practices. We also understand that women who use hormone-replacement therapy and drink more than two glasses of wine daily have higher risks of breast cancer than those who engage in only one of these practices. This tells us that other combinations of chemicals in the environment can also lead to other cancers. One in five cases of lung cancer in women today -- a disease that kills more women than ovarian, breast and uterine cancer combined -- has no known history of active or passive smoking exposure. Rates of non-Hodgkins lymphoma and other cancers not tied with aging or improved screening have also increased in many industrial countries. New cases of testicular cancer continue to rise in most industrial countries. While still rare, childhood cancer is more common today than in the past, and most cases occur in children with no known inherited risk of the disease.

The problem, from a scientific standpoint, is that resolving the effects of miniscule levels of chemicals we encounter throughout our lives is part of a complicated puzzle for which many pieces are missing. What scientists need is data -- lots of it. Manufacturers, however, tend to hold the precise formulations of products as trade secrets, and the law allows them to withhold much information about carcinogens even if they are known to be present. Of course, we should continue to collect information to advance our ability to prevent cancer and other chronic diseases. But when a chemical causes cancers in both sexes of two different species of animals, we shouldn't arrogantly presume we will escape a similar fate. Recent work on the human and animal genomes shows us that humans differ from frogs and mice by fewer than 10 percent of genes. We should not let the absence of specific information on the health consequences for our infants and toddlers of single cancer-causing contaminants like para-dioxane become a reason to delay getting rid of such hazards.

The goal of public-health policy is to prevent harm, not to prove that it's already happened. The Center for Environmental Oncology at the University of Pittsburgh Cancer Institute advises that personal-care products that contain hormones may, in part, account for the continuing and unexplained patterns of breast cancer in African- Americans under age 40, and also may explain why more girls are developing breasts at younger ages. The Centers for Disease Control and Prevention found generally higher residues of some plastic metabolites in African-American women, with children ages 6 to 11 having twice the levels of whites. Dr. Chandra Tiwary, a recently retired military chief of pediatric endocrinology at Brooks Air Force Base, found that African-American baby girls as young as 1 year old developed breasts after their parents applied creams that they hadn't realized contained estrogen to their scalps. When the creams were no longer used, these infant breasts went away. Other work published last week by the National Institute of Environmental Health Science, shows similar effects in young boys who had been washed with some hormone-mimicking soaps or oils. After their parents stopped applying these products, their breasts also receded.

In light of the growing numbers of young girls with breasts, the Lawson Wilkins Pediatric Endocrine Society, the certifying board for pediatric endocrinology, in 1999 changed the recommendation of what is natural. We believe this would be a dangerous move. If we say that it's now normal for African-American and white young girls to develop breasts at ages 6 and 7, respectively, we will fail to pick up serious diseases that could account for this. We will also lose the chance to learn whether widely used agents in the environment, like those found in personal-care products today or others that may enter the food supply, lay behind some of these patterns.

It should not be the job of scientists, or of public-spirited leaders or environmental groups, to find out what contaminants or ingredients may be affecting the delicate endocrine systems of our children and grandchildren. (The tests that found para-dioxane in shampoo were funded privately by environmental journalist and activist David Steinman, author of "Safe Journey to Eden.") Manufacturers have known for years about how para-dioxane forms as a by-product of manufacturing and how to get rid of it. Until now, they just haven't need to do so. People have a right to know whether products they use on themselves and their children contain compounds that increase their risk of disease. They also have a right to expect that government will prevent companies from selling products that are harmful to children. To do otherwise is to treat our children like lab rats in a vast uncontrollable experiment.

Devra Davis is director of the Center for Environmental Oncology at the University of Pittsburgh Cancer Institute and is a professor of epidemiology at the University of Pittsburgh's Graduate School of Public Health. A National Book Award finalist for "When Smoke Ran Like Water," she is completing "The Secret History of the War on Cancer," from which this work is adapted, expected in October from Basic Books.

Copyright 2006 Newsweek, Inc.


STEPPING BACK FROM THE BRINK OF GLOBAL WARMING

By Tim Montague and Peter Montague

Scientists now agree that we have to make huge reductions ingreenhouse gas emissions for the next twenty five years and beyondif we want to limit, then reverse, global warming. This means burningfewer fossil fuels (oil, natural gas and coal). It also meansacknowledging that global warming is not mainly a technical problem,but a social and political problem -- primarily an issue of globaljustice.

Humans, worldwide, currently use a steady 13 trillion watts of power.By 2050, an additional 30 trillion will be needed, according to U.S.government energy experts. Even if these projections are exaggeratedbecause they fail to account for conservation opportunities, thechallenge of our era is to provide sufficient power without creating aclimate catastrophe -- or covering the planet with nuclear powerplants, which create opportunities for nuclear weapons. Unfortunately,so far, the people most affected by global warming have generally beenexcluded from the discussions of how to solve it.

Global warming will increase the average global temperature but itwill also cause the sea level to rise, change agricultural patterns,and increase the size and frequency of "natural" disasters such asfloods, droughts, intense storms, and epidemics of disease likemalaria and dengue fever.

As hurricane Katrina showed us, the impacts of these changes willdisproportionately affect women, youth, coastal peoples, localcommunities, indigenous peoples, fisher folk, small island states,poor people and the elderly. Hardest hit will be people in the globalsouth and in the southern part of the global north -- the people whoare least responsible for creating the problem. As a result, apeople's movement for "climate justice" is now growing worldwide.

Time is getting short for devising solutions. In September, 2006 NASAscientist James Hansen warned "I think we have a very brief windowof opportunity to deal with climate change... no longer than adecade, at the most," to limit the increase in global temperatures to1 degree Celsius (1.8 degrees Fahrenheit).

The main global warming gas is carbon dioxide (CO2). CO2 levels inthe atmosphere are now 382 ppm (parts per million) up from pre-industrial levels of 280 ppm -- a 36% increase. If we pursue businessas usual, CO2 levels are expected to pass 500 ppm by mid-century,which would could cause warming of 2 to 5 degrees Celsius (3.6 to 9degrees Fahrenheit). If this happens, many parts of the earth willbecome unpleasant -- even deadly -- places to live.

We in the U.S. have a special role to play -- we are 5% of the globalpopulation producing 25% of all global warming gases. To make roomfor needed economic growth in other parts of the world, we will needto shrink our carbon footprint drastically. Right now the UnitedStates is a global outlaw -- we have refused to endorse the KyotoProtocol (the United Nations [U.N.] strategy to reduce greenhousegas emissions) through improvements in energy efficiency, publictransit, and clean energy production). As China and Indiaindustrialize and seek the good life -- a new coal burning powerplant goes online each week -- we could be reducing our energy useaccordingly

Despite the absence of leadership at the federal level, some localesin the U.S. are starting to take action:

** From Boston to Seattle, city governments are pursuing plans tomeet and beat the Kyoto Protocol

** Boulder Colorado recently adopted a 'climate tax' -- an extra feeon electricity use (also called a carbon tax, because most of ourenergy produces carbon dioxide). Seattle has imposed a new parkingtax and the mayor hopes to charge tolls on major roads to discouragedriving (if you want to enter the city in a car, you pay a toll).Boston recently passed a green building ordinance requiring allnew buildings over 50,000 sq. feet to meet strict energy efficiencystandards.

We can create a climate safe economy through organized collectiveaction at all levels of society from the neighborhood andcommunity to the national and international policy level.

Global warming is the result of unsustainable economic practices. Tomake our economy sustainable, our per capita consumption -- energy,forest products, metals, plastics, etc. -- has to shrink roughly five-fold. Those changes will only happen when organized residents send aresounding message to their elected officials. The size of the neededcommitment is similar to the effort the U.S. made in World War II.The changes in energy, transportation and food systems we discussbelow are changes that will require national solidarity andcollective sacrifice.

Energy efficiency

According to the World Resources Institute energy accounts for65% of U.S. greenhouse gas emissions (this includes electricutilities, transportation, industrial, residential and commercialuses). Much of this is used for heating, cooling and lightingbuildings, which together make up 12% of U.S. greenhouse gasemissions. Switching to energy efficient fluorescent lighting whichuses 75 percent less energy (and last ten times as long asincandescent bulbs) is a first step consumers and businesses can takeimmediately. Much greater savings are achieved by upgrading abuilding's thermal insulation, machinery and appliances -- applyingwhat Amory Lovins of the Rocky Mountain Institute calls 'whole-system' design -- whereby buildings can be made 75 to 90 percent moreenergy efficient. Lovins reports that energy efficient homes inDavis California cost $1,800 less to build and $1,600 less tomaintain over their lifetime than a conventional home of the samesize. Energy efficient building designs allow us to build homesand offices without a furnace or other traditional heating system,even in cold climates (and vice versa with air conditioning in warmclimates). There are high tech solutions like using thick layers ofpolystyrene insulation and triple layered windows; and low techsolutions like building strong, beautiful homes out of straw-bales.

Switching to energy efficient buildings and consumer devices will notonly reduce greenhouse gas emissions, but will also create hugedollar savings. Industry, which accounts for 17% of U.S. emissions,is already cashing in on advances in energy efficiency by installingenergy efficient motors and other machinery. According to Lovins,DuPont has reduced its heat-trapping emissions by 72 percent over thelast decade, saving more that $2 billion so far. All said, we couldeasily reduce greenhouse gas emissions by at least 50% in theresidential, commercial and industrial sectors applying today's knowhow.

Transportation

Transportation generates a third of our nation's greenhouse gasemissions. The Union of Concerned Scientists reports that the fuelefficiency of the U.S. auto fleet improved by 70 percent between 1975and 1988, saving American consumers $92 billion. Federal corporateaverage fuel economy (CAFE) standards have since stagnated. Doublingthe fuel efficiency of our cars and trucks is doable and would reducetotal heat-trapping emissions by 10 percent. That improvement couldbe made today if everyone drove a vehicle getting 55 miles pergallon, as some cars can now do.

Simply driving less is a cheaper and healthier alternative. We couldreduce our greenhouse gas emissions again by 10 percent by drivinghalf our normal annual distance of 10,000 miles and taking publictransit, biking and walking instead. (Public transit accounts forless than 1 percent of American's miles traveled according to the PewCenter on Global Climate Change.) Emissions from U.S. lighttrucks and cars are the fifth largest global source of greenhousegases -- more than many large countries -- according to the Union ofConcerned Scientists.

Even when carbon-neutral hydrogen powered vehicles become available,living on less land with better public transit and pedestrian-friendly designs is the obvious way to go. And don't get swept up bythe biofuel craze (ethanol from corn and biodiesel from soy). BrianTokar of the Institute for Social Ecology reports, "The entirecorn and soy harvest combined would only satisfy 5.3% of our current[fuel] needs." Converting precious food crops to fuel for hugelyinefficient cars and trucks just doesn't make economic or ecologicalsense. Conservation is by far the cheapest, easiest way to reduce ourcontribution to global warming.

The food we eat

When asked what's wrong with our food system today, Berkeleyprofessor Michael Pollan answers, "I'd have to say the most seriousproblem with the food system is its contribution to global warming."Our food production system is unsustainable for many reasons. Howwe grow our food, how far it travels and what kind of food we eat allimpact the environment. Agriculture (not counting transportationcosts) generates 8% of our greenhouse gas emissions.

The Worldwatch Institute reports that our food regularly travels1,500 to 2,500 miles from farm to plate. Growing our own food andbuying it from local farmers and community supported agriculture(CSA) farms are first steps to reducing our food footprint. Cubais a good example of a country that has developed urban agricultureand alternative energy for food crops.

The Crossroads Resource Center reports that in Minnesota ifconsumer's purchased just 15 percent of their food from local farmersit would generate as much income for farming communities as two-thirds of farm subsidies. Community supported agriculture (CSA)programs that offer consumers a variety of fresh, locally grownfruits and vegetables are a great way to support the local foodeconomy. The CSA Learning Center on Chicago's south side educatesyouth and residents about local farming and helps low-income familiesget healthy food.

Meat is a major contributor to greenhouse gas emissions too.Methane is 21 times more powerful a greenhouse gas than CO2, andlivestock are a major source -- about one sixth -- of atmosphericmethane. According to the United Nations Food and AgricultureOrganization, livestock's contribution to global warming isgreater than that of transportation on a global basis. Global meatproduction is expected to double in the next 45 years. So it's clearthat reducing meat in our diet is a key step as well aslocalizing our food sources.

Clean, renewable energy

Energy from wind and sunlight is abundant and relatively clean, andit reduces global warming. Wind energy is the fastest growing sourceof electricity in the world today. According to the Institute forEnergy and Environmental Research (IEER) we could meet all ourelectricity needs if we harnessed the wind resources of just threestates -- North Dakota, Texas and Kansas.

Solar photovoltaic cells have considerable potential too. More solarenergy hits the earth in 60 minutes than all of humanity consumesin a year. However, harvesting that energy will not be simple oreasy. We could meet 10% of our electricity needs by putting solarcells on existing rooftops, according to a report in Sciencemagazine. State programs like the California Solar Initiative aimto make electricity from solar cells cost-competitive with coal andnuclear -- industries that have collected more than $500 billion insubsidies over the past fifty years according to EnvironmentIllinois. In the next ten years California is expected to morethan triple its solar power capacity to 3,000 megawatts, but eventhis is a still drop in the bucket.

If we converted our economy to 100% wind and solar power -- which wemight be able to do in 30 to 50 years if we made it a nationalemergency goal -- we could avoid at least 50% of today's greenhousegas emissions and vastly reduce fine-particle air pollution, which iskilling 60,000 Americans per year. As Coop America's report onmaking the transition to a climate safe economy suggests, thisrequires an immediate moratorium on all new coal and nuclear powerplants, something corporations are not prepared to do withoutenormous pressure from community groups.

Luckily, community groups are on the case. The international climatejustice movement has established a set of principles to guide itswork, and the movement is growing.


TROUBLE WITH THE PRECAUTIONARY PRINCIPLE

By Peter Montague

The goal of the precautionary principle is simply to prevent harm.(Look before you leap. A stitch in time saves nine.) However, if youwant to prevent harm, you need a pretty good idea of where the harm iscoming from. So you start looking for root causes. If you don't knowthe root causes of a problem, how can you take effective action toprevent it? (Putting a Band-Aid on a cancer may make you feel betterfor a short while, but if you don't confront the cancer you'll findyourself in real trouble. And if we never ask what's causing the risein cancer rates, the trouble just multiplies.)

To me, this is the most important aspect of the precautionaryprinciple. It gets us searching for root causes of harm. Even thoughthis is a good thing -- and necessary -- it can still get you intotrouble.

Precaution defines a sustainable society -- one that is always doingits best to look ahead, to avoid trouble. Taking a precautionaryapproach does not guarantee that a civilization can avoid collapse.But the alternative approach, which dominated our thinking from 1850to now -- "Shoot first and ask questions later," or "Damn thetorpedoes, full speed ahead!" -- has damaged the natural environmentand human health so badly that the Millenium Ecosystem Assessment(result of 6 years study by 1366 scientists in 95 countries) concludedlast year, "At the heart of this assessment is a stark warning. Humanactivity is putting such strain on the natural functions of Earth thatthe ability of the planet's ecosystems to sustain future generationscan no longer be taken for granted." A stark warning indeed.

Precaution is an ancient technique for survival, developed long beforehumans arrived on the scene. Animals have always taken a precautionaryapproach to life. Crows, woodchucks, monkeys -- all have lookouts whoscan the horizon (and the neighborhood), calling out at the first signof trouble. But, more than this, each member of the animal clan takeson the role of self-appointed guardian, attentive to threats. Thisprecautionary approach has allowed animals to sustain themselves formillions of years in a world that is constantly changing and alwaysuncertain. Survival under these conditions is the very definition ofsustainability.

We humans seem to have lost this precautionary perspective. We havecome to believe that we can manufacture the conditions for our ownsurvival, regardless of conditions in the world around us. In theearly 1990s, a group of scientists actually constructed an artificialecosystem and tried to live in it; they called it Biosphere II (theearth itself being Biosphere I). The whole thing was a colossalfailure; the ants took over and the humans were clueless.

During the 20th century, our novel approach -- behaving as if we arein charge of nature -- brought us multiple disasters, several ofwhich are still unfolding today:

** The nuclear industry has covered the planet with radioactive potsof poison that no one will ever clean up -- radioactive wastes dumpedinto the oceans; radioactive canyons in New Mexico where the bomb-makers buried their mistakes in unmarked graves; mountainous heaps ofradioactive uranium mine wastes blowing on the wind; radioactiveresidues from factories making products with radium and thorium;swaths of radioactive fallout worldwide. Now nuclear power plants --often the precursors for nuclear weapons -- are proliferating acrossthe globe. The list of unmanageable problems unleashed by nuclear boy-toys continues to grow at an accelerating pace. This technology aloneshould teach us that our 20th-century ways are unsustainable.

But we have a second set of experiments to learn from. Thepetrochemical industry has littered the planet with staggeringly largenumbers of toxic waste sites buried in the ground, or simply strewnacross the surface. For example, after 25 years of cleanup efforts,New Jersey still lists 16,000 contaminated sites with 200 to 300 newcontaminated sites still being discovered each month. Around theworld, the petrochemical industry is, daily, creating hundreds morethat will remain to plague our children's children's children. Thesize of this problem is too large to even catalog. And 750 newchemicals are still being put into commercial channels each year.

The regulatory system set up to oversee nuclear and petrochemicaltechnologies has always given the benefit of the doubt to rapidinnovation for economc growth, rather than to public health. This mayhave made sense when capital was scarce and nature was abundant. Butnow that capital is abundant and nature is scarce, the regulatorysystem's priorities are causing more harm than good. The world isfundamentally different from the world of 100 or even 50 years ago,and our legal system needs to adapt to these new conditions.

Now the same corporations that created the nuclear and petrochemicalmesses have been rushing pell mell to deploy a new generation of farmore powerful inventions -- biotechnology, nanotechnology, andsynthetic biology (the creation of entirely new forms of life thathave never existed before).

With these new technologies, no one is even pretending that regulationcan stanch the flood of ill-considered innovations, or the harms theyseem certain to bring.

So we have to try something new. The best hope, it seems to me, is forall of us to try to change the culture, to make a precautionaryapproach standard procedure. (If we do that, it will quickly becomeapparent that many of our existing laws and institutions, such asfreewheeling corporations larger than many nations, no longer makesense and need to be rethought.)

Just as our great-grandparents managed to make slavery unthinkable,now we can make it unthinkable to take any big decision without doingour best to anticipate the consequences, to examine our options, andto choose the least harmful way. (Publicly-held corporations, as theyare strucrured today, cannot aim to minimize harm; as a matter of law,they can only to do what is profitable for their shareholders.)

Using a precautionary approach, we would still make painful mistakes.But maybe we could avoid the extinction that threatens to snuff us outif we continue on our present path.

So it seems important to search for root causes of our troubles. Thisis what the precautionary principle would have us do. This gets usasking questions that are not usually asked in polite company.

Q: Why did we develop corporations?

A: To mobilize investment capital to advance economic growth to makemore stuff and accrue more capital.

Q: At one time this made sense, but now that there is more than enoughstuff to go around -- and nature is sinking under the weight of it all-- why do we need more economic growth?

A: Because growth is what produces return on capital investment.

Q: Since we are already spending huge sums to convince people to buystuff they don't need, just to produce return on capital investment --why do we need even more return on investment?

You see what I mean? Searching for root causes of our acceleratingtrain wreck gets us asking questions that some people may not wantasked. That's how the precautionary principle can get you intotrouble. But delicious trouble it is, I confess.


THE WORLD IS NEW

By Peter Montague

In the past 50 years, corporations have grown almost unimaginablyinfluential. Originally invented as a way for entrepreneurs to raisecapital from strangers, publicly-traded corporations have proven to beextraordinarily successful and they have grown steadily, year by year.In many cases, growing bigger has become their main purpose.

In the past 50 years -- between 1955 and 2004 -- large corporationscame to thoroughly dominate the U.S. economy. In 1955, sales of theFortune 500 corporations accounted for 1/3 of gross domestic product(GDP). By 2004, sales of the Fortune 500 amounted to 2/3rds of GDP, amajor consolidation of wealth and power. [1, pg. 22]

Peter Barnes -- co-founder of the Working Assets Long Distance phonecompany -- describes some additional changes that have occurred duringthe past 50 years. In his must-read new book, Capitalism 3.0; A Guideto Reclaiming the Commons, Barnes points out that, 50 years agocapitalism entered a new phase. Up to that time, people had wantedmore goods than the economy could supply. After 1950, there wasessentially no limit to what corporations could produce. Their newproblem was finding buyers.

Others have remarked on this shift as well. In 1967 in The NewIndustrial State, John Kenneth Galbraith observed that largecorporations require stability and so they must control both supplyand demand. To control demand, they manufacture wants. In 1950,everyone's basic physical needs could be met, so to promote growth,corporations had to learn to manufacture desire. Physical wants arelimited but, properly stimulated, desires can become infinite.

You might ask, given that the economy can now satisfy everyone'sphysical needs, providing the basics of a good life, why do we need tomanufacture desire to stimulate growth? Because growth is whatprovides return on investment.

The amount of money available for profitable investment expandsexponentially year after year. Therefore, it is essential to keepdemand (desire) growing apace -- to create new opportunities forinvestors to earn a decent rate of return year after year. The U.S.spent $263 billion on advertising in 2004, largely to stimulatedesire. Despite this, production continues to outpace effectivedemand.

In his book, The Return of Depression Economics (1999), Princetoneconomist Paul Krugman pointed out that inadequate demand (the flipside of overproduction) is now a worldwide problem. He wrote, "Whatdoes it mean to say that depression economics has returned?Essentially it means that for the first time in two generations [50years], failures on the demand side of the economy -- insufficientprivate spending to make use of available productive capacity -- havebecome the clear and present limitation on prosperity for a large partof the world." (pg. 155) Overcapacity is chronic.

In the U.S., there have been two major responses to decliningopportunities for a decent return on investment. One solution has beento invent new ways of manipulating money. As Peter Barnes points out,today "the world is awash with capital, most of it devoted tospeculation."

If we take Barne's word "speculation" to mean, loosely, themanipulation of money itself for profit, then speculators have indeedgrown more important in the U.S. economy in the last 50 years.Corporate profits of the financial industry in the U.S. in 1959 were15% of total corporate profits; by 2004 the financial industry'sprofits represented 36% of total U.S. corporate profits. In roundnumbers, manipulating money now accounts for 40% of all corporateprofits.

The second major response to limited investment opportunities has been"globalization" -- creation of a new set of rules that essentiallyerase national borders, so that materials and capital are now free toflow to wherever costs are lowest. Now if investors see an opportunityto gain a decent return by, say, manufacturing toothpicks by cuttingdown Indonesian rain forests, they are free to move their money thereinstantaneously to take advantage of the opportunity. Within the U.S.,this has worked out well for investors but it has not been quite sobeneficial for the working class or the middle class. As an editorialwriter for the New York Times pointed out in 2002, "Globalizationhas been good for the United States, but even in this country, thegains go disproportionately to the wealthy and to big business."Globalization has been one of the factors that has consolidated wealthin fewer and fewer hands in recent years.[2]

Globalization also helps explain another important feature of the newworld -- the expanding U.S. military. As New York Times columnistThomas Friedman pointed out in 1998 in an article about the globalspread of electronic inventions, "The hidden hand of the global marketwould never work without the hidden fist. And the hidden fist thatkeeps the world safe for Silicon Valley's technologies to flourish iscalled the United States Army, Air Force, Navy and Marine Corps..."The current U.S. military budget of $450 billion -- equal to themilitary budgets of all other nations combined -- is another aspect ofthe need to keep growth going, to create opportunities for investors.

As a result of these trends in the past 50 years, 5% of the U.S.population now owns more private wealth than the other 95%.

Naturally, this 5% has gained outsized power to go with its outsizedwealth. No one begrudges the fortunate their fortunes (almost all ofus think it is better to be rich than not rich), but democracy assumesthat everyone has approximately equal standing. Our system ofgovernance is legitimized by the premise, one person, one vote, notone dollar, one vote. Since money talks -- or, in the case of the top5%, money screams -- we can no longer say we have even the pretenseof a democracy. Instead, we have a plutocracy -- rule by wealth --and one wholly devoted to economic growth.

As economist Herman Daly observed not long ago, we now have a"religious commitment to growth as the central organizing principle ofsociety. Even as growth becomes uneconomic we think we must continuewith it because it is the central myth, the social glue that holds oursociety together."

So even though economic growth is shredding the biosphere, causingmore harm than good (which is what Daly means when he says growth hasbecome "uneconomic"), it is heresy to try to imagine a different wayof being on the planet.

This is a uniquely modern puzzle -- we have a deep religiouscommitment to an idea that was once true, but is now false, and whichis destroying the future.

[1] Peter Barnes, Capitalism 3.0; A Guide to Reclaiming the Commons(San Francisco: Berrett-Koehler, 2006) pg. 22.

[2] There is nothing wrong with international trade, and it can bringsubstantial benefits. As the New York Times points out, "China,Chile and other nations show that under the right conditions,globalization can lift the poor out of misery. Hundreds of millions ofpoor people will never be helped by globalization, but hundreds ofmillions more could be benefiting now, if the rules had not beenrigged to help the rich and follow abstract orthodoxies. Globalizationcan begin to work for the vast majority of the world's population onlyif it ceases to be viewed as an end in itself, and instead is treatedas a tool in service of development: a way to provide food, health,housing and education to the wretched of the earth."

NEXT PAGE -->




Shop by Keywords Above or by Categories Below.

AIR PURIFICATION AROMATHERAPY BABIES
BEDDING BIRDING BODY CARE
BOOKS BUSINESS OPPORTUNITIES BUSINESS-TO-BUSINESS
CAMPING CATALOGUES CLASSIFIEDS
CLEANING PRODUCTS CLOTHING COMPUTER PRODUCTS
CONSTRUCTION CONSULTANTS CRAFTS
ECO KIDS ECO TRAVEL EDUCATION
ENERGY CONSERVATION ENERGY EFFICIENT HOMES ENGINEERING
FITNESS-YOGA FLOWERS FOODS
FOOTWEAR FURNITURE GARDEN
GIFTS HARDWARE HEMP
HERBS HOUSEHOLD INDUSTRY
INVESTMENTS JEWELRY LIGHTING
MAGAZINES MUSIC NATURAL HEALTH
NATURAL PEST CONTROL NEW AGE OFFICE
OUTDOORS PAPER PETS
PROMOTIONAL RESOURCES RECYCLED SAFE ENVIRONMENTS
SEEKING CAPITAL SHELTERS SOLAR-WIND
TOYS TRANSPORTATION VIDEOS
VITAMINS WATER WEATHER
WHOLESALE WOOD HOW TO ADVERTISE

 Green Living Magazine
Updated Daily!

* * * IN-HOUSE RESOURCES * * *
WHAT'S NEW ACTIVISM ALERTS DAILY ECO NEWS
LOCAL RESOURCES DATABASE ASK THE EXPERTS ECO CHAT
ECO FORUMS ARTICLES ECO QUOTES
INTERVIEWS & SPEECHES NON-PROFIT GROUPS ECO LINKS
KIDS LINKS RENEWABLE ENERGY GOVERNMENT/EDUCATION
VEGGIE RESTAURANTS ECO AUDIO/VIDEO EVENTS
COMMUNICATIONS WHAT PEOPLE ARE SAYING ACCOLADES
AWARDS E-MAIL MAILING LIST

EcoMall