RACHEL'S ENVIRONMENT & HEALTH WEEKLY

March 16, 2000

THE MAJOR CAUSE OF CANCER

When Wilhelm Roentgen first discovered X-rays, in 1895, "doctors and physicians saw the practical potential of X-rays at once, and rushed to experiment with them."[1,pg.7] Many physicians built their own X-ray equipment, with mixed results: some home-brew X-ray machines produced no radiation whatsoever, others produced enough to irradiate everyone in the next room.

The ability to see inside the human body for the first time was a marvelous, mysterious and deeply provocative discovery. Roentgen trained X-rays on his wife's hand for 15 minutes, producing a macabre image of the bones of her hand adorned by her wedding ring. Roentgen's biographer, Otto Glasser, says Mrs. Roentgen "could hardly believe that this bony hand was her own and shuddered at the thought that she was seeing her skeleton. To Mrs. Roentgen, as to many others later, this experience gave a vague premonition of death," Glasser wrote.[1,pg.4]

Within a year, by 1896, physicians were using X-rays for diagnosis and as a new way of gathering evidence to protect themselves against malpractice suits. Almost immediately -- during 1895-96 -- it also became clear that X-rays could cause serious medical problems. Some physicians received burns that wouldn't heal, requiring amputation of their fingers. Others developed fatal cancers.

At that time, antibiotics had not yet been discovered, so physicians had only a small number of treatments they could offer their patients; X-rays gave them a range of new procedures that were very "high tech" -- bordering on the miraculous -- and which seemed to hold out promise to the sick. Thus the medical world embraced these mysterious, invisible rays with great enthusiasm. Understandably, physicians at the time often thought they observed therapeutic benefits where controlled experiments today find none.

At that time -- just prior to 1920 -- the editor of AMERICAN X-RAY JOURNAL said "there are about 100 named diseases that yield favorably to X-ray treatment." In her informative history of the technology, MULTIPLE EXPOSURES; CHRONICLES OF THE RADIATION AGE, Catherine Caufield (see REHW #200, #201, #202), comments on this period: "Radiation treatment for benign [non-cancer] diseases became a medical craze that lasted for 40 or more years."[1,pg.15] "...[L]arge groups of people [were] needlessly irradiated for such minor problems as ringworm and acne.... Many women had their ovaries irradiated as a treatment for depression."[1,pg.15] Such uses of X-rays would today be viewed as quackery, but many of them were accepted medical practice into the 1950s. Physicians weren't the only ones enthusiastic about X-ray therapies. If you get a large enough dose of X-rays your hair falls out, so "beauty shops installed X-ray equipment to remove their customers' unwanted facial and body hair," Catherine Caufield reports.[1,pg.15]

Roentgen's discovery of X-rays in 1895 led directly to Henri Becquerel's discovery of the radioactivity of uranium in 1896 and then to the discovery of radium by Marie Curie and her husband Pierre in 1898, for which Becquerel and the Curies were jointly awarded the Nobel prize in 1903. (Twenty years later Madame Curie would die of acute lymphoblastic leukemia.)

Soon radioactive radium was being prescribed by physicians alongside X-rays. Radium treatments were prescribed for heart trouble, impotence, ulcers, depression, arthritis, cancer, high blood pressure, blindness and tuberculosis, among other ailments. Soon radioactive toothpaste was being marketed, then radioactive skin cream. In Germany, chocolate bars containing radium were sold as a "rejuvenator."[1,pg.28] In the U.S, hundreds of thousands of people began drinking bottled water laced with radium, as a general elixir known popularly as "liquid sunshine." As recently as 1952 LIFE magazine wrote about the beneficial effects of inhaling radioactive radon gas in deep mines. Even today The Merry Widow Health Mine near Butte, Montana and the Sunshine Radon Health Mine nearby advertise that visitors to the mines report multiple benefits from inhaling radioactive radon,[2] even though numerous studies now indicate that the only demonstrable health effect of radon gas is lung cancer.

Thus the medical world and popular culture together embraced X-rays (and other radioactive emanations) as miraculous remedies, gifts to humanity from the foremost geniuses of an inventive age.

In the popular imagination, these technologies suffered a serious setback when atomic bombs were detonated over Japan in 1945. Even though the A-bombs arguably shortened WW II and saved American lives, John Hersey's description of the human devastation in HIROSHIMA forever imprinted the mushroom cloud in the popular mind as an omen of unutterable ruin. Despite substantial efforts to cast The Bomb in a positive light, radiation technology would never recover the luster it had gained before WW II.

Seven years after A-bombs were used in war, Dwight Eisenhower set the U.S. government on a new course, intended to show the world that nuclear weapons, radioactivity and radiation were not harbingers of death but were in fact powerful, benign servants offering almost-limitless benefits to humankind. The "Atoms for Peace" program was born, explicitly aimed at convincing Americans and the world that these new technologies were full of hope, and that nuclear power reactors should be developed with tax dollars to generate electricity. The promise of this newest technical advance seemed too good to be true -- electricity "too cheap to meter."[3]

The Atomic Energy Act of 1946 created the civilian Atomic Energy Commission but as a practical matter the nation's top military commanders maintained close control over the development of all nuclear technologies.[4]

Thus by a series of historical accidents, all of the major sources of ionizing radiation fell under the purview of people and institutions who had no reason to want to explore the early knowledge that radiation was harmful. In 1927, Hermann J. Muller had demonstrated that X-rays caused inheritable genetic damage, and he received a Nobel prize for his efforts. However, he had performed his experiments on fruit flies and it was easy, or at least convenient, to dismiss his findings as irrelevant to humans.

In sum, to physicians, radiation seemed a promising new therapy for treating nearly every ailment under the sun; for the military and the Joint Commmission on Atomic Energy in Congress it unleashed hundreds of billions of dollars, a veritable flood of taxpayer funds, most of which came with almost no oversight because of official secrecy surrounding weapons development; and for private-sector government contractors like Union Carbide, Monsanto Chemical Co., General Electric, Bechtel Corporation, DuPont, Martin Marietta and others -- it meant an opportunity to join the elite "military-industrial complex" whose growing political power President Eisenhower warned against in his final address to Congress in 1959.

Throughout the 1950s the military detonated A-bombs above-ground at the Nevada Test Site, showering downwind civilian populations with radioactivity.[5] At the Hanford Reservation in Washington state, technicians intentionally released huge clouds of radioactivity to see what would happen to the human populations thus exposed. In one Hanford experiment 500,000 Curies of radioactive iodine were released; iodine collects in the human thyroid gland. The victims of this experiment, mostly Native Americans, were not told about it for 45 years.[6,pg.96] American sailors on ships and soldiers on the ground were exposed to large doses of radioactivity just to see what would happen to them. The military brass insisted that being showered with radiation is harmless. In his autobiography, Karl Z. Morgan, who served as radiation safety director at the Oak Ridge National Laboratory (Clinton, Tennessee) from 1944 to 1971, recalls that, "The Veterans Administration seems always on the defensive to make sure the victims are not compensated."[6,pg.101] Morgan recounts the story of John D. Smitherman, a Navy man who received large doses of radiation during A-bomb experiments on Bikini Atoll in 1946. Morgan writes, "The Veterans Administration denied any connection to radiation exposure until 1988, when it had awarded his widow benefits. By the time of his death, Smitherman's body was almost consumed by cancers of the lung, bronchial lymph nodes, diaphragm, spleen, pancreas, intestines, stomach, liver, and adrenal glands. In 1989, a year after it had awarded the benefits, the VA revoked them from Smitherman's widow."[6,pg.101]

Starting in the 1940s and continuing into the 1960s, thousands of uranium miners were told that breathing radon gas in the uranium mines of New Mexico was perfectly safe. Only now are the radon-caused lung cancers being tallied up, as the truth leaks out 50 years too late.

In retrospect, a kind of nuclear mania swept the industrial world. What biotechnology and high-tech computers are today, atomic technology was in the 1950s and early 1960s. Government contractors spent billions to develop a nuclear-powered airplane -- even though simple engineering calculations told them early in the project that such a plane would be too heavy to carry a useful cargo.[4,pg.204] Monsanto Research Corporation proposed a plutonium-powered coffee pot that would boil water for 100 years without a refueling.[4,pg.227] A Boston company proposed cufflinks made of radioactive uranium for the simple reason that uranium is heavier than lead and "the unusual weight prevents cuffs from riding up."[4,pg.227]

In 1957, the Atomic Energy Commission established its Plowshare Division -- named of course for the Biblical "swords into plowshares" phrasing in Isaiah (2:4).[4,pg.231] Our government and its industrial partners were determined to show the world that this technology was benign, no matter what the facts might be. On July 14, 1958, Dr. Edward Teller, the father of the H-bomb, arrived in Alaska to announce Project Chariot, a plan to carve a new harbor out of the Alaska coast by detonating up to six H-bombs. After a tremendous political fight -- documented in Dan O'Neill's book, THE FIRECRACKER BOYS[7] -- the plan was shelved. Another plan was developed to blast a new canal across Central America with atomic bombs, simply to give the U.S. some leverage in negotiating with Panama over control of the Panama Canal. That plan, too, was scrapped. In 1967, an A-bomb was detonated underground in New Mexico, to release natural gas trapped in shale rock formations. Trapped gas was in fact released, but -- as the project's engineers should have been able to predict -- the gas turned out to be radioactive so the hole in the ground was plugged and a bronze plaque in the desert is all that remains visible of Project Gasbuggy.[4,pg.236]

In sum, according to NEW YORK TIMES columnist H. Peter Metzger, the Atomic Energy Commission wasted billions of dollars on "crackpot schemes," all for the purpose of proving that nuclear technology is beneficial and not in any way harmful.[4,pg.237]

The Plowshare Division may have been a complete failure, but one lasting result emerged from all these efforts: A powerful culture of denial sunk deep roots into the heart of scientific and industrial America.


March 9, 2000

HIDDEN COSTS OF ANIMAL FACTORIES

As the U.S. discards its family farms and in their place erects factory farms, we might consider the costs. Here we will consider only one cost: the harm to human health from increased use of antibiotics in confined livestock operations, sometimes known as animal factories.

As most people know, modern animal factories in the U.S. now raise tens of thousands of chickens, cattle and pigs in the smallest possible space. The animals are physically close to each other -- jammed together might be a better description -- so an outbreak of disease can pass readily from animal to animal. To prevent this from happening -- and to promote rapid growth -- the animals are regularly treated with antibiotics.

The Institute of Medicine, a division of the National Academy of Sciences, began to question this practice in 1989.[1] The Institute identified a hazard to human health: the creation of antibiotic-resistant bacteria which can cause serious human diseases.

Resistance is a well-understood phenomenon. Not all bacteria are affected equally by antibiotics -- some bacteria are genetically able to resist the killing effects of an antibiotic. As a result, when a group of bacteria is dosed with an antibiotic, some hardy bacteria survive. These resistant bacteria reproduce and the next time they are dosed with the same antibiotic, a hardy few survive again. Eventually, the only surviving bacteria are immune to that particular antibiotic. They have developed "resistance," and that antibiotic has lost its effectiveness against those bacteria. As time passes, some bacteria can develop resistance to multiple antibiotics and these are referred to as "multi-drug-resistant strains." Such multi-drug-resistant bacteria are a serious medical concern because they may cause diseases that are difficult or impossible to cure, the Institute of Medicine said in 1992.[2,pg.92]

Some of the costs of antibiotic-resistant bacteria were summarized by the Institute of Medicine:

"An increasingly important contributor to the emergence of microbial threats to health is drug [antibiotic] resistance. Microbes that once were easily controlled by antimicrobial drugs are, more and more often, causing infections that no longer respond to treatment with these drugs."[2,pg.92]

The Institute went on to outline the human costs of antibiotic-resistant germs: "Treating resistant infections requires the use of more expensive or more toxic alternative drugs and longer hospital stays; in addition, it frequently means a higher risk of death for the patient harboring a resistant pathogen. Estimates of the cost of antibiotic resistance in the United States annually range as high as $30 billion. Even with the continuing development of new drugs, resistance to antibiotics is an increasingly important problem with certain bacterial pathogens."[2,pg.93]

The Institute laid the problem squarely on the doorstep of animal factories: "New agricultural procedures can also have unanticipated microbiological effects. For example, the introduction of feedlots and large-scale poultry rearing and processing facilities has been implicated in the increasing incidence of human pathogens, such as SALMONELLA, in domestic animals over the past 30 years. The use of antibiotics to enhance the growth of and prevent illness in domestic animals has been questioned because of its potential role in the development and dissemination of antibiotic resistance. Approximately half the tonnage of antibiotics produced in the U.S. is used in the raising of animals for human consumption. Thus, concerns about the selection of antibiotic-resistant strains of bacteria and their passage into the human population as a result of this excessive use of antibiotics are realistic."[2,pg.64]

Throughout the 1990s, awareness of this problem has been growing.

In May 1998, the federal Centers for Disease Control and Prevention reported in the NEW ENGLAND JOURNAL OF MEDICINE that a strain of salmonella bacteria had emerged in the U.S. in the last 5 years which is resistant to 5 different antibiotics.[3] Called typhimurium DT 104, this rapidly-emerging bacterium is responsible for an estimated 68,000 to 340,000 illnesses each year in the U.S. The proportion of salmonella infections caused by typhimurium DT 104 increased 30-fold in the U.S. between 1980 and 1996.

The Centers for Disease Control blamed the rapid emergence of this infectious agent on the use of antibiotics in livestock, summarizing its recommendations this way: "More prudent use of antimicrobial agents [antibiotics] in farm animals and more effective disease prevention on farms are necessary to reduce the dissemination of multi-drug-resistant typhimurium DT 104 and to slow the emergence of resistance to additional agents in this and other strains of salmonella."[3]

In March of 1999 the FDA began a multi-year process to regulate the use of antibiotics in farm animals. Here is how the NEW YORK TIMES reported the FDA's action in a front-page story March 8:

"Faced with mounting evidence that the routine use of antibiotics in livestock may diminish the drugs' power to cure infections in people, the Food and Drug Administra- tion has begun a major revision of its guidelines for approving new antibiotics for animals and for monitoring the effects of old ones.

"The goal of the revision is to minimize the emergence of bacterial strains that are resistant to antibiotics, which makes them difficult or even impossible to kill. Drug-resis- tant infections, some fatal, have been increasing in people in the United States, and many scientists attribute the prob- lem to the misuse of antibiotics in both humans and ani- mals.

"Of particular concern to scientists are recent studies showing bacteria in chickens that are resistant to fluoroqui- nolones, the most recently approved class of antibiotics and one that scientists had been hoping would remain effective for a long time."[4]

The NEW YORK TIMES then described[4] the May, 1998, study by the federal Centers for Disease Control,[3] adding new information from an interview with Dr. Fred Angulo, one of the authors of the study:

"Last May, a team from the centers reported in the New England Journal of Medicine that the prevalence of a salmonella strain resistant to five different antibiotics increased from 0.6 percent of all specimens from around the country tested by the centers in 1980 to 34 percent in 1996.

"Similarly, drug resistance in campylobacter bacteria rose from zero in 1991 to 13 percent in 1997 and 14 percent in 1998, said Dr. Fred Angulo, an epidemiologist in the food- borne and diarrheal disease branch at the centers. He said epidemiologists had been alarmed by the campylobacter figures, because the resistance was to fluoroquinolones, the very drugs the F.D.A. was trying hardest to preserve.

"Dr. Angulo said that he and his colleagues had attribut- ed much of the increase in fluoroquinolone resistance to the drug agency's approval of the drugs to treat a respiratory infection in chickens in 1995. It was an approval that the disease control centers opposed, because it would lead to tens of thousands of the birds being treated at one time.

"Dr. Angulo said he thought the rising levels of resis- tance in bacteria taken from sick people had been caused by the heavy use of antibiotics in livestock. 'Public health is united in the conclusion,' he said. 'There is no controversy about where antibiotic resis- tance in food-borne pathogens comes from.'"[4]

Two months later, in May, 1999, a report by the Minnesota Health Department, published in the NEW ENGLAND JOURNAL OF MEDICINE, found that infections by antibiotic-resistant bacteria increased nearly 8-fold between 1992 and 1997. Part of the increase was linked to foreign travel, and part of the increase was linked to the use of antibiotics in chickens. Even the increase due to foreign travel may have been caused by the use of antibiotics in chickens in countries such as Mexico where the use of antibiotics in poultry has quadrupled in recent years, the report said.[5] The study's lead author, Dr. Kirk E. Smith, told the Associated Press, "There is definitely a public health problem with using quinolone [antibiotic] in poultry, and we need to take a hard look at that."[6]

In November 1999 a new report appeared in the NEW ENGLAND JOURNAL OF MEDICINE linking an outbreak of fatal salmonella in Denmark to the use of antibiotics in pigs.[7] Here is how the NEW YORK TIMES reported the story:

"An outbreak of severe, drug-resistant salmonella infections in 27 people in Denmark, traced to meat from infected pigs, is being described by American scientists as a warning on what can happen in the United States unless steps are taken to limit the use of antibiotics in farm animals.

"The episode in Denmark, in which 11 people were hospitalized and 2 of them died, is especially worrisome because the bacteria had made them partly resistant to a class of antibiotics called fluoroquinolones that doctors had considered one of their most powerful weapons against severe cases of salmonella and other bacteria that infect the intestinal tract. If those bacteria invade the bloodstream, which occurs in 3 percent to 10 percent of salmonella cases, the illness can be fatal.

"'Fluoroquinolones become a drug of last resort for some of these infections,' said Dr. Stuart Levy, director of the Center for Adaptation Genetics and Drug Resistance at Tufts University. 'If we're beginning to lose these drugs, where do we go from here?'

"Fluoroquinolones are the most recently approved class of antibiotics; nothing comparable is expected to become available for several years," the Times said.[8]

Deaths due to infectious diseases have been increasing in the U.S. in recent years. In the '50s and '60s, public health specialists were predicting that infectious diseases would disappear as a problem. However, this prediction was entirely wrong. According to a 1996 report in the JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, between 1980 and 1992, the death rate due to infectious diseases as the underlying cause of death increased 58%, from 41 to 65 per 100,000 population in the U.S. (See REHW 528.) Some of this was due to an increase in AIDS during the period. However, AIDS is typically a disease of young people. Among those aged 65 and over, deaths due to infectious diseases increased 25% during the period 1980-1992 (from 271 deaths per 100,000 to 338 deaths per 100,000). Thus there seems to have been a real and substantial increase in deaths due to infectious diseases in the U.S. during the past 20 years.[9]

In sum, serious infectious diseases are enjoying a resurgence in the U.S. Our national policy of replacing family farms with animal factories in the name of "economic efficiency" is one of the key reasons.

NEXT PAGE -->


* * * COMPANIES & PRODUCTS * * *
AIR PURIFICATION AROMATHERAPY BABIES
BEDDING BIRDING BODY CARE
BOOKS BUSINESS OPPORTUNITIES BUSINESS-TO-BUSINESS
CAMPING CATALOGUES CLASSIFIEDS
CLEANING PRODUCTS CLOTHING COMPUTER PRODUCTS
CONSTRUCTION CONSULTANTS CRAFTS
ECO KIDS ECO TRAVEL EDUCATION
ENERGY CONSERVATION ENERGY EFFICIENT HOMES ENGINEERING
FITNESS FLOWERS FOODS
FOOTWEAR FURNITURE GARDEN
GIFTS HARDWARE HEMP
HERBS HOUSEHOLD INDUSTRY
INVESTMENTS JEWELRY LIGHTING
MAGAZINES MUSIC NATURAL HEALTH
NATURAL PEST CONTROL NEW AGE OFFICE
OUTDOORS PAPER PETS
PROMOTIONAL RESOURCES RECYCLED SAFE ENVIRONMENTS
SEEKING CAPITAL SHELTERS SOLAR-WIND
TOYS TRANSPORTATION VIDEOS
VITAMINS WATER WEATHER
WHOLESALE WOOD HOW TO ADVERTISE

 Green Shopping Magazine
Updated Daily!

* * * IN-HOUSE RESOURCES * * *
WHAT'S NEW ACTIVISM ALERTS DAILY ECO NEWS
LOCAL RESOURCES DATABASE ASK THE EXPERTS ECO CHAT
ECO FORUMS ARTICLES ECO QUOTES
INTERVIEWS & SPEECHES NON-PROFIT GROUPS ECO LINKS
KIDS LINKS RENEWABLE ENERGY GOVERNMENT/EDUCATION
VEGGIE RESTAURANTS ECO AUDIO/VIDEO EVENTS
COMMUNICATIONS WHAT PEOPLE ARE SAYING ACCOLADES
AWARDS E-MAIL MAILING LIST

EcoMall