Archive | Carcinogens

Who Watches the Watchers?

STUDIES SHOW: HOW RESEARCHERS CAN MANIPULATE STATISTICS AND HOW THE MEDIA GOBBLES IT UP UNCRITICALLY
Heron
The magnificant Great Blue Heron
Could measured use of DDT still have saved
the Heron, but also millions of human lives?

Editor’s Note: Any issue where science and public policy collides can fall prey to some combination of political opportunism and scientific corruption. Even when motives are pure, there is still potential for well intentioned researchers to go down paths that are later revealed to be completely off the track. When powerful vested interests and deeply rooted emotions intersect, the truth is only one card in the deck, hard to find, and relatively easy to stack.

The following report by veteran EcoWorld science correspondant Edward Wheeler identifies the “seven deadly sins” of epidemiological studies, and how many of these flawed studies pass from the laboratory press release into the uncritical hands of journalists and before you know it, are enshrined in new legislation or regulations. But far too often these studies are not nearly as conclusive as they are made to appear, and the consequent actions we take are not rational.

The point of all this goes beyond just epidemiology, to the relationship between scientific inquiry, media reporting, popular sentiment and public policy. Scientists who indulge in dramatic proclamations, becoming rich and famous in the process, need ongoing critical review. Today one has to ask: Is scientific peer review a way to challenge conventional wisdom and expose conclusions that aren’t clearly indicated by the underlying data, or has peer review become precisely the opposite – a way to exclude contrarian notions? Have certain scientifically developed hypotheses prematurely assumed the mantle of truth beyond debate?

Who will watch the watchers, when the watchers are our scientists, whose currency of reason is so arcane, so specialized and diverse, that nobody, not even among the scientists themselves, has sufficient credentials to question the conventional wisdom? The first step is to remember the fallibility of studies, to restore the innate and vital skepticism of journalists, and to remind the public that debate is the crucible of truth. To that end, read on. – Ed “Redwood” Ring

Studies Show – How researchers can manipulate statistics and how the media gobbles it all up uncritically.
by Edward Wheeler, April 29, 2008
California Coast with Cliffs
California’s magnificant Central Coast,
home to the elusive North American Condor.
Saving this precious species is one of
environmentalism’s finest achievements.

We should all be scared, VERY scared. It seems as if every day a new “study” is reported somewhere in the national media showing a statistical association between diet, lifestyle, or environmental chemicals and some disease or disorder.

Do you eat the “wrong” foods such as red meat, hot dogs, french fries, coffee, alcohol, grilled meats, too much fat, artificial sweeteners, preservatives, or NOT eat enough vegetables?

Are you overweight, and don’t exercise enough? Do you use deodorants, mouthwash, nail polish, electric razors or blankets, cell phones? Do you live near power lines, use birth control pills or take hormone treatments, have some radon in your basement, breathe polluted air or second hand smoke?

Do you worry and fret about all these things after reading the terrifying results of some new study? Then for sure you will surely die from some form of cancer or heart disease sometime next week, probably from the stress and loss of sleep of worrying so much!

All these studies are called epidemiological studies, which seek to find statistical correlations, mostly quite subtle, between diet, lifestyle, or environmental factors and disease. Real sciences, like chemistry and physics, seek to find cause and effect. Epidemiological studies supply only statistical links between this or that risk factor and some disease. Such studies almost never prove cause and effect, and they are subject to researcher bias and political agendas, poor design, confounding variables, bad data gathering and more. Unfortunately, most reporters who write articles on these studies are scientifically ignorant and simply parrot whatever the study authors say.

Author Mark Twain popularized the saying, “there are three kinds of lies: lies, damned lies, and statistics”. An even better quote is from the renowned epidemiologist Alvan Feinstein of Yale University who quoted that, “statistics are like a bikini bathing suit: what is revealed is interesting, but what is concealed is crucial.” Let’s look at the history of this field of study.

Epidemiology: “The study of the distribution of diseases in populations and of factors that influence the occurrence of disease.” Classical epidemiology was fathered by an Italian physician named Ramazzini. Around 1700, he started looking into the possibility that various diseases in patients might be connected to their occupations. For example, miners and chemical workers might have some lung disease because they are exposed to dust, various chemicals, or toxic metals over the course of their careers. Years later a London surgeon, Percivall Pott, noted that virtually everyone he treated for cancer of the scrotum was a chimney sweep. Hummm, he must have thought to himself. It’s a non-communicable disease, so I wonder if all that soot and coal tar they breathe and get all over them every day might be the cause. This was a monumental proof of concept!

Classical epidemiology is like police work. If there is an outbreak of some kind of stomach ailment in a number of people in a city who all seek medical treatment (they were up all night throwing up and sitting on the pot, maybe some even died), public health investigators would seek to determine what history all these sick people might have in common. If it turns out 95% of them ate at Joe’s diner within the last few days, odds are Joe was serving E. coli burgers or maybe Salmonella oysters. A simple test would confirm it.

A recent example is the incidence of a disease identified in 1976, later named “legionnaire’s” disease, which is a form of pneumonia unknown before then. Hundreds of men were affected, and 32 died. It was found that all of them had attended an American Legion convention in Philadelphia. Voila, they identified a bacterium living in the ventilation system of the hotel where the convention took place. That is classical epidemiology. Now let’s discuss a branch of epidemiology that uses “clinical trials” to try to find the facts about disease, cause, and prevention.

Probably the first “study” we now might loosely call a clinical trial occurred in 1753. Scurvy was a common illness among sailors at the time. James Lind, a surgeon in the British Royal Navy, wondered if perhaps it had something to do with the fact that sailors on long voyages ate almost no fresh fruits and vegetables. We now know that scurvy is caused by a deficiency of vitamin C, but at the time the necessity of vitamins to our health was unknown. He tested his hypothesis by dividing a number of scurvy sailors into two groups, one of which was given fresh fruits (we now know to contain vitamin C) to eat, while the other group continued eating hardtack and rum. ALL the sailors sucking limes got over their illness, while ALL the sailors in the group that we now would call the no veggie “control” group still had scurvy. Eureka! From then on British sailors sucked on limes and stayed healthy, while those poor French and Spanish sailors stayed sick and lost lots of sea battles to the British.

Another classic example of an early clinical trial was conducted by Walter Reed, a U.S. army medical officer stationed in Cuba in the 1890s. Yellow fever was rampant at the time, and he wondered why it was only prevalent in tropical climates. His trial could never be done today for ethical reasons. He suspected mosquitoes might somehow be spreading the disease through their bites. He recruited a small number of healthy volunteers, half of whom deliberately were bitten by mosquitoes, while the other half were not bitten. Most of the poor guys with the bites came down with yellow fever, and one of them died! None of the bite free guys got the fever. That was definitive, whereas today various studies and trials are rarely so (with one famous exception that I discussed in my Ecoworld article entitled “Chemophobia”).

The following is a perfect example of how an epidemiological study should be conducted in order to give definitive results, NO question about the results, even if it wasn’t planned that way and would be considered unethical and way too expensive to conduct if it were. AND, this was a really BIG study.

Back in the 1960s, this study enrolled tens of millions of volunteers (the test group) who volunteered to inhale huge amounts of suspected carcinogens every day of their lives for at least 20 years, AT THEIR OWN EXPENSE! The same number of people who did not inhale the suspected carcinogens (the control group) was compared with the test group after 20-30 years to determine the rates of various cancers in the two groups. Absolutely unequivocal results showed that people in the test group had an increased incidence of various cancers and heart disease over the control group, and the most striking result was that people in the test group were about 15 times more likely to get lung cancer than people in the control group!

Thus we now know for sure that smoking can cause lung cancer and various other health problems. Now THAT was a really good epidemiological study! It is, however, not even conceivable to design and carry out such a clinical trial for ethical reasons: and in addition, the time and expense would be prohibitive. So let’s look at how those “study” authors do things today. The reader may have already figured out that I perceive most of the “studies” to be mostly what is often called “junk science.” I do not, however, believe real science is involved at all in most statistical studies, so I call them “bad (pretend) science” or BS for short. I won’t go into the statistical details and methods, but I will show many wonderful examples of famous BS. You can get the underlying methods by reading Steven Milloy’s “Science Without Sense” and “Junk Science Judo.”

Here are the “seven deadly sins” of epidemiology (epiBS from now on) as practiced today:

1) Have a political, health, or moral agenda and design (rig) your study in order to get the results you want. This applies to all sides of the political spectrum and official government agencies. Real scientific method is: put forth a hypothesis, then gather data to determine whether your hypothesis is correct or not. EpiBS method is: Have a mandated or acceptable conclusion in mind, then go select only the data that appear to support your already reached conclusion (see famous example below).

2) Assume that a statistical correlation that you found in your latest study between some disease or disorder and some exposure to some perceived risk factor is proof of a cause and effect relationship. EVEN if there is no apparent biological reason to think so, you can still think of some improbable rationalization for your results!

3) Data dredging: Don’t bother with any hypothesis prior to gathering your data, just ask a large group of people lots of questions about lifestyle, diet, drinking habits, ect., over a period of time. Feed the data into your computer statistics program and see if something correlates to something, who knows what you might find?

4) Don’t bother to verify any data you gather through questionnaires. Just assume nobody ever mis-remembers or lies about their lifestyle, diet, shoe size, or anything else you might have thought to ask about. Ask a subject how much alcohol they drink per day, and they understate the amount by 3 or 4 times at least. It’s like a wife asking her husband how much money he lost playing the slots at the casino.

5) Design studies that are fatally flawed from the beginning, but because you don’t know anything about the biochemistry involved (after all, you are either a medical guy or a statistician), you have no clue why you got the associations you did, but you believe it and publish it anyway.

6) If your study doesn’t find any association between, say, radon exposure and lung cancer, perform a meta-analysis combining the weak, statistically insignificant results of numerous studies by other researchers with your own, and you don’t even have to do a study of your own at all. It doesn’t matter how good or bad or even how similar in their design all those studies were (the old apples and oranges comparisons), combining them just might give a statistically significant correlation.

7) ALWAYS call the news media immediately after finding an association between, say, exposure to some hot issue chemical and some disease state. The reporters know absolutely nothing about how these studies are done and will uncritically report whatever you say. Your study will make big headlines tomorrow, and you will be quoted as saying, “the results are important, but more research is needed”. That translates into, “I need more grant money to continue to do BS.”

The most egregious example I will give of epiBS combines the deadly sins #1 and #6, and its results have had enormous implications on nanny state public policy. The U.S. Environmental Protection Agency’s (EPA) original mission was to establish rules and regulations meant to protect the environment, such as from air and water pollution. However, over the years mission creep occurred, and they now exist to protect public health. This gives them vastly more power to institute regulations that the agency was never originally intended to do.

In 1993, the EPA conducted a now infamous study that kicked off the anti-smoking crusade that continues today. At the time, more than 30 epidemiological studies from around the world had been conducted to see if the spouses of smokers were more likely to get lung cancer than spouses of non-smokers. None of them were definitive, perhaps showing a very weak correlation. Some of those studies actually suggested (also weakly) that exposure to second hand smoke, or environmental tobacco smoke (ETS) might actually protect the spouse against getting lung cancer. This is a plausible biological process called “hormesis”, i.e., very low levels of exposure to a toxin can protect a person against high levels of exposure later.

The EPA has even admitted that the average annual exposure to ETS particles for a non-smoker is less than actively smoking one cigarette. Anyway, the EPA ignored those studies and selected only 11 studies to combine in a meta-analysis that they hoped would establish a statistically significant correlation between ETS and lung cancer in spouses of smokers. They also chose not to include in their meta-analysis any of some 30 available studies that were designed to determine if ETS in the workplace, as opposed to spouses of smokers, could be responsible for an increased risk of lung cancer in non-smokers so exposed. A large majority of those workplace studies found no statistically significant association between workplace ETS exposure and lung cancer risk. Is that why they did not include any of those studies in their meta-analysis?

The meta-analysis used by the EPA to analyse the effects of environmental tobacco smoke (ETS, or 2nd hand smoke) committed two of the cardinal sins of epidemiology. First, they selected only those studies that might show that ETS causes lung cancer. Thus they designed it to be a one-tailed test. That means you assume a priori that the test substance can only be bad, so you don’t include any data that might show the opposite of what you expect (or want) to see. Including all studies in their meta-analysis, even those that may indicate that ETS could possibly be beneficial, would make it a statistically acceptable two-tailed test.

The second cardinal sin of epidemiology they performed is that they used a confidence interval (CI) of 90%, instead of the gold standard 95%, in order to get a statistically significant result. What does that mean, a statistics-ignorant person might ask? At a 95% CI, your statistically significant results have a 1 in 20 probability of being due to pure chance (1/20 means p=.05 in stat language). All good epidemiological studies use the 0.05 CI. The EPA chose to use a CI of 0.1 (one in 10 chance of your results being false instead of a 1 in 20 chance) because they knew beforehand that their results would not be significant otherwise. In other words, they rigged the “study” to get the result they wanted, epiBS in its most flagrant form. They knew that smoking is bad for the health of smokers, but they couldn’t regulate smoking unless they could claim ETS could cause disease in innocent bystanders. This is perverting science because they believe, that for a worthy cause, the end justifies the means.

What, you still think ETS causes lung cancer in non-smokers? The EPA epiBS meta-analysis study was done in 1993. A study sponsored by the World Health Organization in 1998, which covered seven countries over seven years, showed no increase in cancer risk for spouses and co-workers of smokers. It was, however, another meta-analysis. I don’t like meta-analyses in general, even when there may be no political agenda involved as in the EPA study. So has there been one huge study done right, no meta-analysis BS? YES!

In 2003, a study published in the British Journal of Medicine found no relationship between exposure to passive smoke and mortality. It was a HUGE, very believable study. It spanned 39 years and included over 35,000 Californians. So why is such a really good study ignored in the media and the epiBS community? POLITICAL CORRECTNESS?

Disclaimer: I DON’T SMOKE, AND I AM VERY ANNOYED BY ETS, so don’t accuse me of loving tobacco companies. I agree with laws banning smoking in public buildings, transportation, and enclosed areas where one must go to do business.

About the Author: Dr. Wheeler earned a Ph.D. in chemistry from the University of California, Berkeley in 1970. As a research scientist for the U.S. Department of Agriculture in Berkeley, he did pioneering research on how one’s nutritional status and cancer are interrelated, and how our immune systems handle food bourn carcinogens. He published 25 research papers in peer reviewed scientific journals and gave numerous talks (and listened to many, many more) at various scientific meetings. He left the USDA to work for Nabisco in New Jersey as head of the food science research unit. Now retired, he writes brilliant articles for “ecoworld” pro bono. He is the resident contrarian for ecoworld.com.

Email the Editor about this Article
EcoWorld - Nature and Technology in Harmony

Posted in Buildings, Carcinogens, Careers, Causes, Chemicals, Coal, Effects Of Air Pollution, History, Journalists, Other, Policy, Law, & Government, Smoking, Transportation, Water Pollution0 Comments

Chemophobia

Shouldn’t the Dose Make the Poison?
Crop Duster
Are organics always less toxic than synthetics?

Editor’s Note: Why should a contrarian essay entitled “Chemophobia” be broadcast here? Because open debate is essential, and the author has many of his facts straight. The main point, toxicology’s foundation is dose equals poison, and this foundation is often ignored, is valid. Actuarial arguments, framed in actuarial terms, are not callous attempts to “use comparison to deprecate the risk,” they are essential to setting any rational strategy. A deadly poison that you would have to eat three barrels a day of for thirty years to have a 50% elevated risk of some disease, is not hazardous whatsoever if you merely eat three meals a week of it for thirty years.

If the risk of chemicals were put in perspective, Americans would have just gone in and cleaned up the superfund sites, instead of spending hundreds of billions in courtroom fees and salaries for bureaucrats, and done almost nothing. This should make anyone angry.

A distinguished chief scientist once told me his company was developing organic pesticides because that’s where it was easier to get public grants, the approvals were streamlined, and the marketing more effective. He asserted there are many extremely dangerous organic pesticides, that persist more and are far more toxic than available synthetic pesticides. Would DDT be safer and cheaper than current alternatives if it was properly applied? Dose definitely made the poison in the case of DDT, which was applied in doses 100 to 1,000 times higher than could have been effective.

Polemical, indignant, provocative, diametrically opposed points of view are difficult to take, but conventional wisdom should always be challenged. Are we making a mistake to not again use DDT? Wasn’t overuse, not toxicity, the issue with DDT? Isn’t it true that you shouldn’t apply the precautionary principle to everything? Exposing the absurdities that underlie anyone’s self-serving rhetoric helps us to distill what is valid. Are all GMOs really bad? What about vitamen A enriched rice that saved the eyesight of ten million children?

The dose makes the poison, not the label. Are we drowning in carcinogens, or are we just suffering from chemophobia? Read on for hard facts, and you decide. – Ed “Redwood” Ring

Paracelsus, a 16th century alchemist and physician, invented the science of toxicology.

Today, if you are a graduate student in toxicology at major university, you need answer only one question correctly during your final oral exam to get your PhD: The professor asks: what did Paracelsus have to say about potentially toxic chemicals? You answer, “umm, the dose makes the poison??” Huzzah! Here’s your PhD! “The dose makes the poison” means that of the thousands of various chemicals we ingest from breathing and eating, i.e. living, practically all of them are toxic if ingested at a high enough dose. For example, virtually all the various vitamins and minerals we need in order to survive can be toxic if taken in excess. But, excess may mean amounts far higher than one could possibly ingest on a daily basis during your lifetime, no matter how hard you tried.

America's War on Carcinogens Book Cover
American Council on
Science & Health

This brings us to the topic of chemical carcinogens. In 1958, Congress inserted the now infamous Delaney Clause into the Food, Drug and Cosmetic Act. It prohibited the presence in foods of any synthetic chemical (pesticide, food additive, etc.) in any amount if that chemical had been found to cause cancer in laboratory animals (1). Notice the law said synthetic chemical (more about that later). This law is still in effect (loosely, because even EPA regulators understand that zero is impossible), even though today’s analytical techniques enable chemists to detect any chemical of interest in food or water at levels a billion to a trillion times lower than was possible in 1958. Back then, if you put a gram of DDT into your backyard wading pool and sampled the water, you could detect the DDT using analytical techniques available at that time. Today, that same amount of DDT could be detected in a water sample from Lake Michigan!!

It was in 1959 that we had our first national cancer panic in the U.S. Traces of a synthetic herbicide that was a carcinogen in rodents were detected in cranberries, so nobody ate cranberries that Thanksgiving and the industry suffered mightily. This was OK with me because I never liked cranberries anyway. It was pointed out at the time that one would need to eat 15,000 pounds of cranberries every day of one’s life to match the dose rodents were given, but no one seemed to care. There have been many more such media and special interest group (including scientists who love grant money) inspired scares since then: dioxin everywhere, nitrites in bacon and sausage, alar in apples, cell phones causing brain cancer, etc. etc. When I was a kid in the 50s, sitting too close to those newly available “television sets” was widely believed to be a cancer risk because of exposure to the T.V. tube “rays”. Hey, cancer is scary, and very little was known at that time about the biochemical mechanisms involved in cancer etiology, and even less about how our immune system defends us against it.

So by 1959, the major tenet of toxicology, “the dose makes the poison,” was tossed out the window from 100 stories up, crushed like roadkill on the Jersey Turnpike.

After that, the man who got the carcinogen ball really rolling was the noted U.C. Berkeley chemist Bruce Ames. He invented a quick, easy, and cheap test (strangely, now called the “Ames test”) to determine if any chemical of interest can cause mutations in the DNA of bacteria in vitro. If mutations were observed, then that particular chemical was considered likely to be a carcinogen in lab animals (usually it is, but not always). Dr. Ames became a campaigner for environmental groups wanting to ban various pesticides and herbicides. Today, he has totally changed his position, but that’s another story for next time.

By the mid 1960s, rodents bred to be cancer prone (GMO rats and mice) became commercially available for carcinogen testing (it is very difficult to induce cancer in normal rats and mice).

Here’s how testing a chemical is done, then and now:

1) Do the Ames test on some pesticide, food additive, preservative, or whatever and find it to be mutagenic. This means the bacteria’s DNA in a gene is altered in some way or other.

2) Determine what is called the “maximum tolerated dose” (mtd) of this mutagenic chemical in your rats or mice. The mtd is the amount of the chemical that almost kills the rodents in a single dose. It is also a dose that, depending on the particular chemical, can be thousands to millions of times higher than a human could ever eat in a lifetime. Next, feed the rodents just 10% less than that dose daily for their entire lifetime, usually between one and two years. If you really care if your research might be relevant to reality (and if you have enough grant money because these tests are very expensive), you can also include groups of animals fed only the mtd and rarely even the mtd. Oops, I forgot to mention, if the test chemical is so noxious that the rodents won’t eat their food, use gavage, i.e., inject the chemical into their gut every day. This technique obviously mimics human exposure to pesticides, right?

3) After a year or two, sacrifice the animals and count up all the various tumors they might have in various organs. Most of the rodents in the control group, fed a normal diet, will have various tumors anyway because they have been bred to be cancer prone. So, if the test group of rodents fed some noxious chemical at the highest dose has an average of, say, four tumors per animal in a particular organ, and the control group has an average of only 1 tumor per animal, then the chemical being tested increases cancer incidence by 400% !!. Call the media!

FDA Logo
U.S. Food & Drug
Administration

Next, the Food and Drug Administration (FDA) will classify this test chemical as a possible human carcinogen, as if rodents were nothing more than miniature humans, and establish acceptable levels of the chemical in foods using a HUGE margin of safety factor based on a very faulty mathematical model. Sometimes a test chemical will induce cancer in male rats but not in females, or visa versa. It doesn’t matter. Even if the test chemical caused cancer at the maximum tolerated dose (mtd), but did NOT result in ANY excess cancer at HALF the mtd (remember, the dose makes the poison?), the chemical will still be classified as a possible human carcinogen subject to government regulation. Remember, maximum tolerated dose is defined as an amount which will almost kill you with ONE exposure! Now, various “consumer safety groups” and professional fear mongers will launch scare attacks in all the “we love to report this kind of stuff” news media. The EPA will decide what levels are acceptable in the air, water, soil, etc., based on politics more than science, and the FDA will decide acceptable levels in foods based on politics and their faulty mathematical model.

Now let’s consider known human carcinogens. There are very few of these, but I will give you an example of how they came to be known, and it’s not because of rodent testing. About 60 years ago, a huge experiment was started in which millions of human volunteers, at their own expense, were exposed to really high daily doses of a suspected carcinogen over a period of at least 25 years. At the end of the test period, their cancer rates were compared to the rates found in a group of millions of people not exposed to that suspected carcinogen. It turns out that the exposure group had lung cancer incidence at least 10-15 times higher than the non-smoking group! Other types of cancers were also significantly increased. Oops, I forgot to say it was cigarette smoke that was the suspected carcinogen! Obviously, such controlled experiments using some chemical cannot be ethically conducted on humans in a laboratory setting. Because rodents go crazy if forced to breathe noxious stuff like cigarette smoke, it has never been shown that rats can get cancer from breathing it. So we have a situation where we know cigarette smoking causes cancer in humans, but we can’t be sure it does so in rats. Is there some irony here?

JunkScience.Com Logo
The “Junk Science” Website
attacks environmentalist groupthink

There are only a few dozen known human carcinogens, and it takes long term exposure to them to increase cancer risks. Unarguable statistics (NOT rodent testing) have shown that the coal tar chimney sweeps are exposed to daily can cause cancer of the scrotum. If inhaled over a long period, asbestos fibers increase risk of lung cancer and lung disease. Ditto for uranium miners who inhaled lots of silica dust and radon down in the uranium mines in the 1950s and 60s. Mustard gas can cause cancer in doughboys (WW I) and Saddam victims, if the dose doesn’t kill them first. But the absolutely most dangerous, for sure, human carcinogen of all is something that each and every one of us is exposed to almost every day of our lives (unless you live in Seattle anytime or San Francisco in the summer). It’s called sunlight. You want some type of skin cancer? Hang out in the sun as much as possible all your life, don’t use sun block, and look really tanned and beautiful.

But let’s put this stuff in perspective. Asbestos fibers and silica dust need to be inhaled often over a period of many years to increase cancer risk. Asbestos sitting in your attic as a fire retardant insulation is no danger to anyone unless you insist on stirring it up and breathing it every day. Silica is sand, so are you afraid to go to the beach? You don’t inhale sand, but if you’re drilling down in a mine and stir up lots of silica dust and inhale it, you definitely increase lung cancer risk over the years. Even though long time smokers have a risk for lung cancer 10-15 times higher than non-smokers, 85 to 90% of those smokers never get lung cancer, although they may not be able to climb a flight of stairs without panting. And consider sunlight (the ultra violet portion of it). If the EPA could regulate our exposure to UV light using the same criteria it does for all the various pesticides and food preservatives that are carcinogens in rodents, we would all be mandated, like vampires, to stay indoors during daylight hours. We could not go out in daylight without using 200 power sun block, while wearing head to toe clothing and big floppy (government approved) hats! Hawaii would be off limits to humans! I’m not making this up, folks! So the next time you visit the tanning parlor or lie out on the beach, be absolutely sure to avoid eating any snacks containing those evil preservatives!

http://www.amazon.com/exec/obidos/ASIN/0393308294/ecoworld0f-20

Learn Much More: Buy Books!
(Amazon Affiliate)
Technological Risk Book Cover
Technological Risk
by H.W. Lewis

You are still not convinced that the hundred-fifty or so chemicals found to be carcinogenic in rodents really shouldn’t be all that worrisome for us humans at levels of exposure that are related to real life?

Remember in paragraph two above how Congress mandated, with the Delaney clause (1958), that amounts of synthetic chemicals in foods must be zero if they caused cancer in rodents? It was not known then that virtually all the carcinogens in our environmental are natural, and that many (probably all) of the foods we eat contain thousands of different chemicals, some of which are rodent carcinogens. Apples, bananas, basil, cabbage, citrus fruits, mushrooms, turnips, and so many more foods we eat all contain chemicals that cause cancer at huge lifelong doses in laboratory rodents. Broccoli, for example, is known to be protective against cancer in humans when eaten at realistic levels over your lifetime, probably because it contains high levels of various natural antioxidants. It also contains at least four different possible human carcinogens based on the wonderful rodent testing I described above. If you ate 10 or 20 pounds of broccoli every day of your life, you just might increase your risk of some cancer or other. At an American Chemical Society meeting, where I chaired a Symposium (and gave a great talk myself, of course debunking the whole rat model), I had lunch with two FDA guys. As we ate our carcinogen loaded broccoli, I asked them “what if broccoli were a new food that no one had ever eaten before and you guys had to review it for approval as a food additive?” They admitted that it could not be approved under the present FDA rules for food use! I’m not making this up, as Dave Barry would say.

Plants have evolved defense mechanisms against attack by various bug, animal, and bacteria predators. They include, “natural” toxic chemicals and pesticides. For example, there are many mushrooms you do NOT want to eat. I know what some of you are thinking, because I’ve been through this with various friends and enemies who all refuse to listen to anything that goes against their green religion. You think that these “natural” chemicals in our foods are safe because us humans have developed an immunity, over hundreds of thousands of years, to these natural chemical toxins and carcinogens, but not to the relatively new synthetic food additives, herbicides and pesticides. That argument is unequivocally, without any doubt, completely wrong. “Artificial” food additives (preservatives) are chemically very similar to natural antioxidants and bioflavonoids found in vegetables, and everyone knows these chemicals are good for you. I know this because I am a brilliant biochemist who helped elucidate just how us humans metabolize chemicals that in the 1970s and 80′s we called “xenobiotics”, i.e., chemicals imbibed from eating, drinking, and breathing that are not native to our bodies.

Learn Much More: Buy Books!
(Amazon Affiliate)
Junk Science Judo Book Cover
Junk Science Judo
by Steven J. Milloy

In the 1960s, very little was known about how all those thousands of xenobiotic chemicals we absorb into our bodies after we eat were metabolized. Now we know exactly how it’s done. All the foods we eat are broken down in the digestive tract into constituent chemicals that are then absorbed from the small intestine and dumped into our wondrous liver (of all our organs, only the brain is more wondrous). If our liver thinks we can use these chemicals (e.g., amino acids from proteins, sugars from starches, vitamins, etc.) they go into the bloodstream and travel to whatever part of the body that needs them. The thousands of other chemicals that our liver doesn’t think we need, such as drugs, all the chemicals responsible for the flavor and odors of foods, are subject to enzymatic activity that makes those chemicals ready to be eliminated from the bloodstream through the kidneys into the urine. If you feed a rodent or a human any chemical, “natural” or “artificial”, those chemicals will be metabolized in the same way. You can detect the metabolites in the urine, and that’s that.

We will finish this essay with two examples of carcinogen BS. They will either leave you thinking that I am a right wing extremist moron because you yourself are a left wing extremist moron, or your possession of normal common sense will convince you that I am absolutely right and you were really misinformed all your life about these matters:

Benzo-a-pyrene: This chemical is a potent carcinogen in rats. It is created in meats during roasting and grilling in the very tasty browned exterior of the meat. If you like BBQ and roasted meats of all kinds, you will ingest lots of rat carcinogen BaP over your lifetime (often sitting in the carcinogenic sunlight at an outdoor BBQ without any sunblock lotion). Should you never eat roasted meats ever again? Here is how research I did in the late 1970s helped make me feel the way I do today about all these so called carcinogens in the whole food supply and environment.

If you conduct an experiment in which rats are daily fed a diet containing BaP at levels talked about above, then cancer rates in those rats will be higher than a comparable group of rats fed a normal rat diet. But here’s the rub, unknown to biochemists in the 1960′s, a family of remarkable enzymes in the liver of the rodent, and in this case rat livers and human livers are similar, will chemically alter the BaP so that it is easily excreted from the bloodstream into the urine and out it goes harmlessly. One big problem for the poor lab rats, however, is that there is WAY too much of this altered BaP to be excreted all at once. So it then circulates around in the blood and goes through their liver a second time. It is chemically changed again, some goes out harmlessly, but we still have overload. This means un-excreted stuff goes through the liver a third time, and is chemically changed again, and then a fourth (still harmless) time! None of this would happen in a lifetime of you eating your beloved BBQ at extremely lower levels. Once and out is the story for BaP in our livers in real life exposure! But finally, for the overdosed rat, the fifth time through it’s overtaxed liver, a VERY potent carcinogen is created that can react with DNA and initiate the cancer process by mutating a susceptible gene in some organ or other. It you want to know, it is a diol epoxide (us chemists have to use some jargon sometime to sound intelligent. So, does common sense say anything about “the dose makes the poison”?

Finally, consider dioxin, the poster chemical of environmental contamination. The EPA once called dioxin the most potent carcinogen ever! It gained fame after the Viet Nam war because it was a contaminant in the herbicide “agent orange” widely sprayed for defoliation. Since then, there have been zillions of claims from veterans of that war (which I didn’t like at all, and I even marched against it as a good Berkeley hippie!) that exposure to it caused veterans to have increased risk of various diseases, including cancer. Subsequent studies have found no correlation with dioxin exposure and any disease, but many special interest groups still believe it is total environmental evil. What we didn’t know then, but know now, is that dioxin is a natural, ubiquitous chemical in the environment. Every time wood is burned in your fireplace, every time there is a forest fire, dioxin is created and spreads all over the place. If I specifically look for dioxin in virtually any food we eat, using today’s analytical techniques, I will find some. And it will be there at higher levels than the EPA considers safe. This is a really big “so what”, because the levels the EPA thinks is safe for humans are about a gazillion times lower than the level that might actually be dangerous, except to male rats. Remember, those EPA “safe” levels are based on rodent studies, and include a huge margin of safety.

Reason Online Logo

Here is real life example: In 1976, a catastrophic explosion occurred in a chemical plant in Seveso, Italy. Literally tons of noxious chemicals, including huge amounts of dioxin, were spewed into the air, only to settle into the soil and people’s bodies. Those workers who didn’t get killed by the explosion, but were heavily doused with dioxin, developed a severe form of acne that lasted for several weeks or months and then healed. That’s it! This is when various environmentalists were saying that a gram of dioxin could kill millions of people! Twenty-five years later, according to the most recent review of cancer mortality among Seveso residents, there have been no significant increases in overall cancers in the general population. In fact, it looks like dioxin protects against breast cancer in women. That is probably just a statistical fluke, but who knows when you are dealing with statistics? Ask Viktor Yushchenko, the new leader of Ukraine, how toxic dioxin is. Before the recent democratic elections in his country, his enemies tried to poison him by giving him a huge dose of dioxin. They believed the “conventional wisdom” of the world’s “green” groups that dioxin is really, really toxic. They could easily have killed him with traditional poisons such as arsenic, cyanide, ricin (a truly potent “natural” poison from castor beans) or whatever. Instead, they gave him a really bad case of acne from which he will recover, not need to worry about any future cancer, and be able to lead his nation on to greatness. There will be a part two of this carcinogen discussion in the future. Just keep visiting Ecoworld!

About the Author: Edward Wheeler, Ph.D, is a very old biochemist, who actually conducted pioneering cancer/nutrition research in the 1970′s and ’80′s for the U.S. Dept. of Agriculture. He authored some really, really good research papers in journals such as “Cancer research” and others. So he really, really knows what he’s talking about!!! Wheeler’s earlier essays from the “Monsters in the Closet” series are

Bring Back DDT?, and
GMOs, Salvation or Monsters?

Email the Editor about this Article
EcoWorld - Nature and Technology in Harmony

Posted in Animals, Carcinogens, Causes, Chemicals, Coal, Other, Policy, Law, & Government, Smoking, Television2 Comments

Free Market Environmental Protection

Futuristic Artwork by Tim Cantor
image – Tim Cantor

MATCHING THE PROBLEM TO EACH LEVEL OF GOVERNMENT


When the twenty-fifth anniversary of the first Earth Day arrived this past spring, there was much to feel good about. Significant strides have been made in protecting the earth’s biological envelope, and environmental awareness has surely been heightened. But there was also a sense of foreboding. In news comments on the occasion, came expressions of concern that pending legislative reforms such as compensation for regulatory takings, the imposition of limited cost-benefit and risk analyses, and revisions of the Clean Water Act world roll back 25 years of progress. Some even suggested that were it not for federal laws, there would be no environmental protection at all, implying that communities nationwide would stand idly by as their home territories became open sewers.

Though unreasonably costly and at times misguided, the threatened federal regulation has nonetheless provided a massive mechanism for protecting environmental quality. Yes, the air is cleaner than it was 25 years ago, more miles of rivers are swimmable, and lots of contaminated soil has been hauled from Superfund sites. But change is in the works. Even EPA has admitted that the priorities for action forced on the agency do not match the real needs for reducing environmental risks and improving air and water quality. Meanwhile memberships in national environmental groups seem to be peaking. By contrast, grass roots activities are burgeoning.

Surely there is room for improvement. We know that no other industrialized country has a Superfund program as costly and ineffective as ours. No other advanced economy relies as heavily on centralized command and control regulation, and practically every other industrialized nation is light years ahead in organizing river basin associations and similar decentralized organizations for managing water quality.

Today, the U.S. alone seems trapped in a regulatory pit that emphasizes uniform command-and-control regulation, penalties and confrontation, while paying scant attention to outcomes. In short, we rely on an antiquated, inflexible, top-down regulatory structure that was designed for a 1970 smokestack economy. Our command-and-control system was once justified by its ability to assemble and apply costly and scarce information. Yet we live in a new global age where microchips, low-cost information transmission and decentralized decision making rule the day.

Think of the changes that have transpired worldwide in the last quarter century while the basic regulatory blueprints hardly changed at all. When our basic environmental statutes were being designed, there were no FAX machines. There were no personal computers. There was no Internet for E-mail for giving instant information on polluter behavior. There was no CNN to inform the world of environmental calamities. Indeed, there was no way to monitor stack emissions, so that one could write emission trading contracts. There was no low-cost technology for reading tailpipe emissions from automobiles or for scanning the multiple dimensions of water quality. Smog was a mystery. And the notion of continuous monitoring of environmental use by satellites was just a gleam in some scientist´ s eye. In the early 1970s there was no International Standards Organization responding to marketplace demands for a higher quality workplace and giving private sector certification of environmental control. Today, all this and more is as routine as self-rising flour in grocery stores.

Now consider some of what we have learned about environmental hazards from 25 years experience in dealing with pollution. We now know that industrial discharge is not the source of most carcinogens and other environmental hazards. Indeed, we know that industry today ranks well below government operations and nonpoint sources as the major source of pollution. We have learned that the threat of acid rain is not critically associated with sulfur dioxide emissions, that life expectancies are increasing and that forests are flourishing.

Over the course of the last two decades, we have learned that the demand for environmental quality that comes with rising incomes is a powerful force for delivering environmental improvements, one that will not be denied. Factories, farms, municipal treatment plants, and federal government facilities cannot get by with haphazard treatment of the environment. CNN, Internet, E-mail and desk-top publishers will tell. And ordinary people, the ones we see each day in the market, will bring actions to protect their environment. In other words, there is no danger of seeing an unconstrained “race to bottom.” Now is the time, I believe, to replace costly smokestack regulation with the beneficial forces of the market.

FREE MARKET ENVIRONMENTAL PRINCIPALS

Just what are these beneficial market forces? And what steps can we take to build strong linkages between environmental protection and the actions of millions of uncoordinated people? Is it possible for people to do the right thing without even thinking about environmental quality.? First off, the market process reflects social norms and rules of just conduct embodied in law and custom. Market action is driven by prices that emerge when some people own things that others want. For example, when investors have to give up something they value to obtain a site for building a factory, they are suddenly struck with a conservation ethic. Because they must pay, they carefully consider just how much land they really need. They shop. They search. They become creative. Markets and prices provide incentives. And if factory builders must pay the holders of environmental rights before building or polluting, or suffer the consequences, the investors will work even harder to avoid environmental harms. When environmental rights are protected, environmental protection becomes consistent with profit maximization.

Property rights´ definition and protection form the keystone to the market process. If factories and municipal treatment plants are told that they can pollute within certain limits if they have an EPA permit, then each time we get another certified plant we will get more polluted air and rivers. And if regulators tell industry official s how to design plants and they give their stamp of approval only when the specified technology is in place, then industry has no incentive to search for lower cost technologies and less damaging sites. If owners of downstream property lose their right to sue and stop polluters who damage their property, as our statutes do in some cases, industry incentives are blunted even further.

If cities and states are able to use political powers to locate incinerators and landfills without compensating neighboring property owners for damages, then environmental justice becomes an issue. If in the pursuit of protecting endangered species or sensitive wetlands, government can take property rights without paying, then property owners will have incentives to destroy endangered species and to plow under wetlands before they are observed. On the other hand, if property owners are paid for harboring endangered species and for converting land to public use, then wetlands and endangered species will be more secure. The incentives generated by markets, prices, and property rights can replace a large number of environmental policemen.

MODIFYING THE STATUTORY FRAMEWORK

How might we apply these market principals to the existing statutory framework? First, we might attempt to modify the rules. With few exceptions, the legislative blueprints for regulating air and water pollution require command-and-control regulation that results in end-of-pipe, technology-based standards. Generally speaking, regulations focus on inputs, not outcomes. There is a fixation on point-sources — individual machines and in-plant processes. Hardly any attention is directed toward ecological systems, like river basins, complete aquifers and watersheds. The rules leave little room for innovation and can actually penalize firms that devote attention to unidentified but important sources of pollution while overlooking smaller but specified sources that are listed in the rules.

Put in the simplest terms, air and water pollution control from stationary sources is based on national or large geographic area standards that call for uniform approaches to be taken by broad categories of polluters. There are new-source/old-source standards that penalize expansion of plants and protect existing firms from competitive entry. There are stricter standards for particular regions. And there is little focus on overall outcomes. Indeed, after 25 years, we are just now becoming serious about the provision of extensive monitoring data that inform us about relative improvement.

How might we modify the existing blueprint? Allow me to offer a few suggestions.

1) Ditch command-and-control regulation in favor of performance standards. Identify specific goals to be achieved and allow managers to figure it out from there. Let flexibility and the desire to minimize cost become the guiding principals. Let the regulators focus on outcomes.

2) Eliminate new-source/old-source biases. Let competition work at all margins.

3) Establish a complete national emissions monitoring system and, by statute, provide a detailed annual report card that gives reliable data on the environment, by pollutant, river, major lake and stream.

4) Expand emissions and effluent trading to encompass all criteria pollutants. To facilitate emission trading between and among facilities with diverse ownership, provide baseline data and systems analysis so that all contracts can be written and enforced.

5) Require experiments with river and air-shed management systems where all rules for plants and other individual sources are relaxed and the river or air shed is given a measurable environmental goal to be achieved. Include point and nonpoint sources of pollution in the basinwide approach.

For hazardous waste, better known as superfund:

1) Make Superfund a public works project with, for example, a 75/25 percent local/federal sharing of cost, with the proportion based on the expected share of benefits.

2) Eliminate strict joint and several liability as basis for recovery of costs. Replace with joint and several liability only for polluters who violated laws at the time waste was created at a site.

3) Involve states and communities in establishing a triage system that based on intended use identifies sites that should be cleaned, those that should be paved or fenced and guarded, and those that should be monitored for potential cleanup.

4) Eliminate drinking water as the standard to be met from a cleaned Superfund site. Replace with ambient standards accepted by a population for their rivers and lakes.

SUGGESTIONS FOR A CLEAN START

So much for tinkering at the margins of smokestack regulation. Starting with a clean slate calls for careful examination of the fundamental role of government. We must ask 1) what is the purpose of government in the context? and 2) what level of government? Under a market approach, government has the responsibility of protecting citizens from the harm of others, protecting property rights, and reducing transaction costs among parties who seek to solve environmental problems. These duties relate to governing at all levels — the community, the state, the nation, and with regard to issues involving national boundaries. As to the level of government, we should consider the dimensions of the environmental problem being addressed, which is to say the extent of the harm that might be generated by a polluter or group of polluters.

A classification of environmental problems — activities that impose harm on other people, their property or things they prize — will surely yield a matching of some problems with each level of government. For example, a hazardous waste site that does not pose a risk to an aquifer is surely a local problem. One that contaminates a multistate aquifer will require another level of government. Air pollution that imposes costs within a state´ s boundaries is one thing; pollution that imposes costs across a region or across a national boundary is something else. Making such a list will carry us some distance in identifying the appropriate level of government for protecting property and reducing transaction costs among parties who seek to resolve pollution problems.

Decentralizing the problem breaks the national monopoly, generates a multitude of experiments, and allows citizens to vote with their feet. All evidence suggests that the costs of environmental improvements will fall, that those who have the greatest incentive to address environmental concerns will have a larger voice in determining outcomes, and that those who reap the benefits will bear the costs.

But what about the mechanisms that might be used? A review of the nation´ s history and consideration of the experience of other countries inform us of alternatives. For several hundred years, environmental rights were protected by the common law of nuisance and trespass. Cases involving water, air, odor and hazardous waste pollution were adjudicated in common law courts. Where many receivers of pollution were involved, citizens turned to public prosecutors and the public nuisance law. Where pollution crossed state lines, a state attorney general brought suit in behalf of citizens. Just prior to enactment of federal statutes, federal common law was emerging for protecting environmental rights among the states. To a large extent, that fertile field of control was snuffed out by legislation.

We cannot know how common law would have evolved in the absence of monopolized regulation. Perhaps specialized courts would have developed with special masters dealing with highly technical issues. Yet, like statute law, common law did not work perfectly, but the remedies were tough — injunction and damages. Of special importance, only those who could demonstrate harm or potential harm could bring action. A passerby had no standing to bring suit against a polluter whose activities were legitimate in the eyes of the courts and community members.

The common law process has something else to recommend it. It is impossible for a special interest to lobby all common law judges and obtain uniform rulings across the entire country. Put differently, it is very costly to seek rents through the courtroom.

But does common law really work? Today, practically all of the freshwater fish in the United Kingdom are owned by private parties, and have been for generations. The property of the angling clubs is protected by common law. If a polluter, be it a city, and industry or government enterprise, damages a fishery, the angling clubs brought suit. In the last 20 years, some 2,000 suits have been brought. Five have been lost. The result: The rivers of England are clean, in some cases cleaner than drinking water standards require. Market forces can protect water quality.

Prior to the formation of EPA, multistate compacts and commissions were at work dealing with water and air pollution. Ordinary people with good sense recognized that dimensions of environmental problems did not necessarily coincide with the dimensions of existing political bodies. The Ohio river basin commission, which included the states of New York, Pennsylvania, West Virginia and Ohio, was one of the better known operating systems. Through the joint effort of several states, the Ohio river was cleaned, continuously monitored and managed. At the time of EPA´ s founding, there were discussions underway to expand the Ohio river system to include states that bordered the Tennessee River. Quite possibly, river basin associations and multistate compacts would have eventually encompassed all major rivers. Federal legislation ended all that.

Today, in France every major river, their tributaries and coastal zones are managed as six systems with independent governing bodies that work to manage discharges to and withdrawals from all bodies of water. Major rivers in Germany, Scotland and Australia are similarly managed. Efforts are underway in Mexico, Ecuador, Brazil and Canada to develop similar approaches. There is no command-and-control. Water users are given performance standards to meet. In some cases, discharge and withdrawal fees are imposed. In Australia, pollution rights are traded to minimize the cost of controlling salt infusion.

One such experiment is now operating in the U.S., just one. It encompasses North Carolina´ s Tar River where point and nonpoint dischargers are working together in a river basin association. EPA´ s command-and-control regulations are relaxed. As a result, the Pamlico Sound, the receiver of waters from the Tar, is recovering. Costs are reduced dramatically by members of the association.

A review of history and current experience suggests the following:

1) Levels of governmental control and assistance should match the dimensions of the environmental problem being addressed. In some cases, new governing bodies will need to be developed. Regional compacts will be needed. Enabling legislation that forms river basin associations may be required. The national government should focus its attention on environmental problems that are truly national in scope. In all such cases, the national government should focus only on setting performance standards, enforcing those standards, and reporting on progress. There should be no command-and-control regulation for stationary pollution sources.

2) States should be empowered to manage environmental quality within their boundaries with the means and instruments not specified. Multistate participation and river basin associations should be assisted by the national government. States should be left free to invigorate common law remedies or any other control instrument they desire. The emphasis should be on outcomes, not inputs.

3) When the federal government regulates, as in the case of air pollution that affects several states or in the case of mobile sources, the regulatory agency should be required to justify all actions on the basis of scientifically based risk assessment and cost effectiveness. The regulatory agency should be required to report annually on the status of each action taken, its justification, and measured effects.

4) The national government should strengthen its research and development activities and make its expertise available to other governing bodies on a fee basis. The national government should strengthen its role in monitoring environmental quality and regularly provide scientific evidence on the condition of the environment.

There is a role for government in protecting environmental rights and in protecting people from environmental harm. As mentioned here, it is a role that supports property rights, markets and competition. Until now, too much attention has been focused on procedures, process, inputs and on criminalizing the innocent behavior of citizens. As a people, we have chosen a high cost, low result route, and we have learned a lot. Indeed, we should know more about environmental regulation than any other people on the face of the earth. Now is the time to rethink what we are doing. to learn from our own experience and that of others, and to make the 21st century a time when we can truly take the environment seriously.

Email the Editor about this Article
EcoWorld - Nature and Technology in Harmony

Posted in Air Pollution, Carcinogens, Conservation, Drinking Water, Effects Of Air Pollution, Fish, Hazardous Waste, History, Justice, Landfills, Organizations, Other, People, Policy, Law, & Government, Regional, Science, Space, & Technology, Water Pollution0 Comments

No Posts in Category
Advertisement