Archive | Journalists

Canadian Scientists Claim 'muzzling'

OTTAWA, Sept. 13 (UPI) — Canadian scientists say the country’s government is trying to muzzle them, controlling what they say and who they talk to.

Scientists with Natural Resources Canada say they were told this spring they would need “pre-approval” from the office of Minister of Natural Resources Christian Paradis to speak with national and international journalists, Postmedia News reported.

Documents show the rules apply not only to contentious issues but benign subjects, such as floods that occurred 13,000 years ago, Postmedia said.

Under the rules, critics say, Canadians are being cut off from scientists whose work is financed by taxpayers and is often of significant public interest on issues like fish stocks, genetically modified crops or mercury pollution in the Athabasca River.

“We have new media interview procedures that require pre-approval of certain types of interview requests by the minister’s office,” Judy Samoil, NRCan’s communications manager, wrote in an e-mail to colleagues.

The policy applies to “high-profile” issues such as “climate change, oilsands” and when “the reporter is with an international or national media organization,” she wrote.

The ministry defended the new rules.

“The minister is the primary spokesperson for Natural Resources Canada. As such, he needs to be made aware of issues in the media which involve the department so he can effectively fulfill his role,” a statement from the minister’s office said.

“It’s Orwellian,” said Andrew Weaver, a climatologist at University of Victoria.

“The sad reality is that these guys in Ottawa think federal scientists work for them.They don’t, they work for the people of Canada,” he said.

“This is science funded by Canada for the public good. It is not science funded to produce briefing notes for ministers so they can get elected in the next federal campaign.”

Copyright 2010 United Press International, Inc. (UPI). Any reproduction, republication, redistribution and/or modification of any UPI content is expressly prohibited without UPI’s prior written consent.

Posted in Fish, Journalists, Other0 Comments

UPI NewsTrack Health and Science News

Broadband used in ocean images

FALMOUTH, Mass., April 2 (UPI) — Two advanced broadband acoustic systems will help oceanographers pinpoint tiny zooplankton even in rough seas, scientists in Massachusetts said.

In terms of acoustics, the systems are equivalent to changing from black-and-white television to high-definition color television, researchers at Woods Hole Oceanographic Institution said in a release Thursday.

The systems showed highly promising results when tested recently off the U.S. East Coast, researchers Tim Stanton and Andone Lavery recently wrote in the Journal of Marine Science.

The new systems measure sound scattering over a continuous range of frequencies, rather than just a few frequencies.

The advanced broadband ability will aid oceanographers in classifying fish and zooplankton, no matter the turbulence of the water, Stanton and Lavery said. Previously, it was impossible to know whether sound waves were being scattered by high turbulence or by the objects being studied.

Stone Age Scandinavians lactose intolerant

STOCKHOLM, Sweden, April 2 (UPI) — Stone Age hunter-gatherers who lived along the southern coast of Scandinavia 4,000 years ago were unable to digest milk, researchers said.

The findings support a widely held theory that modern Scandinavians descended from people who arrived in the area after the Stone Age population.

Unlike modern Scandinavians, the DNA of the hunter-gathers shows they were lactose intolerant, said researchers at Stockholm University and Uppsala University.

The ability to digest milk is closely associated with the transition from hunter-gatherer populations to agricultural societies, said researcher Anders Gotherstrom.

Gotherstrom is the coordinator of a European Union-funded project focusing on the significance of milk in European prehistory.

“The findings are indicative of what we call ‘gene flow,’ in other words, migration to the region at some later time of some new group of people, with whom we are genetically similar,” Gotherstrom said.

Worm gene could offer clues to human aging

BIRMINGHAM, England, April 2 (UPI) — A gene greatly involved in determining the life span of a laboratory worm could offer clues to aging in people, researchers in Britain said.

The gene DAF-16 helps determine the rate of aging and the average life span of Caenorhabditis elegans and its close evolutionary cousins, researchers at the University of Birmingham said.

The gene is found in many other animals, including humans, said researcher Robin May, who led the study.

May’s team compared longevity, stress resistance and immunity in four related species of worms. In general, high levels of DAF-16 activity correlated with longer life, increased stress resistance and greater immunity against some infections.

Future studies will examine how DAF-16 coordinates a complex network of genes to balance the differing needs of an individual’s immune system over time, May said in Thursday’s edition of PLoS ONE.

Appeals court sides with scientist

LONDON, April 2 (UPI) — A British court says science writer Simon Singh can use what is known as “defense of fair comment” in a libel case brought against him by chiropractors.

The landmark ruling by the Court of Appeals could exempt scientific criticism from claims of defamation by companies or organizations, The Times of London reported Friday.

Singh is being sued by the British Chiropractic Association for a piece he wrote in April 2008 suggesting there was a lack of evidence that chiropractors had successfully treated colic and asthma in children.

The appeals court Thursday overturned a High Court ruling that held Singh’s comments were factual assertions, not expressions of opinion that would allow him to use a “defense of fair comment” to defend himself.

The appeals court said the original ruling threatened to silence scientists and science journalists. The chiropractic association said it may appeal Thursday’s ruling to the Supreme Court.

Copyright 2010 United Press International, Inc. (UPI). Any reproduction, republication, redistribution and/or modification of any UPI content is expressly prohibited without UPI’s prior written consent.

Posted in Fish, Journalists, Other0 Comments

Appeal's Court Sides with Scientist

LONDON, April 2 (UPI) — A British court says science writer Simon Singh can use what is known as “defense of fair comment” in a libel case brought against him by chiropractors.

The Court of Appeal’s landmark ruling could exempt scientific criticism from claims of defamation by companies or organizations, The Times of London reported Friday.

Singh is being sued by the British Chiropractic Association for a piece he wrote in April 2008 suggesting there was a lack of evidence that chiropractors had successfully treated colic and asthma in children.

The appeal’s court Thursday overturned a High Court ruling that held Singh’s comments were factual assertions, not expressions of opinion that would allow him to use a “defense of fair comment” to defend himself.

The appeal’s court said the original ruling threatened to silence scientists and science journalists. The chiropractic association said it may appeal Thursday’s ruling to the Supreme Court.

Copyright 2010 United Press International, Inc. (UPI). Any reproduction, republication, redistribution and/or modification of any UPI content is expressly prohibited without UPI’s prior written consent.

Posted in Journalists, Other0 Comments

End of Freezing in Many States by 2099?

PRINCETON, N.J., March 31 (UPI) — A U.S. organization called Climate Central says its preliminary figures show average future U.S. March temperatures will be higher because of climate change.

The organization’s scientists say they have just published an interactive map showing what various states might expect during the third month of years to come.

Climate Central said its map uses special high-resolution projections covering the lower 48 states to show where average March temperatures are expected to be above or below freezing each decade this century. The group’s maps also compare projections under a reduced-carbon-pollution scenario versus a high one that extends current trends.

The scientists say their findings show either a high or complete loss of freezing zones by the end of the century in every state analyzed.

“Minnesota, Montana and North Dakota would lose the most total below-freezing area, while seven other states — from Arizona to Wisconsin — are projected to lose all they currently have,” the scientists said in a statement.

“These maps imply future changes the research community is only beginning to appreciate,” said Climate Central scientist Ben Strauss.

Climate Central says it is an organization of scientists and journalists “dedicated to communicating the best and latest climate science.”

The maps are available at http://www.climatecentral.org/.

Copyright 2010 United Press International, Inc. (UPI). Any reproduction, republication, redistribution and/or modification of any UPI content is expressly prohibited without UPI’s prior written consent.

Posted in Journalists, Maps, Other0 Comments

Oil Has Major Impact on Russia's 2010 Budget

WASHINGTON, Sept. 23 (UPI) — The Russian government has approved draft legislation for the 2010 federal budget and projected provisional revenue and expenditures for 2011-2012.

RIA Novosti reported Wednesday that Deputy Prime Minister and Finance Minister Aleksei Kudrin told journalists, “The budget has been approved and will be submitted to the State Duma by October 1.”

While the dramatic slump in global energy prices has severely impacted the Russian economy, the world’s second-largest oil exporter, under the draft law budget revenue in 2010 will total $232.1 billion, 16.1 percent of GDP. According to the draft budget, oil and gas revenue is to grow 14.2 percent to $106.6 billion, while non-oil and gas revenue is projected to decline 4.1 percent to $125.4 billion.

According to Kudrin, in 2010 overall government expenditures will increase over 2009 levels despite a 20 percent reduction in spending on state management. The draft budget is based on conservative forecasts for Russia’s 2010 socioeconomic development and the projected 2011-2012 time frame.

The budget uses a baseline prediction of Russian Siberian Urals oil trading on the global market at $58 per barrel in 2010, $59 in 2011 and $60 in 2012.

Copyright 2009 by United Press International

Posted in Energy, Energy & Fuels, Journalists, Other0 Comments

Urban Cold Islands

In a story today in the Los Angeles Times entitled “To slow global warming, install white roofs,” author Margot Roosevelt reports on a recent study that concludes, if you take it at face value, that all we have to do is paint all of our urban rooftops and pavements white and “the global cooling effect would be massive.”

According to Hashem Akbari, a physicist with the Lawrence Berkeley National Laboratory, a 1,000-square-foot roof — the average size on an American home — offsets 10 metric tons of planet-heating carbon dioxide emissions in the atmosphere if dark-colored shingles or coatings are replaced with white material… Globally, roofs account for 25% of the surface of most cities, and pavement accounts for about 35%. If all were switched to reflective material in 100 major urban areas, it would offset 44 metric gigatons of greenhouse gases.”

It would be interesting to understand exactly what Akbari means by this. Is this 44 metric gigatons per year? If not, over what period of time would reflective roofs collectively offset these 44 gigatons? Considering all the nations of the world combined are still emitting somewhat less than 30 gigatons per year, this is a very impressive statistic, no matter how you slice it.

Dr. Akbari’s study isn’t what one should necessarily question, however. We have always insisted the role of land use changes are greatly underestimated when assessing regional climate trends; from tropical deforestation to enhanced thermal absorption due to aquifer depletion to urban heat islands. More pertinent is why the role of urban heat islands – dismissed by the press as a Crichtonian fabrication – has never had credibility, but suddenly the global cooling potential of urban cold islands is cause to legislate? Read the IPCC’s 4th Summary for Policymakers (watch out, it’s 3.6 MB) – you will note the role of land use in causing global warming is minimized, and the role of urban heat islands is negligible. Can you have it both ways?

Another interesting paradox here is the following statement from the report “Globally, roofs account for 25% of the surface of most cities, and pavement accounts for about 35%.” Well maybe if we let people have yards again instead of cramming them into cluster homes, there would be enough land for people to plant trees and create an urban canopy. And if our cars are all soon to run on wind and solar power, maybe we should quit trying to force people out of them and into government operated light rail, busses, and “jitneys.”

The real take-away here is, once again, that there is very little certainty regarding the causes, the severity, or even the direction of climate change. The rhetoric and the conventional wisdom is way behind the latest science and observational data. The policymakers and pundits who have ridiculed the notion of an urban heat island are the same people who are uncritically reporting we must now make every road and roof reflective to mitigate this heat island. There’s nothing wrong with making rooftops reflective to save energy – but does every sensible green product have to incorporate avoiding doomsday in their marketing and lobbying strategy?

Climate change is not a trivial issue. Concern about climate change is nothing to be mocked. But if you removed from the alarmist coalition the people who condone this alarm because they like the side effects – bigger government, more funds for environmental groups, nonprofits and academia, more taxes so the public sector can avoid fiscal reform, more subsidies and regulations so large corporations can crush emerging corporations, and greater energy independence – the only good side effect on that list – you aren’t left with much. At the least, journalists and scientists should recover their innate skepticism, the lifeblood of their professions, and not abdicate their responsibility to point out this contradiction – the IPCC dismisses the heat island effect, yet today’s latest scientific study claims if we made our cities reflective “the global cooling effect would be massive.”

Posted in Cars, Causes, Effects Of Air Pollution, Energy, Global Warming & Climate Change, Journalists, People, Regional, Solar, Wind2 Comments

Unionizing Silicon Valley?

Does that get your attention? It should, because when that happens, the Silicon Valley will become the sister city of Detroit, with the only difference being Detroit gave way to union power fifty years earlier, and is still paying the price. Silicon Valley is a meritocracy, and as long as it stays that way it has a chance to maintain its high-tech dominance.

Unions in 21st Century America are not nearly the same creature they were fifty years ago. Back then unions legitimately fought for rights and benefits that have now largely become institutionalized – safe workplaces, reasonable work hours, competitive pay. Back then American heavy industry enjoyed nearly a monopoly position, and as a result businesses such as the Detroit automakers could afford to grant generous concessions to unions – including pension benefits whose financial sustainability relied on the assumption Detroit’s factories would always be hiring more people than they were retiring. When the world caught up with America, and the big three automakers no longer could experience annual growth in revenue and employees, the cost of these pensions became burdens that crippled them.

The only true monopoly left in America today, however, is in the government. In the public sector there is no global competition, and there is no problem growing revenues, since all you have to do is increase taxes and fees to increase revenue. As a result, unions who had ruinously wrung every dime out of America’s automakers, nearly killing them in the process, and who saw no percentage in trying to organize, say, Walmart employees, have taken over the public sector. And the pension debt now carried by public entities is the biggest liability, by far, most of them will ever face. Any politician who questions this reality is crushed by public sector unions, who collect mostly mandatory dues from millions of public employees and deploy this money to exercise nearly absolute control over elections at the state and local level.

At the same time as union power has shifted to the public sector from private industry – because, sadly, it is easier for unions to control the public sector – the financial epicenter of union power has become those pension funds who manage all the wealth they have confiscated from taxpayers (who have to retire on social security in their 60′s), to provide to unionized public sector workers who retire in their 50′s. These public employee pensions have become so generous, in most cases a private worker would have to have saved over a million dollars in their personal retirement fund for the annual interest during retirement to match the pension of even the lowest echelon of workers in the public sector. Since unionized public employees make far more in base pay than globalized private sector workers, amassing that million or more is problematic for most taxpayers, to put it mildly.

It is in this context that CALPRS, the retirement fund that manages pensions for California’s workers, and AFSCME, the American Federation of State, County and Municipal Employees, have launched a public relations assault on Larry Ellison, CEO of Oracle. Unlike ordinary workers in the public sector, Ellison didn’t simply show up four days per week for 25-30 years so he could retire a millionaire. Ellison’s billions were earned because he rose to the top in the business ecosystem of the Silicon Valley, a place where merit still counts for something. A place where if an employee is productive they are rewarded, and if they are incompetent, they’re fired. A place wholly dissimilar to the unionized public workplace where the only way you can get fired is by being politically incorrect, and job security is furthered if you never solve problems.

CALPRS, along with other pension funds fueled by our taxes, have taken huge stakes in companies like Ellison’s Oracle, and, as reported yesterday in the Los Angeles Times, have suggested it is time for Oracle’s board to adopt a “say on pay” plan, wherein Ellison, a hero who has helped keep America competitive and created jobs for over 84,000 people, will have to periodically justify his compensation package.

This is not a moral crusade, or even a genuine initiative; it is a public relations stunt, part of an ongoing attempt to keep voters focused on the ultra-high pay of a handful of extremely successful private sector executives, instead of on the obscenely inflated pay and benefits of literally tens of millions of unionized public employees. This sort of propaganda is infantile, playing to emotions of resentment and envy, and relying on the utter financial ignorance of most journalists as well as the general population. The venal reality is public sector workers usually make 2-4x what private sector workers make, their retirement benefits are totally unsustainable, and instead of merging funds like CALPRS with Social Security, which would benefit the U.S. economy and protect the interests of ALL workers, they point the finger at people like Larry Ellison to cloud the issue.

Ultimately, public employee unions and their pension fund managers such as CALPRS, will hopefully be smart enough to leave Oracle alone. After all, by investing in the globalized private sector, public employee pensions have a better chance – still remote – of remaining solvent. But voters and investors should understand that public employee pension fund influence on the boards of major corporations represents union influence – and once they control the boards they will control the company. Instead of organizing from the bottom up, Silicon Valley risks becoming unionized from the top down.

For more read posts in our Public Sector Reform section. Also recommended is the website Pension Tsunami.

Posted in Journalists, Other, Policy, Law, & Government4 Comments

Green Car Components

There are dozens of credible companies rolling out next generation cars. From the GM Volt, now barely two years away, to the start-up Tesla Roadster, the list of companies aspiring to deliver the next generation car is growing almost as fast as the denizens of newly minted green journalists rushing to cover their progress. But what about the components?

Three interesting companies provide a encouraging glimpse into progress occurring upstream of the finished vehicle, all of them working on ways to dramatically improve the performance of the internal combustion engine.

In Camarillo, California, Transonic Combustion is developing an engine that can allow “operating conventional reciprocating piston gasoline engines at ultra-high compression ratios.” Through a combination of innovations; advanced combustion chamber geometries, advanced thermal management, precise ignition timing, “revolutionary” thermal management and electronic valves, Transonic is designing an engine that will deliver extremely high fuel efficiencies. Also significant, if they are successful, will be the ability of their engine to operate on fuel blends, including biofuels, at efficiencies superior to what they deliver when fueling conventional engines.

Meanwhile, in Michigan, the automotive capital of America, EcoMotors International is developing an engine that also aspires to deliver extremely high fuel efficiency. On their website’s home page, EcoMotors has a fascinating animation that shows their engine in action. In this design, the engine cylinders lie horizontally, and each cylinder essentially has two pistons, one moving backwards and one moving fowards. These horizontal cylinders are constructed in pairs, so that when one of them is in an expansion stroke, the other one is in a compression stroke. The crankshaft is placed between the cylinders, and four sets of connecting rods turn the engine, one from the back of each piston, and one from the side of each piston that faces the crankshaft. Because every motion generated by the power stroke inside these cylinders is offset by a countermotion in this perfectly symmetrical design, far more of the power generated by the fuel combustion is passed on to the crankshaft, and far less material is necessary to construct the engine block. EcoMotors is a very interesting company.
post resumes below image

The Zajac engine – using conventional pistons
with an external combustion chamber.

Returning to California, this time to San Jose, Zajac Motors is pioneering what is perhaps the most interesting twist yet on the internal combustion engine, if they can pull it off. The Zajac engine has removed the combustion chamber from the cylinders altogether, relying on an external chamber to burn the fuel, then through a complex set of electronic valves, releasing the gas into cylinders dedicated to the expansion stroke, and oxygenating the external chamber with a set of smaller valves that are dedicated to providing the compression stroke. If the Zajac engine works, it will also provide a leapfrog improvement to fuel efficiency, at the same time as the use of an external combustion chamber will allow far cleaner burning.

The automotive world is being transformed today at a pace not seen since the dawn of personal transportation over 100 years ago.

Posted in Cars, Journalists, Other, Transportation1 Comment

Cellulosic Ethanol

WHAT IS IT, CAN WE MAKE IT COST EFFECTIVELY, AND WHEN?
Ethanol Pace Car
The pace car for the 2008 Indianapolis 500 ran
on E85; the race cars burned 100% ethanol fuel.

Last month, for the first time in history, the cars racing in the Indianapolis 500 were fueled by pure ethanol. This should put to rest any concerns about ethanol lacking sufficient energy density to function as a motor fuel.

While the absolute amount of energy contained in ethanol is somewhat lower than gasoline – about 76,000 BTUs per gallon for ethanol compared to about 116,000 BTUs per gallon of gasoline – ethanol has higher octane, generally speaking 110 or more vs. 90 or less, allowing ethanol to run in higher compression, higher efficiency engines. A car optimized to run on ethanol can get comparable mileage to a car optimized to run on gasoline.

There are other concerns about ethanol, for example, the notion that it takes more energy to manufacture ethanol than the energy value of the fuel itself, the suggestion that it isn’t “carbon neutral” after all, and the whopper, the accusation that ethanol production has taken food crops out of production. All of these concerns have some validity, but are shrouded in complexities that defy simple characterizations or easy conclusions. Yet that is what has happened. A few years ago, biofuel in general, and ethanol in particular, could do no wrong. Today the situation is reversed, and around the world, for the most part the powerful media and environmentalist communities have turned on biofuel.

In many respects this awakening is healthy – when mandatory carbon offset trading in the European Community was subsidizing rainforest destruction in southeast asia to make way for oil palm plantations, something was clearly out of whack. But corn ethanol in the USA has drawn the most visible criticisms. California’s Air Resources Board, struggling to implement a lower carbon fuel standard, has recently determined, perhaps correctly, that hauling tank cars by rail over the Rocky Mountains from Iowa to the west coast probably eliminates any carbon neutrality ethanol may have otherwise enjoyed. In Washington D.C., the political backlash continues to build against the subsidies corn ethanol receives, with increasing urgency due to the global food shortages that are allegedly exacerbated by dedicating so much acreage to corn for ethanol.

Corn Field for Ethanol
In the USA, 10 billion gallons of corn ethanol
will be produced annually within a few years.

There are many responses to these concerns, however. When producing ethanol from Brazilian sugar cane, for example, the energy payback can go as high as 8 to 1. In the case of corn ethanol, most analysts put the payback around 1.5 to 1, and at a margin that thin, there is plenty of room for interpretation. But the analyses that claim corn ethanol’s energy payback is insufficient to justify its use as a fuel ignore the caloric value of the distiller’s grain, a byproduct of corn ethanol production.

Critics of corn ethanol subsidies ignore the value of keeping these dollars in the U.S. to reduce the trade deficit. Those environmentalists concerned about the growing “dead zone” caused by agricultural runoff, presumably destined to grow even faster as we turn more acreage to biofuel, are certainly justified. But it is disingenuous to suggest that because we are distilling corn instead of harvesting grain there is somehow a more urgent problem than before. The dead zone in the Gulf of Mexico needs to be cleaned up. Agricultural runoff is an environmental challenge that awaits cost effective solutions – with or without the reality of biofuel.

The most problematic challenge to corn ethanol undoubtedly comes from those who are concerned it is causing rising food prices. But here again there are many significant factors that in aggregate eclipse the impact of corn ethanol, possibly by orders of magnitude. Rising per capita income in Asia and elsewhere has caused increased consumption of meat products, and livestock requires grain. Estimates vary, but for every calorie of meat consumed, about eight calories of grain have to be grown and fed to the livestock. This phenomenon has caused global demand for grain to grow far faster than it would already be growing due to increasing human population. At the same time, there have been temporary but severe setbacks to global grain output – a drought in Australia, flooding in the American mid-west. If that weren’t enough, commodities speculators have hedged themselves against devaluing dollars and falling asset values in stocks and real estate by purchasing commodities futures – driving prices up more than the forces of normal supply and demand already have.

Ethanol proponents have answered the critics in a variety of ways. The “25×25 Alliance,” an industry group committed to the goal of the USA producing 25% of its energy from renewable sources by 2025, has issued “sustainability principles” for biofuel production. The National Corn Growers Association has compiled a great deal of data in an attempt to debunk the position that corn ethanol is the primary cause of worldwide food shortages and commodity price increases. Automakers are caught in the middle – a powerful environmental lobby demands cars capable of being fueled with alternatives to gasoline, then savagely turns on corn ethanol, despite the fact it is the only motor fuel alternative we’ve got that we can produce in meaningful quantities today.

In any event, corn ethanol isn’t the ultimate solution to biofuel supplies, it is only a transitional fuel. This crucial point is often lost amid the controversy surrounding corn ethanol. It is cellulosic ethanol that has the potential to completely replace petroleum based fuel, and when cellulosic ethanol begins to arrive in high volume, a preexisting ethanol infrastructure – cars that run on ethanol, fueling stations that sell ethanol, and a transportation network to deliver ethanol to retailers – will need to be in place. Corn ethanol is priming the pump for the arrival of cellulosic ethanol.

Within the next few years corn ethanol production in the United States is predicted to top 10 billion gallons. This is not a trivial amount of fuel, given the entire light vehicle fleet in the USA consumes only 15 times that amount. Corn ethanol has already reduced the demand for foreign oil for light vehicle use by about 6.5%. Nonetheless, critics who claim corn ethanol production cannot possibly increase enough to replace petroleum are correct. The math of these critics is elegant – 10 billion gallons of corn ethanol, at 2.8 gallons per bushel and 155 bushels per acre equates to 23 million acres, about 7% of America’s active farm acreage. If you use corn ethanol to service 100% of America’s fuel requirements for light vehicles, you use 100% of America’s farmland.

Once again, however, this math is missing the point. Corn ethanol, distilled from corn mash, is not the end of biofuel, it is just the beginning of biofuel. Even the impressive global production of ethanol from sugar cane is easily eclipsed by the potential of cellulosic extraction. So what is cellulosic ethanol, where does it come from, how can it be produced, and how long will it be before meaningful quantities of this fuel arrive at the corner filling station?

One of the most visible and visionary proponents of biofuel is the noted venture capitalist Vinod Khosla, who early in his career was one of the four co-founders of Sun Microsystems, and has parlayed this spectacular victory into an impressive portfolio of investments in private sector companies. Over the past few years Khosla Ventures has invested in dozens of clean technology and sustainable energy companies, including several top tier biofuel ventures, including Coskata and Mascoma, mentioned later in this report. In a recent research paper written by Vinod Khosla entitled “Where will Biofuels and Biomass Feedstocks Come From ,” Khosla identifies and quantifies the many potential sources of cellulosic feedstock for ethanol fuel. Some of the information on the table below borrows from Khosla’s research, but changes some of the assumptions; other data comes from the U.S. Dept. of Energy.

HOW MUCH ETHANOL FEEDSTOCK IS THERE IN THE USA?
Ethanol Feedstock Chart
At least 1.0 billion tons of ethanol feedstock can be
sustainably harvested each year in the United States.
-

The figures on this table are arguably realistic, not optimistic, based on the following assumptions for each feedstock:

Dedicated land use refers to cellulosic crops, such as miscanthus or switchgrass, planted on 5% of American farmland (total US farmland is estimated currently at 317 million acres), less than is currently planted for corn ethanol production. At a yield of 15 tons of cellulosic feedstock per acre and 100 gallons of ethanol per ton of feedstock, nearly 24 billion gallons of ethanol can be produced each year. While 15 tons of feedstock per acre is more than can currently be grown, it is considerably lower than forecasts of yields expected within the next couple of decades, which range as high as 25 tons per acre.

Winter cover crops would not displace existing farmland, and if they were profitable to grow it isn’t unlikely they could become additional income for farmers on 25% of land already under summer cultivation. At a yield of 3 tons per acre – projections go as high as 5 tons per acre – another nearly 24 billion gallons of ethanol can be produced each year.

Redwood Trees
California’s Redwoods. Forest thinning could help
prevent catastrophic fires, reduce infestations,
and provide hundreds of millions of tons of cellulose.

Excess forest biomass is a difficult number to calculate, but when one considers there are about 750 million acres of forest in the USA (ref. Forest Resources of the United States), as well as the fact nearly all of them have become dangerously overgrown (major factors in more catastrophic fires and beetle infestations, ref. Restoration Forestry), the figure we’ve used of 226 million tons per year is probably quite low. It would suggest a growth in forest mass of less than one-third of a ton per acre per year. And in our estimate, even the figure of 226 million tons is only assumed to be 70% utilized. Forest thinning is a form of stewardship long overdue, it will return America’s forests to their healthier historical densities, and their excess mass will power our engines instead of burn in forest fires.

Construction debris and municipal solid waste are obvious candidates for cellulosic harvesting, and even the non-cellulosic materials can be used as fuel for the extraction of syngas (which is converted into ethanol), or reclaimed as building materials. According to the Dept. of Energy, 325 million tons of these waste resources are produced each year. We have assumed 90% utilization, and only 75 gallons of ethanol per ton, a yield that is below most projections.

Other waste resources are deliberately understated – just our industrial emissions are probably sufficient to deliver 100 million tons of feedstock. Also not included in this analysis anywhere else are crop residue, a huge source of feedstocks, some percentage of which can certainly be allocated sustainably to ethanol production without sacrificing soil health.

It isn’t easy to estimate just how much cellulosic feedstock could be sustainably harvested each year in the USA, but but two things are clear from this analysis. (1) When cellulosic ethanol extraction becomes a commercially competitive process, and the industrial capacity is in place to produce high volumes of ethanol from cellulosic materials, there will be plenty of feedstocks – at least 1.0 billion tons per year; possibly twice that. Cellulosic ethanol definitely has the potential to become a significant source of transportation fuel, and (2) Khosla’s contention that land use dedicated to ethanol production in the USA might actually decrease when cellulosic processing takes over is completely plausible. In the example above, no corn ethanol was produced, and the dedicated acreage committed to cellulosic ethanol was assumed to be 5% of America’s farmland, whereas today corn ethanol is grown on about 7% of America’s farmland.

So how will we convert cellulosic material into ethanol? There are hundreds of companies around the world working on ways to accomplish this, using a variety of technological approaches. Last month, while on a General Motors sponsored tour for automotive journalists, I had the opportunity to visit two companies who are pursuing promising, and very different, solutions to the cellulosic ethanol puzzle.

Our trip began in Chicago on the morning of May 21st, where about a dozen journalists assembled to drive a convoy of GM vehicles, all equipped to run on E85 ethanol. In a completely unexpected turn of events, I found myself behind the wheel in a high riding Chevy Silverado, painted with GM colors that announced to the world the truck’s status as an ethanol fueled vehicle, with extended cab and a monstrous bed. Although I was unaccustomed to piloting such a behemoth, there was excellent road visibility from the cab, and GM’s OnStar tracked my position and provided constant audio directions, so I swung into downtown Chicago traffic, and joined the late morning rush out of town. At one point it was clear we needed to move across a couple of lanes to catch our exit, and to make sure we would safely execute this maneuver amidst the 18 wheelers and such, I found it appropriate to smash the gas pedal to the floor and hold it there. The tactic was brilliantly successful, as this gigantic truck leapt forward with impressive accelleration and increased our speed from 45 to 75 in a matter of seconds. Safely in our place on the correct route, I let off the accelerator and knew the power of corn.

Bill Roe, Richard Wagoner, and Vinod Khosla
Coskata CEO Bill Roe and General Motors
Chairman Richard Wagoner seal the deal, as
early Coskata investor Vinod Khosla looks on.

About 40 miles west of Chicago, in Warrenville, Illinois, are the labs of Coskata, a company that is contending to be the first to commercialize production of cellulosic ethanol.

In February 2008 General Motors invested an undisclosed sum in this three year old private company, whose CEO, Bill Roe, stated “we do not believe we have any remaining technological hurdles.” Coskata is betting on this with a pilot plant they are building in Madison, Pennsylvania, near Pittsburgh. They expect to have this plant operating early in 2009, producing 40,000 gallons of fuel per year. GM intends to use the fuel to test their growing fleet of E85 flexfuel vehicles.

Coskata’s technology for extracting ethanol from cellulose is elaborate, but apparently closer to commercialization than competing processes. Whether or not Coskata’s technology ultimately dominates is harder to assess, but according to Roe, the variable costs to produce a gallon of ethanol using their technology is expected to be under $1.00 per gallon. Here’s how Coskata intends to produce ethanol:

In the diagram below, “Coskata’s Manufacturing Process,” there are three primary steps. First the feedstock is shredded and dried, and fed into the gasifier, where it is reduced to syngas at a temperature of 5,000 degrees. Some of the syngas is used to provide the energy for the conversion process, but about 85% of the syngas is converted into ethanol in step two. A recent study by Argonne National Labs estimates Coskata’s process yields an energy payback of about 8 to 1.

The second step is to feed the syngas into a bioreactor, where microbes eat the syngas and excrete ethanol. These microbes are anerobic, meaning they can’t survive in atmosphere, and they are the result of careful selective breeding whereby they are now 100 times more efficient converting syngas into ethanol than they were when they began the process a few years ago. “We know our microbes can convert syngas to ethanol at commercial quantities, cost effectively,” said Roe.

The final step in the process is to feed the ethanol and water out of the bioreactor into a recovery tank, where the ethanol is extracted and the water is recycled back into the bioreactor.

From the look of things during our visit to Coskata’s lab in Warrenville, about the only bugs left in their process are the bugs in the bioreactor. According to Wes Bolson, Coskata’s Chief Marketing Officer, the company is actively seeking partners among the companies who have access to huge quantities of cellulosic feedstock, and currently have nothing they can do with it. These candidates include timber companies, sugar cane refiners, pulp and paper mills, and waste management companies. Coskata can also partner with companies who already are generating syngas, but haven’t got the bioreactor technology.

COSKATA’S MANUFACTURING PROCESS
Diagram of Coskata's Manufacturing Process
Coskata executives believe their technology is ready today.
-

After spending a half-day at Coskata, our corn fueled convoy got back on the highway and headed south to Indianapolis, driving most of the way on southbound Interstate 65. And as our expedition hurtled through America’s heartland on this beautiful afternoon, as far as the eye could see, across the rain watered endless fertile fields of Indiana sprouted new shoots of spring corn.

If you are within blocks, long blocks, of the Indianapolis Motor Speedway, during the last full week in May, you will likely hear the roar of the engines. And as we neared the track on the morning of May 22nd, we too heard and felt the sound as the drivers did qualifying laps in advance of the 92nd running of the Indianapolis 500. In a thankfully soundproof auditorium on the massive infield of the racetrack, we attended an ethanol summit co-sponsored by GM, where I had an opportunity to meet Dr. Mike Ladisch, Chief Technical Officer of Mascoma. This company, like Coskata, is hot on the trail of commercializing cellulosic ethanol production, but they are pursuing a solution that will not rely on high temperature gasification. Instead, Mascoma is developing a biochemical method to convert cellulose into ethanol. Ladisch, a genial scientist who has taken a leave of absence from Purdue to serve as CTO at Mascoma, was understandably guarded about his company’s technology, but characterized it in the following way:

“The work at Mascoma is based on organisms and processes designed to rapidly break down the components of biomass, convert a range of sugars and polymers of sugars to ethanol, and thrive in a manufacturing environment.”

Mascoma intends to do this in one step using genetically engineered microbes that are capable of performing both processes. This is known as consolidated bioprocessing, or CBP, and perhaps represents the ultimate technology to extract ethanol from cellulose.

Another informed opinion on Mascoma (and cellulosic technology in general) was obtained via email from Dr. Lee Lynd, a professor at Dartmouth who, along with Ladisch, is one of the leading scientists in the world pursuing advanced cellulosic technologies. Here is what he wrote:

“Mascoma has the largest and most focused effort worldwide on consolidated bioprocessing, which I consider to be the ultimate low-cost conversion strategy. If Mascoma is able to continue this aggressive effort, I believe that they will succeed and that they will have the lowest cost technology for converting herbaceous and woody angiosperms (e.g. grass and hardwoods) to ethanol and other biofuels. It is less clear that the Mascoma approach will be best for gymnosperms (softwoods), and this could be a long-term niche for thermochemical processing along with processing residues from biological processing. Mascoma’s business strategy features a ‘staircase’ of process configurations, starting with options that can be commercially implemented very soon and progressing ultimately to CBP.”

How soon will Mascoma and others deploy these technologies? Although Mascoma’s website has an excellent description of the various cellulosic technologies (ref. Consolidated Bioprocessing), exactly when they expect their technology to be ready for commercialization appears to be a closely guarded secret. Other observers, off the record, have stated commercially viable enzymatic processing is 5-10 years away. But advances in biotechnology are happening at a staggering pace, and unforeseen breakthroughs are not something to bet against. On the other hand, even if Coskata, Mascoma, and countless other credible contenders to deliver commercially competitive cellulosic ethanol technologies were all ready tomorrow, it will still take years to build the new refineries and transform America’s light vehicle fleet.

In the meantime, corn carries the weight of being the primary source of ethanol in the USA, as the rest of the infrastructure falls into place. There are already 1,600 ethanol stations in the U.S. – about 1% of all gasoline retailers – and with UL certification imminent the big box chains are going to begin offering ethanol fuel, greatly increasing access. General Motors now offers 15 models of flexfuel vehicles; and they are now producing over 1.0 million of them per year. Other automakers are following suit. All over the world, governments are determining what percentages of ethanol fuel – along with other biofuels, biodiesel in particular – to blend into their transportation fuels.

How long can corn carry the weight of this growth, serving as the transitional feedstock? How soon can hybrids and extended range electric vehicles level off or even reduce the demand for transportation fuel? There is little doubt ethanol is a viable fuel for light vehicles, and there is little doubt cellulosic ethanol feedstocks exist in sufficient sustainable abundance to greatly offset petroleum consumption. Finally, there is little doubt that money and support for cellulosic ethanol commercialization is ongoing; from Washington DC to Detroit to the Silicon Valley, everyone is on board. The uncertainty lies in whether or not the new technologies to extract ethanol from cellulose will emerge in months or decades, and in how fast we can build large scale industrial capacity to exploit these new technologies. Look to pilot plants in Madison, Pennsylvania, and elsewhere, for early indications of what may come, and when.

Email the Editor about this Article
EcoWorld - Nature and Technology in Harmony

Posted in Cars, Consumption, Drought, Energy & Fuels, Infrastructure, Journalists, Other, Science, Space, & Technology, Transportation, Visibility, Waste Management1 Comment

Who Watches the Watchers?

STUDIES SHOW: HOW RESEARCHERS CAN MANIPULATE STATISTICS AND HOW THE MEDIA GOBBLES IT UP UNCRITICALLY
Heron
The magnificant Great Blue Heron
Could measured use of DDT still have saved
the Heron, but also millions of human lives?

Editor’s Note: Any issue where science and public policy collides can fall prey to some combination of political opportunism and scientific corruption. Even when motives are pure, there is still potential for well intentioned researchers to go down paths that are later revealed to be completely off the track. When powerful vested interests and deeply rooted emotions intersect, the truth is only one card in the deck, hard to find, and relatively easy to stack.

The following report by veteran EcoWorld science correspondant Edward Wheeler identifies the “seven deadly sins” of epidemiological studies, and how many of these flawed studies pass from the laboratory press release into the uncritical hands of journalists and before you know it, are enshrined in new legislation or regulations. But far too often these studies are not nearly as conclusive as they are made to appear, and the consequent actions we take are not rational.

The point of all this goes beyond just epidemiology, to the relationship between scientific inquiry, media reporting, popular sentiment and public policy. Scientists who indulge in dramatic proclamations, becoming rich and famous in the process, need ongoing critical review. Today one has to ask: Is scientific peer review a way to challenge conventional wisdom and expose conclusions that aren’t clearly indicated by the underlying data, or has peer review become precisely the opposite – a way to exclude contrarian notions? Have certain scientifically developed hypotheses prematurely assumed the mantle of truth beyond debate?

Who will watch the watchers, when the watchers are our scientists, whose currency of reason is so arcane, so specialized and diverse, that nobody, not even among the scientists themselves, has sufficient credentials to question the conventional wisdom? The first step is to remember the fallibility of studies, to restore the innate and vital skepticism of journalists, and to remind the public that debate is the crucible of truth. To that end, read on. – Ed “Redwood” Ring

Studies Show – How researchers can manipulate statistics and how the media gobbles it all up uncritically.
by Edward Wheeler, April 29, 2008
California Coast with Cliffs
California’s magnificant Central Coast,
home to the elusive North American Condor.
Saving this precious species is one of
environmentalism’s finest achievements.

We should all be scared, VERY scared. It seems as if every day a new “study” is reported somewhere in the national media showing a statistical association between diet, lifestyle, or environmental chemicals and some disease or disorder.

Do you eat the “wrong” foods such as red meat, hot dogs, french fries, coffee, alcohol, grilled meats, too much fat, artificial sweeteners, preservatives, or NOT eat enough vegetables?

Are you overweight, and don’t exercise enough? Do you use deodorants, mouthwash, nail polish, electric razors or blankets, cell phones? Do you live near power lines, use birth control pills or take hormone treatments, have some radon in your basement, breathe polluted air or second hand smoke?

Do you worry and fret about all these things after reading the terrifying results of some new study? Then for sure you will surely die from some form of cancer or heart disease sometime next week, probably from the stress and loss of sleep of worrying so much!

All these studies are called epidemiological studies, which seek to find statistical correlations, mostly quite subtle, between diet, lifestyle, or environmental factors and disease. Real sciences, like chemistry and physics, seek to find cause and effect. Epidemiological studies supply only statistical links between this or that risk factor and some disease. Such studies almost never prove cause and effect, and they are subject to researcher bias and political agendas, poor design, confounding variables, bad data gathering and more. Unfortunately, most reporters who write articles on these studies are scientifically ignorant and simply parrot whatever the study authors say.

Author Mark Twain popularized the saying, “there are three kinds of lies: lies, damned lies, and statistics”. An even better quote is from the renowned epidemiologist Alvan Feinstein of Yale University who quoted that, “statistics are like a bikini bathing suit: what is revealed is interesting, but what is concealed is crucial.” Let’s look at the history of this field of study.

Epidemiology: “The study of the distribution of diseases in populations and of factors that influence the occurrence of disease.” Classical epidemiology was fathered by an Italian physician named Ramazzini. Around 1700, he started looking into the possibility that various diseases in patients might be connected to their occupations. For example, miners and chemical workers might have some lung disease because they are exposed to dust, various chemicals, or toxic metals over the course of their careers. Years later a London surgeon, Percivall Pott, noted that virtually everyone he treated for cancer of the scrotum was a chimney sweep. Hummm, he must have thought to himself. It’s a non-communicable disease, so I wonder if all that soot and coal tar they breathe and get all over them every day might be the cause. This was a monumental proof of concept!

Classical epidemiology is like police work. If there is an outbreak of some kind of stomach ailment in a number of people in a city who all seek medical treatment (they were up all night throwing up and sitting on the pot, maybe some even died), public health investigators would seek to determine what history all these sick people might have in common. If it turns out 95% of them ate at Joe’s diner within the last few days, odds are Joe was serving E. coli burgers or maybe Salmonella oysters. A simple test would confirm it.

A recent example is the incidence of a disease identified in 1976, later named “legionnaire’s” disease, which is a form of pneumonia unknown before then. Hundreds of men were affected, and 32 died. It was found that all of them had attended an American Legion convention in Philadelphia. Voila, they identified a bacterium living in the ventilation system of the hotel where the convention took place. That is classical epidemiology. Now let’s discuss a branch of epidemiology that uses “clinical trials” to try to find the facts about disease, cause, and prevention.

Probably the first “study” we now might loosely call a clinical trial occurred in 1753. Scurvy was a common illness among sailors at the time. James Lind, a surgeon in the British Royal Navy, wondered if perhaps it had something to do with the fact that sailors on long voyages ate almost no fresh fruits and vegetables. We now know that scurvy is caused by a deficiency of vitamin C, but at the time the necessity of vitamins to our health was unknown. He tested his hypothesis by dividing a number of scurvy sailors into two groups, one of which was given fresh fruits (we now know to contain vitamin C) to eat, while the other group continued eating hardtack and rum. ALL the sailors sucking limes got over their illness, while ALL the sailors in the group that we now would call the no veggie “control” group still had scurvy. Eureka! From then on British sailors sucked on limes and stayed healthy, while those poor French and Spanish sailors stayed sick and lost lots of sea battles to the British.

Another classic example of an early clinical trial was conducted by Walter Reed, a U.S. army medical officer stationed in Cuba in the 1890s. Yellow fever was rampant at the time, and he wondered why it was only prevalent in tropical climates. His trial could never be done today for ethical reasons. He suspected mosquitoes might somehow be spreading the disease through their bites. He recruited a small number of healthy volunteers, half of whom deliberately were bitten by mosquitoes, while the other half were not bitten. Most of the poor guys with the bites came down with yellow fever, and one of them died! None of the bite free guys got the fever. That was definitive, whereas today various studies and trials are rarely so (with one famous exception that I discussed in my Ecoworld article entitled “Chemophobia”).

The following is a perfect example of how an epidemiological study should be conducted in order to give definitive results, NO question about the results, even if it wasn’t planned that way and would be considered unethical and way too expensive to conduct if it were. AND, this was a really BIG study.

Back in the 1960s, this study enrolled tens of millions of volunteers (the test group) who volunteered to inhale huge amounts of suspected carcinogens every day of their lives for at least 20 years, AT THEIR OWN EXPENSE! The same number of people who did not inhale the suspected carcinogens (the control group) was compared with the test group after 20-30 years to determine the rates of various cancers in the two groups. Absolutely unequivocal results showed that people in the test group had an increased incidence of various cancers and heart disease over the control group, and the most striking result was that people in the test group were about 15 times more likely to get lung cancer than people in the control group!

Thus we now know for sure that smoking can cause lung cancer and various other health problems. Now THAT was a really good epidemiological study! It is, however, not even conceivable to design and carry out such a clinical trial for ethical reasons: and in addition, the time and expense would be prohibitive. So let’s look at how those “study” authors do things today. The reader may have already figured out that I perceive most of the “studies” to be mostly what is often called “junk science.” I do not, however, believe real science is involved at all in most statistical studies, so I call them “bad (pretend) science” or BS for short. I won’t go into the statistical details and methods, but I will show many wonderful examples of famous BS. You can get the underlying methods by reading Steven Milloy’s “Science Without Sense” and “Junk Science Judo.”

Here are the “seven deadly sins” of epidemiology (epiBS from now on) as practiced today:

1) Have a political, health, or moral agenda and design (rig) your study in order to get the results you want. This applies to all sides of the political spectrum and official government agencies. Real scientific method is: put forth a hypothesis, then gather data to determine whether your hypothesis is correct or not. EpiBS method is: Have a mandated or acceptable conclusion in mind, then go select only the data that appear to support your already reached conclusion (see famous example below).

2) Assume that a statistical correlation that you found in your latest study between some disease or disorder and some exposure to some perceived risk factor is proof of a cause and effect relationship. EVEN if there is no apparent biological reason to think so, you can still think of some improbable rationalization for your results!

3) Data dredging: Don’t bother with any hypothesis prior to gathering your data, just ask a large group of people lots of questions about lifestyle, diet, drinking habits, ect., over a period of time. Feed the data into your computer statistics program and see if something correlates to something, who knows what you might find?

4) Don’t bother to verify any data you gather through questionnaires. Just assume nobody ever mis-remembers or lies about their lifestyle, diet, shoe size, or anything else you might have thought to ask about. Ask a subject how much alcohol they drink per day, and they understate the amount by 3 or 4 times at least. It’s like a wife asking her husband how much money he lost playing the slots at the casino.

5) Design studies that are fatally flawed from the beginning, but because you don’t know anything about the biochemistry involved (after all, you are either a medical guy or a statistician), you have no clue why you got the associations you did, but you believe it and publish it anyway.

6) If your study doesn’t find any association between, say, radon exposure and lung cancer, perform a meta-analysis combining the weak, statistically insignificant results of numerous studies by other researchers with your own, and you don’t even have to do a study of your own at all. It doesn’t matter how good or bad or even how similar in their design all those studies were (the old apples and oranges comparisons), combining them just might give a statistically significant correlation.

7) ALWAYS call the news media immediately after finding an association between, say, exposure to some hot issue chemical and some disease state. The reporters know absolutely nothing about how these studies are done and will uncritically report whatever you say. Your study will make big headlines tomorrow, and you will be quoted as saying, “the results are important, but more research is needed”. That translates into, “I need more grant money to continue to do BS.”

The most egregious example I will give of epiBS combines the deadly sins #1 and #6, and its results have had enormous implications on nanny state public policy. The U.S. Environmental Protection Agency’s (EPA) original mission was to establish rules and regulations meant to protect the environment, such as from air and water pollution. However, over the years mission creep occurred, and they now exist to protect public health. This gives them vastly more power to institute regulations that the agency was never originally intended to do.

In 1993, the EPA conducted a now infamous study that kicked off the anti-smoking crusade that continues today. At the time, more than 30 epidemiological studies from around the world had been conducted to see if the spouses of smokers were more likely to get lung cancer than spouses of non-smokers. None of them were definitive, perhaps showing a very weak correlation. Some of those studies actually suggested (also weakly) that exposure to second hand smoke, or environmental tobacco smoke (ETS) might actually protect the spouse against getting lung cancer. This is a plausible biological process called “hormesis”, i.e., very low levels of exposure to a toxin can protect a person against high levels of exposure later.

The EPA has even admitted that the average annual exposure to ETS particles for a non-smoker is less than actively smoking one cigarette. Anyway, the EPA ignored those studies and selected only 11 studies to combine in a meta-analysis that they hoped would establish a statistically significant correlation between ETS and lung cancer in spouses of smokers. They also chose not to include in their meta-analysis any of some 30 available studies that were designed to determine if ETS in the workplace, as opposed to spouses of smokers, could be responsible for an increased risk of lung cancer in non-smokers so exposed. A large majority of those workplace studies found no statistically significant association between workplace ETS exposure and lung cancer risk. Is that why they did not include any of those studies in their meta-analysis?

The meta-analysis used by the EPA to analyse the effects of environmental tobacco smoke (ETS, or 2nd hand smoke) committed two of the cardinal sins of epidemiology. First, they selected only those studies that might show that ETS causes lung cancer. Thus they designed it to be a one-tailed test. That means you assume a priori that the test substance can only be bad, so you don’t include any data that might show the opposite of what you expect (or want) to see. Including all studies in their meta-analysis, even those that may indicate that ETS could possibly be beneficial, would make it a statistically acceptable two-tailed test.

The second cardinal sin of epidemiology they performed is that they used a confidence interval (CI) of 90%, instead of the gold standard 95%, in order to get a statistically significant result. What does that mean, a statistics-ignorant person might ask? At a 95% CI, your statistically significant results have a 1 in 20 probability of being due to pure chance (1/20 means p=.05 in stat language). All good epidemiological studies use the 0.05 CI. The EPA chose to use a CI of 0.1 (one in 10 chance of your results being false instead of a 1 in 20 chance) because they knew beforehand that their results would not be significant otherwise. In other words, they rigged the “study” to get the result they wanted, epiBS in its most flagrant form. They knew that smoking is bad for the health of smokers, but they couldn’t regulate smoking unless they could claim ETS could cause disease in innocent bystanders. This is perverting science because they believe, that for a worthy cause, the end justifies the means.

What, you still think ETS causes lung cancer in non-smokers? The EPA epiBS meta-analysis study was done in 1993. A study sponsored by the World Health Organization in 1998, which covered seven countries over seven years, showed no increase in cancer risk for spouses and co-workers of smokers. It was, however, another meta-analysis. I don’t like meta-analyses in general, even when there may be no political agenda involved as in the EPA study. So has there been one huge study done right, no meta-analysis BS? YES!

In 2003, a study published in the British Journal of Medicine found no relationship between exposure to passive smoke and mortality. It was a HUGE, very believable study. It spanned 39 years and included over 35,000 Californians. So why is such a really good study ignored in the media and the epiBS community? POLITICAL CORRECTNESS?

Disclaimer: I DON’T SMOKE, AND I AM VERY ANNOYED BY ETS, so don’t accuse me of loving tobacco companies. I agree with laws banning smoking in public buildings, transportation, and enclosed areas where one must go to do business.

About the Author: Dr. Wheeler earned a Ph.D. in chemistry from the University of California, Berkeley in 1970. As a research scientist for the U.S. Department of Agriculture in Berkeley, he did pioneering research on how one’s nutritional status and cancer are interrelated, and how our immune systems handle food bourn carcinogens. He published 25 research papers in peer reviewed scientific journals and gave numerous talks (and listened to many, many more) at various scientific meetings. He left the USDA to work for Nabisco in New Jersey as head of the food science research unit. Now retired, he writes brilliant articles for “ecoworld” pro bono. He is the resident contrarian for ecoworld.com.

Email the Editor about this Article
EcoWorld - Nature and Technology in Harmony

Posted in Buildings, Carcinogens, Careers, Causes, Chemicals, Coal, Effects Of Air Pollution, History, Journalists, Other, Policy, Law, & Government, Smoking, Transportation, Water Pollution0 Comments

No Posts in Category
Advertisement