Archive | Literature

Scientists Concerned Over 1,100 Pounds of Dead Octopus Washing Ashore in Portugal

VILA NOVA DE GAIA, Portugal, Jan. 6 (UPI) — More than 1,200 pounds of dead octopus have washed ashore on a 1.8-mile stretch of beach in Vila Nova de Gaia, Portugal, concerned scientists say.

“It’s very strange that so many should be killed, and in such a confined area,” said Nuno Oliveira, head of the Gaia Biological Park. “There’s nothing in the scientific literature for this kind of mass mortality among octopus.”

An estimated 1,100 pounds of octopus all ages and sizes washed up dead Saturday and another 110 pounds Sunday, said Mike Weber, head of the Aguda coastal station in Gaia.

“That suggests that it wiped out the entire local population,” Weber told Time magazine in a story published Wednesday.

Firefighters gathered up the dead animals, with many sent for testing at a laboratory in Lisbon. Biologists have ruled out pollution as the cause of death because no other species were affected. The cause is likely some form of disease-causing parasite or bacteria, Oliveira said.

Copyright 2010 by United Press International

Posted in Aquatic Life, Literature0 Comments

Climate Science

Has global warming alarm become the goal rather than the result of scientific research? Is climate science really designed to answer questions?

Editor’s Note: When the history of the early 21st century is written, it may be the financial health of the global economy was rescued by a new currency, carbon. This new asset class, fungible and tradeable, reinflated the balance sheets of governments and international financial institutions alike, and pulled humanity back from the brink of a worldwide depression. That is the hopeful scenario, and not one to be lightly dismissed.

The other outcome that may be our legacy, however, will be that just when technology and capitalism were about to deliver prosperity and security to an unprecedented number of people everywhere, and just at the time when what our financial systems needed was to embark on new investment in cost-effective energy and water infrastructure, we instead committed the wealth of humanity to deploying immature energy technologies, and arcane projects of no use and stupefying expense – such as blasting CO2 gas into underground caverns.

In either case, what historians will definitely wonder about in future centuries is how deeply flawed logic, obscured by shrewd and unrelenting propaganda, actually enabled a coalition of powerful special interests to convince nearly everyone in the world that CO2 from human industry was a dangerous, planet destroying toxin. This could be remembered as the greatest mass delusion in the history of the world – that CO2, the life of plants, was considered for a time to be a deadly poison.

In this recently presented paper by Dr. Richard Lindzen, published here in its entirety, he describes the origins of global warming alarm, the political agenda of the alarmists, their intimidation tactics, and the reasons for their success. Also, in painstaking detail, he debunks their key scientific claims and counterclaims. Dr. Lindzen is not alone – he is one of the prominent members of what has become thousands of reputable scientists who are coming forward to dispute the theory that anthropogenic CO2 is the prevailing threat to global climate. Anyone who firmly believes anthropogenic CO2 emissions must be dramatically reduced in order to protect our planet should read this paper by Dr. Lindzen, and other work by reputable skeptics. There is simply too much at stake, and too many sweeping political changes being justified because of CO2 alarm, for any responsible activist or policymaker, media influencer or ordinary voter, to not take a second look.
- Ed Ring

Climate Science: Is It Currently Designed To Answer Questions?
by Richard Lindzen, October 30, 2008
global warming simulation
“We have the new paradigm where simulation and programs
have replaced theory and observation.” – Richard Lindzen
(Source: NASA)

For a variety of inter-related cultural, organizational, and political reasons, progress in climate science and the actual solution of scientific problems in this field have moved at a much slower rate than would normally be possible. Not all these factors are unique to climate science, but the heavy influence of politics has served to amplify the role of the other factors.”

By cultural factors, I primarily refer to the change in the scientific paradigm from a dialectic opposition between theory and observation to an emphasis on simulation and observational programs. The latter serves to almost eliminate the dialectical focus of the former. Whereas the former had the potential for convergence, the latter is much less effective.

The institutional factor has many components. One is the inordinate growth of administration in universities and the consequent increase in importance of grant overhead. This leads to an emphasis on large programs that never end. Another is the hierarchical nature of formal scientific organizations whereby a small executive council can speak on behalf of thousands of scientists as well as govern the distribution of ‘carrots and sticks’ whereby reputations are made and broken.

The above factors are all amplified by the need for government funding. When an issue becomes a vital part of a political agenda, as is the case with climate, then the politically desired position becomes a goal rather than a consequence of scientific research.

This paper will deal with the origin of the cultural changes and with specific examples of the operation and interaction of these factors. In particular, we will show how political bodies act to control scientific institutions, how scientists adjust both data and even theory to accommodate politically correct positions, and how opposition to these positions is disposed of.


Although the focus of this paper is on climate science, some of the problems pertain to science more generally. Science has traditionally been held to involve the creative opposition of theory and observation wherein each tests the other in such a manner as to converge on a better understanding of the natural world. Success was rewarded by recognition, though the degree of recognition was weighted according to both the practical consequences of the success and the philosophical and aesthetic power of the success. As science undertook more ambitious problems, and the cost and scale of operations increased, the need for funds undoubtedly shifted emphasis to practical relevance though numerous examples from the past assured a strong base level of confidence in the utility of science. Moreover, the many success stories established ‘science’ as a source of authority and integrity. Thus, almost all modern movements claimed scientific foundations for their aims. Early on, this fostered a profound misuse of science, since science is primarily a successful mode of inquiry rather than a source of authority.

Until the post World War II period, little in the way of structure existed for the formal support of science by government (at least in the US which is where my own observations are most relevant). In the aftermath of the Second World War, the major contributions of science to the war effort (radar, the A-bomb), to health (penicillin), etc. were evident. Vannevar Bush (in his report, Science: The Endless Frontier, 1945) noted the many practical roles that validated the importance of science to the nation, and argued that the government need only adequately support basic science in order for further benefits to emerge. The scientific community felt this paradigm to be an entirely appropriate response by a grateful nation. The next 20 years witnessed truly impressive scientific productivity which firmly established the United States as the creative center of the scientific world. The Bush paradigm seemed amply justified. (This period and its follow-up are also discussed by Miller, 2007, with special but not total emphasis on the NIH (National Institutes of Health).) However, something changed in the late 60’s. In a variety of fields it has been suggested that the rate of new discoveries and achievements slowed appreciably (despite increasing publications)2, and it is being suggested that either the Bush paradigm ceased to be valid or that it may never have been valid in the first place.

(2 At some level, this is obvious. Theoretical physics is still dealing with the standard model though there is an active search for something better. Molecular biology is still working off of the discovery of DNA. Many of the basic laws of physics resulted from individual efforts in the 17th-19th Centuries. The profound advances in technology should not disguise the fact that the bulk of the underlying science is more than 40 years old. This is certainly the case in the atmospheric and oceanic sciences. That said, it should not be forgotten that sometimes progress slows because the problem is difficult. Sometimes, it slows because the existing results are simply correct as is the case with DNA. Structural problems are not always the only factor involved.)

I believe that the former is correct. What then happened in the 1960’s to produce this change? It is my impression that by the end of the 60’s scientists, themselves, came to feel that the real basis for support was not gratitude (and the associated trust that support would bring further benefit) but fear: fear of the Soviet Union, fear of cancer, etc. Many will conclude that this was merely an awakening of a naive scientific community to reality, and they may well be right. However, between the perceptions of gratitude and fear as the basis for support lies a world of difference in incentive structure. If one thinks the basis is gratitude, then one obviously will respond by contributions that will elicit more gratitude. The perpetuation of fear, on the other hand, militates against solving problems. This change in perception proceeded largely without comment. However, the end of the cold war, by eliminating a large part of the fear-base forced a reassessment of the situation. Most thinking has been devoted to the emphasis of other sources of fear: competitiveness, health, resource depletion and the environment.

What may have caused this change in perception is unclear, because so many separate but potentially relevant things occurred almost simultaneously. The space race reinstituted the model of large scale focused efforts such as the moon landing program. For another, the 60’s saw the first major postwar funding cuts for science in the US. The budgetary pressures of the Vietnam War may have demanded savings someplace, but the fact that science was regarded as, to some extent, dispensable, came as a shock to many scientists. So did the massive increase in management structures and bureaucracy which took control of science out of the hands of working scientists. All of this may be related to the demographic pressures resulting from the baby boomers entering the workforce and the post-sputnik emphasis on science. Sorting this out goes well beyond my present aim which is merely to consider the consequences of fear as a perceived basis of support.

Fear has several advantages over gratitude. Gratitude is intrinsically limited, if only by the finite creative capacity of the scientific community. Moreover, as pointed out by a colleague at MIT, appealing to people’s gratitude and trust is usually less effective than pulling a gun. In other words, fear can motivate greater generosity. Sputnik provided a notable example in this regard; though it did not immediately alter the perceptions of most scientists, it did lead to a great increase in the number of scientists, which contributed to the previously mentioned demographic pressure. Science since the sixties has been characterized by the large programs that this generosity encourages. Moreover, the fact that fear provides little incentive for scientists to do anything more than perpetuate problems, significantly reduces the dependence of the scientific enterprise on unique skills and talents. The combination of increased scale and diminished emphasis on unique talent is, from a certain point of view, a devastating combination which greatly increases the potential for the political direction of science, and the creation of dependent constituencies. With these new constituencies, such obvious controls as peer review and detailed accountability begin to fail and even serve to perpetuate the defects of the system. Miller (2007) specifically addresses how the system especially favors dogmatism and conformity.

The creation of the government bureaucracy, and the increasing body of regulations accompanying government funding, called, in turn, for a massive increase in the administrative staff at universities and research centers. The support for this staff comes from the overhead on government grants, and, in turn, produces an active pressure for the solicitation of more and larger grants.3

(3 It is sometimes thought that government involvement automatically implies large bureaucracies, and lengthy regulations. This was not exactly the case in the 20 years following the second world war. Much of the support in the physical sciences came from the armed forces for which science support remained a relatively negligible portion of their budgets. For example, meteorology at MIT was supported by the Air Force. Group grants were made for five year periods and renewed on the basis of a site visit. When the National Science Foundation was created, it functioned with a small permanent staff supplemented by ‘rotators’ who served on leave from universities for a few years. Unfortunately, during the Vietnam War, the US Senate banned the military from supporting non-military research (Mansfield Amendment). This shifted support to agencies whose sole function was to support science. That said, today all agencies supporting science have large ‘supporting’ bureaucracies.)

One result of the above appears to have been the deemphasis of theory because of its intrinsic difficulty and small scale, the encouragement of simulation instead (with its call for large capital investment in computation), and the encouragement of large programs unconstrained by specific goals.4

(4 In fairness, such programs should be distinguished from team efforts which are sometimes appropriate and successful: classification of groups in mathematics, human genome project, etc.)

In brief, we have the new paradigm where simulation and programs have replaced theory and observation, where government largely determines the nature of scientific activity, and where the primary role of professional societies is the lobbying of the government for special advantage.

This new paradigm for science and its dependence on fear-based support may not constitute corruption per se, but it does serve to make the system particularly vulnerable to corruption. Much of the remainder of this paper will illustrate the exploitation of this vulnerability in the area of climate research. The situation is particularly acute for a small weak field like climatology. As a field, it has traditionally been a subfield within such disciplines as meteorology, oceanography, geography, geochemistry, etc. These fields, themselves are small and immature. At the same time, these fields can be trivially associated with natural disasters. Finally, climate science has been targeted by a major political movement, environmentalism, as the focus of their efforts, wherein the natural disasters of the earth system, have come to be identified with man’s activities – engendering fear as well as an agenda for societal reform and control. The remainder of this paper will briefly describe how this has been playing out with respect to the climate issue.

Abraham Lincoln
“You can fool some of the people all of the time,
and all of the people some of the time, but you
can not fool all of the people all of the time…”
- Lincoln (prior to the global warming scare)


The above described changes in scientific culture were both the cause and effect of the growth of ‘big science,’ and the concomitant rise in importance of large organizations. However, all such organizations, whether professional societies, research laboratories, advisory bodies (such as the national academies), government departments and agencies (including NASA, NOAA, EPA, NSF, etc.), and even universities are hierarchical structures where positions and policies are determined by small executive councils or even single individuals. This greatly facilitates any conscious effort to politicize science via influence in such bodies where a handful of individuals (often not even scientists) speak on behalf of organizations that include thousands of scientists, and even enforce specific scientific positions and agendas. The temptation to politicize science is overwhelming and longstanding. Public trust in science has always been high, and political organizations have long sought to improve their own credibility by associating their goals with ‘science’ – even if this involves misrepresenting the science.

Professional societies represent a somewhat special case. Originally created to provide a means for communication within professions – organizing meetings and publishing journals – they also provided, in some instances, professional certification, and public outreach. The central offices of such societies were scattered throughout the US, and rarely located in Washington. Increasingly, however, such societies require impressive presences in Washington where they engage in interactions with the federal government. Of course, the nominal interaction involves lobbying for special advantage, but increasingly, the interaction consists in issuing policy and scientific statements on behalf of the society. Such statements, however, hardly represent independent representation of membership positions. For example, the primary spokesman for the American Meteorological Society in Washington is Anthony Socci who is neither an elected official of the AMS nor a contributor to climate science. Rather, he is a former staffer for Al Gore.

Returning to the matter of scientific organizations, we find a variety of patterns of influence. The most obvious to recognize (though frequently kept from public view), consists in prominent individuals within the environmental movement simultaneously holding and using influential positions within the scientific organization. Thus, John Firor long served as administrative director of the National Center for Atmospheric Research in Boulder, Colorado. This position was purely administrative, and Firor did not claim any scientific credentials in the atmospheric sciences at the time I was on the staff of NCAR. However, I noticed that beginning in the 1980′s, Firor was frequently speaking on the dangers of global warming as an expert from NCAR. When Firor died last November, his obituary noted that he had also been Board Chairman at Environmental Defense– a major environmental advocacy group – from 1975-1980. 5

(5 A personal memoir from Al Grable sent to Sherwood Idso in 1993 is interesting in this regard. Grable served as a Department of Agriculture observer to the National Research Council’s National Climate Board. Such observers are generally posted by agencies to boardsthat they are funding. In any event, Grable describes a motion presented at a Board meeting in 1980 by Walter Orr Roberts, the director of the National Center for Atmospheric Research, and by Joseph Smagorinsky, director of NOAA’s Geophysical Fluid Dynamics Laboratory at Princeton, to censure Sherwood Idso for criticizing climate models with high sensitivities due to water vapor feedbacks (in the models), because of their inadequate handling of cooling due to surface evaporation. A member of that board, Sylvan Wittwer, noted that it was not the role of such boards to censure specific scientific positions since the appropriate procedure would be to let science decide in the fullness of time, and the matter was dropped. In point of fact, there is evidence that models do significantly understate the increase of evaporative cooling with temperature (Held and Soden, 2006). Moreover, this memoir makes clear that the water vapor feedback was considered central to the whole global warming issue from the very beginning.)

The UK Meteorological Office also has a board, and its chairman, Robert Napier, was previously the Chief Executive for World Wildlife Fund – UK. Bill Hare, a lawyer and Campaign Director for Greenpeace, frequently speaks as a ‘scientist’ representing the Potsdam Institute, Germany’s main global warming research center. John Holdren, who currently directs the Woods Hole Research Center (an environmental advocacy center not to be confused with the far better known Woods Hole Oceanographic Institution, a research center), is also a professor in Harvard’s Kennedy School of Government, and has served as president of the American Association for the Advancement of Science among numerous other positions. He was the Clinton-Gore Administration spokesman on global warming. The making of academic appointments to global warming alarmists is hardly a unique occurrence. The case of Michael Oppenheimer is noteworthy in this regard. With few contributions to climate science (his postdoctoral research was in astro-chemistry), and none to the physics of climate, Oppenheimer became the Barbara Streisand Scientist at Environmental Defense. 6

(6 It should be acknowledged that Oppenheimer has quite a few papers with climate in the title – especially in the last two years. However, these are largely papers concerned with policy and advocacy, assuming significant warming. Such articles probably constitute the bulk of articles on climate. It may be fair to say that such articles contribute little if anything to understanding the phenomenon.

He was subsequently appointed to a professorship at Princeton University, and is now, regularly, referred to as a prominent climate scientist by Oprah (a popular television hostess), NPR (National Public Radio), etc. To be sure, Oppenheimer did coauthor an early absurdly alarmist volume (Oppenheimer and Robert Boyle, 1990: Dead Heat, The Race Against the Greenhouse Effect), and he has served as a lead author with the IPCC (Intergovernmental Panel on Climate Change)7. Climate Science: Is it currently designed to answer questions? 7

(7 Certain names come up repeatedly in this paper. This is hardly an accident. In 1989, following the public debut of the issue in the US in Tim Wirth’s and Al Gore’s famous Senate hearing featuring Jim Hansen associating the warm summer of 1988 with global warming, the Climate Action Network was created. This organization of over 280 ENGO’s has been at the center of the climate debates since then. The Climate Action Network, is an umbrella NGO that coordinates the advocacy efforts of its members, particularly in relation to the UN negotiations. Organized around seven regional nodes in North and Latin America, Western and Eastern Europe, South and Southeast Asia, and Africa, CAN represents the majority of environmental groups advocating on climate change, and it has embodied the voice of the environmental community in the climate negotiations since it was established. The founding of the Climate Action Network can be traced back to the early involvement of scientists from the research ENGO community. These individuals, including Michael Oppenheimer from Environmental Defense, Gordon Goodman of the Stockholm Environmental Institute (formerly the Beijer Institute), and George Woodwell of the Woods Hole Research Center were instrumental in organizing the scientific workshops in Villach and Bellagio on ‘Developing Policy Responses to Climate Change’ in 1987 as well as the Toronto Conference on the Changing Atmosphere in June 1988. It should be noted that the current director of the Woods Hole Research Center is John Holdren. In 1989, several months after the Toronto Conference, the emerging group of climate scientists and activists from the US, Europe, and developing countries were brought together at a meeting in Germany, with funding from Environmental Defense and the German Marshall Fund. The German Marshall Fund is still funding NGO activity in Europe: (Pulver, 2004).

One could go on at some length with such examples, but a more common form of infiltration consists in simply getting a couple of seats on the council of an organization (or on the advisory panels of government agencies). This is sufficient to veto any statements or decisions that they are opposed to. Eventually, this enables the production of statements supporting their position – if only as a quid pro quo for permitting other business to get done. Sometimes, as in the production of the 1993 report of the NAS, Policy Implications of Global Warming, the environmental activists, having largely gotten their way in the preparation of the report where they were strongly represented as ‘stake holders,’ decided, nonetheless, to issue a minority statement suggesting that the NAS report had not gone ‘far enough.’ The influence of the environmental movement has effectively made support for global warming, not only a core element of political correctness, but also a requirement for the numerous prizes and awards given to scientists. That said, when it comes to professional societies, there is often no need at all for overt infiltration since issues like global warming have become a part of both political correctness and (in the US) partisan politics, and there will usually be council members who are committed in this manner.

The situation with America’s National Academy of Science is somewhat more complicated. The Academy is divided into many disciplinary sections whose primary task is the nomination of candidates for membership in the Academy. 8 Typically, support by more than 85% of the membership of any section is needed for nomination. However, once a candidate is elected, the candidate is free to affiliate with any section. The vetting procedure is generally rigorous, but for over 20 years, there was a Temporary Nominating Group for the Global Environment to provide a back door for the election of candidates who were environmental activists, bypassing the conventional vetting procedure. Members, so elected, proceeded to join existing sections where they hold a veto power over the election of any scientists unsympathetic to their position. Moreover, they are almost immediately appointed to positions on the executive council, and other influential bodies within the Academy. One of the members elected via the Temporary Nominating Group, Ralph Cicerone, is now president of the National Academy. Prior to that, he was on the nominating committee for the presidency. It should be added that there is generally only a single candidate for president. Others elected to the NAS via this route include Paul Ehrlich, James Hansen, Steven Schneider, John Holdren and Susan Solomon.

(8 The reports attributed to the National Academy are not, to any major extent, the work of Academy Members. Rather, they are the product of the National Research Council, which consists in a staff of over 1000 who are paid largely by the organizations soliciting the reports. The committees that prepare the reports are mostly scientists who are not Academy Members, and who serve without pay.

It is, of course, possible to corrupt science without specifically corrupting institutions. For example, the environmental movement often cloaks its propaganda in scientific garb without the aid of any existing scientific body. One technique is simply to give a name to an environmental advocacy group that will suggest to the public, that the group is a scientific rather than an environmental group. Two obvious examples are the Union of Concerned Scientists and the Woods Hole Research Center. 9

(9 One might reasonably add the Pew Charitable Trust to this list. Although they advertise themselves as a neutral body, they have merged with the National Environmental Trust, whose director, Philip Clapp, became deputy managing director of the combined body. Clapp (the head of the legislative practice of a large Washington law firm, and a consultant on mergers and acquisitions to investment banking firms), according to his recent obituary, was ‘an early and vocal advocate on climate change issues and a promoter of the international agreement concluded in 1997 in Kyoto, Japan. Mr. Clapp continued to attend subsequent global warming talks even after the US Congress did not ratify the Kyoto accord.’

The former conducted an intensive advertising campaign about ten years ago in which they urged people to look to them for authoritative information on global warming. This campaign did not get very far – if only because the Union of Concerned Scientists had little or no scientific expertise in climate. A possibly more effective attempt along these lines occurred in the wake of Michael Crichton’s best selling adventure, Climate of Fear, which pointed out the questionable nature of the global warming issue, as well as the dangers to society arising from the exploitation of this issue. Environmental Media Services (a project of Fenton Communications, a large public relations firm serving left wing and environmental causes; they are responsible for the alar scare as well as Cindy Sheehan’s anti-war campaign.) created a website,, as an ‘authoritative’ source for the ‘truth’ about climate.

This time, real scientists who were also environmental activists, were recruited to organize this web site and ‘discredit’ any science or scientist that questioned catastrophic anthropogenic global warming. The web site serves primarily as a support group for believers in catastrophe, constantly reassuring them that there is no reason to reduce their worrying. Of course, even the above represent potentially unnecessary complexity compared to the longstanding technique of simply publicly claiming that all scientists agree with whatever catastrophe is being promoted. Newsweek already made such a claim in 1988. Such a claim serves at least two purposes. First, the bulk of the educated public is unable to follow scientific arguments; ‘knowing’ that all scientists agree relieves them of any need to do so. Second, such a claim serves as a warning to scientists that the topic at issue is a bit of a minefield that they would do well to avoid.

The myth of scientific consensus is also perpetuated in the web’s Wikipedia where climate articles are vetted by William Connolley, who regularly runs for office in England as a Green Party candidate. No deviation from the politically correct line is permitted.

Perhaps the most impressive exploitation of climate science for political purposes has been the creation of the Intergovernmental Panel on Climate Change (IPCC) by two UN agencies, UNEP (United Nations Environmental Program) and WMO (World Meteorological Organization), and the agreement of all major countries at the 1992 Rio Conference to accept the IPCC as authoritative. Formally, the IPCC summarizes the peer reviewed literature on climate every five years. On the face of it, this is an innocent and straightforward task. One might reasonably wonder why it takes 100′s of scientists five years of constant travelling throughout the world in order to perform this task. The charge to the IPCC is not simply to summarize, but rather to provide the science with which to support the negotiating process whose aim is to control greenhouse gas levels. This is a political rather than a scientific charge. That said, the participating scientists have some leeway in which to reasonably describe matters, since the primary document that the public associates with the IPCC is not the extensive report prepared by the scientists, but rather the Summary for Policymakers which is written by an assemblage of representative from governments and NGO’s, with only a small scientific representation. 10, 11

(10 Appendix 1 is the invitation to the planning session for the 5th assessment. It clearly emphasizes strengthening rather than checking the IPCC position. Appendix 2 reproduces a commentary by Stephen McIntyre on the recent OfCom findings concerning a British TV program opposing global warming alarmism. The response of the IPCC officials makes it eminently clear that the IPCC is fundamentally a political body. If further evidence were needed, one simply has to observe the fact that the IPCC Summary for Policymakers will selectively cite results to emphasize negative consequences. Thus the summary for Working Group II observes that global warming will result in “Hundreds of millions of people exposed to increased water stress.” This, however, is based on work (Arnell, 2004) which actually shows that by the 2080s the net global population at risk declines by up to 2.1 billion people (depending on which scenario one wants to emphasize)! The IPCC further ignores the capacity to build reservoirs to alleviate those areas they project as subject to drought (I am indebted to Indur Goklany for noting this example.))

(11 Appendix 3 is a recent op-ed from the Boston Globe, written by the aforementioned John Holdren. What is interesting about this piece is that what little science it invokes is overtly incorrect. Rather, it points to the success of the above process of taking over scientific institutions as evidence of the correctness of global warming alarmism. The 3 atmospheric scientists who are explicitly mentioned are chemists with no particular expertise in climate, itself. While, Holdren makes much of the importance of expertise, he fails to note that he, himself, is hardly a contributor to the science of climate. Holdren and Paul Ehrlich (of Population Bomb fame; in that work he predicted famine and food riots for the US in the 1980′s) are responsible for the I=PAT formula. Holdren, somewhat disingenuously claims that this is merely a mathematical identity where I is environmental impact, P is population, A is GDP/P and T is I/GDP. However, in popular usage, A has become affluence and T has become technology (viz Schneider, 1997.)


Given the above, it would not be surprising if working scientists would make special efforts to support the global warming hypothesis. There is ample evidence that this is happening on a large scale. A few examples will illustrate this situation. Data that challenges the hypothesis are simply changed. In some instances, data that was thought to support the hypothesis is found not to, and is then changed. The changes are sometimes quite blatant, but more often are somewhat more subtle. The crucial point is that geophysical data is almost always at least somewhat uncertain, and methodological errors are constantly being discovered. Bias can be introduced by simply considering only those errors that change answers in the desired direction. The desired direction in the case of climate is to bring the data into agreement with models that attempt to account for the observations by means of greenhouse forcing, even though such models have displayed minimal skill in explaining or predicting climate. Model projections, it should be recalled, are the basis for our greenhouse concerns. That corrections to climate data should be called for, is not at all surprising, but that such corrections should almost always be in the ‘needed’ direction is exceedingly unlikely. Although the situation suggests overt dishonesty, it is entirely possible, in today’s scientific environment, that many scientists feel that it is the role of science to vindicate the greenhouse paradigm for climate change as well as the credibility of models. Comparisons of models with data are, for example, referred to as model validation studies rather than model tests.

The first two examples involve paleoclimate simulations and reconstructions. Here, the purpose has been to show that both the models and the greenhouse paradigm can explain past climate regimes, thus lending confidence to the use of both to anticipate future changes. In both cases (the Eocene about 50 million years ago, and the Last Glacial Maximum about 18 thousand years ago), the original data were in conflict with the greenhouse paradigm as implemented in current models, and in both cases, lengthy efforts were made to bring the data into agreement with the models.

In the first example, the original data analysis for the Eocene (Shackleton and Boersma, 1981) showed the polar regions to have been so much warmer than the present that a type of alligator existed on Spitzbergen as did florae and fauna in Minnesota that could not have survived frosts. At the same time, however, equatorial temperatures were found to be about 4K colder than at present. The first attempts to simulate the Eocene (Barron, 1987) assumed that the warming would be due to high levels of CO2, and using a climate GCM (General Circulation Model), he obtained relatively uniform warming at all latitudes, with the meridional gradients remaining much as they are today. This behavior continues to be the case with current GCMs (Huber, 2008). As a result, paleoclimatologists have devoted much effort to ‘correcting’ their data, but, until very recently, they were unable to bring temperatures at the equator higher than today’s (Schrag, 1999, Pearson et al, 2000). However, the latest paper (Huber, 2008) suggests that the equatorial data no longer constrains equatorial temperatures at all, and any values may have existed. All of this is quite remarkable since there is now evidence that current meridional distributions of temperature depend critically on the presence of ice, and that the model behavior results from improper tuning wherein present distributions remain even when ice is absent.

The second example begins with the results of a major attempt to observationally reconstruct the global climate of the last glacial maximum (CLIMAP, 1976). Here it was found that although extratropical temperatures were much colder, equatorial temperatures were little different from today’s. There were immediate attempts to simulate this climate with GCMs and reduced levels of CO2. Once again the result was lower temperatures at all latitudes (Bush and Philander, 1998a,b), and once again, numerous efforts were made to ‘correct’ the data. After much argument, the current position appears to be that tropical temperatures may have been a couple of degrees cooler than today’s. However, papers appeared claiming far lower temperatures (Crowley, 2000). We will deal further with this issue in the next section where we describe papers that show that the climate associated with ice ages is well described by the Milankovich Hypothesis that does not call for any role for CO2.

Both of the above examples probably involved legitimate corrections, but only corrections that sought to bring observations into agreement with models were initially considered, thus avoiding the creative conflict between theory and data that has characterized the past successes of science. To be sure, however, the case of the Last Glacial Maximum shows that climate science still retains a capacity for self-correction.

The next example has achieved a much higher degree of notoriety than the previous two. In the first IPCC assessment (IPCC, 1990), the traditional picture of the climate of the past 1100 years was presented. In this picture, there was a medieval warm period that was somewhat warmer than the present as well as the little ice age that was cooler. The presence of a period warmer than the present in the absence of any anthropogenic greenhouse gases was deemed an embarrassment for those holding that present warming could only be accounted for by the activities of man. Not surprisingly, efforts were made to get rid of the medieval warm period (According to Demming, 2005, Jonathan Overpeck, in an email, remarked that one had to get rid of the medieval warm period. Overpeck is one of signators in Appendix 1.). The most infamous effort was that due to Mann et al (1998, 1999) 12 which used primarily a few handfuls of tree ring records to obtain a reconstruction of Northern Hemisphere temperature going back eventually a thousand years that no longer showed a medieval warm period.

(12 The 1998 paper actually only goes back to 1400 CE, and acknowledges that there is no useful resolution of spatial patterns of variability going further back. It is the 1999 paper that then goes back 1000 years.)

Indeed, it showed a slight cooling for almost a thousand years culminating in a sharp warming beginning in the nineteenth century. The curve came to be known as the hockey stick, and featured prominently in the next IPCC report, where it was then suggested that the present warming was unprecedented in the past 1000 years. The study immediately encountered severe questions concerning both the proxy data and its statistical analysis (interestingly, the most penetrating critiques came from outside the field: McIntyre and McKitrick, 2003, 2005 a,b). This led to two independent assessments of the hockey stick (Wegman,2006, North, 2006), both of which found the statistics inadequate for the claims. The story is given in detail in Holland (2007). Since the existence of a medieval warm period is amply documented in historical accounts for the North Atlantic region (Soon et al, 2003), Mann et al countered that the warming had to be regional but not characteristic of the whole northern hemisphere. Given that an underlying assumption of their analysis was that the geographic pattern of warming had to have remained constant, this would have invalidated the analysis ab initio without reference to the specifics of the statistics. Indeed, the 4th IPCC (IPCC, 2007) assessment no longer featured the hockey stick, but the claim that current warming is unprecedented remains, and Mann et al’s reconstruction is still shown in Chapter 6 of the 4th IPCC assessment, buried among other reconstructions. Here too, we will return to this matter briefly in the next section.

The fourth example is perhaps the strangest. For many years, the global mean temperature record showed cooling from about 1940 until the early 70′s. This, in fact, led to the concern for global cooling during the 1970′s. The IPCC regularly, through the 4th assessment, boasted of the ability of models to simulate this cooling (while failing to emphasize that each model required a different specification of completely undetermined aerosol cooling in order to achieve this simulation (Kiehl, 2007)). Improvements in our understanding of aerosols are increasingly making such arbitrary tuning somewhat embarrassing, and, no longer surprisingly, the data has been ‘corrected’ to get rid of the mid 20th century cooling (Thompson et al, 2008). This may, in fact, be a legitimate correction ( The embarrassment may lie in the continuous claims of modelers to have simulated the allegedly incorrect data.

The fifth example deals with the fingerprint of warming. It has long been noted that greenhouse warming is primarily centered in the upper troposphere (Lindzen, 1999) and, indeed, model’s show that the maximum rate of warming is found in the upper tropical troposphere (Lee, et al, 2007). Lindzen (2007) noted that temperature data from both satellites and balloons failed to show such a maximum. This, in turn, permitted one to bound the greenhouse contribution to surface warming, and led to an estimate of climate sensitivity that was appreciably less than found in current models. Once the implications of the observations were clearly identified, it was only a matter of time before the data were ‘corrected.’ The first attempt came quickly (Vinnikov et al, 2006) wherein the satellite data was reworked to show large warming in the upper troposphere, but the methodology was too blatant for the paper to be commonly cited. 13 There followed an attempt wherein the temperature data was rejected, and where temperature trends were inferred from wind data (Allen and Sherwood, 2008). Over sufficiently long periods, there is a balance between vertical wind shear and meridional temperature gradients (the thermal wind balance), and, with various assumptions concerning boundary conditions, one can, indeed, infer temperature trends, but the process involves a more complex, indirect, and uncertain procedure than is involved in directly measuring temperature. Moreover, as Pielke et al (2008) have noted, the results display a variety of inconsistencies. They are nonetheless held to resolve the discrepancy with models.

(13 Of course, Vinnikov et al did mention it. When I gave a lecture at Rutgers University in October 2007, Alan Robock, a professor at Rutgers and a coauthor of Vinnikov et al declared that the ‘latest data’ resolved the discrepancy wherein the model fingerprint could not be found in the data.)

The sixth example takes us into astrophysics. Since the 1970′s, considerable attention has been given to something known as the Early Faint Sun Paradox. This paradox was first publicized by Sagan and Mullen (1972). They noted that the standard model for the sun robustly required that the sun brighten with time so that 2-3 billion years ago, it was about 30% dimmer than it is today (recall that a doubling of CO2 corresponds to only a 2% change in the radiative budget). One would have expected that the earth would have been frozen over, but the geological evidence suggested that the ocean was unfrozen. Attempts were made to account for this by an enhanced greenhouse effect. Sagan and Mullen (1972) suggested an ammonia rich atmosphere might work. Others suggested an atmosphere with as much as several bars of CO2 (recall that currently CO2 is about 380 parts per million of a 1 bar atmosphere). Finally, Kasting and colleagues tried to resolve the paradox with large amounts of methane. For a variety of reasons, all these efforts were deemed inadequate 14 (Haqqmisra et al, 2008). There followed a remarkable attempt to get rid of the standard model of the sun (Sackman and Boothroyd, 2003). This is not exactly the same as altering the data, but the spirit is the same. The paper claimed to have gotten rid of the paradox. However, in fact, the altered model still calls for substantial brightening, and, moreover, does not seem to have gotten much acceptance among solar modelers.

(14 Haqqmisra, a graduate student at the Pennsylvania State University, is apparently still seeking greenhouse solutions to the paradox.)

My last specific example involves the social sciences. Given that it has been maintained since at least 1988 that all scientists agree about alarming global warming, it is embarrassing to have scientists objecting to the alarm. To ‘settle’ the matter, a certain Naomi Oreskes published a paper in Science (Oreskes, 2004) purporting to have surveyed the literature and not have found a single paper questioning the alarm (Al Gore offers this study as proof of his own correctness in “Inconvenient Truth.”). Both Benny Peiser (a British sociologist) and Dennis Bray (an historian of science) noted obvious methodological errors, but Science refused to publish these rebuttals with no regard for the technical merits of the criticisms presented. 15

(15 The refusal was not altogether surprising. The editor of Science, at the time, was Donald Kennedy, a biologist (and colleague of Paul Ehrlich and Stephen Schneider, both also members of Stanford’s biology department), who had served as president of Stanford University. His term, as president, ended with his involvement in fiscal irregularities such as charging to research overhead such expenses as the maintenance of the presidential yacht and the provision of flowers for his daughter’s wedding – offering peculiar evidence for the importance of grant overhead to administrators. Kennedy had editorially declared that the debate concerning global warming was over and that skeptical articles would not be considered. More recently, he has published a relatively pure example of Orwellian double-speak (Kennedy, 2008) wherein he called for better media coverage of global warming, where by ‘better’ he meant more carefully ignoring any questions about global warming alarm. As one might expect, Kennedy made extensive use of Oreskes’ paper. He also made the remarkably dishonest claim that the IPCC Summary for Policymakers was much more conservative than the scientific text.)

Moreover, Oreskes was a featured speaker at the celebration of Spencer Weart’s thirty years as head of the American Institute of Physics’ Center for History of Physics. Weart, himself, had written a history of the global warming issue (Weart, 2003) where he repeated, without checking, the slander taken from a screed by Ross Gelbspan (The Heat is On) 16 in which I was accused of being a tool of the fossil fuel industry. Weart also writes with glowing approval of Gore’s Inconvenient Truth. As far as Oreskes’ claim goes, it is clearly absurd 17. A more carefully done study revealed a very different picture (Schulte, 2007).

The above examples do not include the most convenient means whereby nominal scientists can support global warming alarm: namely, the matter of impacts. Here, scientists who generally have no knowledge of climate physics at all, are supported to assume the worst projections of global warming and imaginatively suggest the implications of such warming for whatever field they happen to be working in. This has led to the bizarre claims that global warming will contribute to kidney stones, obesity, cockroaches, noxious weeds, sexual imbalance in fish, etc. The scientists who participate in such exercises quite naturally are supportive of the catastrophic global warming hypothesis despite their ignorance of the underlying science. 18

(16 For reasons already mentioned, the claim of consensus is great propagandistic value to the global warming movement. Naturally, the existence of a substantial number of legitimate scientists who oppose alarmist assertions is embarrassing to those claiming consensus. A primary approach to such scientists has been to claim that they are lying because they have been paid off by the oil industry. The claim was, in general, both untrue and irrelevant. In the early 90’s Ted Koppel devoted half of a Nightline show to excoriating then Vice President Gore for asking him to dig up dirt on opponents of global warming alarm. However, Gelbspan cheerfully stepped into the breach.)

(17 Oreskes, apart from overt errors, merely considered support to consist in agreement that there had been some warming, and that anthropogenic CO2 contributed part of the warming. Such innocent conclusions have essentially nothing to do with catastrophic projections that involve dozens if not hundreds of ill-predicted and loosely connected variables. Moreover, most of the papers she looked at didn’t even address these issues; they simply didn’t question these conclusions.)

(18 Perhaps unsurprisingly, The Potsdam Institute, home of Greenpeace’s Bill Hare, now has a Potsdam Institute for Climate Impact Research.)


It is often argued that in science the truth must eventually emerge. This may well be true, but, so far, attempts to deal with the science of climate change objectively have been largely forced to conceal such truths as may call into question global warming alarmism (even if only implicitly). The usual vehicle is peer review, and the changes imposed were often made in order to get a given paper published.

Publication is, of course, essential for funding, promotion, etc. The following examples are but a few from cases that I am personally familiar with. These, almost certainly, barely scratch the surface. What is generally involved, is simply the inclusion of an irrelevant comment supporting global warming accepted wisdom. When the substance of thepaper is described, it is generally claimed that the added comment represents the ‘true’ intenthe paper. In addition to the following examples, Appendix 2 offers excellent examples of ‘spin control.’

As mentioned in the previous section, one of the reports assessing the Mann et al Hockey Stick was prepared by a committee of the US National Research Counsel (a branch of the National Academy) chaired by Gerald North (North, 2006). The report concluded that the analysis used was totally unreliable for periods longer ago than about 400 years. In point of fact, the only basis for the 400 year choice was that this brought one to the midst of the Little Ice Age, and there is essentially nothing surprising about a conclusion that we are now warmer. Still, without any basis at all, the report also concluded that despite the inadequacy of the Mann et al analysis, the conclusion might still be correct. It was this baseless conjecture that received most of the publicity surrounding the report.

In a recent paper, Roe (2006) showed that the orbital variations in high latitude summer insolation correlate excellently with changes in glaciation – once one relates the insolation properly to the rate of change of glaciation rather than to the glaciation itself. This provided excellent support for the Milankovich hypothesis. Nothing in the brief paper suggested the need for any other mechanism. Nonetheless, Roe apparently felt compelled to include an irrelevant caveat stating that the paper had no intention of ruling out a role for CO2.

Choi and Ho (2006, 2008) published interesting papers on the optical properties of high tropical cirrus that largely confirmed earlier results by Lindzen, Chou and Hou (2001) on an important negative feedback (the iris effect – something that we will describe later in this section) that would greatly reduce the sensitivity of climate to increasing greenhouse gases. A proper comparison required that the results be normalized by a measure of total convective activity, and, indeed, such a comparison was made in the original version of Choi and Ho’s paper. However, reviewers insisted that the normalization be removed from the final version of the paper which left the relationship to the earlier paper unclear.

Horvath and Soden (2008) found observational confirmation of many aspects of the iris effect, but accompanied these results with a repetition of criticisms of the iris effect that were irrelevant and even contradictory to their own paper. The point, apparently, was to suggest that despite their findings, there might be other reasons to discard the iris effect. Later in this section, I will return to these criticisms. However, the situation is far from unique. I have received preprints of papers wherein support for the iris was found, but where this was omitted in the published version of the papers.

In another example, I had originally submitted a paper mentioned in the previous section (Lindzen, 2007) to American Scientist, the periodical of the scientific honorary society in the US, Sigma Xi, at the recommendation of a former officer of that society. There followed a year of discussions, with an editor, David Schneider, insisting that I find a coauthor who would illustrate why my paper was wrong. He argued that publishing something that contradicted the IPCC was equivalent to publishing a paper that claimed that ‘Einstein’s general theory of relativity is bunk.’ I suggested that it would be more appropriate for American Scientist to solicit a separate paper taking a view opposed to mine. This was unacceptable to Schneider, so I ended up publishing the paper elsewhere. Needless to add, Schneider has no background in climate physics. At the same time, a committee consisting almost entirely in environmental activists led by Peter Raven, the ubiquitous John Holdren, Richard Moss, Michael MacCracken, and Rosina Bierbaum were issuing a joint Sigma Xi – United Nations Foundation (the latter headed by former Senator and former Undersecretary of State Tim Wirth 19 and founded by Ted Turner) report endorsing global warming alarm, to a degree going far beyond the latest IPCC report. I should add that simple disagreement with conclusions of the IPCC has become a common basis for rejecting papers for publication in professional journals – as long as the disagreement suggests reduced alarm. An example will be presented later in this section.

(19 Tim Wirth chaired the hearing where Jim Hansen rolled out the alleged global warming relation to the hot summer of 1988 (viz Section 2).

He is noted for having arranged for the hearing room to have open windows to let in the heat so that Hansen would be seen to be sweating for the television cameras. Wirth is also frequently quoted as having said “We’ve got to ride the global warming issue. Even if the theory of global warming is wrong, we will be doing the right thing — in terms of economic policy and environmental policy.”

Despite all the posturing about global warming, more and more people are becoming aware of the fact that global mean temperatures have not increased statistically significantly since 1995. One need only look at the temperature records posted on the web by the Hadley Centre. The way this is acknowledged in the literature forms a good example of the spin that is currently required to maintain global warming alarm. Recall that the major claim of the IPCC 4th Assessment was that there was a 90% certainty that most of the warming of the preceding 50 years was due to man (whatever that might mean). This required the assumption that what is known as natural internal variability (ie, the variability that exists without any external forcing and represents the fact that the climate system is never in equilibrium) is adequately handled by the existing climate models. The absence of any net global warming over the last dozen years or so, suggests that this assumption may be wrong. Smith et al (2007) (Smith is with the Hadley Centre in the UK) acknowledged this by pointing out that initial conditions had to reflect the disequilibrium at some starting date, and when these conditions were ‘correctly’ chosen, it was possible to better replicate the period without warming. This acknowledgment of error was accompanied by the totally unjustified assertion that global warming would resume with a vengeance in 2009. 20

(20 When I referred to the Smith et al paper at a hearing of the European Parliament, Professor Schellnhuber of the Potsdam Institute (which I mentioned in the previous section with respect to its connection to Greenpeace) loudly protested that I was being ‘dishonest’ by not emphasizing what he referred to as the main point in Smith et al: namely that global warming would return with a vengeance.

As 2009 approaches and the vengeful warming seems less likely to occur, a new paper came out (this time from the Max Planck Institute: Keenlyside et al, 2008) moving the date for anticipated resumption of warming to 2015. It is indeed a remarkable step backwards for science to consider models that have failed to predict the observed behavior of the climate to nonetheless have the same validity as the data. 21

(21 The matter of ‘spin control’ warrants a paper by itself. In connection with the absence of warming over the past 13 years, the common response is that 7 of the last 10 warmest years in the record occurred during the past decade. This is actually to be expected, given that we are in a warm period, and the temperature is always fluctuating. However, it has nothing to do with trends.

Tim Palmer, a prominent atmospheric scientist at the European Centre for Medium Range Weather Forecasting, is quoted by Fred Pearce (Pearce, 2008) in the New Scientist as follows: “Politicians seem to think that the science is a done deal,” says Tim Palmer. “I don’t want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.” Pearce, however, continues “Palmer .. does not doubt that the Intergovernmental Panel on Climate Change (IPCC) has done a good job alerting the world to the problem of global climate change. But he and his fellow climate scientists are acutely aware that the IPCC’s predictions of how the global change will affect local climates are little more than guesswork. They fear that if the IPCC’s predictions turn out to be wrong, it will provoke a crisis in confidence that undermines the whole climate change debate. On top of this, some climate scientists believe that even the IPCC’s global forecasts leave much to be desired. …” Normally, one would think that undermining the credibility of something that is wrong is appropriate.

Even in the present unhealthy state of science, papers that are overtly contradictory to the catastrophic warming scenario do get published (though not without generally being substantially watered down during the review process). They are then often subject to the remarkable process of ‘discreditation.’ This process consists in immediately soliciting attack papers that are published quickly as independent articles rather than comments. The importance of this procedure is as follows. Normally such criticisms are published as comments, and the original authors are able to respond immediately following the comment. Both the comment and reply are published together. By publishing the criticism as an article, the reply is published as a correspondence, which is usually delayed by several months, and the critics are permitted an immediate reply. As a rule, the reply of the original authors is ignored in subsequent references.

In 2001, I published a paper (Lindzen, Chou and Hou) that used geostationary satellite data to suggest the existence of a strong negative feedback that we referred to as the Iris Effect. The gist of the feedback is that upper level stratiform clouds in the tropics arise by detrainment from cumulonimbus towers, that the radiative impact of the stratiform clouds is primarily in the infrared where they serve as powerful greenhouse components, and that the extent of the detrainment decreases markedly with increased surface temperature. The negative feedback resulted from the fact that the greenhouse warming due to the stratiform clouds diminished as the surface temperature increased, and increased as the surface temperature decreased – resisting the changes in surface temperature. The impact of the observed effect was sufficient to greatly reduce the model sensitivities to increasing CO2, and it was, moreover, shown that models failed to display the observed cloud behavior. The paper received an unusually intense review from four reviewers.

Once the paper appeared, the peer review editor of the Bulletin of the American Meteorological Society, Irwin Abrams, was replaced by a new editor, Jeffrey Rosenfeld (holding the newly created position of Editor in Chief), and the new editor almost immediately accepted a paper criticizing our paper (Hartmann and Michelsen, 2002), publishing it as a separate paper rather than a response to our paper (which would have been the usual and appropriate procedure). In the usual procedure, the original authors are permitted to respond in the same issue. In the present case, the response was delayed by several months. Moreover, the new editor chose to label the criticism as follows: “Careful analysis of data reveals no shrinkage of tropical cloud anvil area with increasing SST.” In fact, this criticism was easily dismissed. The claim of Hartmann and Michelsen was that the effect we observed was due to the intrusion of midlatitude non-convective clouds into the tropics. If this were true, then the effect should have diminished as one restricted observations more closely to the equator, but as we showed (Lindzen, Chou and Hou, 2002), exactly the opposite was found.

There were also separately published papers (again violating normal protocols allowing for immediate response) by Lin et al, 2002 and Fu, Baker and Hartmann, 2002, that criticized our paper by claiming that since the instruments on the geostationary satellite could not see the thin stratiform clouds that formed the tails of the clouds we could see, that we were not entitled to assume that the tails existed. Without the tails, the radiative impact of the clouds would be primarily in the visible where the behavior we observed would lead to a positive feedback; with the tails the effect is a negative feedback. The tails had long been observed, and the notion that they abruptly disappeared when not observed by an insufficiently sensitive sensor was absurd on the face of it, and the use of better instruments by Choi and Ho (2006, 2008) confirmed the robustness of the tails and the strong dominance of the infrared impact. However, as we have already seen, virtually any mention of the iris effect tends to be accompanied with a reference to the criticisms, a claim that the theory is ‘discredited,’ and absolutely no mention of the responses. This is even required of papers that are actually supporting the iris effect.

Vincent Courtillot et al (2007) encountered a similar problem. (Courtillot, it should be noted, is the director of the Institute for the Study of the Globe at the University of Paris.) They found that time series for magnetic field variations appeared to correlate well with temperature measurements – suggesting a possible non-anthropogenic source of forcing. This was immediately criticized by Bard and Delaygue (2008), and Courtillot et al were given the conventional right to reply which they did in a reasonably convincing manner. What followed, however, was highly unusual. Raymond Pierrehumbert (a professor of meteorology at the University of Chicago and a fanatical environmentalist) posted a blog supporting Bard and Delaygue, accusing Courtillot et al of fraud, and worse. Alan Robock (a coauthor of Vinnikov et al mentioned in the preceding section) perpetuated the slander in a letter circulated to all officers of the American Geophysical Union. The matter was then taken up (in December of 2007) by major French newspapers (LeMonde, Liberation, and Le Figaro) that treated Pierrehumbert’s defamation as fact. As in the previous case, all references to the work of Courtillot et al refer to it as ‘discredited’ and no mention is made of their response. Moreover, a major argument against the position of Courtillot et al is that it contradicted the claim of the IPCC.

In 2005, I was invited by Ernesto Zedillo to give a paper at a symposium he was organizing at his Center for Sustainability Studies at Yale. The stated topic of the symposium was Global Warming Policy After 2012, and the proceedings were to appear in a book to by entitled Global Warming: Looking Beyond Kyoto. Only two papers dealing with global warming science were presented: mine and one by Stefan Rahmstorf of the Potsdam Institute. The remaining papers all essentially assumed an alarming scenario and proceeded to discuss economics, impacts, and policy. Rahmstorf and I took opposing positions, but there was no exchange at the meeting, and Rahmstorf had to run off to another meeting. As agreed, I submitted the manuscript of my talk, but publication was interminably delayed, perhaps because of the presence of my paper. In any event, the Brookings Institute (a centrist Democratic Party think tank) agreed to publish the volume. When the volume finally appeared (Zedillo, 2008), I was somewhat shocked to see that Rahmstorf’s paper had been modified from what he presented, and had been turned into an attack not only on my paper but on me personally. 22

(22 The strange identification of the CO2 caused global warming paradigm with general relativity theory, mentioned earlier in this section, is repeated by Rahmstorf. This repetition of odd claims may be a consequence of the networking described in footnote 7.)

I had received no warning of this; nor was I given any opportunity to reply. Inquiries to the editor and the publisher went unanswered. Moreover, the Rahmstorf paper was moved so that it immediately followed my paper. The reader is welcome to get a copy of the exchange, including my response, on my web site (Lindzen-Rahmstorf Exchange, 2008), and judge the exchange for himself.

One of the more bizarre tools of global warming revisionism is the posthumous alteration of skeptical positions.

Thus, the recent deaths of two active and professionally prominent skeptics, Robert Jastrow (the founding director of NASA’s Goddard Institute for Space Studies, now headed by James Hansen), and Reid Bryson (a well known climatologist at the University of Wisconsin) were accompanied by obituaries suggesting deathbed conversions to global warming alarm.

The death of another active and prominent skeptic, William Nierenberg (former director of the Scripps Oceanographic Institute), led to the creation of a Nierenberg Prize that is annually awarded to an environmental activist. The most recent recipient was James Hansen who Nierenberg detested.

Perhaps the most extraordinary example of this phenomenon involves a paper by Singer, Starr, and Revelle (1991). In this paper, it was concluded that we knew too little about climate to implement any drastic measures. Revelle, it may be recalled, was the professor that Gore credits with introducing him to the horrors of CO2 induced warming. There followed an intense effort led by a research associate at Harvard, Justin Lancaster, in coordination with Gore staffers, to have Revelle’s name posthumously removed from the published paper. It was claimed that Singer had pressured an old and incompetent man to allow his name to be used. To be sure, everyone who knew Revelle, felt that he had been alert until his death. There followed a lawsuit by Singer, where the court found in Singer’s favor. The matter is described in detail in Singer (2003).

Occasionally, prominent individual scientists do publicly express skepticism. The means for silencing them are fairly straightforward.

Will Happer, director of research at the Department of Energy (and a professor of physics at Princeton University) was simply fired from his government position after expressing doubts about environmental issues in general. His case is described in Happer (2003).

Michael Griffin, NASA’s administrator, publicly expressed reservations concerning global warming alarm in 2007. This was followed by a barrage of ad hominem attacks from individuals including James Hansen and Michael Oppenheimer. Griffin has since stopped making any public statements on this matter.

Freeman Dyson, an acknowledged great in theoretical physics, managed to publish a piece in New York Review of Books (Dyson, 2008), where in the course of reviewing books by Nordhaus and Zedillo (the latter having been referred to earlier), he expressed cautious support for the existence of substantial doubt concerning global warming. This was followed by a series of angry letters as well as condemnation on the web site including ad hominem attacks. Given that Dyson is retired, however, there seems little more that global warming enthusiasts can do. However, we may hear of a deathbed conversion in the future.


This paper has attempted to show how changes in the structure of scientific activity over the past half century have led to extreme vulnerability to political manipulation. In the case of climate change, these vulnerabilities have been exploited to a remarkable extent. The dangers that the above situation poses for both science and society are too numerous to be discussed in any sort of adequate way in this paper. It should be stressed that the climate change issue, itself, constitutes a major example of the dangers intrinsic to the structural changes in science.

As concerns the specific dangers pertaining to the climate change issue, we are already seeing that the tentative policy moves associated with ‘climate mitigation’ are contributing to deforestation, food riots, potential trade wars, inflation, energy speculation and overt corruption as in the case of ENRON (one of the leading lobbyists for Kyoto prior to its collapse). There is little question that global warming has been exploited many governments and corporations (and not just by ENRON; Lehman Brothers, for example, was also heavily promoting global warming alarm, and relying on the advice of James Hansen, etc.) for their own purposes, but it is unclear to what extent such exploitation has played an initiating role in the issue. The developing world has come to realize that the proposed measures endanger their legitimate hopes to escape poverty, and, in the case of India, they have, encouragingly, led to an assessment of climate issues independent of the ‘official’ wisdom (Government of India, 2008). 23

(23 A curious aspect of the profoundly unalarming Indian report is the prominent involvement in the preparation of the report by Dr. Rajendra Pachauri (an economist and long term UN bureaucrat) who heads the IPCC. Dr. Pachauri has recently been urging westerners to reduce meat consumption in order to save the earth from destruction by global warming.)

For purposes of this paper, however, I simply want to briefly note the specific implications for science and its interaction with society. Although society is undoubtedly aware of the imperfections of science, it has rarely encountered a situation such as the current global warming hysteria where institutional science has so thoroughly committed itself to policies which call for massive sacrifices in well being world wide. Past scientific errors did not lead the public to discard the view that science on the whole was a valuable effort. However, the extraordinarily shallow basis for the commitment to climate catastrophe, and the widespread tendency of scientists to use unscientific means to arouse the public’s concerns, is becoming increasingly evident, and the result could be a reversal of the trust that arose from the triumphs of science and technology during the World War II period. Further, the reliance by the scientific community on fear as a basis for support, may, indeed, have severely degraded the ability of science to usefully address problems that need addressing.

It should also be noted that not all the lessons of the World War II period have been positive. Massive crash programs such as the Manhattan Project are not appropriate to all scientific problems. In particular, such programs are unlikely to be effective in fields where the basic science is not yet in place. Rather, they are best suited to problems where the needs are primarily in the realm of engineering.

Although the change in scientific culture has played an important role in making science more vulnerable to exploitation by politics, the resolution of specific issues may be possible without explicitly addressing the structural problems in science. In the US, where global warming has become enmeshed in partisan politics, there is a natural opposition to exploitation which is not specifically based on science itself. However, the restoration of the traditional scientific paradigm will call for more serious efforts. Such changes are unlikely to come from any fiat. Nor is it likely to be implemented by the large science bureaucracies that have helped create the problem in the first place.

A potentially effective approach would be to change the incentive structure of science. The current support mechanism for science is one where the solution of a scientific problem is rewarded by ending support. This hardly encourages the solution of problems or the search for actual answers. Nor does it encourage meaningfully testing hypotheses. The alternative calls for a measure of societal trust, patience, and commitment to elitism that hardly seems consonant with the contemporary attitudes. It may, however, be possible to make a significant beginning by carefully reducing the funding for science. Many scientists would be willing to accept a lower level of funding in return for greater freedom and stability. Other scientists may find the trade-off unacceptable and drop out of the enterprise. The result, over a period of time, could be a gradual restoration of a better incentive structure. One ought not underestimate the institutional resistance to such changes, but the alternatives are proving to be much worse. Some years ago, I described some of what I have discussed here at a meeting in Erice (Lindzen, 2005). Richard Garwin (who some regard as the inventor of the H-bomb) rose indignantly to state that he did not want to hear such things. Quite frankly, I also don’t want to hear such things. However, I fear that ignoring such things will hardly constitute a solution, and a solution may be necessary for the sake of the scientific enterprise.

Acknowledgments: The author wishes to thank Dennis Ambler, Willie Soon, Lubos Motl and Nigel Lawson for useful comments and assistance.

Appendix 1

July 11, 2008

On behalf of the organizing committee, and workshop co-sponsors IPCC, WCRP, IGBP, the US National Science Foundation, and Climate Central, we take great pleasure in inviting you to attend a “Joint IPCC-WCRP-IGBP Workshop: New Science Directions and Activities Relevant to the IPCC AR5” to be held March 3—6, 2009. The Workshop will be hosted by the International Pacific Research Center (IPRC) at the University of Hawaii in Honolulu, Hawaii. The workshop is open to WG1 LAs and CLAs from all four assessments. The proceedings will be made available to IPCC.

This workshop has several major goals:
1) New science results and research directions relevant for the upcoming IPCC Fifth Assessment Report (AR5) will be discussed, with a view to the manner in which new observations and models can ensure their fullest possible consideration in the upcoming AR5. This could include but are not limited to e.g., ice sheet instability, land use parameterizations, aerosols and their effects on clouds and climate, new attribution results beyond temperature, and improved ENSO projections.

2) Subsequent to the AR4, an international planning process has begun to perform a coordinated set of climate model experiments with AOGCMs as well as emerging Earth System Models (ESMs, including new aspects of climate-vegetation and carbon cycle feedbacks) to quantify time-evolving regional climate change using mitigation/adaptation scenarios. These experiments will address key feedbacks in climate system response to increasing greenhouse gases. For example, carbon cycle feedback was identified as one of the main uncertainties for the upper end of future climate projections in the AR4. An international process to produce a set of mitigation scenarios for use in WG1, termed Representative Concentration Pathways (RCPs), will culminate in the fall of 2008 when the scenarios will be turned over to the WG1 modeling groups. The ingredients in these scenarios (emissions and concentrations of various constituents) will be reviewed at the workshop to ensure they are compatible with what is required by the new Earth System Models. It is essential that scientists gathered at the workshop examine and discuss them in detail to ensure compatibility and consistency with the new ESMs, particularly with regard to land use/land cover and emissions, which will also be a central topic at the workshop. Additionally, output requirements for the model simulations and a strategy for extension of long-term simulations to 2300 will be discussed.

3) Decadal climate prediction has recently emerged as a research activity that combines aspects of seasonal/interannual predictions and longer term emission scenario-driven climate change. Recent research results, as well as plans for coordinated experiments to address science problems associated with the decadal prediction, will be discussed at the workshop.

For planning purposes, please register for the workshop at before September 1, 2008. Hotel information is available on that web site, and participants are encouraged to make their hotel reservations as soon as possible because reservations for the various hotel options are on a first come first served basis. Since there are large numbers of potential participants, we will need to know by that early date (September 1) whether or not you plan on attending so we can make appropriate logistical arrangements. A $100 registration fee per attendee will be collected at the workshop. Attendees to the workshop will be largely self-funded similar to the IPCC model analysis workshop held in Hawaii in March, 2005.

We look forward to this opportunity to have WG1 LAs and CLAs from all four assessments gather as a group for a science meeting for the first time in the history of the IPCC. The outcomes from this unique workshop will provide important scientific direction as input to the early planning stages for the IPCC AR5.

Best regards from the organizing committee,

Gerald Meehl, Jonathan Overpeck, Susan Solomon, Thomas Stocker, and Ron Stouffer

Appendix 2

Last year, a TV program opposing global warming alarmism, The Great Global Warming Swindle, was aired by channel 4 in Britain. The IPCC brought a complaint against the producers of the program to the British Office of Communications (OfCom). The OfCom held that the producers did not give the IPCC sufficient time to respond (they were given about a week), but that the program did not materially mislead the public. Steven McIntyre on his web site,, analyzes the decision as well as the dishonest responses of the IPCC officials to the OfCom findings. It is a lovely example of self-refutation. That is to say, the IPCC officials demonstrated that they were acting in a political capacity in the very process of denying this.

Ofcom: The IPCC Complaint

By Steve McIntyre

Ofcom’s disposition of the IPCC Complaint is here page 43. There are many interesting aspects to this decision that are distinct from any of the others. Ofcom’s actual finding is extremely narrow. IT rejected 2 of 6 complaints. On 3 of 6, it determined that the producers had provided notice to IPCC but the notice on Feb 27, 2007 did not leave IPCC with “reasonable time” to respond prior to the airing on March 8, 2007 (though Ofcom itself states that “three working days” is a “reasonable time” for the parties to file an appeal of the present decision. They also determined that the producers failed to give IPCC adequate notice that someone in the production would say that they were “politically driven”. Had the producers sent their email of Feb 27, 2007 on (say) Feb 20, 2007, including a mention in the email that one of the contributors stated that IPCC was “politically driven”, then the Swindle producers would appear to have been immune from the present findings. Little things do matter.

The two rejected claims are themselves rather interesting and make you scratch your head. As discussed below, Swindle contributors were said to have claimed that IPCC had predicted climate disaster and the northward migration of malaria as a result of global warming. IPCC denied ever making such claims and apparently felt that its reputation was sullied by being associated with such claims. These two matters were decided on other grounds, but many readers will be interested to read more about IPCC disassociating itself from claims that global warming would cause northward migration of malaria or predictions of climate disaster.

In addition, in its complaint, IPCC made grandiose claims about its “open and transparent process” and the role of review editors, describing the process as being in the public domain and by its nature designed to avoid “undue influence” of any reviewer. This will come as somewhat of a surprise to CA readers, who are familiar with the avoidance of IPCC procedures by Ammann and Briffa and the seemingly casual performance of review editor Mitchell and who have been following the relentless stonewalling by IPCC and IPCC officials of requests for specific information pertaining to this allegedly “open and transparent process”.

Two Rejected Complaints
They discarded two parts of the complaint entirely.

IPCC denied that it had claimed that malaria “will” spread as a result of global warming (as stated by Channel 4) and said that it was unfair for Channel 4 to have broadcast this claim without their having an adequate opportunity to respond. The claim was decided on other grounds (that the allegation by Paul Reiter did not mention specifically mention IPCC). However, many readers will be surprised and interested to know that IPCC considers that its reputation is diminished by attributing to it the view that malaria will spread as a result of global warming.

IPCC complained that the “programme falsely claimed that its FAR (1990) predicted “climatic disaster as a result of global warming” without an opportunity to defend itself against the indignity of being accused of making such a claim. It’s a relief to the rest of us to know that not only is the IPCC not predicting climatic disaster, but it considers being associated with such a claim to be an insult. Ofcom considered some interesting contemporary evidence, including a speech by Margaret Thatcher, the scientific content of which was approved by Houghton, and came to the view that this was not an unreasonable characterization. Their decision on this issue stated:

“the Committee considered that the comment that described the FAR (1990) as predicting “climatic disaster as a result of global warming” was not an allegation against the IPCC and was not unfair to it. It was not, therefore, incumbent on the programme makers to have offered the IPCC an appropriate and timely opportunity to respond to this particular comment.”

The most interesting part of these two issues were the IPCC defenses.

Three Issues where the notice was insufficiently timely

On three parts of the Complaint (Reiter’s criticism of the malaria section of the IPCC report, Reiter’s criticism of how IPCC made up its author lists, Seitz’ criticism of the SAR-Santer fiasco), Ofcom found that Swindle had provided notice to IPCC within the requirements, but had failed to provide IPCC with enough time to respond.

What would be a reasonable amount of time? Ofcom says in their Guidelines for the handling of standards complaints and cases (in programmes and sponsorship) that three working days is a “reasonable time” for an appeal, 5 working days for broadcasters to deliver any requested material and 10 working days to deliver certain sorts of detailed written submissions.

While the producers had preliminary contact with IPCC in October 2006 (as a result of which they were referred to a website), the first notice to IPCC that they would be presenting the Reiter and Seitz allegations came on Feb 26, 2007 (a Monday). to which there was no response. A follow-up email was sent three days later on March 1, 2007, again with no response. At the time of the show’s first airing on March 8, 2007, ten days (8 working days) after the first notice letter, IPCC had still sent no response. Nor did it send one prior to the second airing. Ofcom noted:

“the IPCC is a large organisation with considerable resources at its disposal and that it employs a dedicated Information and Communications Officer. On the face of it, these factors might be taken to suggest the IPCC should have been in a position to respond to the programme makers’ emails (subject to being provided with sufficient information about the allegations that would be made in the programme)”

On the other hand, Ofcom noted that the producers had failed to properly inform IPCC of the deadlines:

As mentioned above, it was significant that the programme maker’s email of 26 February 2007 gave the IPCC no indication of when its response was required and the follow-up email of 1 March 2007 (sent at 7.33pm) subsequently gave a deadline of the following day. Neither of these emails indicated the date of broadcast.

Taking into account all the above factors, the Committee considered that it was unreasonable for the programme makers to have expected the IPCC to understand that its response was required in a matter of days, and that it was not reasonable to expect the IPCC to be able to provide a response within the one day of being advised of the deadline. The Committee therefore found that the opportunity to respond had not been offered in a timely way.

On these particular findings, there’s a process lesson about the need for clear and unequivocal notice. In this particular case, it seems highly unlikely that IPCC was going to bother responding in any event. So the producers could easily have avoided this particular problem merely by giving clearer and somewhat more informative notice. For example, had they sent out the email on Feb 20, 2007 instead of Feb 27, 2007, notifying the IPCC of their deadline, then it’s hard to see how these parts of the IPCC complaint could have even got as far as they did.

I note that it appears that IPCC itself did not even file the “IPCC Complaint”. It appears to be another concoction by Rado and associates. Their website says that:

“Sir John Houghton … co-authorised our Fairness complaint on behalf of the IPCC…. Dr Pachauri co-authorised our Fairness complaint on behalf of the IPCC. …Martin Parry also co-authorised our Fairness complaint on behalf of the IPCC… Professor [Robert] Watson co-authorised our Fairness complaint on behalf of the IPCC.”

which I take this as evidence that IPCC itself did not author the complaint. Normally, in order to be heard by Ofcom, a “fairness” complaint has to be made by the person directly affected. There are situations in which a third party can be authorized to make the complaint; I haven’t examined whether these situations apply here.

However the form of IPCC “authorization” seems highly curious. John Houghton supposedly “co-authorised our Fairness complaint on behalf of the IPCC”. While Houghton has obviously been an important figure in the IPCC movement, he is not listed at the IPCC website as one of its present officers and would not appear to have sufficient current authority to “authorize” the complaint. Robert Watson’s appearance on this list is also interesting. Watson is likewise not listed as an current IPCC officer; Rado’s website states that Watson is currently DEFRA’s Chief Scientific Adviser. That a DEFRA employee should perceive himself as having the authority to authorize the commencement of an action in the U.K. on behalf of IPCC, which, under other circumstance, asserts its immunity rights as an international organization, is intriguing to say the least.

A “Political” Organization

The last “issue” in play was the statement by Philip Stott that IPCC was a “politically driven” organization.

Dr Philip Stott: “The IPCC, like any UN body, is political. The final conclusions are politically driven.”

This matter differed somewhat from the 3 considered under the previous head in that no notice was given to the IPCC in their Feb 26, 2007 email that the production would say that they are “political”.

In its defence, Channel 4 said

“the programme contributor, Dr Philip Stott, was merely making a statement of fact. Channel 4 said the programme made the important and valid point that the IPCC is political as well as scientific. Channel 4 said the IPCC chairmen and authors are nominated by governments and the reports are viewed by government officials prior to publication. Further, Channel 4 said the IPCC had been criticised on a number of occasions for being hampered by political interference. Channel 4 therefore maintained it was entirely fair for Professor Stott to state that the IPCC is ‘politically driven.’”

The IPCC response will be particularly intriguing to Climate Audit readers who have followed IPCC’s refusal to provide a complete archive of its Review Comments and Responses (in direct breach of their own formal procedures), a refusal abetted by corresponding refusals of national FOI requests. Ofcom summarizes their response:

In relation to the IPCC being “politically driven”, the IPCC said that the requirement for openness and transparency in its processes ensured that it was impossible for any undue interference to take place or any undue pressure to be applied by any reviewer (government or otherwise).

The IPCC said the government expert reviewer is free to ask any lead author to reconsider what they have written, but based solely on scientific content. The lead author will then consider the comment or request for change. If the lead author then wishes to make the change, he/she has to account for the decision to his/her review editor, who will make the final decision. Such changes must then be documented and the results made public.

The IPCC said that, given the IPCC’s own procedures, Channel 4’s arguments in relation to this head of complaint were either ill-informed or disingenuous.

Huh? This is not a true description of the process that I’ve experienced or that has been documented here. “Disingenuous” – they must be taking etiquette lessons from Michael Mann.

In terms of my own personal experience, we know that Ammann evaded the formal “open and transparent” process by sending review comments about our work outside the properly instituted process and that the parties have subsequently refused to produce the presumably adverse comments. Did these exchanges result in “undue interference” or “undue pressure” by a reviewer? The purpose of the “open and transparent” process is to do what IPCC represented to Ofcom that it did. Too bad that it’s not a true description.

Similarly with the role of the Review Editors. IPCC testified to Ofcom that the “review editor” made the final decision. But Review Editor Mitchell has said that these decisions were up to Briffa and the chapter authors. Although IPCC says here that this process is “public”, IPCC has refused to provide Mitchell’s comments and Mitchell has concocted absurd and untrue reasons to avoid producing the comments (even claiming that he acted as an IPCC review editor in a “personal” capacity and that he has destroyed all his IPCC correspondence).

Here’s how Ofcom decided this matter:

In the Committee’s opinion, viewers would have understood from the full section (quoted above) that the IPCC was not a purely scientific body and that its ‘scientific’ conclusions were significantly tainted by political interests.

The Committee considered that such an impression went to the core of the IPCC’s function and reputation: in this regard it noted that the IPCC was set up following international governmental accord with the aim of producing objective scientific assessments to inform policy and decision making worldwide. The Committee considered that “politically driven” was a strong and potentially damaging allegation which, within the context of this part of the programme, suggested direct political influence and was clearly intended to call into question the credibility of the IPCC….

… In the circumstances, the Committee concluded that the IPCC was not afforded a timely or appropriate opportunity to respond to the significant allegation that the conclusions of the IPCC were “politically driven”. This resulted in unfairness to the IPCC in the programme as broadcast.


So what exactly did IPCC win? Ofcom said that the producers should have given them more adequate notice time for Reiter’s allegations about the review of the malaria section and the listing of authors and for Seitz’ allegations about SAR and for the assertion that they would say that IPCC was “politically driven”.
Did Ofcom opine on whether IPCC was giving good or bad reports? Nope. It stuck to knitting and rendered carefully reasoned decisions on whether the producers gave adequate notice to someone being criticized, as required under the Broadcasting Code.


Now look at the crowing about this decision by IPCC officials.
Pachauri: Climate Science:

We are pleased to note that Ofcom has vindicated the IPCC’s claim against Channel Four in spirit and in substance, and upheld most of the formal complaints made by those who respect the IPCC process. It is heartening to see that the review process of the IPCC, and the credibility of the publications of the IPCC were upheld, as was the claim that Channel Four did not give the Panel adequate time to respond to most of their allegations. The IPCC is an organization that brings together the best experts from all over the world committed to working on an objective assessment of all aspects of climate change. The relevance and integrity of its work cannot be belittled by misleading or irresponsible reporting. We express our appreciation of the Fairness Committee at Ofcom, and are satisfied with their rulings on this matter.

Some of this is simply untrue. Ofcom did not “uphold” the review process of the IPCC or the credibility of IPCC publications. Neither did it trash them. It simply did not consider them. Pachauri is totally misrepresenting the decision.


The ruling today from Ofcom regarding the Great Global Warming Swindle programme has exposed the misleading and false information regarding the Intergovernmental Panel on Climate Change (IPCC) that was contained in that programme and that has been widely disseminated by the climate denying community. The integrity of the IPCC’s reports has therefore been confirmed as has their value as a source of accurate and reliable information about climate change.

Again, all completely untrue. The Ofcom decision did “not expose the misleading and false information” regarding IPCC nor did it “confirm the integrity of the IPCC reports”. Nor did it endorse the programme nor did it trash the integrity of the reports. It didn’t make any decision on them one way or another. It simply said that the producers failed to give IPCC enough notice to respond.

Robert Watson

I am pleased that Ofcom recognized the serious inaccuracies in the Global Warming Swindle and has helped set the record straight.

Again untrue. Ofcom did nothing of the sort. It made no attempt whatever to sort out the scientific disputes.

Martin Parry:

This is excellent news. People and policymakers need to have confidence in the science of climate change. The reputation of the IPCC as the source of dependable and high quality information has been fully upheld by this Ofcom ruling. Channel 4’s Great Global Warming Swindle was itself a disreputable attempt to swindle the public of the confidence it needs in scientific advice.

Again completely untrue. The Ofcom ruling did not “uphold” the “reputation of the IPCC as the source of dependable and high quality information”. Nor did it disparage its reputation. It simply said that IPCC didn’t get enough time to respond.

Appendix 3

From the Boston Globe
Convincing the climate-change skeptics
By John P. Holdren | August 4, 2008

THE FEW climate-change “skeptics” with any sort of scientific credentials continue to receive attention in the media out of all proportion to their numbers, their qualifications, or the merit of their arguments. And this muddying of the waters of public discourse is being magnified by the parroting of these arguments by a larger population of amateur skeptics with no scientific credentials at all. Long-time observers of public debates about environmental threats know that skeptics about such matters tend to move, over time, through three stages. First, they tell you you’re wrong and they can prove it. (In this case, “Climate
isn’t changing in unusual ways or, if it is, human activities are not the cause.”) Then they tell you you’re right but it doesn’t matter. (“OK, it’s changing and humans are playing a role, but it won’t do
much harm.”) Finally, they tell you it matters but it’s too late to do anything about it. (“Yes, climate disruption is going to do some real damage, but it’s too late, too difficult, or too costly to avoid that, so we’ll just have to hunker down and suffer.”)

All three positions are represented among the climate-change skeptics who infest talk shows, Internet blogs, letters to the editor, op-ed pieces, and cocktail-party conversations. The few with credentials in climate-change science have nearly all shifted in the past few years from the first category to the second, however, and jumps from the second to the third are becoming more frequent. All three factions are wrong, but the first is the worst. Their arguments, such as they are, suffer from two huge deficiencies.

First, they have not come up with any plausible alternative culprit for the disruption of global climate that is being observed, for example, a culprit other than the greenhouse-gas buildups in the atmosphere that have been measured and tied beyond doubt to human activities. (The argument that variations in the sun’s output might be responsible fails a number of elementary scientific tests.)

Second, having not succeeded in finding an alternative, they haven’t even tried to do what would be logically necessary if they had one, which is to explain how it can be that everything modern science tells us about the interactions of greenhouse gases with energy flow in the atmosphere is wrong.
Members of the public who are tempted to be swayed by the denier fringe should ask themselves how it is possible, if human-caused climate change is just a hoax, that: The leaderships of the national academies of sciences of the United States, United Kingdom, France, Italy, Germany, Japan, Russia, China, and India, among others, are on record saying that global climate change is real, caused mainly by humans, and reason for early, concerted action. This is also the overwhelming majority view among the faculty members of the earth sciences departments at every first-rank university in the world.

All three of holders of the one Nobel prize in science that has been awarded for studies of the atmosphere (the 1995 chemistry prize to Paul Crutzen, Sherwood Rowland, and Mario Molina, for figuring out what was happening to stratospheric ozone) are leaders in the climate-change scientific mainstream.

US polls indicate that most of the amateur skeptics are Republicans. These Republican skeptics should wonder how presidential candidate John McCain could have been taken in. He has castigated the Bush administration for wasting eight years in inaction on climate change, and the policies he says he would implement as president include early and deep cuts in US greenhouse-gas emissions. (Senator Barack Obama’s position is similar.)

The extent of unfounded skepticism about the disruption of global climate by human-produced greenhouse gases is not just regrettable, it is dangerous. It has delayed – and continues to delay – the development of the political consensus that will be needed if society is to embrace remedies commensurate with the challenge. The science of climate change is telling us that we need to get going. Those who still think this is all a mistake or a hoax need to think again.

John P. Holdren is a professor in the Kennedy School of Government and the Department of Earth and Planetary Sciences at Harvard and the director of the Woods Hole Research Center.


Allen, R.J. and S.C. Sherwood (2008) Warming maximum in the tropical upper troposphere deduced from thermal winds, Nature 25 May 2008; doi:10.1038/ngeo208 1-5

Arnell, N.W. (2004) Climate change and global water resources: SRES emissions and socio-economic scenarios, Global Environmental Change, 14, 31-52.

Bard, E.and G. Delaygue (2008) Comment on “Are there connections between the Earth’s magnetic field and climate?” Earth and Planetary Science Letters 265 302–307

Barron, E.J. (1987) Eocene Equator-to-Pole Surface Ocean Temperatures: A Significant Climate Problem? PALEOCEANOGRAPHY, 2, 729–739

Bush, A.B.G. and S.G.H. Philander (1998a) The late Cretaceous: simulation with a coupled atmosphere-ocean general circulation model. Paleoceanography 12 495-516

Bush, A.B.G. and S.G.H. Philander (1998b) The role of ocean-atmosphere interactions in tropical cooling during the last glacial maximum. Science 279 1341-1344

Bush, V. (1945) Science: the Endless Frontier.

Choi, Y.-S., and C.-H. Ho (2006), Radiative effect of cirrus with different optical properties over the tropics in MODIS and CERES observations, Geophysical Research Letters, 33, L21811, doi:10.1029/2006GL027403

Choi, Y.-S., and C.-H. Ho (2008), Validation of the cloud property retrievals from the MTSAT-1R imagery using MODIS observations, International Journal of Remote Sensing, accepted.

Chou, M.-D., R.S. Lindzen, and A.Y. Hou (2002b) Comments on “The Iris hypothesis: A negative or positive cloud feedback?” J. Climate, 15, 2713-2715.

CLIMAP Project (1976) The surface of the ice-age Earth. Science 191:1131-1136

Courtillot, V., Y. Gallet, J.-L. Le Mouël, F. Fluteau, and A. Genevey (2007) Are there connections between the Earth’s magnetic field and climate? Earth and Planetary Science Letters 253 328–339

Crichton, M. (2004) State of Fear, Harper Collins, 624 pp.

Crowley, T. J. (2000) CLIMAP SSTs re-revisited. Climate Dynamics 16:241-255

Demming, D. (2005) Global warming, the politicization of science, and Michael Crichton’s State of Fear, Journal of Scientific Exploration, 19, 247-256.

Dyson, F. (2008) The Question of Global Warming, New York Review of Books, 55, No. 10, June 12, 2008.

Fu, Q., Baker, M., and Hartman, D. L.(2002) Tropical cirrus and water vapor: an effective Earth infrared iris feedback? Atmos. Chem. Phys., 2, 31–37

Gelbspan, R. (1998) The Heat is On, Basic Books, 288 pp.

Government of India (2008) National Action Plan on Climate Change, 56pp.

Happer, W. (2003) Harmful Politicization of Science in Politicizing Science: The Alchemy of

Policymaking edited by Michael Gough, Hoover Institution 313 pp (pp 27-48).

Haqq-Misra, J.D., S.D. Domagal-Goldman, P. J. Kasting, and J.F. Kasting (2008) A Revised, hazy methane greenhouse for the Archean Earth. Astrobiology in press

Hartmann, D. L., and M. L. Michelsen (2002) No evidence for iris. Bull. Amer. Meteor. Soc., 83, 249–254.

Held, I.M. and B.J. Soden (2006) Robust responses of the hydrological cycle to global warming, Journal of Climate., 19, 5686-5699.

Holland, D. (2007) Bias And Concealment in the IPCC Process: The “Hockey-Stick” Affair and its Implications, Energy & Environment, 18, 951-983.

Horvath, A., and B. Soden, ( 2008) Lagrangian Diagnostics of Tropical Deep Convection and Its Effect upon Upper-Tropospheric Humidity, Journal of Climate, 21(5), 1013–1028

Huber, M. (2008) A Hotter Greenhouse? Science 321 353-354

IPCC, 1990: Climate Change: The IPCC Scientific Assessment [Houghton, J. T et al., (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 362 pp.

IPCC, 1996: Climate Change 1995: The Science of Climate Change. Contribution of Working Group I to the Second Assessment Report of the Intergovernmental Panel on Climate Change [Houghton et al. (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 572 pp

IPCC, 2001: Climate Change 2001: The Scientific Basis. Contribution of Working Group I to the Third Assessment Report of the Intergovernmental Panel on Climate Change [Houghton, J.T., et al. (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 881 pp.

IPCC, 2007:Solomon et al., (eds.) 2007: ‘Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change’. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA. (Available at

Keenlyside, N. S., M. Latif, J. Jungclaus, L. Kornblueh and E. Roeckner (2008) Advancing decadal-scale climate prediction in the North Atlantic sector. Nature 453 84-88

Kennedy, D., 2008: Science, Policy, and the Media, Bulletin of the American Academy of Arts & Sciences, 61, 18-22.

Kiehl, J.T. (2007) Twentieth century climate model response and climate sensitivity. Geophys. Res. Lttrs., 34, L22710, doi:10.1029/2007GL031383

Lee, M.I., M.J. Suarez, I.S. Kang, I. M. Held, and D. Kim (2008) A Moist Benchmark Calculation for the Atmospheric General Circulation Models, J.Clim., in press.

Lin, B., B. Wielicki, L. Chambers, Y. Hu, and K.-M. Xu, (2002) The iris hypothesis: A negative or positive cloud feedback? J. Climate, 15, 3–7.

Lindzen, R.S. (1999) The Greenhouse Effect and its problems. Chapter 8 in Climate Policy After Kyoto (T.R. Gerholm, editor), Multi-Science Publishing Co., Brentwood, UK, 170pp.

Lindzen, R.S. (2005) Understanding common climate claims. in Proceedings of the 34th International Seminar on Nuclear War and Planetary Emergencies, R. Raigaini, editor, World Scientific Publishing Co., Singapore, 472pp. (pp. 189-210)

Lindzen, R.S. (2007) Taking greenhouse warming seriously. Energy & Environment, 18, 937-950.

Lindzen, R.S., M.-D. Chou, and A.Y. Hou (2001) Does the Earth have an adaptive infrared iris? Bull. Amer. Met. Soc. 82, 417-432.

Lindzen, R.S., M.-D. Chou, and A.Y. Hou (2002) Comments on “No evidence for iris.” Bull. Amer. Met. Soc., 83, 1345–1348

Lindzen-Rahmstorf Exchange (2008)

Mann, M.E., R.E. Bradley, and M.K. Hughes (1998) Global-scale temperature patterns and climate forcing over the past six centuries,” Nature, 392, 779-787.

Mann, M.E., Bradley, R.S. and Hughes, M.K. (1999) Northern Hemisphere Temperatures During the Past Millennium: Inferences, Uncertainties, and Limitations, Geophysical Research Letters,
26, 759-762.

McIntyre, S. and R. McKitrick (2003) Corrections to the Mann et al. (1998) proxy data base and Northern hemispheric average temperature series,” Energy and Environment, 14, 751-771.

McIntyre, S. and R. McKitrick (2005a) The M&M critique of MBH98
Northern hemisphere climate index: Update and implications, Energy and Environment,
16, 69-100.

McIntyre, S. and R. McKitrick (2005b) Hockey sticks, principal components, and spurious significance,” Geophysical Research Letters, 32, L03710, doi:10.1029/2004GL021750

Miller, D.W. (2007) The Government Grant System Inhibitor of Truth and Innovation? J. of Information Ethics, 16, 59-69

National Academy of Sciences (1992) Policy Implications of Greenhouse Warming:Mitigation, Adaptation, and the Science Base, National Academy Press, 944 pp.

North, G.R., chair (2006) NRC, 2006: Committee on Surface Temperature Reconstructions for the Last 2,000 Years, National Research Council, National Academies Press

Oppenheimer, M. and R.Boyle (1990) Dead Heat, The Race Against the Greenhouse Effect, Basic Books, 288 pp.

Oreskes, N.(2004) The scientific consensus on climate change. Science, 306, 1686.

Pearce, F. (2008) Poor forecasting undermines climate debate. New Scientist, 01 May 2008, 8-9

Pearson, P.N., P.W. Ditchfeld, J. Singano, K.G. Harcourt-Brown, C.J. Nicholas, R.K. Olsson,
N.J. Shackleton & M.A. Hall (2000) Warm tropical sea surface temperatures in the Late Cretaceous and Eocene epochs Nature 413 481-487

Pielke Sr., R.A., T.N. Chase, J.R. Christy and B. Herman (2008) Assessment of temperature trends in the troposphere deduced from thermal winds. Nature (submitted)

Pulver, Simone (2004). Power in the Public Sphere: The battles between Oil Companies and Environmental Groups in the UN Climate Change Negotiations, 1991-2003. Doctoral dissertation, Department of Sociology, University of California, Berkeley

Roe, G. (2006) In defense of Milankovitch. Geophys. Res. Ltrs., 33, L24703, doi:10.1029/2006GL027817

Sackmann, J. and A.I. Boothroyd (2003) Our sun. V. A bright young sun consistent with helioseismology and warm temperatures on ancient earth and mars. The Astrophysical Journal, 583:1024-1039

Sagan, C. and G. Mullen. (1972) Earth and Mars: evolution of atmospheres and surface temperatures. Science, 177, 52-56.

Schneider, S.H., (1997) Laboratory Earth, Basic Books, 174pp.

Schrag, D.P. (1999) Effects of diagenesis on isotopic record of late Paleogene equatorial sea surface temperatures. Chem. Geol., 161, 215-224

Schulte, K.-M. (2008) Scientific consensus on climate? Energy and Environment, 19 281-286

Shackleton, N., and A. Boersma, (1981) The climate of the Eocene ocean, J. Geol. Soc., London, 138, 153-157.

Singer, S.F. (2003) The Revelle-Gore Story Attempted Political Suppression of Science in Politicizing Science: The Alchemy of Policymaking edited by Michael Gough, Hoover Institution 313 pp (pp 283-297).

Singer, S.F., C. Starr, and R. Revelle (1991), “What To Do About Greenhouse Warming: Look Before You Leap,” Cosmos 1 28–33.

Smith, D.M., S. Cusack, A.W. Colman, C.K. Folland, G.R. Harris, J.M. Murphy (2007) Improved Surface Temperature Prediction for the Coming Decade from a Global Climate Model Science, 317, 796-799

Soon, W., S. Baliunas, C. Idso, S. Idso, and D. Legates (2003) Reconstructing climatic and environmental changes of the past 1000 years: a reappraisal. Energy and Environment, 14, 233-296

Thompson, D.W.J., J. J. Kennedy, J. M. Wallace and P.D. Jones (2008) A large discontinuity in the mid-twentieth century in observed global-mean surface temperature Nature 453 646-649

Vinnikov, K.Y. N.C. Grody, A. Robock, RJ. Stouffer, P.D. Jones, and M.D. Goldberg (2006) Temperature trends at the surface and in the troposphere. J. Geophys. Res.,111, D03106, doi:10.1029/2005JD006392

Weart, S. (2003) The Discovery of Global Warming, Harvard University Press, 228 pp.
Wegman, E.J. et al., (2006): Ad Hoc Committee report on the “Hockey Stick” global climate reconstruction, commissioned by the US Congress House Committee on Energy and Commerce,

Zedillo, E., editor (2007) Global Warming: Looking Beyond Kyoto. Brookings Institution Press, 237 pp.

Richard Lindzen Portrait

About the author: Richard S. Lindzen is the Alfred P. Sloan Professor of Atmospheric Science at the Massachusetts Institute of Technology ( This paper was prepared for a meeting sponsored by Euresis (Associazone per la promozione e la diffusione della cultura e del lavoro scientifico) and the Templeton Foundation on Creativity and Creative Inspiration in Mathematics, Science, and Engineering: Developing a Vision for the Future. The meeting was held in San Marino from 29-31 August 2008. Its Proceedings are expected to be published in 2009. Reprinted here with permission from the author.

Email the Editor about this Article
EcoWorld - </p> <p>Nature and Technology in Harmony

Posted in Atmospheric Science, Causes, Consumption, Drought, Effects Of Air Pollution, Energy, Engineering, Ethics, Fish, Global Warming & Climate Change, Literature, Military, Natural Disasters, Nuclear, Office, Organizations, Other, Ozone, Physical Sciences, Policies & Solutions, Regional, Science, Space, & Technology, Services, Solar, Wind16 Comments

The Global Warming Alarm Industry

Newsweek Magazine’s cover story of August 13, 2007 entitled, “The Truth About Denial” contains very little that could actually be considered balanced, objective or fair by journalistic standards.

The one-sided editorial, masquerading as a “news article,” was written by Sharon Begley with Eve Conant, Sam Stein and Eleanor Clift and Matthew Philips and purports to examine the “well-coordinated, well-funded campaign by contrarian scientists, free-market think tanks and industry has created a paralyzing fog of doubt around climate change.”

Environmentalists don’t have to be global warming alarmists.

The only problem is – Newsweek knew better. Reporter Eve Conant, who interviewed Senator James Inhofe (R-Okla.), the Ranking Member of the Environment & Public Works Committee, was given all the latest data proving conclusively that it is the proponents of man-made global warming fears that enjoy a monumental funding advantage over the skeptics. (A whopping $50 BILLION to a paltry $19 MILLION and some change for skeptics – Yes, that is BILLION to MILLION – see below )

This week’s “news article” in Newsweek follows the Magazine’s October 23, 2006 article which admitted the error of their ways in the 1970′s when they predicted dire global cooling. (See: Senator Inhofe Credited For Prompting Newsweek Admission of Error on 70′s Predictions of Coming Ice Age)

Use of Word ‘Denier’

First, let’s take a look at Newsweek’s use of the word “denier” when describing a scientist who views with skepticism the unproven computer models predicting future climate doom. The use of this terminology has drawn the ire of Roger Pielke, Jr. of the University of Colorado’s Center for Science and Technology Policy Research. “The phrase ‘climate change denier’ is meant to be evocative of the phrase ‘holocaust denier,’” Pielke, Jr. wrote on October 9, 2006.

Let’s be blunt. This allusion is an affront to those who suffered and died in the Holocaust. This allusion has no place in the discourse on climate change. I say this as someone fully convinced of a significant human role in the behavior of the climate system,” Pielke, Jr. explained.

Newsweek Fails Basic Arithmetic

Newsweek reporter Eve Conant was given the documentation showing that proponents of man-made global warming have been funded to the tune of $50 BILLION in the last decade or so, but the Magazine chose instead to focus on how skeptics have reportedly received a paltry $19 MILLION from ExxonMobil over the last two decades. Paleoclimate scientist Bob Carter, who has testified before the Senate Environment & Public Works committee, explained how much money has been spent researching and promoting climate fears and so-called solutions. “In one of the more expensive ironies of history, the expenditure of more than $US50 billion on research into global warming since 1990 has failed to demonstrate any human-caused climate trend, let alone a dangerous one,” Carter wrote on June 18, 2007. (See: “High Price for Load of Hot Air”)

The U.S. alone has spent $30 billion on federal programs directly or indirectly related to global warming in just the last six years, according to one estimate. (Climate Change’s Carnival Atmosphere) ($5.79 billion in 2006 alone) Adding to this total is funding from the UN, foundations, universities, foreign governments, etc. Huge sums of money continue to flow toward addressing climate fears. In August, a State Treasurer in California “proposed a $5 billion bond measure to combat global warming,” according to the Sacramento Bee. Even if you factor in former Vice President Al Gore’s unsubstantiated August 7, 2007 assertion that $10 million dollars a year from the fossil fuel industry flows into skeptical organizations, any funding comparison between skeptics and warming proponents utterly fails.

Update: Gore to launch $100 million a year multimedia global warming fear campaign. Gore alone will now be spending $90 million more per year than he alleges the entire fossil fuel industry spends, according to an August 26, 2007 article in Advertising Age. (Global Warming Ads From Al Gore Coming Soon)

Capitalists can love nature & environmentalists can love profit.
(Photo: Mark Jackson)

Global Warming ‘A Big Cash Grab’

Meteorologist Dr. Roy W. Spencer, formerly a senior scientist for climate studies at NASA’s Marshall Space Flight Center and currently principal research scientist at the University of Alabama in Huntsville, called the Newsweek article part of a “coordinated assault” on skeptics. “[Newsweek] alleges that a few scientists were offered $10,000 (!) by Big Oil to research and publish evidence against the theory of manmade global warming. Of course, the vast majority of mainstream climate researchers receive between $100,000 to $200,000 from the federal government to do the same, but in support of manmade global warming,” Spencer wrote in an August 15, 2007 blog post. (A Report from the Global Warming Battlefield)

James Spann, a meteorologist certified by the American Meteorological Society, suggests scientific objectively is being compromised by the massive money flow to proponents of man-made climate fears. “Billions of dollars of grant money is flowing into the pockets of those on the man-made global warming bandwagon. No man-made global warming, the money dries up. This is big money, make no mistake about it. Always follow the money trail and it tells a story,” Spann wrote on January 18, 2007. (AMS Certified Weatherman Strikes Back At Weather Channel Call for Decertification)

Nothing wrong with making money at all, but when money becomes the motivation for a scientific conclusion, then we have a problem. For many, global warming is a big cash grab,” Spann added. ‘An Entrenched Interest’ Atmospheric physicist Dr. Fred Singer, co-author of the book “Unstoppable Global Warming: Every 1500 Years,” also detailed the extensive financing machine the proponents of man-made global warming enjoy. “Tens of thousands of interested persons benefit directly from the global warming scare—at the expense of the ordinary consumer. Environmental organizations globally, such as Greenpeace, the Sierra Club, and the Environmental Defense Fund, have raked in billions of dollars. Multi-billion-dollar government subsidies for useless mitigation schemes are large and growing. Emission trading programs will soon reach the $100 billion a year level, with large fees paid to brokers and those who operate the scams,” Singer explained on June 30, 2007. (The Constitution and American Sovereignty)

In other words, many people have discovered they can benefit from climate scares and have formed an entrenched interest. Of course, there are also many sincere believers in an impending global warming catastrophe, spurred on in their fears by the growing number of one-sided books, movies, and media coverage,” Singer added. For a detailed breakdown of how much money flows to promoters of climate fear, see a Janaury 17, 2007 EPW blog post: (Weather Channel Climate Expert Calls for Decertifying Global Warming Skeptics)

The [climate] alarmists also enjoy a huge financial advantage over the skeptics with numerous foundations funding climate research, University research money and the United Nations endless promotion of the cause. Just how much money do the climate alarmists have at their disposal? There was a $3 billion donation to the global warming cause from Virgin Air’s Richard Branson alone. The well-heeled environmental lobbying groups have massive operating budgets compared to groups that express global warming skepticism. The Sierra Club Foundation 2004 budget was $91 million and the Natural Resources Defense Council had a $57 million budget for the same year. Compare that to the often media derided Competitive Enterprise Institute’s small $3.6 million annual budget. In addition, if a climate skeptic receives any money from industry, the media immediately labels them and attempts to discredit their work. The same media completely ignore the money flow from the environmental lobby to climate alarmists like James Hansen and Michael Oppenheimer. (ie. Hansen received $250,000 from the Heinz Foundation and Oppenheimer is a paid partisan of Environmental Defense Fund) The alarmists have all of these advantages, yet they still feel the need to resort to desperation tactics to silence the skeptics. (EPA Chief Vows to Probe E-mail Threatening to ‘Destroy’ Career of Climate Skeptic)

Could it be that the alarmists realize that the American public is increasingly rejecting their proposition that the family SUV is destroying the earth and rejecting their shrill calls for ‘action’ to combat their computer model predictions of a ‘climate emergency?’” (See EPW Blog for full article)

As Senator Inhofe further explained in a September 25, 2006 Senate floor speech: “The fact remains that political campaign funding by environmental groups to promote climate and environmental alarmism dwarfs spending by the fossil fuel industry by a three-to-one ratio. Environmental special interests, through their 527s, spent over $19 million compared to the $7 million that Oil and Gas spent through PACs in the 2004 election cycle.” (Hot & Cold Media Spin: A Challenge To Journalists Who Cover Global Warming)

Senator Inhofe further explained: “I am reminded of a question the media often asks me about how much I have received in campaign contributions from the fossil fuel industry. My unapologetic answer is ‘Not Enough,’ — especially when you consider the millions partisan environmental groups pour into political campaigns.

Now contrast all of the above with how much money the “well funded” skeptics allegedly receive. The Paltry Funding of Skeptics (by comparision) The most repeated accusation is that organizations skeptical of man-made climate fears have received $19 Million from an oil corporation over the past two decades. This was the subject of a letter by two U.S. Senators in 2006 (See Senators letter of October 30, 2006 noting the $19 Million from Exxon-Mobil to groups skeptical of man-made global warming – Rockefeller and Snowe Demand that Exxon Mobil End Funding of Campaign that Denies Global Climate Change)

To put this $19 Million over two decades into perspective, consider: One 2007 U.S. Department of Agriculture (USDA) grant of $20 million to study how “farm odors” contribute to global warming exceeded all of the money that skeptics reportedly received from an oil giant in the past two decades. To repeat: One USDA grant to study the role of “farm odors” in global warming exceeded ALL the money skeptics have been accused of receiving from an oil giant over the past two decades. (Excerpt from article: “The United States Department of Agriculture has released reports stating that when you smell cow manure, you’re also smelling greenhouse gas emissions.” Scientists sniffing out methane emissions from cows sniffing at the wrong end

As erroneous and embarrassingly one-sided as Newsweek’s article is, the magazine sunk deeper into journalistic irrelevance when it noted that skeptical Climatologist Patrick Michaels had reportedly received industry funding without revealing to readers the full funding picture. The magazine article mentions NASA’s James Hansen as some sort of example of a scientist untainted by funding issues. But what Newsweek was derelict in reporting is that Hansen had received a $250,000 award from the Heinz Foundation run by Senator John Kerry’s wife Teresa in 2001 and then subsequently endorsed Kerry for President in 2004. (Hot & Cold Media Spin: A Challenge To Journalists Who Cover Global Warming)

MIT climate scientist Richard Lindzen has noted how proponents of man-made climate fears enjoy huge funding advantages. “Alarm rather than genuine scientific curiosity, it appears, is essential to maintaining funding,” Lindzen wrote in a April 12, 2006 op-ed in the Wall Street Journal. (Climate of Fear)

Scientists who dissent from the alarmism have seen their grant funds disappear, their work derided, and themselves libeled as industry stooges, scientific hacks or worse. Consequently, lies about climate change gain credence even when they fly in the face of the science that supposedly is their basis,” Lindzen added. (For more on the vilification of climate skeptics see EPA Chief Vows to Probe E-mail Threatening to ‘Destroy’ Career of Climate Skeptic)

Loving wilderness and wildlife is universal.

Science Vindicating Skeptics

Finally, Newsweek’s editorial rant attempts to make it appear as though the science is getting stronger in somehow proving mankind is driving a climate catastrophe. There are, however, major problems with that assertion. Scientists are speaking up around the globe to denounce Gore, the UN and the media driven “consensus” on global warming. Just recently, an EPW report detailed a sampling of scientists who were once believers in man-made global warming and who now are skeptical. [See May 15, 2007 report: Climate Momentum Shifting: Prominent Scientists Reverse Belief in Man-made Global Warming - Now Skeptics: Growing Number of Scientists Convert to Skeptics After Reviewing New Research.

Mathematician & engineer Dr. David Evans, who did carbon accounting for the Australian government, detailed how he left the global warming funding "gravy train" and became a skeptic. "By the late 1990's, lots of jobs depended on the idea that carbon emissions caused global warming. Many of them were bureaucratic, but there were a lot of science jobs created too. I was on that gravy train, making a high wage in a science job that would not have existed if we didn't believe carbon emissions caused global warming," Evans explained. "But starting in about 2000, the last three of the four pieces of evidence outlined above fell away or reversed," Evans wrote. "The pre-2000 ice core data was the central evidence for believing that atmospheric carbon caused temperature increases. The new ice core data shows that past warmings were *not* initially caused by rises in atmospheric carbon, and says nothing about the strength of any amplification. This piece of evidence casts reasonable doubt that atmospheric carbon had any role in past warmings, while still allowing the possibility that it had a supporting role," he added.

In addition, just last week, three new scientific studies further strengthened the skeptics' views on climate change. (LINK) Further, a recent analysis of peer-reviewed literature thoroughly debunks any fears of Greenland melting and a frightening sea level rise. [See July 30, 2007 - Latest Scientific Studies Refute Fears of Greenland Melt

Newsweek: A Media Dinosaur

The question remains: Is Newsweek even a news outlet worth taking the time to respond to in posts like this? Does Newsweek, a quirky alternative news outlet, even have an impact on public policy anymore? Journalism students across the world can read this week's cover story to learn how reporting should not be done. Hopefully, that will be Newsweek's legacy -- serving as a shining example of the failure of modern journalism to adhere to balance, objectivity and fairness. Anyone who fails to see this inconvenient truth is truly (to borrow Newsweek's vernacular) a "denier." Background of recent climate science developments: Even the alarmist UN has cut sea level rise estimates dramatically since 2001 and has reduced man's estimated impact on the climate by 25%. Meanwhile a separate 2006 UN report found that cow emissions are more damaging to the planet than all of the CO2 emissions from cars and trucks

The New York Times is now debunking aspects of climate alarmism. An April 23, 2006 article in the New York Times by Andrew Revkin stated: "few scientists agree with the idea that the recent spate of potent hurricanes, European heat waves, African drought and other weather extremes are, in essence, our fault (a result of manmade emissions.) There is more than enough natural variability in nature to mask a direct connection, [scientists] say.” The New York Times is essentially conceding that no recent weather events are outside of natural climate variability. So all the climate doomsayers have to back up their claims of climate fears are unproven computer models of the future. Of course, you can’t prove a prediction of the climate in 2100 wrong today. It’s simply not possible.

Climate Computer Models Not So Reliable

Recently, a top UN scientist publicly conceded that climate computer model predictions are not so reliable after all. Dr. Jim Renwick, a lead author of the IPCC 4th Assessment Report, admitted to the New Zealand Herald in June 2007, “Half of the variability in the climate system is not predictable, so we don’t expect to do terrifically well.

A leading scientific skeptic of global warming fears, Dr. Hendrik Tennekes, former CEO and director of research for the Netherlands’ Royal National Meteorological Institute, took the critique of climate models that predict future doom a step further. Tennekes wrote on February 28, 2007, “I am of the opinion that most scientists engaged in the design, development, and tuning of climate models are in fact software engineers. They are unlicensed, hence unqualified to sell their products to society.

Ivy League geologist Dr. Robert Giegengack of the University of Pennsylvania noted “for most of Earth’s history, the globe has been warmer than it has been for the last 200 years. It has rarely been cooler,” Giegengack said according to a February 2007 article in Philadelphia Magazine. (Al Gore is a Greenhouse Gasbag)

The article continued, “[Giegengack] says carbon dioxide doesn’t control global temperature, and certainly not in a direct linear way.” Climatologist Dr. Timothy Ball explained that one of the reasons climate models fail is because they overestimate the warming effect of CO2 in the atmosphere. Ball described how CO2′s warming impact diminishes. “Even if CO2 concentration doubles or triples, the effect on temperature would be minimal. The relationship between temperature and CO2 is like painting a window black to block sunlight. The first coat blocks most of the light. Second and third coats reduce very little more. Current CO2 levels are like the first coat of black paint,” Ball explained in a June 6, 2007 article in Canada Free Press. (PM’s summit speech shows Parliamentarians have come full circle on climate change rhetoric)

New data is revealing what may perhaps be the ultimate inconvenient truth for climate doomsayers: Global warming stopped in 1998. Dr. Nigel Calder, co-author with physicist Henrik Svensmark of the 2007 book “The Chilling Stars: A New Theory on Climate Change,” explained in July 2007.

In reality, global temperatures have stopped rising. Data for both the surface and the lower air show no warming since 1999. That makes no sense by the hypothesis of global warming driven mainly by CO2, because the amount of CO2 in the air has gone on increasing. But the fact that the Sun is beginning to neglect its climatic duty – of battling away the cosmic rays that come from ‘the chilling stars’ – fits beautifully with this apparent end of global warming.” Perhaps the conversion of many former scientists from believers in man-made global warming to skeptics and the new peer-reviewed research is why so many proponents of a climatic doom have resorted to threats and intimidation in attempting to silence skeptics.

It is ironic that Newsweek quoted former Colorado Senator Tim Wirth to promote climate fears and vilify skeptics. After all, it was Wirth who reportedly said in 1990: “We’ve got to ride the global warming issue. Even if the theory of global warming is wrong, we will be doing the right thing — in terms of economic policy and environmental policy.” (Newsweek Hides Real Global Warming Agenda) It seems that the science underlying the claims of climate fears may not be as important to the global warming activists as their proposed tax and regulatory “solutions.”

Related Links:

  • New Peer-Reviewed Scientific Studies Chill Global Warming Fears
  • Senator Inhofe declares climate momentum shifting away from Gore (The Politico op ed)
  • Scientific Smackdown: Skeptics Voted The Clear Winners Against Global Warming Believers in Heated NYC Debate
  • Global Warming on Mars & Cosmic Ray Research Are Shattering Media Driven “Consensus”
  • Global Warming: The Momentum has Shifted to Climate Skeptics
  • Prominent French Scientist Reverses Belief in Global Warming – Now a Skeptic
  • Top Israeli Astrophysicist Recants His Belief in Manmade Global Warming – Now Says Sun Biggest Factor in Warming
  • Warming On Jupiter, Mars, Pluto, Neptune’s Moon & Earth Linked to Increased Solar Activity, Scientists Say
  • Panel of Broadcast Meteorologists Reject Man-Made Global Warming Fears- Claim 95% of Weathermen Skeptical
  • MIT Climate Scientist Calls Fears of Global Warming ‘Silly’ – Equates Concerns to ‘Little Kids’ Attempting to “Scare Each Other”
  • Weather Channel TV Host Goes ‘Political’- Stars in Global Warming Film Accusing U.S. Government of ‘Criminal Neglect’Weather Channel Climate Expert Calls for Decertifying Global Warming Skeptics
  • The Weather Channel Climate Expert Refuses to Retract Call for Decertification for Global Warming Skeptics
  • Senator Inhofe Announces Public Release Of “Skeptic’s Guide To Debunking Global Warming”

This was originally published in August 5th, 2007 on the Inhofe Environmental and Public Works Blog, entitled “Newsweek’s Climate Editorial Screed Violates Basic Standards of Journalism” and is republished with permission.

Posted in Business & Economics, Drought, History, Literature, Organizations, Other, Science, Space, & Technology0 Comments

Climate Conference for Global Warming Skeptics Challenges Claims of Consensus & Alleges Supressions of Skeptical Alarmist Predictions

Trail in Woods
“It is my belief that the strident claims of catastrophes
caused by man-made global warming are stated with a
degree of confidence not warranted by the data.”
Dr. William M. Briggs

Editor’s Note: We have been publishing more material than ever on the subject of climate change, for a very simple reason: The debate is not over as to the cause, the eventual severity, nor the remedies for climate change. The debate never was over, and for the mainstream press to have ever acceded to the notion that debate was over, or to condone marginalizing anyone who continued to debate, is one of the most eggregious examples of media bias in history.

One should think that given what is at stake – the reorganization of our entire political and economic systems – debate would be welcomed. One would think those who are calling for debate and discussion would be heralded as voices of moderation and reason, instead of branded as ideological fanatics and corporate shills. The fact that debate is supposedly “over” regarding something for which the remedy is so fundamentally and abruptly transformative should concern anyone who claims to care about human rights, individual freedoms, free enterprise, and an open society. The idea that anyone who questions global warming alarmism is freely demonized should concern any student of history. The solution – government control over virtually anything that emits a gas – including CO2, which plants and trees require to for their very survival – and huge new taxes (perhaps disguised in the form of Wall Street friendly “cap and trade” mechanisms, but the consumer still pays the freight), should concern anyone who cares about representative government and values the concept of private property.

Perhaps it is common sense that is endangered here. If the earth is indeed warming because of anthropogenic CO2, what can be done? Shall we sequester 20-30 gigatons of CO2 every year, when for the amount of money that would cost, we could clean up every river, stop overfishing the oceans, eliminate every criteria air pollutant, and fight malaria to a standstill? Even accepting conclusions of climate models – problematic concoctions that constitute the “scientific” imperative behind AGW alarmism and consequent policy – isn’t it true that we would have to sequester literally 80% of the CO2 currently attributed to human activity? Isn’t that impossible? Why not reforest the planet? Why not restore the mangrove forests that used to stop tidal surges throughout the coastal tropics, and why not stop blowing up coral reefs to flush the cash crop of fish, so they could regenerate and again stop seasonal storms from inundating tropical islands? There is a legitimate environmental agenda completely apart from global warming alarm – and there are many skeptics who nonetheless care deeply about the environment.

Common sense would suggest we question the agenda of the global warming alarmists who rely on fear and questionable science, not that of the skeptics. If there is a “denial industry,” who would benefit? A handful of underfunded think tanks? If there is a hidden agenda, it is more likely coming from the “alarm industry.” Government agencies get more tax revenue, the United Nations gets a revenue stream, insurance companies collect higher premiums, attorneys file more lawsuits, Wall Street gets a new source of commissions and fees, corporations get more subsidies, various environmentalist nonprofits, activists, academics, politicians and professional consultants acquire a game-changing new source of funding and influence, and small businesses get destroyed because they can’t afford to comply with the new regulations, and families everywhere pay punitive prices for energy, water and land. Is this the future we want? Maybe if all of this AGW alarmism were true!

The 2008 International Conference on Climate Change described in this feature, held earlier this month in New York City, featured some of the most credible (and credentialed) people to ever convene at a climate related event. Their presentations on climate science were diverse, as befits a scientific conference, and most of the participants were “skeptics,” something that also befits a scientific conference. But despite the report’s description of this event as being well covered by the media, in reality it was not. For the most part, the media ignored this event. Because the comparison that is valid is not whether or not a few of the mainstream media outlets reported on this event. The valid comparison is whether or not this event got the level of attention that the most recent IPCC press release garnered, and by that measure coverage was nonexistant. And in too many cases, coverage was slanted to present a smarmy, derisive characterization of the event as a final gasp of the “flat earthers.”

Many conscientious people, relatively free of biases, simply feel climate science is beyond them; they need to recognize this inhibition hasn’t stopped the people reporting the news or those in the political & entertainment community whose pronouncements they have relied upon. And for those who believe in AGW alarmism because it fits preexisting biases, or furthers a political or economic agenda, know this: Science – stripping away the corruption and opportunism that has infected much of the scientific community when it comes to the question of alleged global warming – has no ideology, no ulterior purpose; it is utterly dispassionate. Science relies on skepticism, and ultimately rests on truth. – Ed “Redwood” Ring

The Debate Resumes – Climate conference for global warming skeptics challenges claims of consensus, alleges suppression of findings skeptical of alarmist predictions.
by Marc Morano, March 2008
“Most extremist views about climate change have little
or no scientific basis. The rational basis for extremist
views on global warming may be desire for political action.”
Dr. Gerd-Rainer Weber

Scientists skeptical of man-made climate fears meeting at the 2008 International Conference on Climate Change in New York City described the “absolute horror stories” about how some scientific journals have engaged in “outrageous and unethical behavior” in attempting to suppress them from publishing their work in peer-reviewed journals.

The March 2-4 groundbreaking conference, which featured about 100 speakers with over 500 people attending, presented the report of a team of international scientists who formed a group to counter the UN IPCC. [Note: The author of this report attended and participated in the conference.]

The event, which garnered significant international and U.S. media attention, featured many current and former UN IPCC scientists from around the world. (See “Nature, Not Human Activity, Rules the Climate” and see climate declaration signed by the scientists at the conference here).

The conference occurred just months after the release of a blockbuster U.S. Senate Minority Report featuring over 400 prominent scientists who recently disputed man-made global warming claims. The more than 400 scientists featured in the report thoroughly debunk the assertions that “all scientists agree” about man-made global warming. But as New York Times environmental reporter Andrew Revkin noted on March 6, science is ultimately not a numbers game. “As we all know, climate science is not a numbers game (there are heaps of signed statements by folks with advanced degrees on all sides of this issue),” Revkin wrote. Furthermore, a Canadian survey of scientists released on March 6, 2008 offered even more evidence that the alleged “consensus” is non-existent. A canvass of more than 51,000 scientists with the Association of Professional Engineers, Geologists and Geophysicists of Alberta (APEGGA) found 68% of them disagree with the statement that “the debate on the scientific causes of recent climate change is settled.’” According to the survey, only 26% of scientists attributed global warming to “human activity like burning fossil fuels.” APEGGA”s executive director Neil Windsor said, “We’re not surprised at all. There is no clear consensus of scientists that we know of.”

During the conference, scientists revealed the lack of tolerance science journals and institutions have exhibited for skeptical climate views. “We [fellow skeptical scientists] talked mostly of work and upcoming papers and went through the standard ritual of griping about journal editors and the ridiculous hoops we sometimes have to jump through to get papers published. But some of the guys had absolute horror stories of what happened to them when they tried getting papers published that explored non-”consensus” views. Really outrageous and unethical behavior on the parts of some editors. I was shocked,” wrote conference participant Dr. William M. Briggs, a climate statistician who serves on the American Meteorological Society’s Probability and Statistics Committee and is an Associate Editor of Monthly Weather Review, on his blog on March 4.

Prominent Hungarian Physicist Dr. Miklýs Zágoni, a former global warming activist who recently reversed his views about man-made climate fears and is now a skeptic, presented scientific findings at the conference refuting rising CO2 fears. Zágoni’s scientific mentor Hungarian scientist, Dr. Ferenc Miskolczi, an atmospheric physicist, resigned from his post working with NASA because he was disgusted with the agency’s lack of scientific freedom. Miskolczi, who also presented his peer-reviewed findings at the conference, said he wanted to release his new research that showed “runaway greenhouse theories contradict energy balance equations,” but he claims NASA refused to allow him. “Unfortunately, my working relationship with my NASA supervisors eroded to a level that I am not able to tolerate. My idea of the freedom of science cannot coexist with the recent NASA practice of handling new climate change related scientific results,” Miskolczi said according to a March 6 Daily Tech article.

American River
“Global warming has been tremendously over-hyped;
most of the climate change we’ve seen is natural.
I think we are brainwashing our children terribly.”
Dr. William Gray

Meteorologist Joseph D’Aleo, the first Director of Meteorology at The Weather Channel and former chairman of the American Meteorological Society’s (AMS) Committee on Weather Analysis and Forecasting, noted that many of his scientific colleagues did not attend the conference because they “feared their attendance might affect their employment.” D’Aleo described the fear of retribution many skeptics face as a “sad state of affairs.” But D’Aleo noted that he believes there is “very likely a silent majority of scientists in climatology, meteorology, and allied sciences who do not endorse what is said to be the ‘consensus’ position.”

Other scientists have echoed these claims. Atmospheric scientist Dr. Nathan Paldor, Professor of Dynamical Meteorology and Physical Oceanography at the Hebrew University of Jerusalem, author of almost 70 peer-reviewed studies, asserted in December 2007 that skeptics have a much harder time publishing in peer-reviewed literature. “Many of my colleagues with whom I spoke share these views and report on their inability to publish their skepticism in the scientific or public media,” Paldor, who was not in attendance at the New York conference, wrote in December. In February 2008, Canadian environmentalist David Suzuki publicly called for politicians skeptical of a man-made climate “crisis” to be thrown “into jail because what they’re doing is a criminal act.”  – See also July 2007 comprehensive report detailing how skeptical scientists have faced threats and intimidation.

Many prominent scientists participating and attending were very impressed by the New York City climate conference. Hurricane researcher and Meteorologist Stanley B. Goldenberg of NOAA (National Oceanic and Atmospheric Administration) in Miami praised the Heartland Instituted sponsored conference. “The fact is that this conference is evidence that there are numerous respected, established and in many cases world-renowned scientists who have done careful research in various areas of “climate change” that sharply differ with the [UN] IPCC results,” Goldenberg told the New York Times. Meteorologist D’Aleo had nothing but praise for the conference. “It was the best climate conference I have attended in my 30 years in the professional societies. The two-day meeting featured over 100 excellent presentations made by scientists from Australia, Canada, England, France, Hungary, New Zealand, Poland, Russia, Sweden, and of course the United States,” D’Aleo wrote on his website IceCap.US on March 4. The oft repeated notion of “hundreds” or even “thousands” of scientists affiliated with the UN agreeing to a single “consensus” does not hold up to scrutiny. Out of all the scientists affiliated with the UN, only 52 scientists participated in UN IPCC Summary for Policymakers, which had to adhere to the wishes of the UN political leaders and delegates in a process described as more closely resembling a political party’s convention platform battle, not a scientific process. Inhofe Debunks So-Called ‘Consensus’ On Global Warming, and The Inhofe EPW Blog

Many current and former UN scientists disagree with the IPCC Summary for Policymakers and many of them attended the skeptical climate conference in New York. In addition, the so-called “consensus” statements by scientific groups like the National Academy of Sciences, the American Meteorological Society, and the American Geophysical Union are only voted on by two dozen or so governing board members with no direct vote cast by rank-and-file scientists.

D’Aleo addressed the complaints of some mainstream media reporters who noted that the climate conference did not produce a focused scientific message, but instead posited multiple explanations of climate changes. “There was a variety of opinions as there should be in science and all were tolerated. There was no group think or stagnant thinking as we find at other so called Climate Conferences,” D’Aleo wrote. Why would the media ever expect a uniform scientific message at a large climate conference? It appears that reporters need to be reminded that the UN IPCC (after all it is the InterGOVERNMENTAL Panel on Climate Change) is the unusual event, not the international climate conference in New York City this past week. It is true that the skeptical conference presented an array of scientific views, but reporters should not be surprised by this diversity. Instead, the question for reporters should be, Why do UN IPCC climate events have such conformity and a lack of dissent? Many reporters are so used to attending virtually scripted UN IPCC Summary for Policymakers meetings which reach the predetermined “consensus” that mankind is driving a climate crisis. (To read more about how the IPCC Summary for Policymakers has been tainted by politics please see this article.)

Media Coverage of Conference

The climate conference garnered the attention of many media outlets including:

  • New York Times
  • BBC
  • Washington Post
  • ABC News
  • Associated Press
  • Reuters; China Post
  • CNN
  • New York Sun
  • Fox News
  • Times of India
  • Czech’s Ceske Noviny
  • Investor’s Business Daily
  • Canada”s Financial Post
  • United Press International
  • Wall Street Journal

Some of the mainstream media coverage reached bottom quickly See CNN’s Miles O”Brien accuses scientists at conference of being “Flat Earthers”.

Some of the mainstream media coverage, including several articles in New York Times, presented fair coverage:

  • Global-Warming Skeptics Convene in N.Y.
  • Lessons from the Skeptics’ Conference
  • Skeptics on Human Climate Impact Seize on Cold Spell

Despite many mainstream media outlets efforts to mock the gathering, it was a semi-victory for the conference that reporters likes Miles O’Brien of CNN and Bill Blakemore of ABC News even showed up. For info on O’Brien’s past climate reporting, see here. For info on Blakemore’s past climate reporting see here. The Business and Media Institute (BMI) also released their comprehensive study during the conference which reveals how the news media reports on global warming. The report, titled “Global Warming Censored” found that network TV news stifles debate, relies on “politicians, rock stars and men-on-the street for science” reporting. BMI also critiqued the news media coverage of the International Conference on Climate Change. has a critique of the media coverage titled “Mainstream media’s mockery”. American Thinker weighed in with a very comprehensive report from the conference. [Note: For a comprehensive sampling of the media coverage of the conference, see part two of this report here.

"There is no evidence that CO2 has ever driven or will
ever drive world temperatures and climate change.
Consequently, worrying about CO2 is irrelevant."
Astrophysicist Piers Corbyn

Funding Myths Exposed

One of the most incisive articles about the conference came from John Tierney of the New York Times. Tierney exposed the erroneous notion that "industry" funding fuels climate skepticism. "Do the critics really think there's more money and glory to be won by doubting global warming than by going along with the majority? I ask this question not because I doubt the integrity or competence of the researchers and environmental groups who are getting billions of dollars from government agencies, corporations, foundations and private donors concerned about climate change," Tierney wrote on March 6. An August 2007 report detailed how proponents of man-made global warming fears enjoy a monumental funding advantage over skeptical scientists. Tierney quoted Joseph Bast, Heartland's president, stating: "Donations from energy companies have never amounted to more than 5 percent of our budget in any year, and there is no corporate sponsor underwriting any of this conference." Tierney also presented the case that so-called global warming "solutions" are money makers for many. "A cap-and-trade systems for curbing carbon emissions (the kind criticized at this week's conference) is popular in Washington in no small part because of corporate lobbyists who see a chance to make money from the carbon credits," he wrote. "And there's lots of money to be doled out to researchers studying climate change and new energy technologies," he added.

Dissenters of Climate Fears Growing in Number

The New York City conference of dissenting scientists comes after many declared 2007 the "tipping point" for climate alarmism and referred to it as the year man-made global warming fears "bit the dust" as an abundance of peer-reviewed studies countered rising CO2 fears. Many of the scientists featured in the December 2007 U.S. Senate Minority Report of over 400 scientists attended the conference. The skeptical scientists at the conference presented diverse views on climate change, but generally they rallied around several key points. 1) The Earth is currently well within natural climate variability. 2) Almost all climate fear is generated by unproven computer model predictions. 3) An abundance of peer-reviewed studies continue to debunk rising CO2 fears and, 4) "Consensus" has been manufactured for political, not scientific purposes.

In such nations as:

  • Germany
  • Brazil
  • Netherlands
  • Russia
  • Argentina
  • New Zealand
  • Portugal
  • France

Groups of scientists have recently spoken out to oppose and debunk man-made climate fears. In addition, many scientists who are also progressive environmentalists believe climate fear promotion has "co-opted" and "hijacked" the green movement.

Former Vice President Al Gore's film An Inconvenient Truth and the UN IPCC (Intergovernmental Panel on Climate Change) reports have prompted many skeptical scientists to speak out and join the growing resistance. "Al Gore brought me back to the battle and prompted me to do renewed research in the field of climatology. And because of all the misinformation that Gore and his army have been spreading about climate change I have decided that 'real' climatologists should try to help the public understand the nature of the problem," Climatologist Robert Durrenberger, past president of the American Association of State Climatologists, declared in May 2007. Since the release of the December 20 Senate minority report detailing the hundreds of skeptics, a steady stream of scientists from around the world have continued to declare themselves dissenters of the alleged "climate crisis." Just days before the international climate conference began, the first woman in the world to receive a PhD in meteorology, Dr. Joanna Simpson, declared she was "skeptical" of catastrophic man-made warming. "Since I am no longer affiliated with any organization nor receiving any funding, I can speak quite frankly," Simpson, formerly of NASA who has authored more than 190 studies, wrote in a public letter on February 27. Simpson was described by former Colorado State Climatologist Roger Pielke, Sr. as "among the most preeminent scientists of the last 100 years". "The main basis of the claim that man's release of greenhouse gases is the cause of the warming is based almost entirely upon climate models. We all know the frailty of models concerning the air-surface system. We only need to watch the weather forecasts," Simpson explained. "But as a scientist I remain skeptical," she added.

Creek Bank
"If we are facing a crisis at all, I think it is that we
are preparing for warming when it is looking like we
are cooling. We are preparing for the wrong thing."
Dr. Timothy Ball

Number of Skeptical Scientists Continue to Grow

Also last week, Geologist William F. McClenney, a California Licensed Professional Geologist and former Certified Environmental Auditor in Victoria, Australia, announced that he had reversed his views about man-made global warming. McClenney now says he has done "the math and realized that you just can't get to global warming with CO2." See February 28, 2008 full statement here. McClenney joins other scientists who recently converted from believer to skeptic of man-made climate fears, see full article here. Geologist Dr. Don Easterbrook, an emeritus professor of geology at Western Washington University who has authored eight books and 150 journal publications, announced earlier this week that he was putting his "reputation on the line" by predicting global cooling. "The average of the four main temperature measuring methods is slightly cooler since 2002 (except for a brief el Niýo interruption) and record breaking cooling this winter. The argument that this is too short a time period to be meaningful would be valid were it not for the fact that this cooling exactly fits the pattern of timing of warm/cool cycles over the past 400 years," Easterbrook wrote on March 1, 2008.  Atmospheric Scientist Dr. Art V. Douglas, the recently retired Chair of the Atmospheric Sciences Department at Creighton University in Omaha, Nebraska, and author of numerous papers for peer-reviewed publications, also publicly announced his dissent from man-made climate fears in February 2008. "Whatever the weather," Douglas said, "it's not being caused by global warming." Atmospheric Physicist James A. Peden, formerly of the Space Research and Coordination Center in Pittsburgh and a founding member of the American Society for Mass Spectrometry, also announced his skepticism on February 18, 2008. "Sorry folks, but we're not exactly buying into the Global Hysteria just yet. We know a great deal about atmospheric physics, and from the onset, many of the claims were just plain fishy," Peden wrote. In January 2008, environmental scientist professor Delgado Domingos of Portugal, the founder and director of the Numerical Weather Forecast group, announced publicly that he considered CO2 related climate fears to be "dangerous nonsense." Domingos, who retired in 2006, has more than 150 published articles in the research fields of Thermodynamics, Numerical Methods in Fluid Mechanics and Meteorological Forecast. "There are measurable climate changes but there is also an enormous manipulation in reducing everything to CO2 and equivalents. The main gas producing the green house effect is water vapor. The present alarm on climate change is an instrument of social control, a pretext for major businesses and political battle. It became an ideology, which is concerning," Domingos said. Physics professor Dr. Frederick Wolf of Keene State College in New Hampshire, declared himself a dissenter of man-made climate fears in January 2008. Wolf has taught meteorology and climatology courses for the past 25 years and will be undertaking a sabbatical project on global warming. "Several things have contributed to my skepticism about global warming being due to human causes. We all know that the atmosphere is a very complicated system," Wolf said. "I am impressed by the number of scientific colleagues who are naturally skeptical about the conclusion of human induced warming," he added.

Lamenting Use of Term "Consensus"

The number of scientists who are now publicly dissenting from Gore's and the UN's view of climate change has become so overwhelming that promoters of man-made climate fears now lament the use, or the "overuse of the term "consensus" in the public discussion of global warming. "I do think the scientific community, the progressive community, environmentalists and media are making a serious mistake by using the word "consensus"" to describe climate change impacts, wrote Joseph Romm of Climate Progress in a February 27, 2008 commentary in [Note: Despite the growing scientific dissent and the increasing number of peer-reviewed studies which debunk rising CO2 fears, Romm now advocates that the term "consensus" be dropped in favor of a stronger term to promote man-made climate fears. In addition, at least one scientist publicly pondered reconsidering his view of man-made climate fears after Senate report of 400 scientists was released in December. "It (the Senate 400 scientists report) got me thinking: I'm an environmental scientist, but I've never had time to review the "evidence" for the anthropogenic causes of global warming," wrote environmental scientist Professor Rami Zurayk of the American University in Beirut on December 27, 2007. "When I said, in my opening speech for the launch of UNEP's (United Nations Environment Program) Global Environment Outlook-4 in Beirut: 'There is now irrevocable evidence that climate change is taking place...' I was reading from a statement prepared by UNEP. Faith-based science it may be, but who has time to review all the evidence? I'll continue to act on the basis of anthropogenic climate change, but I really need to put some more time into this," Zurayk wrote.

Nature, Not Human Activity, Rules the Climate

The Heartland Institute's International climate conference built on the momentum of growing number of skeptics as the conference showcased a new report by a team of international scientists who formed a group to counter the UN IPCC called the "Nongovernmental International Panel of Climate Change" (NIPCC). The skeptical scientist report was titled "Nature, Not Human Activity, Rules the Climate" Key findings of the NIPCC's report: 1) Most of climate change is caused by natural forces, 2) The human contribution is not significant, 3) Solar-activity changes are the main cause of climate change. Climate Scientist Dr. S. Fred Singer, former director the US Weather Satellite Service and past vice chairman of the U.S. National Advisory Committee on Oceans and Atmosphere, told the conference that the IPCC "chose to ignore these facts, because they conflicted with the conclusion that global warming is anthropogenic (man-made).

Not a global crisis

The International Climate Conference in New York also featured hundreds of climate experts from around the world, who issued a March 4 "Manhattan Declaration" on man-made global warming, stating in part: "1) There is no convincing evidence that CO2 emissions from modern industrial activity has in the past, is now, or will in the future cause catastrophic climate change, 2) Attempts by governments to inflict taxes and costly regulations on industry and individual citizens with the aim of reducing emissions of CO2 will pointlessly curtail the prosperity of the West and progress of developing nations without affecting climate, and 3) Human-caused climate change is not a global crisis." The declaration resolved that "scientific questions should be evaluated solely by the scientific method."

"Warming Island" Not So New

Former Virginia State Climatologist Dr. Patrick Michaels addressed the conference and debunked fears of unprecedented melting in Greenland. Michaels noted the media hype recently about the "discovery" of a "new" island in Greenland caused by melting glaciers dubbed "Warming Island." (See April 2007 article titled: "An island made by global warming." But Michaels ridiculed the claim that the island was "new" by citing a 1957 book called "Arctic Riviera" by Swiss explorer Ernst Hofer which featured an illustration clearly depicting the same island in the early 1950s. Michaels noted that Greenland temperatures were as warm or warmer in the 1930s and 1940s than today"s temperatures. [See July 30, 2007 Report - Latest Scientific Studies Refute Fears of Greenland Melt]

Woods and Creek
“There are numerous respected, established and in
many cases world-renowned scientists who have done
careful research in various areas of ‘climate change’
that sharply differ with the IPCC results.”
Stanley B. Goldenberg

Sampling of Key Quotes from Scientists Participating in the 2008 International Conference on Climate Change:

Former UN Scientist Dr. Paul Reiter of the Pasteur Institute in Paris (who resigned from UN IPCC in protest): “As far as the science being ‘settled,’ I think that is an obscenity. The fact is the science is being distorted by people who are not scientists.”

UN IPCC scientist Vincent Gray of New Zealand: “This conference demonstrates that the [scientific] debate is not over. The climate is not being influenced by carbon dioxide.”

Canadian Climatologist Dr. Timothy Ball: “If we are facing [a crisis] at all, I think it is that we are preparing for warming when it is looking like we are cooling. We are preparing for the wrong thing.”

Climate researcher Dr. Craig Loehle, formerly of the Department of Energy Laboratories and currently with the National Council for Air and Stream Improvements, has published more than 100 peer-reviewed scientific papers: “The 2000-year [temperature] trend is not flat, so a warming period is not unprecedented. [&] 1500-year [temperature] cycle as proposed by [Atmospheric physicist Fred] Singer and [Dennis] Avery is consistent with Loehle climate reconstruction. [&] 1500-year cycle implies that recent warming is part of natural trend.”

Hurricane expert and Meteorologist Dr. William Gray: “There are lots of skeptics out there, all over the U.S. and the rest of the world. [Global warming] has been over-hyped tremendously; most of the climate change we have seen is largely natural. I think we are brainwashing our children terribly.”

UK Astrophysicist Piers Corbyn: “There is no evidence that CO2 has ever driven or will ever drive world temperatures and climate change. The consequence of that is that worrying about CO2 is irrelevant. Our prediction is world temperatures will continue to decline until 2014 and probably continue to decline after that.”

Weather Channel founder and meteorologist John Coleman: “Serious scientists and serious students of global warming have concluded after a lot of effort that there is little basis for the thought that we are going to have catastrophic global warming.”

Dr. Benny Peiser of the Faculty of Science of Liverpool John Moores University in UK: “[Global warming cap-and-trade bills have] caused so much trouble in Europe. It’s not working, it’s never going to work. It won’t have any effect on the climate, but only that there will be more unemployed in Europe. If that helps the climate, perhaps that is a solution.”

Atmospheric physicist Ferenc Miskolczi, formerly with NASA”s Langley Research Center: “The runaway greenhouse effect is physically impossible. [&] The observed global warming has nothing to do directly with the greenhouse effect; it must be related to changes in the total absorbed solar radiation or dissipated heat from other natural or anthropogenic sources of thermal energy.”

Meteorologist Art Horn: “There are thousands of scientists around the world who believe that this issue is not settled. The climate is not being influenced by carbon dioxide.”

German Meteorologist Dr. Gerd-Rainer Weber: “Most of the extremist views about climate change have little or no scientific basis. The rational basis for extremist views about global warming may be a desire to push for political action on global warming.”

Physics Professor Emeritus Dr. Howard Hayden of the University of Connecticut: “The fluctuations in Earth’s temperature are caused by astronomical phenomena. The combined effects of all “greenhouse gases,” albedo changes, and other Earthly changes account for no more than about 3 degrees C of the changes during transitions between ice ages and interglacials.”

Climate statistician Dr. William M. Briggs, who serves on the American Meteorological Society’s Probability and Statistics Committee and is an Associate Editor of Monthly Weather Review “It is my belief that the strident and frequent claims of catastrophes caused by man-made global warming are stated with a degree of confidence not warranted by the data. [&] Too many people are too confident about too many things. That was the simple message of the Heartland conference, and one that I hope sinks in.”

Weblink to part two of Marc Morano’s report on the 2008 International Conference on Climate Change

About the Author: Marc Morano is communications director for the Republicans on the U.S. Senate Committee on Environment and Public Works. Morano commenced work with the committee under Senator James Inhofe, who was majority chairman of the committee until January 2007 and is now minority ranking member. In December 2006 Morano launched a blog on the committee’s website that largely promotes the views of climate change skeptics. This article is reprinted here with permission from the author.

Additional EcoWorld features on Global Warming:

  • A Case Against Climate Alarmism, Dr. Richard Lindzen
  • 35 Inconvenient Truths, Lord Christopher Monckton
  • Interview with Roger Pielke, Sr., EcoWorld Exclusive
  • Glacial Acceleration, Paul Brown
  • Global Warming Priorities, Dr. Edward Wheeler
  • Rebuttal to Inconvenient Truth, Marlo Lewis
  • Inconvenient Skeptics, D. James Guzy
  • Global Warming Facts, Dr. Richard Lindzen
  • Is There a Basis for Global Warming Alarm?, Dr. Richard Lindzen
  • Global Warming Alarm?, Dr. Edward Wheeler
  • Global Warming Posts, EcoWorld Editor’s Blog
EcoWorld - Nature and Technology in Harmony

Posted in Art, Causes, Effects Of Air Pollution, Energy, Fish, Global Warming & Climate Change, History, Literature, Other, Radiation, Solar, Tidal3 Comments

35 Inconvenient Truths: The Errors in Al Gore's Movie

Polar Bear
Is climate change endangering the Polar Bear?

Editor’s Note: When you strip away the ideology, the truth still matters, so not just for balance but for integrity, we continue to post features like this. The denial industry is not going to go away until the truth is known, and truth can withstand skepticism. And what if the skeptics are right?

In October 2007, a British judge ruled the movie “An Inconvenient Truth” had nine inaccuracies. And shortly thereafter, in reference to this movie, another British person, Chris Monckton, wrote “35 Inconvenient Truths,” republished with permission by EcoWorld here. Not nine, but 35 inaccuracies. In reading this compilation you have to wonder whether we aren’t getting carried away. How many sweeping political and economic mandates will come of this? How many civil suits? How many regulations, subsidies, taxes, and trades?

Reading this feature – which certainly has several valid points – presents the question of what else? Is every weather event imbued with inflections of doom and guilt, the numerical or factual basis unquestioned, the inflections informed by emotion instead of due diligence? Is climate-change alarm influencing reporting on business and politics? Should someone simply believe in anthropogenic global warming, “AGW,” because they want to believe in AGW because all the collective action we may take on behalf of AGW is good? Maybe yes, and maybe not.

Because even if AGW is real, would unleashing the power of free enterprise to adapt to changing climate realities be a better use of resources than trying to eliminate combustion through massive new transfers of wealth from the private sector to the public sector? In our view, $100 dollars per barrel of oil is a sufficient incentive for alternative energy to have a chance. Further, eliminating subsidies for fossil fuel should come before new taxes and subsidies to develop alternative energy. Reforming the public sector should come before any new taxes.

One of Monckton’s points, #30, deserves highlighting – like many of us, he rejects the position that CO2 is pollution. Without CO2 plants could not have photosynthesis, which is necessary for plants to grow and generates oxygen for humans to breath. Plants cannot breath without CO2. For such a fundamental misconception to enter into law via the U.S. Supreme Court ought to alert anyone to the fact something is wrong here. Let the gardens of private land and the gardens of public discourse adapt and benefit from this truth; CO2 is life, and airborne toxic molecules and particulates are something else altogether. In that spirit, on with the story. – Ed “Redwood” Ring

35 Inconvenient Truths – The Errors in Al Gore’s Movie
by Christopher Monckton, December 15, 2007
Al Gore Delivering Climate Change Presentation
Al Gore delivering his famous presentation.
But are his arguments really beyond debate?

In October 2007 the High Court in London identified nine “errors” in the movie An Inconvenient Truth. The judge had stated that, if the UK Government had not agreed to send to every secondary school in England a corrected guidance note, he would have made a finding that the Government’s distribution of the film to all English secondary schools had been an unlawful contravention of an Act of Parliament prohibiting the political indoctrination of children.

A Gore spokesperson and “environment advisor,” Ms. Kalee Kreider, has issued a questionable response to this news. She begins by saying that the film presented “thousands and thousands of facts.” It did not: just 2,000 “facts” in 93 minutes would have been one fact every three seconds. The film contained only a few dozen points, most of which will be seen to have been substantially inaccurate. The judge concentrated only on nine points which even the UK Government, to which Gore is a climate-change advisor, had to admit did not represent mainstream scientific opinion.

Ms. Kreider then states, incorrectly, that the judge himself had never used the term “errors.” In fact, the judge used the term “errors,” in inverted commas, throughout his judgment.

Next, Ms. Kreider makes some unjustifiable ad hominem attacks on Mr. Stewart Dimmock, the lorry driver, school governor and father of two school-age children who was the plaintiff in the case. This memorandum, however, will eschew any ad hominem response, and will concentrate exclusively on the 35 scientific inaccuracies and exaggerations in Gore’s movie.

Ms. Kreider then says, “The process of creating a 90-minute documentary from the original peer-reviewed science for an audience of moviegoers in the U.S. and around the world is complex.” However, the single web-page entitled “The Science” on the movie’s official website contains only two references to articles in the peer-reviewed scientific journals. There is also a reference to a document of the IPCC, but its documents are not independently peer-reviewed in the usual understanding of the term.

Ms. Kreider then says, “The judge stated clearly that he was not attempting to perform an analysis of the scientific questions in his ruling.” He did not need to. Each of the nine “errors” which he identified had been admitted by the UK Government to be inconsistent with the mainstream of scientific opinion.

Ms. Kreider says the IPCC’s results are sometimes “conservative,” and continues: “Vice President Gore tried to convey in good faith those threats that he views as the most serious.” Readers of the long list of errors described in this memorandum will decide for themselves whether Mr. Gore was acting in good faith. However, in this connection it is significant that each of the 35 errors listed below misstates the conclusions of the scientific literature or states that there is a threat where there is none or exaggerates the threat where there may be one. All of the errors point in one direction – towards undue alarmism. Not one of the errors falls in the direction of underestimating the degree of concern in the scientific community. The likelihood that all 35 of the errors listed below could have fallen in one direction purely by inadvertence is less than 1 in 34 billion.

We now itemize 35 of the scientific errors and exaggerations in Al Gore’s movie. The first nine were listed by the judge in the High Court in London in October 2007 as being “errors.” The remaining 26 errors are just as inaccurate or exaggerated as the nine spelt out by the judge, who made it plain during the proceedings that the Court had not had time to consider more than these few errors. The judge found these errors serious enough to require the UK Government to pay substantial costs to the plaintiff.

#1 – Sea Level Rising Six Meters:

Gore says that a sea-level rise of up to 6 m (20 ft) will be caused by melting of either West Antarctica or Greenland. Though Gore does not say that the sea-level rise will occur in the near future, the judge found that, in the context, it was clear that this is what he had meant, since he showed expensive graphical representations of the effect of his imagined 6 m (20 ft) sea-level rise on existing populations, and he quantified the numbers who would be displaced by the sea-level rise.

The IPCC says sea-level increases up to 7 m (23 ft) above today’s levels have happened naturally in the past climate, and would only be likely to happen again after several millennia. In the next 100 years, according to calculations based on figures in the IPCC’s 2007 report, these two ice sheets between them will add a little over 6 cm (2.5 inches) to sea level, not 6 m (this figure of 6 cm is 15% of the IPCC’s total central estimate of a 43 cm or 1 ft 5 in sea-level rise over the next century). Gore has accordingly exaggerated the official sea-level estimate by nearly 10,000 percent.

Ms. Kreider says the IPCC estimates a sea-level rise of “59 cm” by 2100. She fails to point out that this amounts to less than 2 ft, not the 20 ft imagined by Gore. She also fails to point out that this is the IPCC’s upper estimate, on its most extreme scenario. And she fails to state that the IPCC, faced with a stream of peer-reviewed articles stating that sea-level rise is not a threat, has reduced this upper estimate from 3 ft in 2001 to less than 2 ft (i.e. half the mean centennial sea-level rise that has occurred since the end of the last Ice Age 10,000 years ago) in 2007.

Ms. Kreider says the IPCC’s 2007 sea-level calculations excluded contributions from Greenland and West Antarctica because they could not be quantified. However, Table SPM1 of the 2007 report quantifies the contributions of these two ice-sheets to sea-level rise as representing about 15% of the total change.

Graph of Sea Level and Carbon Used

The report also mentions the possibility that there may be an unquantified further contribution in future from these two ice sheets arising from “dynamical ice flow.” However, the Greenland ice sheet rests in a depression in the bedrock created by its own weight, wherefore “dynamical ice flow” is impossible, and the IPCC says that temperature would have to be sustained at more than 5.5 degrees C above its present level for several millennia before half the Greenland ice sheet could melt, causing sea level to rise by some 3 m (10 ft).

Finally, the IPCC’s 2007 report estimates that the likelihood that humankind is having any influence on sea level at all is little better than 50:50.

The judge was accordingly correct in finding that Gore’s presentation of the imagined imminent threat of a 6 m (20 ft) sea-level rise, with his account of the supposed impact on the present-day populations of Manhattan, the Netherlands, Bangladesh, etc., etc, was not a correct statement of the mainstream science on this question.

#2 – Pacific islands “drowning”

Gore says low-lying inhabited Pacific coral atolls are already being inundated because of anthropogenic global warming, leading to the evacuation of several island populations to New Zealand. However, the atolls are not being inundated, except where dynamiting of reefs or over-extraction of fresh water by local populations has caused damage.

Furthermore, corals can grow at ten times the predicted rate of increase in sea level. It is not by some accident or coincidence that so many atolls reach just a few feet above the ocean surface.

Ms. Kreider says, “The IPCC estimates that 150 million environmental refugees could exist by the year 2050, due mainly to the effects of coastal flooding, shoreline erosion and agricultural disruption.” However, the IPCC cannot be basing its estimate on sea-level rise, since even its maximum projected rise of just 30 cm (1 ft) by 2050 would not cause significant coastal flooding or shoreline erosion. There are several coastlines (the east coast of England, for instance) where the land is sinking as a consequence of post-ice-age isostatic recovery, or where (as in Bangladesh) tectonic subduction is similarly causing the land to sink. But such natural causes owe nothing to sea-level rise.

There have been no mass evacuations of populations of islanders as suggested by Gore, though some residents of Tuvalu have asked to be moved to New Zealand, even though the tide-gauges maintained until recently by the National Tidal Facility of Australia show a mean annual sea-level rise over the past half-century equivalent to the thickness of a human hair. The problem with the Carteret Islands, mentioned by Ms. Kreider, arose not because of rising sea levels but because of imprudent dynamiting of the reefs by local fishermen.

Black and White Tree

In the Maldives, a detailed recent study showed that sea levels were unchanged today compared with 1250 years ago, though they have been higher in much of the intervening period, and have very seldom been lower.

A well-established tree very close to the Maldivian shoreline and only inches above sea level was recently uprooted by Australian environmentalists anxious to destroy this visible proof that sea level cannot have risen very far.

#3 – Thermohaline circulation “stopping”

Gore says “global warming” may shut down the thermohaline circulation in the oceans, which he calls the “ocean conveyor,” plunging Europe into an ice age. It will not. A paper published in 2006 says: “Analyses of ocean observations and model simulations suggest that changes in the thermohaline circulation during the last century are likely the result of natural multidecadal climate variability. Indications of a sustained thermohaline circulation weakening are not seen during the last few decades. Instead, a strengthening since the 1980s is observed.”

Ms. Kreider, for Mr. Gore, says that “multiple scientists” have claimed that we cannot exclude the possibility of the disruption or shutdown of the Conveyor. Disruption, perhaps: shutdown, no. It is now near-universally accepted that the thermohaline circulation cannot be and will not be shut down by “global warming,” and the film should have been corrected to reflect the consensus.

#4 – CO2 “driving temperature”

Gore says that in each of the last four interglacial warm periods it was changes in carbon dioxide concentration that caused changes in temperature. It was the other way about. Changes in temperature preceded changes in CO2 concentration by between 800 and 2800 years, as scientific papers including the paper on which Gore’s film had relied had made clear.

Ms. Kreider says it is true that “greenhouse gas levels and temperature changes in the ice signals have a complicated relationship but they do fit.” This does not address Gore’s error at all. The judge found that Gore had very clearly implied that it was changes in carbon dioxide concentration that had led to changes in temperature in the palaeoclimate, when the scientific literature is unanimous (save only for a single paper by James Hansen, whom Gore trusts) to the effect that the relationship was in fact the other way about, with a carbon dioxide feedback contributing only a comparatively insignificant further increase to temperature after the temperature change had itself initiated a change in carbon dioxide concentration.

The significance of this error was explained during the court proceedings, and was accepted by the judge. Gore says that the 100 ppmv difference between carbon dioxide concentrations during ice-age temperature minima and interglacial temperature maxima represents “the difference between a nice day and a mile of ice above your head.” This would imply a CO2 effect on temperature about 10 times greater than that regarded as plausible by the consensus of mainstream scientific opinion (see Error 10).

Ms. Kreider refers readers to a “more complete description” available at a website maintained by, among others, two of the three authors of the now-discredited “hockey stick” graph that falsely attempted to abolish the Mediaeval Warm Period. The National Academy of Sciences in the US had found that graph to have “a validation skill not significantly different from zero” – i.e., the graph was useless.

#5 – Snows of Kilimanjaro “melting”

Gore says “global warming” has been melting the snows of Mount Kilimanjaro in Africa. It is not.

The melting of the Furtwangler Glacier at the summit of the mountain began 125 years ago. More of the glacier had melted before Hemingway wrote The Snows of Kilimanjaro in 1936 than afterward.

Temperature at the summit never rises above freezing and is at an average of 7 Celsius. The cause of the melting is long-term climate shifts exacerbated by imprudent regional deforestation, and has nothing to do with “global warming.”

Ms. Kreider says, “Every tropical glacier for which we have documented evidence shows that glaciers are retreating.” However, a recent survey of the glaciers in the tropical Andes shows that they were largely ice-free in the past 10,000 years, except on the very highest peaks. The mere fact of warming or melting, therefore, tells us nothing of the cause.

Ms. Kreider says, “Global warming exacerbates the stresses that ecosystems (and humans) are already experiencing.” However, since the temperature at the summit of Kilimanjaro remains below freezing and has not risen in 30 years, “global warming” is not “exacerbating the stresses” at the summit of Kilimanjaro.

#6 – Lake Chad “drying up”

Gore says “global warming” dried up Lake Chad in Africa. It did not. Over-extraction of water and changing agricultural patterns dried the lake, which was also dry in 8500BC, 5500BC, 1000BC and 100BC. Ms. Kreider says, “There are multiple stresses upon Lake Chad.” However, the scientific consensus is that at present those “stresses” do not include “global warming.”

#7 – Hurricane Katrina “man made”

Gore says Hurricane Katrina, that devastated New Orleans in 2005, was caused by “global warming.” It was not. It was caused by the failure of Gore’s party, in the administration of New Orleans, to heed 30 years of warnings by the Corps of Engineers that the levees dams that kept New Orleans dry could not stand a direct hit by a hurricane. Katrina was only Category 3 when it struck the levees. They failed, as the Engineers had said they would. Gore’s party, not “global warming,” was to blame for the consequent death and destruction.

Graph of the Number of Hurricanes that Make Landfall from 1900 to 2000

Ms. Kreider says, “Mr. Gore has never addressed the issue of climate change and hurricane frequency.” What Gore actually says, however, addresses the frequency not only of hurricanes but also of typhoons and tornadoes

“We have seen in the last couple of years, a lot of big hurricanes. Hurricanes Jean, Francis and Ivan were among them. In the same year we had that string of big hurricanes; we also set an all time record for tornadoes in the United States. Japan again didn’t get as much attention in our news media, but they set an all time record for typhoons. The previous record was seven. Here are all ten of the ones they had in 2004.”

Graph of Hurricane Wind Speed and Number of Hurricanes that Make Landfall

For the record, however, the number of Atlantic hurricanes shows no trend over the past half century; the number of typhoons has fallen throughout the past 30 years; the number of tornadoes has risen only because of better detection systems for smaller tornadoes; but the number of larger tornadoes in the US has fallen.

#8 – Polar bear “dying”

Gore says a scientific study shows that polar bears are being killed swimming long distances to find ice that has melted away because of “global warming.” They are not. The study, by Monnett & Gleason (2005), mentioned just four dead bears. They had died in an exceptional storm, with high winds and waves in the Beaufort Sea. The amount of sea ice in the Beaufort Sea has grown over the past 30 years. A report for the World Wide Fund for Nature shows that polar bears, which are warm-blooded, have grown in numbers where temperature has increased, and have become fewer where temperature has fallen. Polar bears evolved from brown bears 200,000 years ago, and survived the last interglacial period, when global temperature was 5 degrees Celsius warmer than the present and there was probably no Arctic ice-cap at all. The real threat to polar bears is not “global warming” but hunting. In 1940, there were just 5,000 polar bears worldwide. Now that hunting is controlled, there are 25,000.

Ms. Kreider says sea-ice “was the lowest ever measured for minimum extent in 2007.” She does not say that the measurements, which are done by satellite, go back only 29 years. She does not say that the North-West Passage, a good proxy for Arctic sea-ice extent, was open to shipping in 1945, or that Amundsen passed through in a sailing vessel in 1903.

#9 – Coral reefs “bleaching”

Gore says coral reefs are “bleaching” because of “global warming.” They are not. There was some bleaching in 1998, but this was caused by the exceptional El Nino Southern Oscillation that year. Two similarly severe El Ninos over the past 250 years also caused extensive bleaching. “Global warming” was nothing to do with it.

Ms. Kreider says, “The IPCC and other scientific bodies have long identified increases in ocean temperatures with the bleaching of coral reefs.” So they have: but the bleaching in 1998 occurred as a result not of “global warming” but of a rare, though not unique, severe El Nino Southern Oscillation.

#10 – 100 ppmv of CO2 “melting mile-thick ice”

Gore implies that the difference of just 100 parts per million by volume in CO2 concentration between an interglacial temperature maximum and an ice-age temperature minimum causes “the difference between a nice day and having a mile of ice above your head.” It does not. Gore’s implication has the effect of overstating the mainstream consensus estimate of the effect of CO2 on temperature at least tenfold.

Temperature changes by up to 12 degrees C between glacial minima and interglacial maxima, but CO2 concentration changes by no more than 100 ppmv. Gore is accordingly implying that 100 ppmv can cause a temperature increase of up to 12 degrees C. However, the consensus as expressed by the IPCC is that 100 ppmv of increased CO2 concentration, from 180 to 280 ppmv, would increase radiant energy flux in the atmosphere by 2.33 watts per square meter, or less than 1.2 degrees Celsius including the effect of temperature feedbacks.

#11 – Hurricane Caterina “manmade”

Gore says that Hurricane Caterina, the only hurricane ever to strike the coast of Brazil, was caused by “global warming.” It was not. In 2004, Brazil’s summer sea surface temperatures were cooler than normal, not warmer. But air temperatures were the coldest in 25 years. The air was so much colder than the water that it caused a heat flux from the water to the air similar to that which fuels hurricanes in warm seas.

#12 – Japanese typhoons “a new record”

Gore says that 2004 set a new record for the number of typhoons striking Japan. It did not. The trend in the number of typhoons, and of tropical cyclones, has fallen throughout the past 50 years. The trend in rainfall from cyclones has also fallen, and there has been no trend in monsoon rainfall.

#13 – Hurricanes “getting stronger”

Galveston Hurricane of 1900

Gore says scientists had been giving warnings that hurricanes will get stronger because of “global warming.” They will not. Over the past 60 years there has been no change in the strength of hurricanes, even though hydrocarbon use went up six-fold in the same period. Research by Dr. Kerry Emanuel, cited by Ms. Kreider, has been discredited by more recent findings that wind-shear effects tend to nullify the amplification of hurricane strength which he had suggested, and, of course, by the observed failure of hurricanes to gain strength during the past 60 years of “global warming.”

#14 – Big storm insurances losses “increasing”

Hurricane Chart

Gore says insurance losses arising from large storms and other extreme-weather events are increasing, by implication because of “global warming.” They are not. Insured losses, as a percentage of the population of coastal areas in the path of hurricanes, were lower even in 2005 than they had been in 1925. In 2006, a very quiet hurricane season, Lloyds of London posted their biggest-ever profit: £3.6 billion.

#15 – Mumbai “flooding”

Monthly Total Rainfall Over Santa Cruz

Gore says flooding in Mumbai is increasing, by implication because of “global warming.” It is not. Rainfall trends at the two major weather stations in Mumbai show no increase in heavy rainfall over the past 48 years.

#16 – Severe tornadoes “more frequent”

Graph of the Number of Severe Tornadoes in the United States

Gore says that 2004 set an all-time record for tornadoes in the US. More tornadoes are being reported because detection systems are better than they were. But the number of severe tornadoes has been falling for more than 50 years.

#17 – The sun “heats the Arctic ocean”

Gore says that ice-melt allows the Sun to heat the Arctic Ocean, and a diagram shows the Sun’s rays heating it directly. It does not. The ocean emits radiant energy at the moment of absorption, and would freeze if there were no atmosphere. It is the atmosphere, not the Sun that warms the ocean. Also, Gore’s diagram confuses the tropopause with the ionosphere, and he makes a number of other errors indicating that he does not understand the elementary physics of radiative transfer.

#18 – Arctic “warming fastest”

Ice Breakers

Gore says the Arctic has been warming faster than the rest of the planet. It is not. While it is in general true that during periods of warming (whether natural or anthropogenic) the Arctic will warm faster than other regions, Gore does not mention that the Arctic has been cooling over the past 60 years, and is now one degree Celsius cooler than it was in the 1940s. There was a record amount of snow cover in the Northern Hemisphere in 2001. Several vessels were icebound in the Arctic in the spring of 2007, but few newspapers reported this. The newspapers reported that the North-West Passage was free of ice in 2007, and said that this was for the first time since records began: but the records, taken by satellites, had only begun 29 years previously. The North-West Passage had also been open for shipping in 1945, and, in 1903, the great Norwegian explorer Amundsen had passed through it in a sailing ship.

#19 – Greenland ice sheet “unstable”

Greenland Change in Ice Sheet Elevation
Colors indicate ice-sheet elevation
change rate in cm/year, based on satellite
altimeter data, 1992-2003. The spatially
averaged increase is 5.4 +- 0.2 cm/year

Gore says “global warming” is making the Greenland ice sheet unstable. It is not. Greenland ice grows 2in a year. The Greenland ice sheet survived each of the previous three interglacial periods, each of which was 5 degrees Celsius warmer than the present. It survived atmospheric CO2 concentrations of up to 1000 ppmv (compared with today’s 400 ppmv). It last melted 850,000 years ago, when humankind did not exist and could not have caused the melting. There is a close correlation between variations in Solar activity and temperature anomalies in Greenland, but there is no correlation between variations in CO2 concentration and temperature changes in Greenland. The IPCC (2001) says that to melt even half the Greenland ice sheet would require temperature to rise by 5.5 degrees C and remain that high for several thousand years.

#20 – Himalayan glacial melt waters “failing”

Gore says 40% of the world’s population get their water supply from Himalayan glacial melt waters that are failing because of “global warming.” They don’t and they are not. The water comes almost entirely from snow-melt, not from ice-melt. Over the past 40 years there has been no decline in the amount of snow-melt in Eurasia.

#21 – Peruvian glaciers “disappearing”

Gore says that a Peruvian glacier is less extensive now than it was in the 1940s, implying that “global warming” is the cause. It is not. Except for the very highest peaks, the normal state of the Peruvian cordilleras has been ice-free throughout most of the past 10,000 years.

#22 – Mountain glaciers worldwide “disappearing”

Graph of Glacier Length and Metric Tons of Carbon Used

Gore says that “the ice has a story to tell, and it is worldwide.” He shows several before-and-after pictures of glaciers disappearing. However, the glacial melt began in the 1820s, long before humankind could have had any effect, and has continued at a uniform rate since, showing no acceleration since humankind began increasing the quantity of CO2 in the atmosphere. Total ice volumes in three of the last four Ice Ages were lower than they are today, and “global warming” had nothing to do with that.

#23 – Sahara desert “drying”

Sand Dune

Gore says terrible tragedies are occurring in the southern Sahara because of drought which he blames on “global warming.” There is no drought caused by “global warming.” In 2007 there were record rains across the whole of the southern Sahara. In the past 25 years the Sahara has shrunk by some 300,000 square kilometers because of additional rainfall. Some scientists think “global warming” may actually mitigate pre-existing droughts because there will be more water vapor in the atmosphere. Before 1200 AD there were frequent, prolonged and severe droughts in the Great Plains. Since 1200 AD, there has been more rainfall. Likewise, the US has had more rainfall since the 1950s than it had in the earlier part of the 20th Century, when the great droughts which were then common were described by John Steinbeck in The Grapes of Wrath. South African rainfall was also more stable in the second half of the 20th Century, when human effect on climate is said to have become significant, than in the first half.

#24 – West Antarctic ice sheet “unstable”

Antarctic Ice Sheet

Gore says disturbing changes have been measured under the West Antarctic ice sheet, implicitly because of “global warming.” Yet most of the recession in this ice sheet over the past 10,000 years has occurred in the absence of any sea-level or temperature forcing. In most of Antarctica, the ice is in fact growing thicker. Mean Antarctic temperature has actually fallen throughout the past half-century. In some Antarctic glens, environmental damage has been caused by temperature decreases of up to 2 degrees Celsius. Antarctic sea-ice spread to a 30-year record extent in late 2007.

#25 – Antarctic Peninsula ice shelves “breaking up”

Gore says half a dozen ice shelves each “larger than Rhode Island” have broken up and vanished from the Antarctic Peninsula recently, implicitly because of “global warming.” Global warming is unlikely to have been the cause. Gore does not explain that the ice shelves have melted before, as studies of seabed sediments have shown. The Antarctic Peninsula accounts for about 2% of the continent, in most of which the ice is growing thicker. All the recently-melted shelves, added together, amount to an area less than one-fifty-fifth the size of Texas.

#26 – Larsen B Ice Shelf “broke up because of ‘global warming’”

Gore focuses on the Larsen B ice shelf, saying that it completely disappeared in 35 days. Yet there has been extensive ice-shelf break-up throughout the past 10,000 years, and the maximum ice-shelf extent may have been in the Little Ice Age in the late 15th century.

#27 – Mosquitoes “climbing to higher altitudes”


Gore says that, because of “global warming”, mosquitoes are climbing to higher altitudes. They are not. Most recent outbreaks have been at lower levels than those of a century and more ago. He says that Nairobi was founded 1000 m above sea level so as to be above the mosquito line. It was not. In the period before anthropogenic warming could have had any significant effect, there were ten malaria outbreaks in Nairobi, one of which reached as far up as Eldoret, almost 3000 m above sea level. Malaria is not a tropical disease. Mosquitoes do not need tropical temperatures: they need no more than 15 degrees Celsius to breed. The largest malaria outbreak of modern times was in Siberia in the 1920s and 1930s, when 13 million were infected, 600,000 died and 30,000 died as far north as Arkhangelsk, on the Arctic Circle. There is no reason to suppose that malaria will spread even if the climate continues to become warmer.

#28 – Many tropical diseases “spread through ‘global warming’”

Gore says that, as well as malaria, “global warming” is spreading dengue fever, Lyme disease, West Nile virus, arena virus, avian flu, Ebola virus, E. Coli 0157:H7, Hanta virus, legionella, leptospirosis, multi-drug-resistant TB, Nipah virus, SARS and Vibrio Cholerae 0139. It is doing no such thing. Only the first four diseases are insect-borne, but none is tropical. Of the other diseases named by Gore either in his film or in the accompanying book, not one is sensitive to increasing temperature. They are spread not by warmer weather but by rats, chickens, primates, pigs, poor hygiene, ill-maintained air conditioning, or cold weather.

#29 – West Nile virus in the US “spread through ‘global warming’”

Gore says that West Nile virus spread throughout the US in just two years, implicitly because of “global warming.” It did not. The climate in the US ranges from some of the world’s hottest deserts to some of its iciest tundra. West Nile virus flourishes in any climate. Warming of the climate, however caused, does not affect its incidence or prevalence.

#30 – Carbon dioxide is “pollution”

Gore describes carbon dioxide as “global warming pollution.” It is not. It is food for plants and trees. Tests have shown that even at concentrations 30 times those of the present day even the most delicate plants flourish. Well-managed forests, such as those of the United States, are growing at record rates because the extra carbon dioxide in the atmosphere is feeding the trees. Carbon dioxide, in geological timescale, is at a very low concentration at present. Half a billion years ago it was at 7000 parts per million by volume, about 18 times today’s concentration.

#31 – The European heat wave of 2003 “killed 35,000″

Gore says, “A couple of years ago in Europe they had that heat wave that killed 35,000.” Though some scientists agree with Gore, the scientific consensus is that extreme warm anomalies more unusual than the 2003 heat wave occur regularly; extreme cold anomalies also occur regularly; El Niýo and volcanism appear to be of much greater importance than any general warming trend; and there is little evidence that regional heat or cold waves are significantly increasing or decreasing with time. In general, warm is better than cold, which is why the largest number of life-forms are in the tropics and the least number are at the poles. A cold snap in the winter following the European heat wave killed 20,000 in the UK alone. Though the IPCC says 150,000 people a year are being killed worldwide by “global warming,” it reaches this figure only by deliberately excluding the number of people who are not being killed because there is less cold weather. In the US alone, it has been estimated that 174,000 fewer people are being killed each year because there are fewer episodes of extreme cold.

#32 – Pied flycatchers “cannot feed their young”

Gore says “The peak arrival date for migratory birds 25 years ago was April 25. Their chicks hatched on June 3, just at the time when the caterpillars were coming out: Nature’s plan. But 20 years of warming later the caterpillars peaked two weeks earlier. The chicks tried to catch up with it, but they couldn’t. So they are in trouble.” Yet adaptation is easy for the flycatchers: they merely fly a few tens of kilometers further north and they will find caterpillars hatching at the appropriate time. Besides, though Gore does not say so, what is bad news for the pied flycatchers is good news for the caterpillars, and for the butterflies they will become.

#33 – Gore’s bogus pictures and film footage

In the book accompanying Gore’s film, the story of the pied flycatchers and the caterpillars is accompanied by a picture of a bird feeding her hungry chicks. However, closer inspection shows that the bird is not a pied flycatcher but a black tern; and that she is not carrying a caterpillar in her beak, but a small fish. Gore similarly misuses spectacular footage of a glacier apparently calving off enormous slabs of ice into the sea footage that is often shown on television to accompany stories about “global warming.” However, the glacier in question is one that is known to be advancing and to be doing so more rapidly and more often than previously. It is in southern Argentina, where its snout crosses and eventually dams, Lake Argentino. Water builds up behind the ice dam and eventually bursts it, causing the spectacular collapse of ice into the lake that is so misleadingly used as the iconic image of the effect of “global warming” on glaciers. The breaking of the ice dam used to occur every eight years or so: now, however, it occurs every five years, not because of “global warming” because of the regional cooling of the southern Atlantic.

#34 – The Thames Barrier “closing more frequently”

Gore says that rising sea levels are compelling the operators of the Thames Barrier to close it more frequently than when it was first built. They are not. The barrier is indeed closed more frequently than when it was built, but the reason has nothing to do with “global warming” or rising sea levels. The reason is a change of policy by which the barrier is closed during exceptionally low tides, so as to retain water in the tidal Thames rather than keeping it out. Yet even the present leader of the official Opposition in the UK Parliament recently used a major speech as the opportunity to mention today’s more frequent closing of the Thames Barrier as though it were a matter of grave concern.

35 – “No fact…in dispute by anybody.”

Gore says that his prediction that the atmospheric concentration of carbon dioxide will rise to more than 600 parts per million by volume as soon as 2050 is “not controversial in any way or in dispute by anybody.” However, not one of the half-dozen official projections of growth in CO2 concentration made by the IPCC shows as much as 600 parts per million by 2050.

Christopher Monckton

About the Author: Christopher Walter Monckton, 3rd Viscount Monckton of Brenchley (born 14 February 1952) is a retired British international business consultant, policy advisor, writer, and inventor. He served as an advisor to Margaret Thatcher and has attracted controversy for his public opposition to the mainstream scientific consensus on global warming and climate change. This story was originally released in October 2007 on the website of the Science & Public Policy Institute, among other places, and is republished here with permission.

Email the Editor about this Article
EcoWorld - Nature and Technology in Harmony

Posted in Birds, Causes, Drought, Effects Of Air Pollution, Energy, Fish, Global Warming & Climate Change, Literature, Other, Regional, Shipping, Solar, Tidal, Wind6 Comments

Interview with Roger Pielke, Sr.

Roger Pielke Portrait
Dr. Roger Pielke, Sr.
“Scientific rigor has been sacrificed,
and poor policy and political decisions
will inevitably follow.”

Roger Pielke Sr. is a retired professor of atmospheric science at Colorado State University, Ft. Collins, and a senior research scientist at the University of Colorado, Boulder. Since July 2005 he has written and maintained Climate Science, a blog that serves as a scientific forum for dialogue and commentary on climate issues. With William R. Cotton, he is the co-author of Human Impacts on Weather and Climate (Cambridge University Press, 2007). And over the past summer he co-hosted a conference entitled “Land Use and Climate Change,” in Boulder, Colorado. While Dr. Pielke rejects being characterized as a “global warming skeptic,” his work is unwaveringly critical of the current conventional wisdom regarding climate change and what to do about it. EcoWorld Editor Ed Ring recently caught up with Dr. Pielke, who had the following to say on the topic:

EcoWorld: How would you say that current conventional wisdom regarding climate change has gotten it wrong?

Pielke: In terms of climate change and variability on the regional and local scale, the Intergovernmental Panel on Climate Change (IPCC) reports, the Climate Change Science Program (CCSP) report on surface and tropospheric temperature trends, and the U.S. National Assessment [of Climate Change] have overstated the role of the radiative effect of the anthropogenic increase of carbon dioxide (CO2) in relation to a diversity of other human climate- forcing mechanisms. Indeed, many research studies incorrectly oversimplify climate change by characterizing it as being dominated by the radiative effect of human-added CO2. But while prudence suggests that we work to minimize our disturbance of the climate system (since we don’t fully understand it), by focusing on just one subset of forcing mechanisms, we end up seriously misleading policymakers as to the most effective way of dealing with our social and environmental vulnerability in the context of the entire spectrum of environmental risks and other threats we face today.

EcoWorld: What about experts’ predictions of rising sea levels, extreme weather, melting polar ice caps, and so on?

Pielke: Global and regional climate models have not demonstrated themselves to be skillful predictors of regional and local climate change and variability over multidecadal time scales. For example, in the case of sea ice, the models are consistent with the decrease in Arctic sea ice in recent years, but they cannot explain the multiyear increase in Antarctic sea ice (including a record level this year). With respect to extreme weather, a much more important issue than how greenhouse gases are altering our climate is society’s greatly increased vulnerability to extreme weather events – a direct result not of changes in weather but of increased settlement by expanding human populations into low-lying coastal regions, floodplains, and marginal arid land.

EcoWorld: But what about the northern icecap shrinking this September to possibly possibly its smallest size in history (exposing more than 1 million square miles of open water) or the comments of Robert Correll, chairman of the Arctic Climate Impact Assessment, regarding recent observations in Greenland (“We have seen a massive acceleration of the speed with which these glaciers are moving into the sea”)? Is something new and alarming happening?

Pielke: These examples represent selected observations that promote the view that human-input carbon dioxide is dominating climate change. However, the climate is – and always will be – changing. Thus, although human activity certainly affects the way in which climate varies and changes, actual global observations present a much more complex picture than that represented by the two examples listed above. For example, Antarctic sea ice reached a record maximum coverage in 2007, and the globally averaged lower atmosphere has not warmed in the last nine years (and, in fact, is cooler than it was in 1998). In addition, there are regions of the world where glaciers are advancing (such as New Zealand, parts of the Himalayas, and Norway). However, this information – which conflicts with the projections of the multi-decadal global climate models and the 2007 IPCC report – has been almost completely ignored by policymakers and the media.

Human Impacts on Weather and Climate Change Book Cover
Human Impacts on Weather and
Climate, by Roger Pielke, Sr.,
and William R. Cotton
Cambridge University Press

EcoWorld: What role have alterations in land use played in climate change?

Pielke: Changes in land use by humans and the resulting alterations in weather and hydrology are major drivers of long-term regional and global climate patterns – yet the 2007 IPCC Statement for Policymakers largely ignores their importance (despite extensive documentation in research literature). Along with the diverse influences of aerosols on climate, land use effects (caused, for example, by deforestation, desertification, and conversion of land to farming) may be at least as important in altering the weather as the changes in climate patterns associated with the radiative effect of carbon dioxide and other well-mixed greenhouse gases. Moreover, land use and land cover changes will continue to exert an important influence on the Earth’s climate for the next century.

The reason for this is that even if the globally averaged surface temperature change over time ends up being close to zero in response to land use and land cover change and variability, the regional changes in surface temperature, precipitation, and other climate metrics could be as large as or larger than those that result from the anthropogenic increase of greenhouse gases. Moreover, people and ecosystems experience the effects of environmental change regionally, not as global averaged values. Thus, the issue of a “discernable human influence on global climate” misses the obvious, in that we have been altering climate by land use and land cover change ever since humans began large-scale alterations of the land surface.

EcoWorld: What were the main conclusions to come out of your recent conference focusing on the land use changes that affect the Earth’s climate?

Pielke: This meeting reconfirmed the first order role of land management as a climate forcing mechanism. These findings supported the conclusions of the 2005 National Research Council report “Radiative Forcing of Climate Change: Expanding the Concept and Addressing Uncertainties,” which identified land use change as having a major effect on climate. Unfortunately, the role of land surface processes was underreported in the body of the IPCC report and was essentially ignored in the IPCC Statement for Policymakers.

EcoWorld: Sticking with land use changes: Do you think that tropical forests create a thermostatic effect that moderates extreme weather? And following on that, do you think tropical deforestation could be as significant a driver in climate change as anthropogenic CO2?

Pielke: Tropical deforestation clearly has an effect on both regional and global climate that is at least as important as the radiative effect of adding CO2. When forests are removed, not only does the climate system lose the biodiversity and other benefits of that environment, the vegetation loses its ability to dynamically respond in ways that reduce extreme weather fluctuations. For example, when trees access deeper water through their roots, the resulting transpiration of water vapor into the atmosphere (making rain more likely) can help ameliorate dry conditions when the large-scale weather pattern is one of drought.

EcoWorld: What is your criticism of the IPCC?

Pielke: Mainly the fact that the same individuals who are doing primary research into humans’ impact on the climate system are being permitted to lead the assessment of that research. Suppose a group of scientists introduced a drug they claimed could save many lives: There were side effects, of course, but the scientists claimed the drug’s benefits far outweighed its risks. If the government then asked these same scientists to form an assessment committee to evaluate their claim (and the committee consisted of colleagues of the scientists who made the original claim as well as the drug’s developers), an uproar would occur, and there would be protests. It would represent a clear conflict of interest. Yet this is what has happened with the IPCC process. To date, either few people recognize this conflict, or those that do choose to ignore it because the recommendations of the IPCC fit their policy and political agenda. In either case, scientific rigor has been sacrificed, and poor policy and political decisions will inevitably follow.

EcoWorld: How effective are current climate computer models in helping us understand global climate trends?

Pielke: Using global climate models to improve our understanding of how the system works represents a valuable application of such tools, but the term sensitivity study should be used to characterize these assessments. In sensitivity studies, a subset of the forcings and/or feedback of the climate system are perturbed to examine their response. Since the computer model of the climate system is incomplete (meaning it doesn’t include all of the important feedbacks and forcings), what the IPCC is really doing is conducting a sensitivity study.

The IPCC reports, however, inaccurately present their assessment as a “projection” – one that’s widely interpreted by policymakers and others as being able to skillfully forecast the future state of the climate system. But even one of the IPCC’s leading authors, Kevin Trenberth, has gone on record reminding people of the limitations of the models used in its projections. Says Trenberth, “There are no predictions by IPCC & and there never have been.” He further states, “None of the models used by IPCC are initialized to the observed state, and none of the climate states in the models correspond even remotely to the current observed climate.”

Indeed, says Trenberth, “The current projection method works to the extent it does because it utilizes differences from one time to another, and the main model bias and systematic errors are thereby subtracted out. This assumes linearity. It works for global forced variations, but it cannot work for many aspects of climate, especially those related to the water cycle.”

Thus, as clarified even by one of the key IPCC contributors (who has a vested interest in the acceptance of the 2007 IPCC report), current climate models clearly cannot accurately model observed real-world changes in climate. Global model results projected out decades into the future should never be interpreted as skillful forecasts. Instead, they should be interpreted as sensitivity studies on limited variables. When authors of research papers use definitive words (such as “will occur”) and display model output with specific time periods in the future, they are misleading policymakers and other people who use this information.

EcoWorld: What policies should be considered to deal with climate change? Is reducing CO2 emissions part of the solution?

Pielke: Reduction of greenhouse gas emissions can only serve as a useful “environmental currency” as long as it provides the benefits needed to reduce the risk to critical environmental and social resources. As such, it needs to be part of a win-win strategy that provides a diversity of benefits. With energy efficiency and energy independence, for example, everyone benefits. As the “currency” for these benefits, however, greenhouse gas emission reduction represents an unnecessarily blunt instrument if there are more effective ways to reduce the risks to societal and environmental resources. Moreover, greenhouse gas policies can produce serious unintended negative consequences such as an increase in carcinogenic emissions when biodiesel is used, or reductions in biodiversity and alterations in climate when land management practices convert large areas to biofuels.

Greenhouse gas emission reductions, relative to other environmental currencies, should be evaluated with respect to their ability to reduce risk to essential social and environmental resources. In this framework, greenhouse emission reductions are only useful if they provide real benefit to those resources. Thus, if a policy made for other reasons also happens to reduce greenhouse gas emissions, you clearly have a win-win situation. The current focus on using reductions in CO2 emissions as the primary currency for achieving benefits to society and the environment, however, clearly represents a very flawed approach.

Email the Editor about this Article
EcoWorld - Nature and Technology in Harmony

Posted in Atmospheric Science, Biodiversity, Drought, Effects Of Air Pollution, Energy, Energy Efficiency, Global Warming & Climate Change, History, Literature, Other, Policies & Solutions, Regional2 Comments

Is there a Basis for Global Warming Alarm & Climate Catastrophe?

Brayton Point Powerplant in Fall River, Massachusetts
Brayton Point powerplant in Fall River, Massachusetts
(Photo: Alexey Sergeev)

Editor’s Note: There are several reasons we have decided to jump into the global warming debate. First of all, it apparently is no longer a debate – everything’s been decided – and that doesn’t feel right. Around the world, politicians and big businesses are now enthusiastically embracing the need to fight global warming. In order to curb CO2 emissions, we are on the verge of enacting sweeping regulations that will affect every industrial sector on earth.

There’s nothing wrong with many of the side benefits. Hopefully while we regulate CO2, we won’t forget to also regulate carbon monoxide, lead, ozone, particulate matter, nitrogen dioxide, and sulfur dioxide. Hopefully while we regulate CO2, we will be spurred towards faster adoption of totally clean alternative energy, and hopefully as well, we will all achieve energy independence.

Still, it feels like there hasn’t been enough willingness to carefully deliberate the science behind the scare. We have already poked at some of the assumptions underlying global warming theories, and found them wanting. Why isn’t deforestation, and changed land use, and the urban heat island effect, also held accountable for global warming? Why can’t reforestation absorb and reduce levels of carbon dioxide? After all, we’ve lost 40% of the world’s forests – of course atmospheric carbon dioxide levels are higher.

We have looked at truly alarming articles regarding rising sea levels in our posts Antarctic Ice and Greenland’s Ice Cap, and while they present some sobering possibilities, upon scrutiny there wasn’t anything nearly as alarming as the mainstream media is telling us, over and over. And what about those intensifying storms, such as Hurricane Katrina? Our author here claims that as the earth warms, the temperature differential between the tropics and the polar regions diminishes, and storms decrease in intensity. So we have a lot of questions.

Dr. Richard Lindzen, the Alfred P. Sloan Professor of Atmospheric Science at the Massachusetts Institute of Technology, in this presentation delivered less than a year ago at Yale University, and not previously published, has outlined in technical detail the reasons for his skepticism regarding global warming hysteria. Even Dr. Lindzen’s detractors acknowledge he is one of the most eminent atmospheric scientists in the world. In spite of being at times excoriated, Dr. Lindzen is sticking to his guns. We prefer to think this is courage, not opportunism. In any case, it is the arguments, not the individual, which must determine if we accept or reject a theory. Read on, and make up your own minds. – Ed “Redwood” Ring

For the sensitive reader or listener, the language used in connection with the issue of Global Warming must frequently sound strange.

Weather and climate catastrophes of all sorts are claimed to be what one expects from global warming, and global warming is uniquely associated with man’s activities. The reality of the threat of global warming is frequently attested to by reference to a scientific consensus:

Tony Blair (1): “The overwhelming view of experts is that climate change, to a greater or lesser extent, is man-made, and, without action, will get worse.”

Elizabeth Kolbert in the New Yorker (2): “All that the theory of global warming says is that if you increase the concentration of greenhouse gases in the atmosphere, you will also increase the earth’s average temperature. It’s indisputable that we have increased greenhouse-gas concentrations in the air as a result of human activity, and it’s also indisputable that over the last few decades average global temperatures have gone up.”

Al Gore Portrait
Al Gore – a passionate
advocate of curbing
CO2 emissions

Given the alarm that surrounds the issue, such statements seem peculiarly inconclusive and irrelevant to the catastrophes cited. To be sure, these references are one-sided. They fail to note that there are many sources of climate change, and that profound climate change occurred many times both before and after man appeared on earth; given the ubiquity of climate change, it is implausible that all change is for the worse. Moreover, the coincidence of increasing CO2 and the small warming over the past century hardly establishes causality. Nevertheless, for the most part I do not personally disagree with the Consensus (though the absence of any quantitative considerations should be disturbing). Indeed, I know of no serious split, and suspect that the claim that there is opposition to this consensus amounts to no more than setting up a straw man to scoff at. However, I believe that people are being led astray by the suggestion this agreement constitutes support for alarm.

Let us view the components that comprise this consensus a little more precisely while recognizing that there is, indeed, some legitimate controversy connected with specific aspects of even these items.

  1. The global mean surface temperature is always changing. Over the past 60 years, it has both decreased and increased. For the past century, it has probably increased by about 0.6 ±0.15 degrees Centigrade (C). That is to say, we have had some global mean warming.
  2. CO2 is a greenhouse gas and its increase should contribute to warming. It is, in fact, increasing, and a doubling would increase the greenhouse effect (mainly due to water vapor and clouds) by about 2%.
  3. There is good evidence that man has been responsible for the recent increase in CO2, though climate itself (as well as other natural phenomena) can also cause changes in CO2.

In some respects, these three pillars of consensus are relatively trivial. Remaining completely open is the question of whether there is any reason to consider this basic agreement as being alarming. Relatedly, is there any objective basis for considering the approximate 0.6C increase in global mean surface temperature to be large or small regardless of its cause? The answer to both questions depends on whether 0.6C is larger or smaller than what we might have expected on the basis of models which have led to our concern. These models are generally called General Circulation Models (GCMs). We may, therefore, seek to determine how the current level of man made climate forcing compares with what we would have were CO2 to be doubled (a common reference level for GCM calculations).

An Inconvenient Truth Book Cover
An Inconvenient Truth, by Al Gore
The most influential book in history?

In terms of climate forcing, greenhouse gases added to the atmosphere through mans activities since the late 19th Century have already produced three-quarters of the radiative forcing that we expect from a doubling of CO2 (3).

The main reasons for this are:

  1. CO2 is not the only anthropogenic greenhouse gas – others like methane also contribute; and
  2. The impact of CO2 is nonlinear in the sense that each added unit contributes less than its predecessor. For example, if doubling CO2 from its value in the late 19th Century (about 290 parts per million by volume or ppmv) to double this (i.e., 580 ppmv) causes a 2% increase in radiative forcing (4), then to obtain another 2% increase in radiative forcing we must increase CO2 by an additional 580 ppmv rather than by another 290 ppmv. At present, the concentration of CO2 is about 380 ppmv. The easiest way to understand this is to consider adding thin layers of paint to a pane of glass. The first layer cuts out much of the light, the next layer cuts out more, but subsequent layers do less and less because the painted pane is already essentially opaque.

It should be stressed that we are interested in climate forcing, and not simply levels of CO2; the two are most certainly not linearly proportional.

Essential to alarm is the fact that most current climate models predict a response to a doubling of CO2 of about 4C (which is much larger than what one expects the simple doubling of CO2 to produce: ie, about 1C). The reason for this is that in these models, the most important greenhouse substances, water vapor and clouds, act in such a way as to greatly amplify the response to anthropogenic greenhouse gases alone (ie, they act as what are called large positive feedbacks). However, as all assessments of the Intergovernmental Panel on Climate Change (IPCC) have stated (at least in the text – though not in the Summaries for Policymakers), the models simply fail to get clouds right. We know this because in official model intercomparisons, all models fail miserably to replicate observed distributions of cloud cover. Thus, the model predictions are critically dependent on features that we know must be wrong. In Figure 1 we see that treatment of clouds involves errors an order of magnitude greater than the forcing from a doubling of CO2 (5). While the IPCC allows for the possibility that the models get water vapor right, the intimate relation of water vapor to clouds makes such a conclusion implausible.

Graph of Total Cloudiness by Latitude
Figure 1. Each thin gray line shows an individual model’s hindcast of percentage
cloud cover averaged by latitude. The black line shows the observed cloud cover

Let me summarize the main points thus far:

  1. It is NOT the level of CO2 that is important, but rather the impact of man made greenhouse gases on climate.
  2. Although we are far from the benchmark of doubled CO2, climate forcing is already about 3/4 of what we expect from such a doubling.
  3. Even if we attribute all warming over the past century to man made greenhouse gases (which we have no basis for doing), the observed warming is only about 1/3-1/6 of what models project.

We are logically led to two possibilities:

  1. Our models are greatly overestimating the sensitivity of climate to man made greenhouse gases, or
  2. The models are correct, but there is some unknown process that has cancelled most of the warming.

Note that calling the unknown process “aerosols” does not change this statement since aerosols and their impact are unknown to a factor of ten or more; indeed, even the sign is in doubt.

In arguing for climate alarmism, we are choosing the second possibility. Moreover, we are assuming that the unknown cancellation will soon cease. How is the second possibility supported given that it involves so many more assumptions than the first possibility?

Graphs of Simulated Annual Global Mean Surface Temperatures
Figure 2. Simulations of global mean temperature with various combinations of ‘forcing.’

The IPCC Third Assessment Report (TAR) made use of a peculiar exercise in curve fitting using results from the Hadley Centre. It consists in three plots which are reproduced in Figure 2. In the first panel, we are shown an observed temperature record (without error bars), and the outputs of four model runs (using their coupled ocean-atmosphere model) with so-called natural forcing for the period 1860-2000. There is a small spread in the model runs (which presumably displays model uncertainty – it most assuredly does not represent natural internal variability). In any event, the models look roughly like the observations until the last 30 years. We are then shown a second diagram where the observed curve is reproduced and the four models are run with anthropogenic forcing. Here we see rough agreement over the last 30 years, and poorer agreement in the earlier period. Finally, we are shown the observations and the model runs with both natural and anthropogenic forcing, and, voila, there is rough agreement over the whole record. It should be noted that the models used had a relatively low sensitivity to a doubling of CO2 of about 2.5C.

In order to know what to make of this exercise, one must know exactly what was done. The natural forcing consisted in volcanoes and solar variability. Prior to the Pinatubo eruption in 1991, the radiative impact of volcanoes was not well measured, and estimates vary by about a factor of 3. Solar forcing is essentially unknown. Thus, natural forcing is, in essence, adjustable. Anthropogenic forcing includes not only anthropogenic greenhouse gases, but also aerosols that act to cancel warming (in the Hadley Centre outputs, aerosols and other factors cancelled two thirds of the greenhouse forcing). Unfortunately, the properties of aerosols are largely unknown. In the present instance, therefore, aerosols constitute simply another adjustable parameter (indeed, both its magnitude and its time history are adjustable, and even its sign is in question).

This was remarked upon in a recent paper in Science (Andersen, et al, 2003 (6)), wherein it was noted that the uncertainty was so great that estimating aerosol properties by tuning them to optimize agreement between models and observations (referred to as an inverse method) was probably as good as any other method, but that the use of such estimates to then test the models constituted a circular procedure. This is as strong a criticism of model procedures as is likely to be found in Science. The authors are all prominent in aerosol work. The first author is the most junior, and when it was pointed out that the article reflected negatively on model outputs, he vehemently denied any such intent. In the present example, the choice of models with relatively low sensitivity, allowed adjustments that were not so extreme.

New uncertainties are always entering the aerosol picture. Some are quite bizarre. A recent article in Science (Jaenicke, 2005 (7)) even proposed a significant role to airborn dandruff. Other articles have been suggesting that the primary impact of aerosols is actually warming (Jacobson, 2001 (8), Chen and Penner, 2005 (9)). Of course this is the beauty of the global warming issue for many scientists. The issue deals with such small climate forcing and small temperature changes that it permits scientists to argue that everything and anything is important for climate.

In brief, the defense of the models starts by assuming the model is correct. One then attributes differences between the model behavior in the absence of external forcing, and observed changes in ‘global mean temperature’ to external forcing. Next one introduces ‘natural’ forcing and tries to obtain a ‘best fit’ to observations. If, finally, one is able to remove remaining discrepancies by introducing ‘anthropogenic’ forcing, we assert that the attribution of part of the observed change to the greenhouse component of ‘anthropogenic’ forcing must be correct.

Of course, model internal variability is not correct, and ‘anthropogenic’ forcing includes not only CO2 but also aerosols, and the latter are unknown to a factor of 10-20 (and perhaps even sign). Finally, we have little quantitative knowledge of ‘natural’ forcing so this too is adjustable. Recall that the Hadley Centre acknowledges that the “aerosols” canceled most of the forcing from CO2.

Industrial Smokestacks and Smog
Something like 100,000 quadrillion BTUs of
energy are locked in remaining fossil fuels.
There is no shortage, but is it safe to burn?
(Photo: US EPA)

Yet, the ‘argument’ I have just presented is the basis for all popular claims that scientists now ‘believe’ that man is responsible for much of the observed warming!

It would appear that the current role of the scientist in the global warming issue is simply to defend the ‘possibility’ of ominous predictions so as to justify his ‘belief.’

To be fair to the authors of Chapter 12 of the IPCC Third Scientific Assessment here is what they provided for the draft statement of the Policymakers Summary: From the body of evidence since IPCC (1996), we conclude that there has been a discernible human influence on global climate. Studies are beginning to separate the contributions to observed climate change attributable to individual external influences, both anthropogenic and natural. This work suggests that anthropogenic greenhouse gases are a substantial contributor to the observed warming, especially over the past 30 years. However, the accuracy of these estimates continues to be limited by uncertainties in estimates of internal variability, natural and anthropogenic forcing, and the climate response to external forcing.

This statement is not too bad – especially the last sentence. To be sure, the model dependence of the results is not emphasized, but the statement is vastly more honest than what the Summary for Policymakers in the IPCC’s Third Assessment Report ultimately presented:

In the light of new evidence and taking into account the remaining uncertainties, most of the observed warming over the last 50 years is likely to have been due to the increase in greenhouse gas concentrations.

In point of fact, the impact of man remains indiscernible simply because the signal is too small compared to the natural noise. Claims that the current temperatures are ‘record breaking’ or ‘unprecedented’, however questionable or misleading, simply serve to obscure the fact that the observed warming is too small compared to what models suggest. Even the fact that the oceans’ heat capacity leads to a delay in the response of the surface does not alter this conclusion (especially since the Hadley Centre results are obtained with a coupled model).

Moreover, the fact that we already have three quarters of the climate forcing expected from a doubling of CO2 means that if one truly believes the models, then we have long since passed the point where mitigation is a viable strategy. What remains is to maximize our ability to adapt. That the promotion of alarm does not follow from the science is clearly illustrated by the following example:

According to any textbook on dynamic meteorology, one may reasonably conclude that in a warmer world, extratropical storminess and weather variability will actually decrease. The reasoning is as follows. Judging by historical climate change, changes are greater in high latitudes than in the tropics. Thus, in a warmer world, we would expect that the temperature difference between high and low latitudes would diminish. However, it is precisely this difference that gives rise to extratropical large-scale weather disturbances. Moreover, when in Boston on a winter day we experience unusual warmth, it is because the wind is blowing from the south. Similarly, when we experience unusual cold, it is generally because the wind is blowing from the north. The possible extent of these extremes is, not surprisingly, determined by how warm low latitudes are and how cold high latitudes are. Given that we expect that high latitudes will warm much more than low latitudes in a warmer climate, the difference is expected to diminish, leading to less variance.

Nevertheless, we are told by advocates and the media that exactly the opposite is the case, and that, moreover, the models predict this (which, to their credit, they do not) and that the basic agreement discussed earlier signifies scientific agreement on this matter as well. Clearly more storms and greater extremes are regarded as more alarming than the opposite. Thus, the opposite of our current understanding is invoked in order to promote public concern. The crucial point here is that once the principle of consensus is accepted, agreement on anything is taken to infer agreement on everything advocates wish to claim.

The reader may have noticed that I focused on extratropical storms in the above example. However, given the relatively heavy hurricane season we’ve had, the emphasis of late has been on tropical storms. Recent papers suggesting that in a warmer world, such storms may become more powerful (10), have been seized upon with alacrity by political activists. Needless to add, the articles seized upon have been extremely controversial, but more to the point, no such relation was uncovered for storms reaching land – only for those over water.

At this point, it is doubtful that we are even dealing with a serious problem. If this is correct, then there is no policy addressing this non-problem that would be cost-effective. Even if we believe the problem to be serious, we have already reached the levels of climate forcing that have been claimed to be serious. However, when it comes to Kyoto, the situation is even worse. Here, there is widespread and even rigorous scientific agreement that complete adherence to the Kyoto Agreement would have no discernible impact on climate regardless of what one believes about climate. Thus, the theme of this meeting is, at least on this count, appropriate.

What about the first possibility: namely that the models on which we are basing our alarm are much too sensitive? Not only is this the possibility that scientists would normally have preferred on the basis of Occam’s famous razor, but it is also a possibility for which there is substantial support (11). I will focus on one line of this evidence: tropical warming in the 90′s has been associated with much greater outgoing long wave radiation than models produce. This discrepancy points toward the absence of a strong negative feedback in current models.

The discrepancy has been confirmed by at least four independent groups: at NASA’s Goddard Institute for Space Studies (Chen et al, 2002, DelGenio and Kovari, 2002 (12)), at NASA Langley (Wielicki et al, 2002, Lin et al, 2004 (13)), at SUNY Stony Brook (Cess and Udelhofen, 2003 (14)), and at the University of Miami (Clement and Soden, 2005 (15)).

This discrepancy would normally have pointed to exaggerated model sensitivity. However, the preceding papers attempted to either attribute the discrepancy to circulation changes or to ‘unknown’ cloud properties – except for the last paper. Clement and Soden (2005) showed that changes in dynamics could not produce changes averaged over the tropics. They showed this using 4 separate models, but it had been shown theoretically by Chou and Lindzen (2004) (16). Clement and Soden also showed that the discrepancy could be resolved by allowing convective precipitation efficiency to increase with surface temperature. Such a dependence is at the heart of the iris effect which was first found by Lindzen, Chou and Hou (2001) (17), and was theoretically predicted by Sun and Lindzen (1993) (18). In LCH, we attempted to examine how tropical clouds responded to changing surface temperature, and found that existing satellite data was only marginally capable of dealing with this issue. The results, however, suggested that there were strong negative feedback — counter to what models suggest. It was moreover, easy to show that models in no way replicated the cloud behavior that was observed.

It may turn out that the rigorous measurement of precipitation can be done with ground based radar. Ground based radar allows the almost continuous measurement of precipitation and the separation of convective precipitation from stratiform precipitation (albeit with remaining questions of accuracy). In the tropics, both types of precipitation originate in condensation within cumulus towers. However, condensation that does not form precipitation is carried aloft as ice which is detrained to form cirrus from which the condensate eventually falls as stratiform precipitation. Precipitation efficiency is given by the relation:


Using data from Kwajalein Atoll in the western Pacific, we were able to study how e varies with sea surface temperature. In addition, the Kwajalein radar allows one to explicitly look at the area of stratiform rain per unit of convective mass flux.

Graphs of Precipitation Efficiency versus Surface Temperature
Figure 3. Left: Preciptation efficiency vs. surface temperature;
Right: Cirrus area per unit convective activity vs. surface temperature.

We see from Figure 3 that e increases about 7.1% per degree C increase in SST (compared with 7.5% estimated by Sun and Lindzen, 1993), and that this increase is associated with a decrease in normalized stratiform area of about 25% per degree C (which is a bit larger than what was estimated from space observations by Lindzen, Chou and Hou, 2001). If correct, this basically confirms the iris effect, and the fact that models have greatly exaggerated climate sensitivity because, in contrast to models, nature, itself, acts to limit rather than exaggerate the influence of added greenhouse gases.

What would be the implication of these simple results?

The primary implication would be that for over 25 years, we have based not only our worst case scenarios but even our best case scenarios on model exaggeration. This was already suggested by previous results, but the present result has the virtue of specifically identifying a basic and crucially relevant error. Under the circumstances, the main question we will be confronting is how long the momentum generated by this issue will prevent us from seeing that it has been an illusion based on model error. However, I am not altogether optimistic about this.

Smokestacks at Mine
What’s coming out – industrial
smokestacks with GE Reuter-Stokes
Stak-Tracker gas analyzers
(Photo: NASA)

The public discourse on global warming has little in common with the standards of scientific discourse. Rather, it is part of political discourse where comments are made to secure the political base and frighten the opposition rather than to illuminate issues.

In political discourse, information is to be “spun” to reinforce pre-existing beliefs, and to discourage opposition. The chief example of the latter is the perpetual claim of universal scientific agreement. This claim was part of the media treatment of global cooling (in the 1970′s) and has been part of the treatment of global warming since 1988 (well before most climate change institutes were created). The consensus preceded the research.

That media discourse on climate change is political rather than scientific should, in fact, come as no surprise. However, even scientific literature and institutions have become politicized. Some scientists issue meaningless remarks in what I believe to be the full expectation that the media and the environmental movement will provide the ‘spin.’ Since the societal response to alarm has, so far, been to increase scientific funding, there has been little reason for scientists to complain. Should scientists feel any guilt it is assuaged by two irresistible factors: The advocates define public virtue; and administrators are delighted with the growing grant overhead. The situation has been recognized since time immemorial. In Federalist Paper No. 79, Alexander Hamilton brooded about abuses that might arise from legislative tampering with judges’ salaries. “In the general course of human nature,” he wrote, “a power over a man’s subsistence amounts to a power over his will.” An indication of such an attitude occurred when, in 2003, the draft of the US National Climate Plan urged high priority for improving our knowledge of climate sensitivity (ie, in finding the answer). It appears that an NRC review panel was critical of this prioritization, urging prioritization instead for broader support for numerous groups to study the impacts of the putative warming. One is tempted to suggest that the NRC panel was more interested in spreading the wealth than in finding an answer.

A second aspect of politicization of discourse specifically involves scientific literature. Articles challenging the claim of alarming response to anthropogenic greenhouse gases are met with unusually quick rebuttals. These rebuttals are usually published as independent papers rather than as correspondence concerning the original articles, the latter being the usual practice. When the usual practice is used, then the response of the original author(s) is published side by side with the critique. However, in the present situation, such responses are delayed by as much as a year. In my experience, criticisms do not reflect a good understanding of the original work. When the original authors’ responses finally appear, they are accompanied by another rebuttal that generally ignores the responses but repeats the criticism. This is clearly not a process conducive to scientific progress, but it is not clear that progress is what is desired. Rather, the mere existence of criticism entitles the environmental press to refer to the original result as ‘discredited,’ while the long delay of the response by the original authors permits these responses to be totally ignored.

A final aspect of politicization is the explicit intimidation of scientists. Intimidation has mostly, but not exclusively, been used against those questioning alarmism. Victims of such intimidation generally remain silent. Congressional hearings have been used to pressure scientists who question the ‘consensus’. Scientists who views question alarm are pitted against carefully selected opponents. The clear intent is to discredit the ‘skeptical’ scientist from whom a ‘recantation’ is sought.

The news media is frequently used as an instrument for this intimidation. A notable example in the early 1990′s was Ted Koppel’s Nightline program. He announced that Vice President Gore had asked him to find connections to unsavory interests of scientists questioning global warming alarm. Koppel, after editorializing on the inappropriateness of the request, proceeded to present a balanced exposure of the debate. Newspaper and magazine articles routinely proclaimed that scientists who differ with the consensus view are stooges of the fossil fuel industry. All of this would be bad enough, but the real source of intimidation was the fact that neither the American Meteorological Society nor the American Geophysical Society saw fit to object to any of this.

These are not isolated examples. Before 1991, some of Europe’s most prominent climate experts were voicing significant doubts about climate alarm. Note that the issue has always concerned the basis for alarm rather than the question of whether there was warming (however small) or not. Only the most cynical propagandist could have anticipated that sentient human beings could be driven into panic by the mere existence of some warming. In any event, among these questioners were such distinguished individuals as Sir John Mason, former head of the UK Meteorological Office, and Secretary of the Royal Society, Prof. Hubert Lamb, Europe’s foremost climatologist and founder of the Climate Research Unit at East Anglia University, Dr. Henk Tennekes, Director of Research at the Royal Dutch Meteorological Institute, and Professor Aksel Wiin-Nielsen of the University of Copenhagen and former Director of the European Centre for Medium Range Weather Forecasting, and Secretary General of the World Meteorological Organization. All of these figures except Tennekes have disappeared from the public discourse. Lamb is now dead. Tennekes was dismissed from his position, and Wiin-Nielsen was tarred by Bert Bolin (the first head of the IPCC) as a tool of the coal industry. In Russia, a number of internationally recognized pioneers of climate science like K. Kondratyev (died in 2006) and Y. Izrael, continue to vocally oppose climate alarm, but Russian scientists eager for connections with the rest of Europe are much more reluctant to express such views.

Not all such situations ended badly. When a senior Energy Department official, William Happer, was dismissed in 1993 after expressing questions about the scientific basis about global warming claims, the physics community was generally supportive and sympathetic (19). In another more bizarre case, an attempt was made to remove the name of Roger Revelle from an already published paper he coauthored with S. Fred Singer and Chauncy Starr, by claiming that Singer had cajoled an allegedly senile Roger Revelle into permitting himself to be so used. This paper discouraged hasty action on ill-understood warming. It should be noted that Revelle was the professor who Al Gore frequently cites as having introduced him to the horrors of global warming. In any event, Singer took the issue to court and won. His description of the case makes interesting reading (20).

More recent is a controversy over a 1,000 year reconstruction of mean temperature purporting to show that the half degree (Centigrade) rise of the past century was unprecedented (21). Because of the extensive use of this work in the politics of global warming, Congressman Joe Barton demanded the analytical detail since the research was supported by US funds. Both the American Meteorological Society and the American Geophysical Union protested Barton’s request. One need not go into the merits of this controversy to see that this difference in the response of professional organizations sends a rather chilling message. Only the defenders of the orthodoxy will be defended against intimidation.

I want to emphasize that the basic agreement frequently described as representing a global warming ‘consensus’ is entirely consistent with there being virtually no problem. Actual observations suggest that the sensitivity of the real climate is much less than found in computer models whose sensitivity depend on processes which are clearly misrepresented. Attempts to assess climate sensitivity by direct observation of cloud processes, and other means, point to a conclusion that doubling CO2 would lead to about 0.5C warming or less.

Unfortunately, a significant part of the scientific community appears committed to the maintenance of the notion that alarm may be warranted. Alarm is felt to be essential to the maintenance of funding. The argument is no longer over whether the models are correct (they are not), but rather whether their results are at all possible. Alas, it is impossible to prove something is impossible. The global warming issue parts company with normative science at an early stage. A good indicator of this disconnect is widespread and rigorous scientific agreement that the Kyoto Agreement would have no discernible impact on climate. This clearly is of no importance to the thousands of negotiators, diplomats, regulators, general purpose bureaucrats and advocates whose livelihood is tied to climate alarmism.

A rarely asked but important question is whether promoting alarmism is good for science? The situation may not be so remote from the impact of Lysenkoism on Soviet genetics. However, personally, I think the future will view the response of contemporary society to ‘global warming’ as simply another example of the appropriateness of the fable of the Emperor’s New Clothes. For the sake of the science, I hope that future arrives soon. In the mean time, we can continue to play our parts in this modern version of The Emperor’s New Clothes. Let us hope that our descendents will be amused rather than horrified.


1 Economist, December 24, 2004 (back)

2 New Yorker, April 25, 2005 (back)

3 Myhre et al. (1998) Geophys. Res. Ltrs., 25, 2715-2718; Hansen, J., and M. Sato 2004. Greenhouse gas growth rates. Proc. Natl. Acad. Sci. 101, 16109-16114. (back)

4 The term, forcing, in this paper, refers to the imbalance in radiative energy flux that would be produced by the addition of greenhouse gases. We will generally describe such forcing as either a percentage increase in the greenhouse effect, or as a flux with units of Watts per square meter. Such a flux acts to warm the earth. (back)

5 Gates, W. L., J. Boyle, C. Covey, C. Dease, C. Doutriaux, R. Drach, M. Fiorino, P. Gleckler, J. Hnilo, S. Marlais, T. Phillips, G. Potter, B.D. Santer, K.R. Sperber, K. Taylor and D. Williams, 1999: An overview of the Atmospheric Model Intercomparison Project (AMIP). Bulletin of the American Meteorological Society, 80, 29-55. (back)

6 Anderson, T.L., R.J. Charlson, S.E. Schwartz, R. Knutti, O. Bucher, H. Rhode, and J. Heitzenberg (2003) Climate forcing by aerosols – a hazy picture. Science, 300, 1103-1104. (back)

7 R. Jaenicke (2005) Abundance of cellular material and proteins in the atmosphere. Science, 308, 73. (back)

8 Jacobson, Mark. Z., 2001. Strong radiative heating due to the mixing state of black carbon in atmospheric aerosols. Nature Vol 409, No 6821, pp. 695-7, February 8, 2001 (back)

9 Chen, Yang, and Joyce E. Penner. 2005. Uncertainty analysis for estimates of the first indirect aerosol effect. Atmospheric Chemistry and Physics, 5, 2935-2948, online (back)

10 Emanuel, Kerry, 2005. Increasing destructiveness of tropical cyclones over the past 30 years, Nature, 436, 686-688; Webster, P.J., G. J. Holland, J. A. Curry, and H.-R. Chang, 2005: Changes in Tropical Cyclone Number, Duration, and Intensity in a Warming Environment, Science, 309, 1844-1846. (back)

11 One line of inquiry involves looking at the temporal response to identifiable perturbations like volcanoes or so-called regime changes. It turns out that rapid responses correspond to low sensitivity while slow responses would imply higher sensitivity. Such inquiries invariably show rapid responses. Some examples are R.S. Lindzen and C. Giannitsis (1998) On the climatic implications of volcanic cooling. J. Geophys. Res., 103, 5929-5941; Lindzen, R.S. and C. Giannitsis (2002) Reconciling observations of global temperature change. Geophys. Res. Ltrs. 29, (26 June) 10.1029/2001GL014074; Douglass, D.H., and R.S. Knox (2005) Climate forcing by the volcanic eruption of Mount Pinatubo. Geophys. Res. Letters, 32, L05710, doi:10.1029/2004GL022119. (back)

12 Chen, J., B.E. Carlson, and A.D. Del Genio, 2002: Evidence for strengthening of the tropical general circulation in the 1990s. Science, 295, 838-841; Del Genio, A. D., and W. Kovari, 2002: Climatic properties of tropical precipitating convection under varying environmental conditions. J. Climate, 15, 2597-2615. (back)

13 Wielicki, B.A., T. Wong, et al, 2002: Evidence for large decadal variability in the tropical mean radiative energy budget. Science, 295, 841-844; Lin, B., T. Wong, B. Wielicki, and Y. Hu, 2004: Examination of the decadal tropical mean ERBS nonscanner radiation data for the iris hypothesis. J. Climate, 17, 1239-1246. (back)

14 Cess, R.D. and P.M. Udelhofen, 2003: Climate change during 1985-1999: Cloud interactions determined from satellite measurements. Geophys. Res. Ltrs., 30, No. 1, 1019, doi:10.1029/2002GL016128. (back)

15 Clement, A.C. and B. Soden (2005) The sensitivity of the tropical-mean radiation budget. J. Clim., 18, 3189-3203. (back)

16 Chou, M.-D. and R.S. Lindzen (2004) Comments on “Examination of the Decadal Tropical Mean ERBS Nonscanner Radiation Data for the Iris Hypothesis”. J. Clim. 18, 2123-2127. (back)

17 R.S. Lindzen, M.-D. Chou, and A.Y. Hou (2001) Does the Earth have an adaptive infrared iris? Bull. Amer. Met. Soc. 82, 417-432. (back)

18 Sun, D-Z. and R.S. Lindzen (1993) Distribution of tropical tropospheric water vapor. J. Atmos. Sci., 50, 1643-1660. (back)

19 This situation is described in W. Happer (2003) Harmful politicization of science. In Politicizing Science, M. Gough, editor, Hoover Institution Press, Stanford, CA 313 pp. (back)

20 S. Fred Singer (2003) The Revelle-Gore Story: Attempted political suppression of science. In Politicizing Science, M. Gough, editor, Hoover Institution Press, Stanford, CA 313 pp. (back)

21 Mann, M.E., Bradley, R.S. and Hughes, M.K., Northern Hemisphere Temperatures During the Past Millennium: Inferences, Uncertainties, and Limitations, Geophysical Research Letters, 26, 759-762, 1999. (back)

About the Author: Richard S. Lindzen is the Alfred P. Sloan Professor of Atmospheric Science at the Massachusetts Institute of Technology. This paper was presented at the Yale Center for the Study of Globalization ( on October 21, 2005, and will appear in the published proceedings of that meeting. Reprinted with permission from the author.

EcoWorld - Nature and Technology in Harmony

Posted in Atmospheric Science, Causes, Coal, Energy, Global Warming & Climate Change, History, Literature, Office, Organizations, Other, Ozone, People, Radiation, Science, Space, & Technology, Solar, Volcanoes, Wind2 Comments

Bioethanol vs. Biodiesel

Sugar Cane Field below Mountains
Sugar Cane
Probably today’s top bioethanol crop

Editor’s Note: The biofuel sector is in its infancy, yet bioethanol production is already equal to one-half of one percent of total petroleum production in the world. Biodiesel lags far behind, only contributing one-twentieth of one percent compared to the world’s total petroleum fuel production.

In this cautionary, comprehensive assessment of biofuels, it is clear that in proper conditions they are economically viable today, and that worldwide biofuel production is poised to make a quantum leap. But when comparing the principal biodiesel crops, bioethanol versus biodiesel, the result is inconclusive. Complicating any attempt to assess the potential of biofuels are claims that “secondary treatment of cellosic waste” can yield quantities of bioethanol equal to the initial extraction of ethanol or diesel. But this secondary extraction of ethanol from cellulose is not yet a cost-effective process.

Biofuels, like all fuels, can be analysed using a net energy balance analysis, or a life cycle cost analysis which emphasizes economic factors. While both of these analyses are valuable, the relationship between the two of them, basically, is that the better the life cycle cost analysis is, the closer to 1.0 the net energy balance can fall while still yielding a viable biofuel.

How good biofuels are is highly subjective based on location and feedstock. Most research is either highly proprietary, or just getting underway, or both. Biofuel converts solar energy, at its best, at a rate of only around .15%. So while biofuels such as ethanol from sugar cane in Brazil and diesel from oil palms in West Africa are undoubtedly economically viable, they cannot replace petroleum. There isn’t enough land. Genetically engineered algae may change all of that, of course.

According to this authoritative review by EcoWorld Contributing Editor Louis Strydom, the commercial sector is just now getting interested in biofuel refining and biofuel crops. It is important to appreciate that innovative crops and technologies may be already developed to a scientifically or even commercially reliable stage, yet much data on these promising crops is not yet available in the public at large.
- Ed “Redwood” Ring

The biofuel component of the bioenergy sector is certainly an important one.

The International Energy Agency did an analysis and projection of oil consumption measured in Million tons of oil equivalent (Mtoe) from 1971 to 2030. Whereas in 1971 transport consumption of oil to total oil was roughly 50%, by 2030 transport will account for around two thirds of oil consumption. More concerning, total oil consumption is predicted to increase to 5,000 Mtoe, more than doubling the total consumption of 1971 (1). A significant portion of this increase in consumption is driven by the developing economies and is therefore very difficult to reduce due to their rapid economic growth. Both in terms of reduction of CO2 emissions from this increasing consumption as well as reducing the dependency on oil as the primary product the demand for viable biofuels will increase in the years to come. From the same analysis bioethanol production was estimated at around 30 million tons per year in 2004 and biodiesel at only an estimated 2,5 million tons per year as of 2004.

The demand is thus clearly there for a substitute to oil, and would be even more pronounced if this substitute could be “greener.” As regards biofuels however, the key imperative would seemingly remain the economic viability of the substitute. As Nobel prizewinner, Sydney Brenner, once noted “the only ‘omics’ that really counts in Biotechnology is economics. (2)

How then do you consider the viability of a project, whether it be biodiesel or bioethanol?

It should be noted from the outset, that there is a perplexing myriad of country, location, and project specific data that cannot possibly be covered in a short article such as this. Further, it may be quite possible that a specific project may have other factors specific to that project that completely erode the assumptions of this article. This is exacerbated by the lack of published and scientific data available. That is not in any way whatsoever saying that a huge a mount of scientific research has not been done on biofuels, but simply that a) the biofuel field is in itself a huge field of study, b) new technological advances have presented themselves (or are in the pipeline) that affect current assumptions, c) some advances are driven by economics and these will present themselves by way of company performances in the future.

This article focuses on environmental analysis of the biodiesel and bioethanol industries. Focus is mainly placed on a PEEST analysis (Political, Economic, Environmental, Social, and Technical). The Social and Technical factors are covered briefly. This analysis is not intended to be exhaustive and is intended only to highlight some of the salient points pertinent to such an analysis and reflect on some of the current thinking around these factors. Strategists might argue that there are a number of other environmental analytical techniques which should be applied to an environmental scan of the industry; again such an analysis would be so extensive that it falls outside the scope of this article. Further, the focus of this article is mainly on the external environment as it is far too complex a subject to cover internal environmental factors affecting a project, and even the external environment is only approached from a generic approach so as not to create undue complexity.


European Union Flag
The environmental benefits of
biofuel is especially important
in the European Union

The main political drivers for the biofuel industry in Europe is directed towards the environment, and therefore lowering carbon emissions as much as possible. As a secondary objective the Europeans aim to reduce their dependency on petro-energy (3). This perspective is closely related to the objectives and structure of the Kyoto protocol, which is endorsed by the EU. Interestingly, in terms of concrete actions in the EU, the focus seems more to be on developing biofuels in the member states to meet the objectives (4). The objectives include a 5.75% bio-to-petrofuels blend envisaged by the European Union by 2010.

Americans view biofuel
as a way to help them achieve
energy independence

In contrast, in the US the primary objective is to reduce dependency on imported oil. More specifically, the Department of Energy’s Office of Energy Efficient and Renewable Energy (EERE) invests in research to achieve the following goals:

- Dramatically reduce, or even end, dependence on foreign oil.

- Spur the creation of a domestic bioindustry. (5)

The two main political drivers therefore are energy independence and environmentally friendly energy. It should be noted that on both sides of the Atlantic these drivers concentrate on internal markets/projects. It follows therefore that development of the industries are directed to addressing local concerns and therefore feedstocks and technologies are directed towards that.

Indian Flag
Rapidly industrializing nations
such as India see biofuel as
a source of jobs and wealth

In the developing world it is worthwhile to note that an additional driver is the potential of job creation for the local economies, particularly where labor intensive feedstocks can be utilized such as harvestable nuts and crops for biofuels.

In short, as countries focus on developing their own biofuel industries, the direction of research and development for biofuels will be driven by the countries and their corresponding corporations and financial markets who invest the largest amounts of capital to these endeavors. Thus for example, the primary development in terms of economically viable research and studies in biodiesel and bioethanol centre around the first world countries. This significantly hampers assessment of projects which fall outside this scope as there is much more limited research available, and often the published research available centers more on developmental than commercial factors in contrast to similar research in the developed world.


In the particular case of biofuels, Economic and Environmental analysis are often intertwined. Under this heading some of the main measurement criteria to be covered are Life-Cycle Cost Analysis (LCA) and Net Energy Balance (NEB):


Life cycle cost analysis covers the costs incurred “from cradle to grave of the project.” (6) It focuses mainly on the environmental impact, but the generic methodology could readily be applied to the economic cost-benefit of the project (although this is not done too frequently). LCA can often be complimented by other measurements such as external-cost analysis. The LCA does prove to be a beneficial tool to analyse projects. In terms of biofuels the main focus seems to be towards energy consumption – utilising it as a decision making tool to compare the owning and operating costs for energy using systems. There is relatively limited research still available in this field but most apply the LCA to automobiles, comparing using different fuels as inputs. These studies are however subject to academic limitations and a number of factors have the potential to change the outcomes of the studies such as price of fuel (and also which fuel is the baseline – petroleum or diesel), efficiency of the engine, cost of the vehicle, cost of manufacturing, service costs, fixed costs, end-of-life salvagability of the automobile, etc.

An example of this variance in research output due to varying underlying factors is an LCA conducted in 2000 on alternative biofuels (7). Essentially the article concluded that given the underlying factors at the time biofuels where not cost effective due to their high fuel price. Excluding subsidies, biodiesel from soybeans did prove about 20% more cost-effective than Ethanol (C2 H5 OH). Ethanol from corn was determined to be marginally more effective than herbaceous and woody biomass respectively. Similarly a research paper presented in 1998 utilising both LCA and an External Costs Analysis noted that: “The LCA analysis shows that the benefits in terms of greenhouse gas emissions are being compensated by higher environmental impacts, especially for eutrophication. The External Cost Analysis, shows that external costs of biodiesel and fossil diesel are in the same range and are dominated by the impacts of the use phase.” (8)

Oil Palms
Oil Palms in the Niger Delta
Probably the top biodiesel crop, but no study
can fail to take into account local conditions.

The main problem is that most research done using LCA is fundamentally linked to the economics of the country and region in which it is produced. Therefore, performing a study in the US and attempting to replicate that study in Europe could likely lead to different results. This variations could be attributed, but not limited, to the following factors – cost of input stock, type of input stock, planting, harvesting and processing of input stock, economic factors, subsidies, etc. In a nutshell, although LCA is highly relevant to a specific market or possibly project, it is not necessarily possible to use this data to extrapolate to other countries or projects.

It is important to note that the country conducting the research will typically focus on the crops that are viable within its location – thus for example a study in the UK would not consider a plantation of Palm Oil as input stock as although it is far superior in yields to crops available in Europe it is not suitable for the climate and would hence be disregarded from the specific country perspective.

For these reasons, comparisons of biodiesel vs. bioethanol should be done with extreme caution and the scientific principles behind such a study must be defensible. Furthermore, the LCA studies come with the additional caution that they are time sensitive – the studies referred to above, would quite likely have different outcomes in the year 2006 given changing oil prices, technological advances in agriculture of the input crops, refining, and the end use products (such as automobiles).


The other factor predominant in literature is the Net Energy Balance (NEB). This represents an overview of production and consumption of primary and secondary biofuels for a specific project, area, country or region. Energy balances should cover all the primary and secondary energy sources, showing clearly the non-energy use of such sources. (9)

NEB is an interesting counterpoint to LCA analyses in that NEB purely focuses on energy input versus energy output of the production cycle and therefore does not take cost variables into consideration. This is helpful in that it focuses on efficiency throughout the energy crop to biofuel cycle. In this sense the main impact on the reliability of the NEB analyses is the development of technology both from an agricultural and refining perspective.

A report by the US Department of Energy comments that the NEB of ethanol is estimated at around 1.38 for ethanol from corn, but that cellulosic bioethanol (from plant mass) can reach 2.62. The same report anticipates an NEB potential of around 5.0 based on further research on cellulosic ethanol production. The same report notes that biodiesel NEB can reach around 3.2 from soybeans. (10)

Jatropha Lamp
Low-tech biofuel for lamps and
stoves work now, as refining
methods and markets evolve.


A study in the Netherlands noted that it is far more efficient to utilise the feedstock for electricity generation (bioenergy) than it is to use it for the production of biofuels. Further, this study indicated that given the input feedstocks compared (Sugar beet for bioethanol and rapeseed oil for biodiesel) and the country (the Netherlands) that biodiesel was more energy efficient and therefore had a more positive energy balance (11). A similar study focusing on rapeseed oil for biodiesel and wheat for bioethanol production also indicated that biodiesel was the superior fuel in terms of NEB – both crops however were found to be NEB positive (12). Bioethanol in particular has often been held to have a negative NEB, although recent studies do seem to indicate it is possible to produce bioethanol with a positive NEB (13).

The essence however is that it appears with current feedstock output biodiesel consistently has a higher NEB (14). Of course depending on the amount of research and development done on bioethanol, this may likely change if such R&D is invested in ethanol and the technology, processes, and crops are improved. Furthermore, the matter is highly subjective to location and feedstock, so for example, Brazil which has a very large sugar cane industry may prove much higher NEB’s if ethanol is consumed in country. Transport costs and energy consumption of feedstock plays a significant role, and therefore a high NEB crop in say a developing country producing sugar or palm oil may be eroded if such stocks are transported to a refinery in a developed country. Again, if research is not conducted on a sound scientific basis then comparing outside of the scientific parameters can result in incorrect assumptions of the merits of a biodiesel or bioethanol project.

It is reasonable then to expect the NEB’s of crops to improve in the future. The reasons for this are as follows. First, only existing commercial crops are used. These existing crop yields could be improved with further genetic engineering, crop selection, etc. Second, commercial activity has not been undertaken or materially developed yet for specialist biofuel crops such as Jatropha Curcas, Ponga Mia, etc. Large scale commercial activity will likely yield more energy efficient production of feedstocks, which will then improve the NEB. Third, current production primarily concentrates on annual crops. Particularly so in Europe and America where the crops have to be replanted each year to be harvested. If plantations are used less energy is expended to produce the crops because the plantation is harvested annually without having to replant the crop. Of course Palm Oil and to a lesser extent sugar do cater for this requirement and are therefore at a higher NEB level already.

There are of course a host of other measurement factors that should be considered in analyzing the economic and environmental factors, however the above two have been covered in some detail as they are often encountered in literature and do provide a sound basis for analysis.


A pertinent social factor affecting biofuels is that particularly in the developed world society has become increasingly environmentally conscious, and therefore it has become more important for the man on the street to consume greener energy. More importantly the increased public interest in greener energy has translated into political interest and “Green” parties have increasingly come to the fore in politics in some European countries.

Chinese Children
Faces of the future: Will biofuel become a
vital part of the eventual energy solutions
for huge emerging nations such as China?

In contrast to the already fully industrialized world, there are many countries which are experiencing population booms as well as growing economies. Specifically in India and China the population sizes coupled with strong growth is increasing demand for energy and fuels. In the developing world the demand for energy combined with the pressures of unemployment have made bioethanol and biodiesel popular areas of interest. This interest has been enhanced particularly for biodiesel crops such as Palm oil and Jatropha where manual labor is often utilized for harvesting and thus the development of these industries assists in job creation as well as energy independence.


Technical advances both in terms of actual crop yields and refining technologies have significantly improved in the past few years. For refining one only has to look at the refining technological improvements companies such as Lurgi (15) have made over the past decade to appreciate this. On the crop side research has improved crop yields for plants such as Palm Oil – see for example the research done by the Palm Oil Research Institute of Malaysia (16) and academic institutions (17) both on crop development as well as increased utilization of crop waste (18).

Again, as noted in the Economic/Environment factors, the research remains context specific, and it is important not to extrapolate data without a sound scientific base and due consideration for all factors influencing the research data. So, for example, a study was conducted for the UK Department of Transport using 2002 as a basis year comparing international resource costs for biodiesel and bioethanol, with extrapolations for 2020. The study was limited in scope to Europe, US and Brazil and concentrated on wood, straw, wheat and corn for bioethanol and oil seeds (soybeans and rapeseed oil) for biodiesel. For the 2002 year it indicated that given the feedstock and environment, that biodiesel produced more energy per investment than bioethanol as regards Europe and the UK, however when the US and Brazil were included bioethanol proved more effective. In their 2020 projections, accounting for technological improvements and using the same input factors, the study anticipated that the most cost-effective solution would be biodiesel produced from biomass (wood & straw) given the technologies anticipated by that time (19).

The importance of the UK study lies not in the predictions of the year 2002, but in the fact that it is realistic to expect significant technological improvements in both the agricultural and processing/refining productivity of biofuels. To this extent it is recommended that biofuels projects be approached on a case by case basis to determine viability and that a definitive position is not taken as regards viability of biodiesel versus bioethanol. There are a number of current views and comparisons drawn for the different feedstock crops for biodiesel and bioethanol, see for example a recent article on this site (20).


Current published and scientific data as regards to technological advances both in terms of crops (for example yields per hectare of different crops) as well as processing and refining advances are hard to come by. The main reason for this is that the biofuels field has only received serious attention from commercial investors in the last few years and to this extent it is seen as an exciting market to be entering. Consequently, this has lead to most research being proprietary, patented or not published and utilized for commercial purposes. This is not to say that some of the advances are not scientifically credible, only that they are not always accessible. The consequent risk is that a lot of unsubstantiated data is assumed, either over or under estimating the realistic yields of crops, efficiencies of processes and technologies, etc.

It is possible that the nature of the feedstocks themselves will allow the industry to change, for example, by-products that are now seen as waste or used for one purpose may change to be used for completely different purposes in the future. So for example certain tropical and sub-tropical crops may end up driving a wider bioenergy business rather than a solely biofuels business if the byproducts are converted into electricity. This could in turn effect how the LCA or NEB of a project is calculated and consequently change the economic viability of future projects.

The main issue constraining definitive comment on evaluating biodiesel vs. bioethanol is that both fuels are not fully developed yet. The reasons for this are as follows:

First, only limited existing commercial crops are have recorded data on their NEB. These existing crop yields could be improved with further, crop selection, genetic engineering, etc.

Jatropha Seedlings
Jatropha seedlings getting a start in India
There is much still to learn about this promising crop

Second, commercial activity has not been undertaken or materially developed yet for specialist biofuel crops such as Jatropha Curcas, Ponga Mia, etc. Large scale commercial activity will likely yield more energy efficient production of feedstocks, which will then improve the NEB.

Third, current production primarily concentrates on annual crops. Particularly so in Europe and America where the crops have to be replanted each year to be harvested. If plantations are used, less energy is expended to produce the crops when the plantation is harvested annually without having to replant the crop. Of course Palm Oil and to a lesser extent sugar do cater for this requirement and are therefore at a higher NEB level already.

Regardless of the seeming higher NEB viability in the developing world for biofuels, a number of firms have built or are in the process of building major biofuel refineries in the US, Europe and other developed countries (21).

Conclusion – Bioethanol & Biodiesel are a toss-up – they both work well depending on the crop and the planting environment.

Attempting to analyze an industry such as biofuels is a very complex task. Both bioethanol and biodiesel crops, processes and refining technology is constantly improving. Further, it is important to appreciate that crops and technologies may be developed to a scientifically reliable stage, yet data thereon is not yet available in the public domain. The purpose of this article was to extract and review some of the current data and drivers impacting on the biofuel sector within the conceptual construct of a PEEST analysis to thereby highlight some of the current factors, thinking and research in this field and thereby provide the reader with a basic construct for future analysis of biofuel projects.

Footnotes and Reference Sources:

1 Fulton, L. 20/21 June 2005. (back)

Assessing the biofuels option.

Presented at a conference in Paris – Biofuels for Transport: an International Perspective

2 Whelan, J. November 6, 2004. (back)
The Insider : European Biotech – No Pain, No Gain. New Scientist p54 – 55.

3 Anon. 17 August 2004. (back)

Greencars :

4 British Associations for Bio Fuels and Oils (back)

5 Tyson, K.S. et al. June 2004. (back)

Biomass Oil Analysis: Research Needs and Recommendations.

National Renewable Energy Laboratory

6 De Nocker, N. et al. December 3-4, 1998. (back)

Comparison of LCA and external cost analysis for biodiesel and diesel.

Presented at 2nd International conference LCA in Agriculture, Agro-Industry and Forestry.

7 MacLean, H.L. et al. 2000. (back)

A Life-Cycle Cost Analysis of Alternative Automobile Fuels. Journal of Air & Waste Association (50:1769-1779).

8 De Nocker, N. et al. December 3-4, 1998. (back)

Comparison of LCA and external cost analysis for biodiesel and diesel.

Presented at 2nd International conference LCA in Agriculture, Agro-Industry and Forestry.

9 Food and Agriculture Organisation of the United Nations (back)

10 Food and Agriculture Organisation of the United Nations (back)

US Department of Energy – Net Energy Balance for Bioethanol Use.

11 Kampman, B.E. et al. November, 2003. (back)

Biomass: for vehicle fuels or power generation? CE: Solutions for environment, economy and technology.

12 Richards, I.R. June 2000. (back)

Energy balances in the growth of oilseed rape for biodiesel and of wheat for bioethanol.

British Association of Biofuels and oils – Levington Agricultural report.

13 Anon. January 29, 2006. (back)

New Study Makes case for Bioethanol. Euractiv.

14 Anon. April 4, 2006. (back)

Net Energy Balance for Bioethanol production and use.

U.S. Department of Energy – Energy Efficiency and Renewable Energy Biomass Program

15 (back)

16 (back)

17 Corley, H. Est. 1999. (back)

New Technologies for Plantation Crop Improvement. Cranfield University.

18 Khalil, H. et al. 2000. (back)

The effect of various anhydride modifications on mechanical properties and water absorption of oil palm empty fruit bunches reinforced polyester composites. Polymer International. Volume 50, Issue 4 : 395 – 402.

19 AET Technologies (back)

International resource costs of biodiesel and bioethanol. UK Department for transport.

20 (back)

21 Winnie, J. June 10, 2005. (back)

Major Biofuel projects.


About the Author: Louis Strydom is an expert in new venture creation and project finance with wide experience on projects in the developing world. One of Louis’ main projects for the last year has been conducting a pre-feasibility study and promotion of a 230,000 acre site for a Jatropha plantation and biodiesel refinery in Kenya. Previously he was Senior Vice President of Project Finance at Decillion – a company listed on the Johannesburg Stock Exchange. Other positions included Senior Economist managing the Credit Policy and Risk Management division of the Export Credit Insurance Corporation of South Africa. Prior to that he was a Director with Triumvirate responsible for Marketing and Consulting on Crisis Management. Louis also has extensive experience in short term insurance with American International Group on fire/casualty risks, niche products and political risks in Africa, Europe, the Middle East, UK and USA.

Email the Editor about this Article
EcoWorld - Nature and Technology in Harmony

Posted in Consumption, Electricity, Energy, Energy & Fuels, Energy Efficiency, Engineering, Literature, Other, Science, Space, & Technology, Solar1 Comment


Here, recently, the human body was referred to as a “Simple Machine,” and the article explored the ramifications of life extension; the ramifications of human aging being slowed, human diseases being eliminated. This achievement, if you will, is the logical extension of our accelerating skills in the area of genetic manipulation and nanotechnology.

There are a host of species whose “machinery” is much simpler than the human structure, and in these areas human-engineered change at the genetic level has already begun. Genetically engineered food crops, virtually unknown ten years ago, now occupy over 100 million acres of U.S. farmland. This is an astonishing 167,000 square miles, an area larger than the entire state of California. Over half of the soybeans grown in the U.S. last year were genetically engineered, and over a third of the entire U.S. corn crop.

The decision by major U.S. agribusiness to embrace genetically engineered crops was done without thought to the public relations fiasco that ensued. Whether or not genetically engineered food will marginally harm some species, or cause a catastrophic breakdown in global ecosystems was almost incidental in terms of the complete lack of attention these corporations paid to possible public reaction. In the event, public reaction, especially in Europe, has been fierce.

In the United Kingdom, the same country that produced “Dolly,” the first mammal produced by genetic cloning, opposition to agricultural biotechnology has been especially vehement. Prince Charles, perhaps the best known opponent, has said “I happen to believe that this kind of genetic modification takes mankind into realms that belong to God, and to God alone.”

The result of this opposition has been a significant slowdown in the proliferation of genetically modified crops. But it’s not clear in the available literature just how bad these crops really are. Some of the products developed were clearly bad, such as Monsanto’s “terminator seed.” This is a genetically modified plant that sterilizes its own seeds after it has fully grown. Not only did news of this seed lead people to the uninformed but plausible conclusion that the seed might mingle with other plant species and cause them to turn off all their reproductive systems, wiping out life on earth in one horrible genetic bridge-too-far, but the commercial motive for these seeds was to force farmers to buy new seeds year after year. This is common practice in the U.S. and Europe, where farmers use hybrid seeds, but forcing farmers in the developing world to buy new seeds year after year would create a severe financial burden. Not very good PR, guys.

There is much benefit that genetic engineering could provide, if it was known to be definitely safe. It’s very similar to nuclear power in that regard. Nukes don’t cause air pollution, they have a nearly inexhaustible supply of fuel; if they could be made perfectly safe to operate they would be a welcome alternative to damming rivers and burning fossil fuel. A new strain of genetically engineered rice, for example, grows grains that contain an abundant supply of vitamin A. One of the biggest sources of malnutrition in much of the developing world is vitamin A deficiency, affecting over 100 million children. This deficiency can cause blindness and death.

To those who believe that humans are a cancer on the earth, none of this matters. The deepest of the deep ecologists lack no certainty in what future should exist for genetically modified crops; no future. But to the rest of us, it would be nice to think that if we haven’t yet evaluated all the risks of “frankenfoods,” that doesn’t mean we never will. With proper nalysis and research, hopefully humanity can reap the benefits of genetically modified foods without suffering the possible drawbacks.

Email the Editor about this Article
EcoWorld - Nature and Technology in Harmony

Posted in Air Pollution, Engineering, Literature, Nature & Ecosystems, Other1 Comment

No Posts in Category