How to use open threads

This page is for discussion of climate change generally.

Open threads

To access the open threads, click on the menu item “OPEN THREADS” in the navigation bar above, hover over “HOW TO USE OPEN THREADS” then over “CLIMATE SCIENCE”. The resulting menu, if it’s too long to fit on the screen, will scroll up with the mouse wheel or the down cursor key.

173 Thoughts on “How to use open threads

  1. Richard C (NZ) on October 27, 2010 at 6:22 pm said:

    Bob D says:
    October 27, 2010 at 5:56 pm

    On the performance of models:

    G. G. Anagnostopoulos et al. (2010) “A comparison of local and aggregated climate model outputs with observed data”

    We compare the output of various climate models to temperature and precipitation observations at 55 points around the globe.We also spatially aggregate model output and observations over the contiguous USA using data from 70 stations, and we perform comparison at several temporal scales, including a climatic (30-year) scale.
    Besides confirming the findings of a previous assessment study that model projections at point scale are poor, results show that the spatially integrated projections are also poor.

  2. THREAD on October 29, 2010 at 9:43 am said:

    An Update on Radiative Transfer Model Development at Atmospheric and Environmental Research, Inc.

    J. S. Delamere, S. A. Clough, E. J. Mlawer, Sid-Ahmed Boukabara, K. Cady-Pereira, and M. Shepard Atmospheric and Environmental Research, Inc. Lexington, Maine

    Over the last decade, a suite of radiative transfer models has been developed at Atmospheric and Environmental Research, Inc. (AER) with support from the Atmospheric and Radiation Measurement (ARM) Program. These models span the full spectral regime from the microwave to the ultraviolet, and range from monochromatic to band calculations. Each model combines the latest spectroscopic advancements with radiative transfer algorithms to efficiently compute radiances, fluxes, and cooling rates. These models have been extensively validated against high-resolution spectral measurements and broadband irradiance measurements. Several of these models are part of the broadband heating rate profile value-added product (BBHRP VAP), currently being established at the ARM Southern Great Plains (SGP) site.

    A Web site has been established to host the AER radiative transfer models ( The Web site facilitates access to the models and is a convenient platform on which to provide model updates

    Also see – “Atmospheric Thermodynamics and Heat”

    Radiative Transfer Climate Models – Google Search

  3. Richard C (NZ) on October 29, 2010 at 10:18 am said:

    Radiative forcing by well-mixed greenhouse gases:
    Estimates from climate models in the IPCC AR4

    Collins Et Al 2010

    The radiative e ffects from increased concentrations of wellmixed
    greenhouse gases (WMGHGs) represent the most signi ficant and best understood anthropogenic forcing of the climate system. The most comprehensive tools for simulating past and future climates in
    uenced by WMGHGs are fully coupled atmosphere-ocean general circulation models (AOGCMs). Because of the importance of WMGHGs as forcing agents, it is essential that AOGCMs compute the radiative forcing by these gases as accurately as possible. We present the results of a Radiative-Transfer Model Intercomparison (RTMIP) between the forcings computed by the radiative parameterizations of AOGCMs and by benchmark line-by-line (LBL) codes. The comparison is focused on forcing by CO2, CH4, N2O, CFC-11, CFC-12, and the increased H2O expected in warmer climates. The models included in the intercomparison include several LBL codes and most of the global models submitted to the Intergovernmental Panel on Climate Change (IPCC) 4th Assessment Report (AR4). In general, the LBL models are in excellent agreement with each other. However, in many cases, there are substantial discrepancies among the AOGCMs and between the AOGCMs and LBL codes. In some cases this is because the AOGCMs neglect particular absorbers, in particular the near infrared e ffects of CH4 and N2O, while in others it is due to the methods for modeling the radiative processes. The biases in the AOGCM forcings are generally largest at the surface level. We quantify these diferences and discuss the implications for interpreting variations in forcing and response across the multi-model ensemble of AOGCM simulations assembled for the IPCC AR4.

    [Beware – IPCC incestuosity, scepticism required]

  4. Richard C (NZ) on October 29, 2010 at 10:42 am said:

    Radiative Transfer Climate Models – Google Search

  5. Richard C (NZ) on October 29, 2010 at 11:00 am said:

    Effects of bias in solar radiative transfer codes on global climate model simulations

    Arking 2005

    Department of Earth and Planetary Sciences, Johns Hopkins University, Baltimore, Maryland, USA

    Discussion and Conclusion
    [19] The radiative properties of the clear atmosphere are such that about half the solar radiation incident at TOA is absorbed by the surface, and only 25% is absorbed by the atmosphere. Hence, it is the surface that is the primary source of heat for the troposphere, most of which is in theform of emitted (infrared) radiation, but some of it is in the form of a thermodynamic heat exchange at the surface (comprising sensible and latent heat) that is carried upward by convection. Enhancing atmospheric absorption of solar radiation would transfer to the atmosphere some of the solar energy that would otherwise heat the surface. [20] As one might expect from a change in the radiation code which increases the absorption of solar radiation in the atmosphere, energy that is otherwise primarily absorbed by the surface is captured by the atmosphere. Hence, as we see in Figure 3, the convective flux necessarily decreases. The magnitude of the change seen, 15–17 W m2 at the surface, probably exaggerates the effect because the simulations were done under clear sky conditions. There is also a small increase in tropospheric temperatures, related to a small increase in column absorption below the tropopause, and a larger increase in stratospheric temperatures due to the increase of absorption above the tropopause. [21] In examining the response of the atmosphere to a doubling of CO2, we find the effects of enhanced absorption are much smaller because we are now looking at differences of differences. The effect on the tropospheric temperature is negligible, and the effect on the convective flux response is non-negligible only near the surface when we allow water vapor feedback.

  6. Richard C (NZ) on October 29, 2010 at 11:24 am said:

    Effects of bias in solar radiative transfer codes on global climate model simulations – Google Scholar Search

    Note: this is a better search than:

    radiative transfer codes global climate model simulations

    and contains for example;

    “An accurate parameterization of the infrared radiative properties of cirrus clouds for climate models” Yang 1998

  7. Richard C (NZ) on October 29, 2010 at 11:29 am said:

    Effects of bias in solar radiative transfer codes on global climate model simulations – Google Scholar Search

    Note: this is a better search than:

    radiative transfer codes global climate model simulations

    and contains for example;

    “An accurate parameterization of the infrared radiative properties of cirrus clouds for climate models” Yang 1998

    Also see – “Clouds in Climate Models”

  8. THREAD on October 29, 2010 at 8:19 pm said:


    Not so Apologetic

    .Climate “Climate Change Catastrophes in Critical Thinking”
    .Greenhouse Effect The Shattered Greenhouse: How Simple Physics Demolishes the “Greenhouse Effect”.
    .Volcanic CO2
    .Volcanic CFCs
    .Expanding Earth

    .Most Misquoted

  9. Richard C (NZ) on October 30, 2010 at 9:50 am said:

    See – Climate Models

    NON IPCC and Natural Forcings ONLY

    Atmospheric & Environmental Research, Inc.’s (AER)
    Radiative Transfer Working Group

    The foundation of our research and model development is the validation of line-by-line radiative transfer calculations with accurate high-resolution measurements.

  10. Richard C (NZ) on October 30, 2010 at 1:33 pm said:

    See – “Climate Models”

    NON IPCC and Natural Forcings ONLY

    Cloud Resolving Model (CRM)


  11. THREAD on October 30, 2010 at 4:07 pm said:

    SkepticalScience – a warmist enclave

  12. THREAD on October 30, 2010 at 5:11 pm said:

    Open Threads – Climate Conversation Group

    Climate Science


  13. Richard C (NZ) on October 30, 2010 at 7:05 pm said:

    Steven Goddard responds to Gareth Renowden’s (Hot Topic) ad hom ridden puff piece “Buffoons in arms: Goddard joins Monckton at SPPI” with a science smackdown.

    “It is difficult to argue against ad homs, but easy to discuss the science.”

    H/T Andy

  14. THREAD on October 31, 2010 at 8:54 am said:

    David Archibald

    Papers and Presentations.

  15. Richard C (NZ) on October 31, 2010 at 8:55 am said:

    Oops, wrong place – see down page.

  16. Richard C (NZ) on October 31, 2010 at 5:50 pm said:

    From – Discussion and conclusions

    This paper discusses the findings from the Radiative Transfer Model Intercomparison Project (RTMIP). The basic goal of RTMIP is to compare the radiative forcings computed with AOGCMs in the IPCC AR4 against calculations with LBL models.

    [But no comparison with real-world measurements of radiation – Note]

    These results suggest several directions for development of the radiative parameterizations in AOGCMs. First, tests of the accuracy of shortwave and longwave forcings at the surface should be given special attention. Second, the shortwave parameterizations in all the AOGCMs should be enhanced to include the e ffects of CH4 and optionally N2O on near-infrared radiation. Third, AOGCMs should evaluate the convergence of shortwave radiation in the atmosphere using benchmark calculations. This is a particularly clean test of the radiation physics, and the current models exhibit an improbably large spread of the convergence.

    E fforts to address these issues would have several benefi ts for the climate-modeling community and for groups using their models in scientifi c and societal applications. Better agreement of AOGCMs with LBL calculations would lead to greater con fidence in simulations of past and future climate. It would also facilitate the analysis of forcing-response relationships from the complex and heterogeneous multi-model ensembles that have become a standard component of international climate-change assessments.

    [But no extension of the “Better agreement of AOGCMs with LBL calculations” recommendation to – “LBL calculations agreement with empirical observations” (i.e. scientific method) as in the AERI Radiative Transfer Working Groups stated foundation of “validation of line-by-line radiative transfer calculations with accurate high-resolution measurements”. See “Climate Science” Atmospheric & Environmental Research, Inc.’s (AER) Radiative Transfer Working Group

  17. Richard C (NZ) on October 31, 2010 at 6:00 pm said:

    Download error – alternative link:-

  18. Richard C (NZ) on October 31, 2010 at 7:13 pm said:

    On the performance of models:

    “A comparison of local and aggregated climate model outputs with observed data”

    G. G. Anagnostopoulos et al. (2010)

    It is claimed that GCMs provide credible quantitative estimates of future climate change, particularly at continental scales and above. Examining the local performance of the models at 55 points, we found that local projections do not correlate well with observed measurements. Furthermore, we found that the correlation at a large spatial scale, i.e. the contiguous USA, is worse than at the local scale.

    However, we think that the most important question is not whether GCMs can produce credible estimates of future climate, but whether climate is at all predictable in deterministic terms. Several publications, a typical example being Rial et al. (2004), point out the difficulties that the climate system complexity introduces when we attempt to make predictions. “Complexity” in this context usually refers to the fact that there are many parts comprising the system and many interactions among these parts. This observation is correct, but we take it a step further. We think that it is not merely a matter of high dimensionality, and that it can be misleading to assume that the uncertainty can be reduced if we analyse its “sources” as nonlinearities, feedbacks, thresholds, etc., and attempt to establish causality relationships. Koutsoyiannis (2010) created a toy model with simple, fully-known, deterministic dynamics, and with only two degrees of freedom (i.e. internal state variables or dimensions); but it exhibits extremely uncertain behaviour at all scales, including trends, fluctuations, and other features similar to those displayed by the climate. It does so with a constant external forcing,which means that there is no causality relationship between its state and the forcing. The fact that climate has many orders of magnitude more degrees of freedom certainly perplexes the situation further, but in the end it may be irrelevant; for, in the end, we do not have a predictable system hidden behind many layers of uncertainty which could be removed to some extent, but, rather, we have a system that is uncertain at its heart.

    Do we have something better than GCMs when it comes to establishing policies for the future? Our answer is yes: we have stochastic approaches, and what is needed is a paradigm shift. We need to recognize the fact that the uncertainty is intrinsic, and shift our attention from reducing the uncertainty towards quantifying the uncertainty (see also Koutsoyiannis et al., 2009a). Obviously, in such a paradigm shift, stochastic descriptions of hydroclimatic processes should incorporate what is known about the driving physical mechanisms of the processes. Despite a common misconception of stochastics as black-box approaches whose blind use of data disregard the system dynamics, several celebrated examples, including statistical thermophysics and the modelling of turbulence, emphasize the opposite, i.e. the fact that stochastics is an indispensable, advanced and powerful part of physics. Other simpler examples (e.g. Koutsoyiannis, 2010) indicate how known deterministic dynamics can be fully incorporated in a stochastic framework and reconciled with the unavoidable emergence of uncertainty in predictions.

  19. Richard C (NZ) on October 31, 2010 at 8:03 pm said:

    Modeling the Dynamics of Long-Term Variability of Hydroclimatic Processes

    Department of Civil Engineering, Colorado State University, Fort Collins, Colorado
    Department of Statistics, Colorado State University, Fort Collins, Colorado
    Department of Atmospheric Science, and Colorado Climate Center, Colorado State University, Fort Collins, Colorado

    (Manuscript received 26 December 2001, in final form 9 September 2002)

    The stochastic analysis, modeling, and simulation of climatic and hydrologic processes such as precipitation, streamflow, and sea surface temperature have usually been based on assumed stationarity or randomness of the process under consideration. However, empirical evidence of many hydroclimatic data shows temporal variability
    involving trends, oscillatory behavior, and sudden shifts. While many studies have been made for detecting and testing the statistical significance of these special characteristics, the probabilistic framework for modeling the temporal dynamics of such processes appears to be lacking. In this paper a family of stochastic models that can be used to capture the dynamics of abrupt shifts in hydroclimatic time series is proposed. The applicability of such ‘‘shifting mean models’’ are illustrated by using time series data of annual Pacific decadal oscillation (PDO) indices and annual streamflows of the Niger River.

    Concluding remarks (abridged)
    Empirical evidence has shown that some hydroclimatic processes exhibit abrupt shifting patterns in addition to autocorrelation. Also, it has been documented that outputs from a physically based climate model exhibit abrupt changes on decadal timescales. Modeling
    the dynamics of such type of processes by using stochastic methods has been the main subject of the research reported herein. Certain stochastic models can replicate such abrupt shifting behavior particularly when apparent abrupt shifts in the mean occur. Such stochastic models do not seek to explain the underlying physical mechanism of the observed sudden shifts. However, they can be capable of generating or simulating equally likely traces or scenarios with features (e.g., the abrupt shifts) that are statistically similar or comparable to those shown by the observed records. Such simulated traces of the hydroclimatic process under consideration (e.g., annual rainfall over an area) may be useful for assessing the availability of resources in a future horizon and identifying the vulnerability space of specified resources.

  20. Richard C (NZ) on October 31, 2010 at 8:25 pm said:

    Stochastic Hydrology

    Advanced application of statistics and probability to hydrology as applied in the modeling of hydro-climatic sequences

    Computer rendering of stochastic models

    Fournier 1982

  21. Richard C (NZ) on October 31, 2010 at 8:33 pm said:

    Note – Arking 2005 predates recent findings of negative feedbacks from clouds

  22. val majkus on November 1, 2010 at 10:47 am said:

    Here’s the inimitable Dr Tim Ball on data manipulation
    Data collection is expensive and requires continuity – it’s a major role for government. They fail with weather data because money goes to political climate research. A positive outcome of corrupted climate science exposed by Climategate, is re-examination beginning with raw data by the UK Met Office (UKMO). This is impossible because much is lost, thrown out after modification or conveniently lost, as in the case of records held by Phil Jones, director of Climategate. (Here and Here)

    Evidence of manipulation and misrepresentation of data is everywhere. Countries maintain weather stations and adjust the data before it’s submitted through the World Meteorological Organization (WMO) to the central agencies including the Global Historical Climatology Network (GHCN), the Hadley Center associated with CRU now called CRUTEM3, and NASA’s Goddard Institute for Space Studies (GISS).

    They make further adjustments before selecting stations to produce their global annual average temperature. This is why they produce different measures each year from supposedly similar data.

    There are serious concerns about data quality. The US spends more than others on weather stations, yet their condition and reliability is simply atrocious. Anthony Watts has documented the condition of US weather stations; it is one of governments failures.” (end of quote)

    and Ken Stewart has shown how this has happened in Australia
    his conclusion?
    In a commendable effort to improve the state of the data, the Australian Bureau of Meteorology (BOM) has created the Australian High-Quality Climate Site Network. However, the effect of doing so has been to introduce into the temperature record a warming bias of over 40%. And their climate analyses on which this is based appear to increase this even further to around 66%.

    Has anyone done similar checking in New Zealand?

  23. THREAD on November 1, 2010 at 1:17 pm said:


    Ken Stewart

  24. Mike Jowsey on November 1, 2010 at 11:53 pm said:

    Richards S. Courtenay rocks

    Love this one:
    “So, the difference between a model’s results and observed reality informs about the model, and this difference is not “evidence” for the existence or otherwise of any postulated effect – for example, anthropogenic global warming – in the real climate system.
    If you cannot grasp this simple point then you should consider the following. Computer models based on fundamental physical laws can very accurately emulate the behaviours of battling spaceships, but this cannot provide any “evidence” for the existence of alien monsters in real space.”

  25. Mike Jowsey on November 1, 2010 at 11:55 pm said:

    Oops – apologies for mis-spelling name : hate it when that happens! S/b Richard S. Courtney

  26. I’d be getting a bit worried about this particular computer model:

    Climate change game launched

    An educational computer game in which users have to save the world from climate change offers an interesting solution – decide the problem is overpopulation and design a virus to kill millions.

    Fancy a bit of genocide in your stocking this Christmas, Johnny?

  27. Mike Jowsey on November 2, 2010 at 9:31 am said:

    OMG – this is for real! Reminds me of Hitler Youth camps – indoctrinate, indoctrinate, indoctrinate. THE END JUSTIFIES THE MEANS!

  28. Mike,
    Sounds like you and Richard North are reading the same messages into this:

  29. Richard C (NZ) on November 2, 2010 at 10:01 am said:

    Is this another own goal like 10:10?

    I’m putting this in “Controversy and scandal” anyway.

  30. Richard C (NZ) on November 3, 2010 at 3:52 pm said:

    the inconvenient SKEPTIC

    Radiative Heat Transfer: Simple Overview – first in series

    Discussion in comments

  31. Richard C (NZ) on November 9, 2010 at 9:47 am said:


    Overconfidence in IPCC’s detection and attribution: Part III

    October 24, 2010

    by Judith Curry

    Circularity in the argument

    Apart from the issue of the actual logic used for reasoning, there is circularity in the argument that is endemic to whatever reasoning logic is used. Circular reasoning is a logical fallacy whereby the proposition to be proved is assumed in one of the premises.

    The most serious circularity enters into the determination of the forcing data. Given the large uncertainties in forcings and model inadequacies (including a factor of 2 difference in CO2 sensitivity), how is it that each model does a credible job of tracking the 20th century global surface temperature anomalies (AR4 Figure 9.5)? This agreement is accomplished through each modeling group selecting the forcing data set that produces the best agreement with observations, along with model kludges that include adjusting the aerosol forcing to produce good agreement with the surface temperature observations. If a model’s sensitivity is high, it is likely to require greater aerosol forcing to counter the greenhouse warming, and vice versa for a low model sensitivity. The proposition to be proved (#7) is assumed in premise #3 by virtue of kludging of the model parameters and the aerosol forcing to agree with the 20th century observations of surface temperature. Any climate models that uses inverse modeling to determine any aspect of the forcing substantially weakens the attribution argument owing to the introduction of circular reasoning.

  32. Richard C (NZ) on November 9, 2010 at 10:00 am said:


    Overconfidence in IPCC’s detection and attribution: Part III

    October 24, 2010

    by Judith Curry

    Circularity in the argument

    Richard S Courtney | October 25, 2010 at 4:48 am | Reply

    Dr Curry:

    I think the problem with the models is more profound than you state when you write:

    “The most serious circularity enters into the determination of the forcing data. Given the large uncertainties in forcings and model inadequacies (including a factor of 2 difference in CO2 sensitivity), how is it that each model does a credible job of tracking the 20th century global surface temperature anomalies (AR4 Figure 9.5)? This agreement is accomplished through each modeling group selecting the forcing data set that produces the best agreement with observations, along with model kludges that include adjusting the aerosol forcing to produce good agreement with the surface temperature observations.

    I stated my assessment on a previous thread of your blog and I take the liberty of copying it here because I think it goes to the heart of the issue of “Overconfidence”.

    My comment was in the thread titled “What can we learn from climate models” that is at

    It was as follows:

    “Richard S Courtney | October 6, 2010 at 6:07 am | Reply Ms Curry:
    Dr Curry:

    Thank you for your thoughtful and informative post.

    In my opinion, your most cogent point is:

    “Particularly for a model of a complex system, the notion of a correct or incorrect model is not well defined, and falsification is not a relevant issue. The relevant issue is how well the model reproduces reality, i.e. whether the model “works” and is fit for its intended purpose.”

    However, in the case of climate models it is certain that they do not reproduce reality and are totally unsuitable for the purposes of future prediction (or “projection”) and attribution of the causes of climate change.

    All the global climate models and energy balance models are known to provide indications which are based on the assumed degree of forcings resulting from human activity resulting from anthropogenic aerosol cooling input to each model as a ‘fiddle factor’ to obtain agreement between past average global temperature and the model’s indications of average global temperature. This ‘fiddle factor’ is wrongly asserted to be parametrisation.

    A decade ago I published a peer-reviewed paper that showed the UK’s Hadley Centre general circulation model (GCM) could not model climate and only obtained agreement between past average global temperature and the model’s indications of average global temperature by forcing the agreement with an input of assumed anthropogenic aerosol cooling.

    And my paper demonstrated that the assumption of anthropogenic aerosol effects being responsible for the model’s failure was incorrect.

    (ref. Courtney RS ‘An assessment of validation experiments conducted on computer models of global climate using the general circulation model of the UK’s Hadley Centre’ Energy & Environment, Volume 10, Number 5, pp. 491-502, September 1999).

    More recently, in 2007, Kiehle published a paper that assessed 9 GCMs and two energy balance models.

    (ref. Kiehl JT,Twentieth century climate model response and climate sensitivity. GRL vol.. 34, L22710, doi:10.1029/2007GL031383, 2007).

    Kiehl found the same as my paper except that each model he assessed used a different aerosol ‘fix’ from every other model.

    He says in his paper:

    ”One curious aspect of this result is that it is also well known [Houghton et al., 2001] that the same models that agree in simulating the anomaly in surface air temperature differ significantly in their predicted climate sensitivity. The cited range in climate sensitivity from a wide collection of models is usually 1.5 to 4.5 deg C for a doubling of CO2, where most global climate models used for climate change studies vary by at least a factor of two in equilibrium sensitivity.

    The question is: if climate models differ by a factor of 2 to 3 in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy.

    Kerr [2007] and S. E. Schwartz et al. (Quantifying climate change–too rosy a picture?, available at ) recently pointed out the importance of understanding the answer to this question. Indeed, Kerr [2007] referred to the present work and the current paper provides the ‘‘widely circulated analysis’’ referred to by Kerr [2007]. This report investigates the most probable explanation for such an agreement. It uses published results from a wide variety of model simulations to understand this apparent paradox between model climate responses for the 20th century, but diverse climate model sensitivity.”

    And Kiehl’s paper says:

    ”These results explain to a large degree why models with such diverse climate sensitivities can all simulate the global anomaly in surface temperature. The magnitude of applied anthropogenic total forcing compensates for the model sensitivity.”

    And the “magnitude of applied anthropogenic total forcing” is fixed in each model by the input value of aerosol forcing.

    Kiehl’s Figure 2 can be seen at
    Please note that it is for 9 GCMs and 2 energy balance models, and its title is:
    ”Figure 2. Total anthropogenic forcing (Wm2) versus aerosol forcing (Wm2) from nine fully coupled climate models and two energy balance models used to simulate the 20th century.”

    The graph shows the anthropogenic forcings used by the models show large range of total anthropogenic forcing from 1.22 W/m^2 to 2.02 W/m^2 with each of these values compensated to agree with observations by use of assumed anthropogenic aerosol forcing in the range -0.6 W/m^2 to -1.42 W/m^2. In other words, the total anthropogenic forcings used by the models varies by a factor of almost 2, and this difference is compensated by assuming values of anthropogenic aerosol forcing that varies by a factor of almost 2.4.

    Anything can be adjusted to hindcast obervations by permitting that range of assumptions. But there is only one Earth, so at most only one of the models can approximate the climate system which exists in reality.

    The underlying problem is that the modellers assume additional energy content in the atmosphere will result in an increase of temperature, but that assumption is very, very unlikely to be true.

    Radiation physics tells us that additional greenhouse gases will increase the energy content of the atmosphere. But energy content is not necessarily sensible heat.

    An adequate climate physics (n.b. not radiation physics) would tell us how that increased energy content will be distributed among all the climate modes of the Earth. Additional atmospheric greenhouse gases may heat the atmosphere, they may have an undetectable effect on heat content, or they may cause the atmosphere to cool.

    The latter could happen, for example, if the extra energy went into a more vigorous hydrological cycle with resulting increase to low cloudiness. Low clouds reflect incoming solar energy (as every sunbather has noticed when a cloud passed in front of the Sun) and have a negative feedback on surface temperature.

    Alternatively, there could be an oscillation in cloudiness (in a feedback cycle) between atmospheric energy and hydrology: as the energy content cycles up and down with cloudiness, then the cloudiness cycles up and down with energy with their cycles not quite 180 degrees out of phase (this is analogous to the observed phase relationship of insolation and atmospheric temperature). The net result of such an oscillation process could be no detectable change in sensible heat, but a marginally observable change in cloud dynamics.

    However, nobody understands cloud dynamics so the reality of climate response to increased GHGs cannot be known.

    So, the climate models are known to be wrong, and it is known why they are wrong: i.e.

    1. they each emulate a different climate system and are each differently adjusted by use of ‘fiddle factors’ to get them to match past climate change,

    2. and the ‘fiddle factors’ are assumed (n.b. not “estimated”) forcings resulting from human activity,

    3. but there is only one climate system of the Earth so at most only one of the models can be right,

    4. and there is no reason to suppose any one of them is right,

    5. but there is good reason to suppose that they are all wrong because they cannot emulate cloud processes which are not understood.

    Hence, use of the models is very, very likely to provide misleading indications of future prediction (or “projection”) of climate change and is not appropriate for attribution of the causes of climate change.



  33. Richard C (NZ) on November 11, 2010 at 12:09 pm said:

    Why ‘Science of Doom’ Doesn’t Understand the 1st Law of Thermodynamics

    Tuesday, November 9, 2010

    The alarmist Science of Doom blog post Do Trenberth and Kiehl understand the First Law of Thermodynamics? shows an absurd ‘thought experiment’ riddled with errors that is supposed to support and explain the IPCC Earth energy budget of Kevin Trenberth. In brief, the post shows a 30,000 Watt energy source [“earth”] enclosed inside a 3 meter thick PVC sphere [“the atmosphere”] and erroneously calculates that the 30,000 Watts miraculously turns into 1,824,900 Watts emitted from the inner surface of the PVC sphere. To anyone with a rudimentary understanding of physics, this is an obvious violation of the conservation of energy demanded by the 1st Law of Thermodynamics. Let’s examine the math errors of ‘Science of Doom’ (a climate scientist named Dr. Phillips) that produce this absurd result, beginning with the thought experiment diagram and assumptions:

  34. Richard C (NZ) on November 12, 2010 at 5:41 pm said:


    Seafriends Marine Conservation and Education Centre. 7 Goat Island Rd. Leigh R.D.5. New Zealand

  35. val majkus on May 16, 2011 at 11:39 pm said:

    For Australians:

    Here’s a copy of an e mail I received today
    A group set up with Allen Jones as patron and some very eminent advisors including Warwick Hughes
    And another media personality Andrew Bolt
    I heartily endorse the site; it’s time to get the message that people like Warwick have been researching for so long before the public and into the media

    so I hope you can donate and encourage this effort

    and here’s the e mail

    It’s time: an opportunity to support true scientists presenting real-world climate science to inform the public.

    7:10am Tuesday morning, May 17th, and Wednesday, May 18th tune into radio 2GB, Sydney:

    16 May, 2011
    We’re delighted to be able to introduce you to the web site of the newly-formed Galileo Movement ( The principal aim of this Movement is to first win the battle against the currently threatened tax on carbon dioxide and then to win the war against any drive for ever putting a price on it.
    From recent public polls it is obvious that the majority of Australians are opposed to this tax and we believe you are probably part of that majority.
    Our efforts are non-political as there are politicians in all parties who need to be convinced of the futility of taxing a beneficial trace gas in the atmosphere. So many people still think that carbon dioxide is a harmful pollutant; we need your help to educate these people that this is not so! We have strong connections to excellent scientific and other advisers worldwide. We’re ready.
    How can you help? Well, on the web site there is a printable flyer which you can print and distribute into letter boxes, at bus stops, shopping centres and railway stations; it is at and click on “flyer”.
    More importantly, we intend to run a major, national, professionally-managed campaign to gain access to significant members of the mass media (print journalists, radio and TV personalities) who need to understand that the Intergovernmental Panel on Climate Change (IPCC) has been spreading truly false information. This campaign will be costly, so we are appealing to you and all like-minded people to make a contribution towards this effort. You will note from our web site that virtually all our efforts are voluntary; there are some minor costs in maintaining formal financial records and in producing and maintaining the web site. Once we have won the war, any remaining funds are destined to go to the Royal Flying Doctor Service.
    Please distribute this email widely to make people aware of as this may be our last opportunity to unite to defeat this negative, economic and socially life-changing legislation. It is time to act: “The triumph of evil requires only for good men (and women) to do nothing”.
    Thank you,

    John Smeed, Director
    Case Smit, Director
    Malcolm Roberts, Project Manager
    P.S. Our apologies if you receive more than one copy of this email.
    P.P.S. Please let us have your suggestions for enhancing our web site.
    P.P.P.S. Listen to Australia’s most influential radio personality Alan Jones of Radio 2GB (, and Patron of the Galileo Movement, who will be interviewing distinguished American meteorologist Professor Lindzen at 7:10am Tuesday (May 17th) and at 7:10am on Wednesday, The Galileo Movement’s voluntary Project Manager, Malcolm Roberts

    Malcolm Roberts
    BE (Hons), MBA (Chicago)
    Fellow AICD, MAIM, MAusIMM, MAME (USA), MIMM (UK), Fellow ASQ (USA, Aust)

    My personal declaration of interests is at:
    (manually go to and look for ‘Summaries’ and then click on ‘Aims, background and declaration of interests …’)

    180 Haven Road
    Pullenvale QLD 4069
    Home 07 3374 3374
    Mobile 04 1964 2379

    Please note: Apart from suburb and state, my contact details are not for publication nor broadcasting and are provided only for your own personal use to respond.

    For care to be effective, care needs to be informed

  36. Climate Extremes and Extreme Weather

  37. Richard C (NZ) on November 16, 2011 at 8:06 am said:

    Climate Extremes and Global Warming

    By ANDREW C. REVKIN – Dot Earth


    Passions are heightened by extraordinary recent climate-related disasters and concerns in poor countries that they’re already being affected by a greenhouse-gas buildup mainly caused (so far) by rich countries.

    But despite decades of work on greenhouse-driven warming, research aimed at clarifying how greenhouse-driven global warming will affect the rarest categories of raging floods, searing droughts and potent storms is both limited and laden with uncertainty.


    See Q and A with Chris Field, a leader of the panel’s [IPCC] Working Group 2 “Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX)”


  38. Richard C (NZ) on November 16, 2011 at 8:19 am said:

    Details and reaction to the IPCC Special Report:

    “Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX)”

    can be found in this (off-topic) thread under the “‘Monster’ increase in emissions” post starting here:-

  39. Richard C (NZ) on November 18, 2011 at 7:42 am said:

    Review fails to support climate change link

    * by: Graham Lloyd, Environment editor
    * From: The Australian
    * November 18, 2011 12:00AM

    WIDELY-HELD assumptions that climate change is responsible for an upsurge in extreme drought, flood and storm events are not supported by a landmark review of the science.

    And a clear climate change signal would not be evident for decades because of natural weather variability.

    Despite the uncertainties, politicians – including US President Barack Obama in his address to federal parliament yesterday – continue to link major weather events directly to climate change.

    Greens leader Bob Brown yesterday highlighted Mr Obama’s climate change comments and said the extreme weather impacts were “not just coming, they are happening”.

    But rather than bolster claims of a climate change link, the scientific review prepared by the world’s leading climate scientists for the UN’s Intergovernmental Panel on Climate Change highlights the level of uncertainty. After a week of debate, the IPCC will tonight release a summary of the report in Kampala, Uganda, as a prelude to the year’s biggest climate change conference, being held in Durban, South Africa.

    The full report will not be released for several months but a leaked copy of the draft summary, details of which have been published by the BBC and a French news agency, have provided a good indication of what it found.

    While the human and financial toll of extreme weather events has certainly risen, the cause has been mostly due to increased human settlement rather than worse weather.


  40. Richard C (NZ) on November 18, 2011 at 6:44 pm said:

    IPCC scientists test the Exit doors

    RE: Mixed messages on climate ‘vulnerability’. Richard Black, BBC.

    AND UPDATED: The Australian reports the leaked IPCC review, AND a radio station just announced it as “IPCC says we don’t know if there is a reason for the carbon tax”. See more below.
    This is another big tipping point on the slide out of the Great Global Scam. IPCC scientists — facing the travesty of predictions-gone-wrong — are trying to salvage some face, and plant some escape-clause seeds for later. But people are not stupid.

    A conveniently leaked IPCC draft is testing the ground. What excuses can they get away with? Hidden underneath some pat lines about how anthropogenic global warming is “likely” to influence… ah cold days and warm days, is the get-out-of-jail clause that’s really a bombshell:

    “Uncertainty in the sign of projected changes in climate extremes over the coming two to three decades is relatively large because climate change signals are expected to be relatively small compared to natural climate variability”.

  41. Richard C (NZ) on December 7, 2011 at 9:09 am said:

    Common link in extreme weather events found – and no, it isn’t AGW

    Posted on December 5, 2011 by Anthony Watts

    From the University of Wisconsin-Madison something you’ll never see posted on Climate Progress or mentioned by weepy Bill McKibben because it mellows their harshness

    Global winds could explain record rains, tornadoes

    MADISON –Two talks at a scientific conference this week will propose a common root for an enormous deluge in western Tennessee in May 2010, and a historic outbreak of tornadoes centered on Alabama in April 2011.

    Both events seem to be linked to a relatively rare coupling between the polar and the subtropical jet streams, says Jonathan Martin, a University of Wisconsin-Madison professor of atmospheric and oceanic sciences.

    But the fascinating part is that the change originates in the western Pacific, about 9,000 miles away from the intense storms in the U.S. midsection, Martin says.

    The mechanism that causes the storms originates during spring or fall when organized complexes of tropical thunderstorms over Indonesia push the subtropical jet stream north, causing it to merge with the polar jet stream.

    The subtropical jet stream is a high-altitude band of wind that is normally located around 30 degrees north latitude. The polar jet stream is normally hundreds of miles to the north.

    Martin calls the resulting band of wind a “superjet.”


  42. Richard C (NZ) on December 26, 2011 at 7:30 am said:

    Looks like alarm will continue to focus on extreme weather in 2012 – a great new greenfield opportunity for climate science funding. Here’s something to watch for mid-year (a headsup, if you will):-

    Harsh Political Reality Slows Climate Studies Despite Extreme Year

    Published: December 24, 2011

    At the end of one of the most bizarre weather years in American history, climate research stands at a crossroads.

    But for many reasons, efforts to put out prompt reports on the causes of extreme weather are essentially languishing. Chief among the difficulties that scientists face: the political environment for new climate-science initiatives has turned hostile, and with the federal budget crisis, money is tight

    […page 2]

    Some steps are being taken. Peter A. Stott, a leading climate scientist in Britain, has been pressing colleagues on both sides of the Atlantic to develop a robust capability to analyze weather extremes in real time. He is part of a group that expects to publish, next summer, the first complete analysis of a full year of extremes, focusing on 2011.

    In an interview, Dr. Stott said the goal was to get to a point where “the methodologies are robust enough that you can do it in a kind of handle-turning way.”

    But he added that it was important to start slowly and establish a solid scientific foundation for this type of work. That might mean that some of the early analyses would not be especially satisfying to the public.

    “In some cases, we would say we have a confident result,” Dr. Stott said. “We may in some cases have to say, with the current state of the science, it’s not possible to make a reliable attribution statement at this point.”

  43. Richard C (NZ) on December 26, 2011 at 7:37 am said:

    Tom Nelson headlines the article a little differently:-

    Another lie from your New York Times: “the weather becomes more erratic by the year”

  44. Richard C (NZ) on January 8, 2012 at 11:42 am said:

    Useful tutorials from Stephan Goddard at Real Science

    Learning To Distinguish Between Low CO2 And High CO2 Droughts | Real Science

    Droughts on the left side of the blue line were below 350 ppm CO2, and droughts on the right side were above 350 ppm. Before you can distinguish between them, you need to accurately determine how many angels can dance on the head of a pin. Then you have to tell the press that droughts seem like they are getting worse.

    Learning To Distinguish Between Low CO2 Tornadoes And High CO2 Tornadoes | Real Science

    Tornadoes to the left of the pink line are below 350 PPM tornadoes, and tornadoes to the right are high CO2 (supercharged) tornadoes. Can you spot the difference?

  45. Richard C (NZ) on April 6, 2013 at 11:35 am said:

    Failed winter climate predictions

    (The first 33 concern mostly Germany and Central Europe)

    1. “Due to global warming, the coming winters in the local regions will become milder.”
    Stefan Rahmstorf, Potsdam Institute of Climate Impact Research, University of Potsdam, 8 Feb 2006


    2. “Milder winters, drier summers: Climate study shows a need to adapt in Saxony Anhalt.”
    Potsdam Institute for Climate Impact Research, Press Release, 10 Jan 2010.


    3. “More heat waves, no snow in the winter“ … “Climate models… over 20 times more precise than the UN IPCC global models. In no other country do we have more precise calculations of climate consequences. They should form the basis for political planning. … Temperatures in the wintertime will rise the most … there will be less cold air coming to Central Europe from the east. …In the Alps winters will be 2°C warmer already between 2021 and 2050.”
    Max Planck Institute for Meteorology, Hamburg, 2 Sept 2008.

    More >>>>>>>> [59 so far]

  46. Richard C (NZ) on April 6, 2013 at 12:18 pm said:

    Matt Ridley’s diary: My undiscovered island, and the Met Office’s computer problem


    At least somebody’s happy about the cold. Gary Lydiate runs one of Northumberland’s export success stories, Kilfrost, which manufactures 60 per cent of Europe’s and a big chunk of the world’s aircraft de-icing fluid, so he puts his money where his mouth is, deciding how much fluid to send to various airports each winter. Back in January, when I bumped into him in a restaurant, he was beaming: ‘Joe says this cold weather’s going to last three months,’ he said. Joe is Joe Bastardi, a private weather forecaster, who does not let global warming cloud his judgment. Based on jetstreams, el Niños and ocean oscillations, Bastardi said the winter of 2011–12 would be cold only in eastern Europe, which it was, but the winter of 2012–13 would be cold in western Europe too, which it was. He’s now predicting ‘warming by mid month’ of April for the UK.

    More >>>>>

Comment navigation


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>