Climate Models

This thread is for discussion of computer climate models, or General Circulation Models (GCMs).

28 Thoughts on “Climate Models

  1. Well, well, it looks like someone got the models wrong again.

    How often have we heard that droughts will increase due to global warming? It’s the single most-quoted effect that alarmists use when discussing Africa, for example.

    Seems they were wrong.
    http://www.nature.com/nature/journal/vaop/ncurrent/full/nature11377.html

    We find no evidence in our analysis of a positive feedback—that is, a preference for rain over wetter soils—at the spatial scale (50–100 kilometres) studied. In contrast, we find that a positive feedback of soil moisture on simulated precipitation does dominate in six state-of-the-art global weather and climate models—a difference that may contribute to excessive simulated droughts in large-scale models.

    This is why these blokes should have checked their models before shouting about the end of the world.

  2. Richard C (NZ) on September 18, 2012 at 8:51 pm said:

    New paper shows negative feedback from clouds ‘may damp global warming’

    A paper published today in The Journal of Climate uses a combination of two modelling techniques to find that negative feedback from clouds could result in “a 2.3-4.5% increase in [model projected] cloudiness” over the next century, and that “subtropical stratocumulus [clouds] may damp global warming in a way not captured by the [Global Climate Models] studied.” This strong negative feedback from clouds could alone negate the 3C alleged anthropogenic warming projected by the IPCC.

    As Dr. Roy Spencer points out in his book,

    “The most obvious way for warming to be caused naturally is for small, natural fluctuations in the circulation patterns of the atmosphere and ocean to result in a 1% or 2% decrease in global cloud cover. Clouds are the Earth’s sunshade, and if cloud cover changes for any reason, you have global warming — or global cooling.”

    According to the authors of this new paper, current global climate models “predict a robust increase of 0.5-1 K in EIS over the next century, resulting in a 2.3-4.5% increase in [mixed layer model] cloudiness.”

    EIS or estimated inversion strength has been shown by observations to be correlated with cloudiness, as demonstrated by the 2nd graph below from the University of Washington, indicating a 1 K increase in EIS results in an approximate 4-5% increase in low cloud cover [CF or cloud fraction]. Thus, a combination of observational data and modelling indicate clouds have a strong net negative feedback upon global warming that is “not captured” by current climate models.

    CMIP3 Subtropical Stratocumulus Cloud Feedback Interpreted Through a Mixed-Layer Model

    PETER M. CALDWELL,* YUNYAN ZHANG, and STEPHEN A. KLEIN

    >>>>>>>

    http://hockeyschtick.blogspot.co.nz/2012/09/new-paper-shows-negative-feedback-from.html

  3. Richard C (NZ) on October 18, 2012 at 7:54 am said:

    Climate change research gets petascale supercomputer

    1.5-petaflop IBM Yellowstone system runs 72,288 Intel Xeon cores

    Computerworld – Scientists studying Earth system processes, including climate change, are now working with one of the largest supercomputers on the planet.

    The National Center for Atmospheric Research (NCAR) has begun using a 1.5 petaflop IBM system, called Yellowstone, that is among the top 20 supercomputers in the world, at least until the global rankings are updated next month.

    For NCAR researchers it is an enormous leap in compute capability — a roughly 30 times improvement over its existing 77 teraflop supercomputer. Yellowstone is a 1,500 teraflops system capable of 1.5 quadrillion calculations per second.

    The NCAR-Wyoming Supercomputing Center in Cheyenne, where this system is housed, says that with Yellowstone, it now has “the world’s most powerful supercomputer dedicated to geosciences.”

    Along with climate change, this supercomputer will be used on a number of geoscience research issues, including the study of severe weather, oceanography, air quality, geomagnetic storms, earthquakes and tsunamis, wildfires, subsurface water and energy resources.

    […]

    Scientists will be able to use the supercomputer to model the regional impacts of climate change. A model that is 100 km (62 miles) is considered coarse because the grid covers a large distance. But this new system may be able to reduce resolution to as much as 10 km (6.2 miles), giving scientists the ability to examine climate impacts in greater detail.

    […]

    Yellowstone is running in a new $70 million data center. The value of the supercomputer contract was put at $25 million to $35 million. It has 100 racks, with 72,288 compute cores from Intel Sandy Bridge processors.

    >>>>>>>>

    http://www.computerworld.com/s/article/9232382/Climate_change_research_gets_petascale_supercomputer

    Rather large energy requirement too – “The facility was designed with a total capacity of 4 to 5 megawatts of electricity, but with Yellowstone now in production, usage is considerably lower. Total power for computing, cooling, office, and support functions has averaged 1.8 to 2.1 MW”

    NCAR-Wyoming Supercomputing Center
    Fact Sheet

    https://www2.ucar.edu/atmosnews/news/nwsc-fact-sheet

  4. Richard C (NZ) on October 18, 2012 at 8:39 am said:

    I queried John Christy as to which modeling group it was that has mimiced absolute temperature and trajectory this century so far in his EPS statement Figure 2.1. This was his reply:-

    Richard:

    This model labeled 27 should be inmcm4 (Russia)

    http://www.springerlink.com/content/x6647x575g82734j/

    John C.

    John R. Christy
    Director, Earth System Science Center
    Distinguished Professor, Atmospheric Science
    University of Alabama in Huntsville
    Alabama State Climatologist

    • Richard C (NZ) on October 18, 2012 at 8:55 am said:

      What did the Russians do that everyone else didn’t in CMIP5 for AR5? Did they ramp GHG forcing down to zero I wonder? They do say there were “some changes in the formulation”

      Abstract

      The INMCM3.0 climate model has formed the basis for the development of a new climate-model version: the INMCM4.0. It differs from the previous version in that there is an increase in its spatial resolution and some changes in the formulation of coupled atmosphere-ocean general circulation models. A numerical experiment was conducted on the basis of this new version to simulate the present-day climate. The model data were compared with observational data and the INMCM3.0 model data. It is shown that the new model adequately reproduces the most significant features of the observed atmospheric and oceanic climate. This new model is ready to participate in the Coupled Model Intercomparison Project Phase 5 (CMIP5), the results of which are to be used in preparing the fifth assessment report of the Intergovernmental Panel on Climate Change (IPCC).

      # # #

      Good to see a modeling group validating their model against observations (GCM group that is, RTM groups do this religiously) – this is a major breakthrough.

    • Richard C (NZ) on October 18, 2012 at 1:29 pm said:

      Simulating Present-Day Climate with the INMCM4.0 Coupled Model of the Atmospheric and Oceanic General Circulations

      E. M. Volodin, N. A. Dianskii, and A. V. Gusev, 2010

      Institute of Numerical Mathematics, Russian Academy of Sciences, ul. Gubkina 8, Moscow, 119991 Russia
      e-mail: volodin@inm.ras.ru

      http://83.149.207.89/GCM_DATA_PLOTTING/documents/PhysAtm4_10VolodinLO.pdf

      Page 2:-

      This makes it possible to analyze systematic errors in simulating the present-day climate and to assess the range of its possible changes caused, for example, by anthropogenic forcing.

      Page 3:-

      On the basis of this model, a numerical experiment was carried out to simulate the modern climate. To this end, the concentrations specified for all radioactive gases and aerosols corresponded to those in 1960.

      Page 4:-

      Name: Air temperature at the surface °C
      Observations: 14.0 ± 0.2 [34]
      INMCM3.0: 13.0 ± 0.1
      INMCM4.0: 13.7 ± 0.1

      Page 4:-

      The 1951–2000 NCEP reanalysis data [31] were used to compare the model atmospheric dynamics with observational data, and data from [32–41] were used to compare the integral atmospheric characteristics.

      Page 3:-

      The parameterizations of the basic physical processes in the model have changed only slightly; namely, some of the tuning parameters have changed. Among these are the parameterizations of radiation [18],

      18. V. Ya. Galin, “Parametrization of Radiative Processes
      in the DNM Atmospheric Model,”

      ‘Parametrization of radiative processes in the DNM atmospheric model’

      Galin, V.Y. [Russian Academy of Sciences, Moscow (Russian Federation)]
      1998

      https://www.etde.org/etdeweb/details_open.jsp?osti_id=300295

      Abstract:
      The radiative code of the atmospheric model (DNM model) of the Institute of Numerical Mathematics (IVM), Russian Academy of Sciences is described. The code uses spectral transmission functions and the delta-Eddington approximation to take into account the absorption and scattering of radiation in the atmosphere due to atmospheric gases, aerosols, and clouds. The simplest regularization procedure in combination with the nonmonotonic factorization method is used to find a stable solution to the ill-conditioned system of delta-Eddington equations. Computation algorithms are presented, and the results obtained are compared to both the data of benchmark line-by-line calculations and the model data of ICRCCM international radiative programs. It was found that the DNM model yields a high accuracy of computing the thermal and solar radiation.

      # # #

      Unfortunately I can’t access the body of the Galin paper. Unfortunate because the “absorption and scattering” characteristics of CO2 used (and any changes made in INMCM4.0) would make VERY interesting reading.

    • Richard C (NZ) on October 18, 2012 at 3:43 pm said:

      5.2 Heat emission on page 43 of Volodin, Dianskii, and Gusev gives the formulae, share of emissions across the spectrum, and references tables of coefficients.

      http://83.149.207.89/GCM_DATA_PLOTTING/documents/modelen.pdf

    • Richard C (NZ) on October 18, 2012 at 4:02 pm said:

      Description of the CCM INM RAS and model experiments

      Description of the atmospheric climate model inmcm4.0.(new) [hotlink]

      Short description of the coupled climate model inmcm3.0 and model experiments. [hotlink]

      Timetable of the model experiments.

      Selected publications [hotlinked]

      Volodin E.M., Diansky N.A.. “Prediction of the climate change in 19-22th centuries using coupled climate model”.

      Volodin E.M., Diansky N.A. “ENSO reconstruction in the Coupled Climate Model”.

      Volodin E.M.”Simulation of the modern climate. Comparison with observations and data of other climate models”.

      Volodin E.M. “Reliability of the future climate change forecasts”.

      Volodin E.M., Diansky N.A., Gusev A.V. “Simulating Present Day Climate with the INMCM4.0 Coupled Model of the Atmospheric and Oceanic General Circulations”.(new)

      Volodin E.M. “Atmosphere-Ocean General Circulation Model with the Carbon Cycle”.(new)

      http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_en.html

  5. “Pinatubo Climate Sensitivity and Two Dogs that didn’t bark in the night”

    Interesting article on climate sensitivity over at Lucia’s

    http://rankexploits.com/musings/2012/pinatubo-climate-sensitivity-and-two-dogs-that-didnt-bark-in-the-night/

    • Richard C (NZ) on October 24, 2012 at 12:47 pm said:

      Lucia’s blog analysis makes Nuccitelli et al’s DK12 Comment look somewhat ordinary.

      For about 2 yrs data and “a single ocean heat capacity model” (one-heat-sink), Lucia’s model “is “seeing” an ocean capacity of 53 watt-months/deg C/m2 – equivalent to about 30 to 40m water depth”. Further down page, the model is “(still) “seeing” a total ocean heat capacity corresponding to about the top 30-40m of ocean”. This for 60S to 60N only.

      According to Nuccitelli et al, that’s all “noise” and 5 yr smoothed data should be used down to 2000m.

      Can’t say I’m convinced by globally averaged approximations for these calculations. I think the 0-GCM approach using observed ocean heat climatology (which one?) corresponding to TOA satellite observations cell-by-cell are about the only way to arrive at anything anywhere near meaningful. Not that I know what it is about at Lucia-level.

  6. Richard C (NZ) on December 31, 2012 at 5:21 pm said:

    AR5 Chapter 11; Hiding the Decline (Part II)

    http://wattsupwiththat.com/2012/12/30/ar5-chapter-11-hiding-the-decline-part-ii/#more-76591

    Figure 11.33: Synthesis of near-term projections of global mean surface air temperature. a), b) and c):-

    http://wattsupwiththat.files.wordpress.com/2012/12/image_thumb1.png?w=936&h=1143

    They hid the decline! In the first graph, observational data ends about 2011 or 12. In the second graph though, it ends about 2007 or 8. There are four or five years of observational data missing from the second graph. Fortunately the two graphs are scaled identically which makes it very easy to use a highly sophisticated tool called “cut and paste” to move the observational data from the first graph to the second graph and see what it should have looked like:

    http://wattsupwiththat.files.wordpress.com/2012/12/image_thumb2.png?w=939&h=414

    Well oops. Once one brings the observational data up to date, it turns out that we are currently below the entire range of models in the 5% to 95% confidence range across all emission scenarios. The light gray shading is for RCP 4.5, the most likely emission scenario. But we’re also below the dark gray which is all emission scenarios for all models, including the ones where we strangle the global economy.

    + + +

    Also John Christy’s preliminary plot (incomplete) of CMIP5 RCP4.5 vs observations (UAH/RSS):-

    http://curryja.files.wordpress.com/2012/07/christy-fig.jpg?w=808&h=622

  7. Richard C (NZ) on February 3, 2013 at 11:34 am said:

    The controversy

    by Anastassia Makarieva, Victor Gorshkov, Douglas Sheil, Antonio Nobre, Larry Li

    Thanks to help from blog readers, those who visited the ACPD site and many others who we have communicated with, our paper has received considerable feedback. Some were supportive and many were critical. Some have accepted that the physical mechanism is valid, though some (such as JC) question its magnitude and some are certain it is incorrect (but cannot find the error). Setting aside these specific issues, most of the more general critical comments can be classified as variations on, and combinations of, three basic statements:

    1. Current weather and climate models (a) are already based on physical laws and (b) satisfactorily reproduce observed patterns and behaviour. By inference, it is unlikely that they miss any major processes.

    2. You should produce a working model more effective than current models.

    3. Current models are comprehensive: your effect is already there.

    Let’s consider these claims one by one.

    Models and physical laws

    […]

    Thus, while there are physical laws in existing models, their outputs (including apparent circulation power) reflect an empirical process of calibration and fitting. In this sense models are not based on physical laws. This is the reason why no theoretical estimate of the power of the global atmospheric circulation system has been available until now.

    The models reproduce the observations satisfactorily

    As we have discussed in our paper (p. 1046) current models fail when it comes to describing many water-related phenomena. But perhaps a more important point to make here is that even where behaviours are satisfactorily reproduced it would not mean that the physical basis of the model are correct. Indeed, any phenomenon that repeats itself can be formally described or “predicted” completely without understanding its physical nature

    […]

    For example, a climate model empirically fitted for a forest-covered continent cannot inform us about the climatic consequences of deforestation if we do not correctly understand the underlying physical mechanisms.

    You should produce a better model than the existing ones

    […]

    To expect a few theorists, however keen, can achieve that is neither reasonable nor realistic. We have invested our efforts to show, using suitable physical estimates, that the effect we describe is sufficient to justify a wider and deeper scrutiny. (At the same time we are also developing a number of texts to show how current models in fact contain erroneous physical relationships (see, e.g., here)).

    Your effect is already present in existing models

    Many commentators believe that the physics we are talking about is already included in models. There is no omission. This argument assumes that if the processes of condensation and precipitation are reproduced in models, then the models account for all the related phenomena, including pressure gradients and dynamics. This is, however, not so. Indeed this is not merely an oversight but an impossibility. The explanation is interesting and deserves recognition – so we shall use this opportunity to explain.

    […]

    In current models in the absence of a theoretical stipulation on the circulation power, a reverse logic is followed. The horizontal pressure gradients are determined from the continuity equation, with the condensation rate calculated from the Clausius-Clapeyron law using temperature derived from the first law of thermodynamics with empirically fitted turbulence. However, as we have seen, to correctly reproduce condensation-induced dynamics, condensation rate requires an accuracy much greater than γ << 1. Meanwhile the imprecision of the first law of thermodynamics as applied to describe the non-equilibrium atmospheric dynamics is precisely of the same order of γ. The kinetic energy of the gas is not accounted for in equilibrium thermodynamics.

    […]

    Summary and outlook

    The Editor’s comment on our paper ends with a call to further evaluate our proposals. We second this call. The reason we wrote this paper was to ensure it entered the main-stream and gained recognition. For us the key implication of our theory is the major importance of vegetation cover in sustaining regional climates. If condensation drives atmospheric circulation as we claim, then forests determine much of the Earth’s hydrological cycle (see here for details). Forest cover is crucial for the terrestrial biosphere and the well-being of many millions of people. If you acknowledge, as the editors of ACP have, any chance – however large or small – that our proposals are correct, then we hope you concede that there is some urgency that these ideas gain clear objective assessment from those best placed to assess them.

    http://judithcurry.com/2013/01/31/condensation-driven-winds-an-update-new-version/

  8. Richard C (NZ) on April 30, 2013 at 4:33 pm said:

    New paper finds IPCC climate models unable to reproduce solar radiation at Earth’s surface

    A new paper published in the Journal of Geophysical Research – Atmospheres finds the latest generation of IPCC climate models were unable to reproduce the global dimming of sunshine from the ~ 1950s-1980s, followed by global brightening of sunshine during the 1990’s. These global dimming and brightening periods explain the observed changes in global temperature over the past 50-60 years far better than the slow steady rise in CO2 levels. The authors find the models underestimated dimming by 80-85% in comparison to observations, underestimated brightening in China and Japan as well, and that “no individual model performs particularly well for all four regions” studied. Dimming was underestimated in some regions by up to 7 Wm-2 per decade, which by way of comparison is 25 times greater than the alleged CO2 forcing of about 0.28 Wm-2 per decade. The paper demonstrates climate models are unable to reproduce the known climate change of the past, much less the future, that the forcing from changes in solar radiation at the Earth surface is still far from being understood and dwarfs any alleged effect of increased CO2.

    ‘Evaluation of multidecadal variability in CMIP5 surface solar radiation and inferred underestimation of aerosol direct effects over Europe, China, Japan and India’

    R. J. Allen 1, J. R. Norris 2, M. Wild 3

    DOI: 10.1002/jgrd.50426

    http://hockeyschtick.blogspot.co.nz/2013/04/new-paper-finds-ipcc-climate-models.html

  9. Richard C (NZ) on May 19, 2013 at 12:39 pm said:

    ‘Global warming slowdown retrospectively “predicted” ‘

    By Ashutosh Jogalekar

    When I was in graduate school I once came across a computer program that’s used to predict the activities of as yet unsynthesized drug molecules. The program is “trained” on a set of existing drug molecules with known activities (the “training set”) and is then used to predict those of an unknown set (the “test set”). In order to make learning the ropes of the program more interesting, my graduate advisor set up a friendly contest between me and a friend in the lab. We were each given a week to train the program on an existing set and find out how well we could do on the unknowns.

    After a week we turned in our results. I actually did better than my friend on the existing set, but my friend did better on the test set. From a practical perspective his model had predictive value, a key property of any successful model. On the other hand my model was one that still needed some work. Being able to “predict” already existing data is not prediction, it’s explanation. Explanation is important, but a model such as mine that merely explained what was already known is an incomplete model since the value and purpose of a truly robust model is prediction. In addition, a model that merely explains can be made to fit the data by tweaking its parameters with the known experimental numbers.

    These are the thoughts that went through my mind as I read a recent paper from Nature Climate Change in which climate change modelers “predicted” the last ten years of global temperature stagnation.

    […]

    This kind of retrospective calculation is a standard part of model building. But let’s not call it a “prediction”, it’s actually a “postdiction”. The present study indicates that models used for predicting temperature changes need some more work, especially when dealing with tightly coupled complex systems such as ocean sinks. In addition you cannot simply make these models work by tweaking the parameters; the problem with this approach is that it risks condemning the models to a narrow window of applicability beyond which they will lack the flexibility to take sudden changes into account. A robust model is one with a minimal number of parameters which does not need to be constantly tweaked to explain what has already happened and which is as general as possible. Current climate models are not useless, but in my opinion the fact that they could not prospectively predict the temperature stagnation implies that they lack robustness. They should really be seen as “work in progress”.

    I can also see how such a study will negatively affect the public image of global warming. People are usually not happy with prediction after the fact……..

    >>>>>>

    http://blogs.scientificamerican.com/the-curious-wavefunction/2013/05/15/global-warming-slowdown-retrospectively-predicted/

  10. Richard C (NZ) on June 19, 2013 at 2:46 pm said:

    ‘The “ensemble” of models is completely meaningless, statistically’

    Posted on June 18, 2013 by Anthony Watts

    This comment from rgbatduke, who is Robert G. Brown. at the Duke University Physics Department on the No significant warming for 17 years 4 months thread has gained quite a bit of attention [e.g. reproduced by Dr Judith Curry at Climate Etc] because it speaks clearly to truth. So that all readers can benefit, I’m elevating it to a full post

    rgbatduke says:

    June 13, 2013 at 7:20 am

    http://wattsupwiththat.com/2013/06/18/the-ensemble-of-models-is-completely-meaningless-statistically/

    Last two paragraphs:

    “It would take me, in my comparative ignorance, around five minutes to throw out all but the best 10% of the GCMs (which are still diverging from the empirical data, but arguably are well within the expected fluctuation range on the DATA side), sort the remainder into top-half models that should probably be kept around and possibly improved, and bottom half models whose continued use I would defund as a waste of time. That wouldn’t make them actually disappear, of course, only mothball them. If the future climate ever magically popped back up to agree with them, it is a matter of a few seconds to retrieve them from the archives and put them back into use.

    Of course if one does this, the GCM predicted climate sensitivity plunges from the totally statistically fraudulent 2.5 C/century to a far more plausible and still possibly wrong ~1 C/century, which — surprise — more or less continues the post-LIA warming trend with a small possible anthropogenic contribution. This large a change would bring out pitchforks and torches as people realize just how badly they’ve been used by a small group of scientists and politicians, how much they are the victims of indefensible abuse of statistics to average in the terrible with the merely poor as if they are all equally likely to be true with randomly distributed differences.”

  11. Richard C (NZ) on June 24, 2013 at 7:21 pm said:

    One of the first jobs for NIWA’s High Performance Computing Facility (HPCF) was snow modeling partly funded by the Ski Areas Association of New Zealand:

    ‘New Zealand snow areas confident they will adapt to any risks from climate change’

    16 December 2010

    New climate modelling shows seasonal snow levels at New Zealand ski areas will be reduced by the effects of climate change in the coming years, but the good news is the loss may actually be less than originally anticipated and we should be able to continue to make snow, even under a more extreme climate scenario

    http://www.niwa.co.nz/news/new-zealand-snow-areas-confident-they-will-adapt-any-risks-climate-change

    A lot less. I’ve just seen a newsclip from Mt Hutt (I think it was) where they were saying the 3m base was the most they had ever seen.

  12. Richard C (NZ) on June 26, 2013 at 7:06 pm said:

    ‘New Weather Service supercomputer faces chaos’

    By Steve Tracton

    The National Weather Service is currently in the process of transitioning its primary computer model, the Global Forecast System (GFS), from an old supercomputer to a brand new one [Weather and Climate Operational Supercomputer System (WCOSS)]. However, before the switch can be approved, the GFS model on the new computer must generate forecasts indistinguishable from the forecasts on the old one.

    One expects that ought not to be a problem, and to the best of my 30+ years of personal experience at the NWS, it has not been. But now, chaos has unexpectedly become a factor and differences have emerged in forecasts produced by the identical computer model but run on different computers.

    This experience closely parallels Ed Lorenz’s experiments in the 1960s, which led serendipitously to development of chaos theory (aka “butterfly effect). What Lorenz found – to his complete surprise – was that forecasts run with identically the same (simplistic) weather forecast model diverged from one another as forecast length increased solely due to even minute differences inadvertently introduced into the starting analyses (“initial conditions”).

    […]

    So what lay behind the chaotic like divergence of solutions between the identically same GFS run on different computer systems? Simply speaking, the error in model’s sequence of short range (3 hour) forecasts, which provide the “first guess” in assimilation of the latest observations, does not result in precisely the same initial conditions for the next pair of GFS extended range forecasts (see schematic illustration below).

    The differences in the simulations arise solely from exceedingly small, but apparently consequential differences in numerical calculations. These are associated with differences in the computer systems’ structure and logical organization (architecture) and compilers which translate programming codes (e.g., versions of Fortran) to machine language – and probably other factors way over my head to understand.

    >>>>>>>>

    http://www.washingtonpost.com/blogs/capital-weather-gang/wp/2013/06/25/new-weather-service-supercomputer-faces-chaos/

  13. Richard C (NZ) on June 28, 2013 at 10:08 pm said:

    ‘Policy Implications of Climate Models on the Verge of Failure’

    By Paul C. Knappenberger and Patrick J. Michaels
    Center for the Study of Science, Cato Institute, Washington DC

    [converted from a poster displayed at the AGU Science Policy Conference, Washington, June 24-26]

    INTRODUCTION

    Assessing the consistency between real-world observations and climate model projections
    is a challenging problem but one that is essential prior to making policy decisions which
    depend largely on such projections. National and international assessments often mischaracterize the level of consistency between observations and projections.
    Unfortunately, policymakers are often unaware of this situation, which leaves them
    vulnerable to developing policies that are ineffective at best and dangerous at worst.

    Here, we find that at the global scale, climate models are on the verge of failing to
    adequately capture observed changes in the average temperature over the past 10 to 30
    years—the period of the greatest human influence on the atmosphere. At the regional
    scale, specifically across the United States, climate models largely fail to replicate known
    precipitation changes both in sign as well as magnitude.

    […]

    CONCLUSIONS:

    It is impossible to present reliable future projections from a collection of climate
    models which generally cannot simulate observed change. As a consequence, we
    recommend that unless/until the collection of climate models can be demonstrated to accurately capture observed characteristics of known climate changes, policymakers should avoid basing any decisions upon projections made from them. Further, those policies which have already be established using projections from these climate models should be revisited.

    Assessments which suffer from the inclusion of unreliable climate model projections include those produced by the Intergovernmental Panel on Climate Change and the U.S. Global Climate Change Research Program (including the draft of their most recent National Climate Assessment). Policies which are based upon such assessments include those established by the U.S. Environmental Protection Agency pertaining to the regulation of greenhouse gas emissions under the Clean Air Act.

    http://wattsupwiththat.com/2013/06/27/policy-implications-of-climate-models-on-the-verge-of-failure/

    Re the EPA assessments, see ‘Amicus brief to the Supreme Court’ (filed May 23, 2013):

    http://www.climateconversation.wordshine.co.nz/open-threads/climate/regions/usa/#comment-213937

  14. Richard C (NZ) on July 12, 2013 at 10:35 am said:

    ‘Climate change: The forecast for 2018 is cloudy with record heat’

    Efforts to predict the near-term climate are taking off, but their record so far has been patchy.

    * Jeff Tollefson

    In August 2007, Doug Smith took the biggest gamble of his career. After more than ten years of work with fellow modellers at the Met Office’s Hadley Centre in Exeter, UK, Smith published a detailed prediction of how the climate would change over the better part of a decade1. His team forecasted that global warming would stall briefly and then pick up speed, sending the planet into record-breaking territory within a few years.

    The Hadley prediction has not fared particularly well. Six years on, global temperatures have yet to shoot up as it projected. Despite this underwhelming result, such near-term forecasts have caught on among many climate modellers, who are now trying to predict how global conditions will evolve over the next several years and beyond. Eventually, they hope to offer forecasts that will enable humanity to prepare for the decade ahead just as meteorologists help people to choose their clothes each morning.

    These near-term forecasts stand in sharp contrast to the generic projections that climate modellers typically produce, which look many decades ahead and don’t represent the actual climate at any given time. “This is very new to climate science,” says Francisco Doblas-Reyes, a modeller at the Catalan Institute of Climate Sciences in Barcelona, Spain, and a lead author of a chapter that covers climate prediction for a forthcoming report by the Intergovernmental Panel on Climate Change (IPCC). “We’re developing an additional tool that can tell us a lot more about the near-term future.”

    In preparation for the IPCC report, the first part of which is due out in September, some 16 teams ran an intensive series of decadal forecasting experiments with climate models. Over the past two years, a number of papers based on these exercises have been published, and they generally predict less warming than standard models over the near term. For these researchers, decadal forecasting has come of age. But many prominent scientists question both the results and the utility of what is, by all accounts, an expensive and time-consuming exercise.

    […]

    By starting in the present with actual conditions, Smith’s group hoped to improve the model’s accuracy at forecasting the near-term climate. The results looked promising at first. The model initially predicted temperatures that were cooler than those seen in conventional climate projections — a forecast that basically held true into 2008. But then the prediction’s accuracy faded sharply: the dramatic warming expected after 2008 has yet to arrive (see ‘Hazy view’). “It’s fair to say that the real world warmed even less than our forecast suggested,” Smith says. “We don’t really understand at the moment why that is.”

    […]

    Smith says that his group at the Hadley Centre has doubled the resolution of its model, which now breaks the planet into a grid with cells 150 kilometres on each side. Within a few years, he hopes to move to a 60-kilometre grid, which will make it easier to capture the connections between ocean activities and the weather that society is interested in. With improved models, more data and better statistics, he foresees a day when their models will offer up a probabilistic assessment of temperatures and perhaps even precipitation for the coming decade.

    In preparation for that day, he has set up a ‘decadal exchange’ to collect, analyse and publish annual forecasts. Nine groups used the latest climate models to produce ten-year forecasts beginning in 2011. An analysis of the ensemble6 shows much the same pattern as Smith’s 2007 prediction: temperatures start out cool and then rise sharply, and within the next few years, barring something like a volcanic eruption, record temperatures seem all but inevitable.

    “I wouldn’t be keen to bet on that at the moment,” Smith says, “but I do think we’re going to make some good progress within a few years.”

    http://www.nature.com/news/climate-change-the-forecast-for-2018-is-cloudy-with-record-heat-1.13344

    # # #

    No mention of UKMO’s Dec 2012 5 year forecast to 2017 but basically, all these near-term model predictions “start out cool and then rise sharply” no matter what year they start them.

    I think they have a collective problem.

  15. Richard C (NZ) on August 29, 2013 at 2:42 pm said:

    Two GCM papers appear to be creating a “buzz” at present.

    First paper:

    ‘Recent global warming hiatus tied to equatorial Pacific surface cooling’

    Yu Kosaka and Shang-Ping Xie

    [Judith Curry] “….the same natural internal variability (primarily PDO) that is responsible for the pause is a major and likely dominant cause (at least at the 50% level) of the warming in the last quarter of the 20th century”

    http://judithcurry.com/2013/08/28/pause-tied-to-equatorial-pacific-surface-cooling/

    [John Michael Wallace of the University of Washington] “It argues that not only could the current hiatus in the warming be due to natural causes: so also could the rapidity of the warming from the 1970s until the late 1990s”

    http://www.climatecentral.org/news/new-study-ties-global-warming-hiatus-to-a-pacific-cooldown-16405

    Second paper:

    ‘Overestimated global warming over the past 20 years’

    Opinion & Comment by Fyfe, Gillett and Zwiers

    [Judith Curry] “Their conclusion This difference might be explained by some combination of errors in external forcing, model response and internal climate variability is right on the money IMO”

    http://judithcurry.com/2013/08/28/overestimated-global-warming-over-the-past-20-years/

    [The Hockey Schtick] “The authors falsify the models at a confidence level of 90%, and also find that there has been no statistically significant global warming for the past 20 years”

    http://hockeyschtick.blogspot.co.nz/2013/08/new-paper-finds-climate-models-have.html

    # # #

    “Pause”, “hiatus”, and “divergence” now standard climatological terms in the literature apparently.

    • Richard C (NZ) on August 29, 2013 at 4:36 pm said:

      Twitter / BigJoeBastardi: Now “climate researchers” will …

      Now “climate researchers” will want huge grants to tell us that when pdo warms in 20 years, warming will resume,after drop to late 70s temps

      Twitter / RyanMaue: Cold-phase of PDO means …

      Cold-phase of PDO means “hiatus/less/pause/plateau” of warming. We need a Nature article w/climate models to prove this?

      Twitter / RyanMaue: I already blamed lack of global …

      I already blamed lack of global TC activity from 2007-2012 on colder Pacific conditions. I thought it was so apparent to be non-publishable

      Twitter / BigJoeBastardi: The arrogance and ignorance …

      The arrogance and ignorance of these guys, now “discovering” what many have forecasted to happen due to cold PDO is stunning

      http://tomnelson.blogspot.co.nz/2013/08/links_1320.html

    • Richard C (NZ) on August 29, 2013 at 4:43 pm said:

      Tisdale re Kosaka and Xie:

      “Anyone with a little common sense who’s reading the abstract and the hype around the blogosphere and the Meehl et al papers will logically now be asking: if La Niña events can stop global warming, then how much do El Niño events contribute? 50%? The climate science community is actually hurting itself when they fail to answer the obvious questions.”

      http://wattsupwiththat.com/2013/08/28/another-paper-blames-enso-for-the-warming-hiatus/

      ‘Global warming pause caused by La Nina’

      The researchers said similar decade-long pauses could occur in future, but the longer-term warming trend was “very likely to continue with greenhouse gas increases”.

      Read more: http://www.smh.com.au/environment/climate-change/global-warming-pause-caused-by-la-nina-20130829-2ss3p.html#ixzz2dKYCdLUn

      # # #

      Or “…the longer-term warming trend was “very likely to [turn to cooling] with [solar decreases]”

      It all depends on the (correct) attribution.

    • Richard C (NZ) on August 29, 2013 at 5:10 pm said:

      Settled science: The heat is hiding in the ocean, while the Pacific Ocean cools, and it’s “pretty straightforward” and “complicated” and a “a chicken vs. egg problem” dogs the finding Pacific Ocean cools, flattening global warming

      “Really, this seems pretty straightforward. The climate is complicated, and natural variability can mask trends seen over century-long timescales,” says climate scientist David Easterling of the National Oceanic and Atmospheric Administration’s National Climatic Data Center in Asheville, N.C.

      MIT’s Susan Solomon is more skeptical of the Pacific Ocean cooling as an explanation for the flattening, saying “a chicken vs. egg problem” dogs the finding. “Did the sea surface temperatures cool on their own, or were they forced to do so by, for example, changes in volcanic or pollution aerosols, or something else? This paper can’t answer that question.”

      http://tomnelson.blogspot.co.nz/2013/08/settled-science-heat-is-hiding-in-ocean.html

  16. Richard C (NZ) on September 12, 2013 at 3:16 pm said:

    New paper finds ‘up to 30% discrepancy between modeled and observed solar energy absorbed by the atmosphere’

    More problems for the climate models: A paper published today in Geophysical Research Letters finds that there is “up to 30% discrepancy between the modeled and the observed solar energy absorbed by the atmosphere.” The authors attribute part of this large discrepancy, which would alone have a greater radiative forcing effect than all of the man-made CO2 in the atmosphere, to water vapor absorption in the near UV region [see hotlink], “But the magnitude of water vapor absorption in the near UV region at wavelengths shorter than 384 nm is not known.” The authors note, “Water vapor is [the most] important greenhouse gas in the earth’s atmosphere” and set out to discover [apparently for the first time] “The effect of the water vapor absorption in the 290-350 nm region on the modeled radiation flux at the ground level.”

    ‘The influence of water vapor absorption in the 290-350 nm region on solar radiance: Laboratory studies and model simulation’

    Juan Du, Li Huang, Qilong Min, Lei Zhu

    Abstract

    [1] Water vapor is an important greenhouse gas in the earth’s atmosphere. Absorption of the solar radiation by water vapor in the near UV region may partially account for the up to 30% discrepancy between the modeled and the observed solar energy absorbed by the atmosphere. But the magnitude of water vapor absorption in the near UV region at wavelengths shorter than 384 nm is not known. We have determined absorption cross sections of water vapor at 5 nm intervals in the 290-350 nm region, by using cavity ring-down spectroscopy. Water vapor cross section values range from 2.94 × 10-24 to 2.13 × 10-25 cm2/molecule in the wavelength region studied. The effect of the water vapor absorption in the 290-350 nm region on the modeledradiation flux at the ground level has been evaluated using radiative transfer model.

    >>>>>>>>

    http://hockeyschtick.blogspot.co.nz/2013/09/new-paper-finds-up-to-30-discrepancy.html

  17. Richard C (NZ) on September 23, 2013 at 7:10 pm said:

    ‘Leaked SPM AR5: Multi-decadal trends’

    Data Comparisons Written by: lucia

    […]

    A way into the section the draft states.

    “Models do not generally reproduce the observed reduction in surface warming trend over the last 10–15 years……………….”

    […]

    Earlier in the draft we find:

    “There is very high confidence that climate models reproduce the observed large-scale patterns and multi-decadal trends in surface temperature, especially since the mid-20th century”

    So evidently the AR5 will admit that they have not reproduced observed warming in the past 10-12 years, speculate that it might be unpredictable climate variability, solar, volcanic or aerosol forcings or possibly due to ‘too strong a response to increasing greenhouse-gas forcings”, which mostly amounts too excess climate sensitivity. That said, reading the leaked draft, I can’t help but wonder about their definition of “multi-decadal”. Generally, I assume that means “two or more decades”. So, I ran my script to get roughly 15, 20 and 25 year trends, comparing the observed earth trend to the spread in trends in the ‘AR5′ models forced using the rcp45 scenario.

    […]

    As you can see, while the 15 year trend (discussed in the leaked draft SPM) are just on the edge of the model spread, longer term predictions are fall outside. So I would think if they don’t have great confidence in predicting 15 year trends, they would have even less confidence in predicting “multi-decadal” trends. But what do I know?

    Anyway, possibly this leaked draft is a hoax. We’ll see.

    http://rankexploits.com/musings/2013/leaked-spm-ar5-multi-decadal-trends/

  18. Richard C (NZ) on September 29, 2013 at 4:51 pm said:

    ‘Viewpoints: Reactions to the UN climate report’

    BBC

    Professor John Shepherd, ocean & earth science, University of Southampton

    “….no-one ever claimed that climate models could predict all these decadal wiggles”

    http://www.bbc.co.uk/news/science-environment-24296204

    # # #

    Successive decadal wiggles are what make multidecadal projections. And Kosaka and Xie (2013) modeled (in retrospect) the present decadal wiggle when constrained by natural oceanic variation.

    Therefore, natural variation (e.g. PDO/AMO) must be integrated in the models before realistic projections can be made – the sceptics argument for yonks,

  19. Richard C (NZ) on November 15, 2013 at 10:02 am said:

    ‘New paper finds simple laptop computer program reproduces the flawed climate projections of supercomputer climate models’

    The Hockey Schtick

    A new paper finds a simple climate model based on just three variables “and taking mere seconds to run on an ordinary laptop computer, comes very close to reproducing the results of the hugely complex climate models.” and “The [laptop computer] model was based on three key processes: how much energy carbon dioxide prevents from escaping to space (radiative forcing), the relationship between rate of warming and temperature, and how rapidly the ocean takes up heat (ocean thermal diffusivity).”

    Actually, you only need one independent variable [CO2 levels] to replicate what the highly complex supercomputer climate models output. This has been well demonstrated by Dr. Murry Salby in his lecture, which shows 1:1 agreement between the supercomputer-simulated global temperature and CO2 levels over the 21st century: [see graph]

    More>>>>>>>

    http://hockeyschtick.blogspot.co.nz/2013/11/new-paper-finds-simple-laptop-computer.html

  20. I remember, even clearer now 20yrs later, when I was studying Electrical Engineering at University doing Philosophy 101. We were the first year that had philosophy added to the course with the intention to opening the eyes of potential engineers to a frame of reference for the decisions we may one day make as engineers.

    The topics I remember were:
    • Value judgments, how our personal values influence what’s right
    • The energy crisis – over reliance of fossil fuels
    • Global warming, The Greenhouse effect, man-made CO2 emissions
    I very much enjoyed the Value judgments topic. Why do we make bridges only 2.5 times stronger than their maximum loading? What makes your decisions and values more important than others? Does it take into account the potential of natural disasters? Excellent stuff!
    The Energy Crisis topic didn’t make as much sense to me. So many loaded learnings. We weren’t philosophizing, we were being brain washed. I could understand that fossil fuel is a finite resource but I also knew, at the same time I was being brain washed, the constant discovery of more and larger deposits that technology was helping us find were being discovered. We were being told to use Nuclear energy, solar, wind, tidal, etc. etc. This only had me think, what is the environmental cost of those resources? Why aren’t we discussing those in philosophy rather than being brain washed? I was sure, even without evidence, the environmental cost of making solar panels was likely to be expensive. Not only were fossil fuels required to make them but how much processing and environmental damage? I knew we weren’t being encouraged to think, but to agree. Anyway, so what if I don’t agree?
    Then the topic of Global Warming. I can’t tell you why, it must have been instinct but the whole topic did not sit right with me. Maybe because it was so accusational? It was our fault! And therefore it was our responsibility to fix it. Nope! I didn’t have anything to do with our current position at 20yrs of age. I knew it wasn’t me, and I was feeling uncomfortable about the whole delivery of this brain washing. I immediately agreed, we probably should stop polluting the planet and reduce our use of fossil fuels but the rest was rubbish.
    I was not happy and there was a lack of scientific evidence. And then, the evidence that was produced? Well it was a chart of the earth’s temperature related to the suns radiation. I don’t have a copy any longer, all these years later and I can’t find it online. What I saw, at least in my mind was a direct correlation between the suns output compared to the earth’s temperature. It was as clear as day for me. I wanted to find evidence to support my gut feeling and the only piece of evidence that seemed to matter, but there was none. Keep in mind the internet wasn’t what it is today. The best thing about the internet at that moment in time was the release of Netscape, so I got to see boobs on the computer. Yes indeed, remember the very first steps into the World Wide Web? I do.
    Needless to say, I failed philosophy. I would not spew their lies. I still say, I have never learned more than I did when I failed philosophy.
    All these years later I have found a growing movement of educated and intelligent people who share my suspicion towards the global warming lie. Ok, I better clarify that comment. The lie that global warming is due to man-made CO2 emissions.
    I encourage everyone to research this for themselves. It shouldn’t be a surprise that I refer you to a community I am involved with SuspiciousObservers.
    Here is Ben’s latest conference which is a great start and overview. Watch this if nothing else. Ben Davidson: The Variable Sun and Its Effects on Earth | EU2014
    Their website contains a wide variety of brilliant information including:
    • Starwater – water comes from stars and every planet has water
    • C(lie)mate – the global warming lie
    • Agenda 21
    Check out the daily SO news on YouTube at https://www.youtube.com/user/Suspicious0bservers
    See weather presented from a space perspective.
    It’s bigger than you think.
    Rikdownunda

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>