Thursday, May 8, 2008

Why Is Climate So Stable?

According to Dot Earth a few weeks ago, one of the main questions facing climate scientists is "what causes climate change?" I think that question is backwards. The right question is the one Lovelock starts with, which is "what causes climate stability?"

The Lovelock formulation, which treats the stability rather than the change as the unexplained phenomenon, may be the right way round for this question.

How did the biosphere persist for so long? Are we just really really really lucky?

A variation of the anthropic principle indicates that this might well be the case: if we weren't really lucky (compared to other planets) we wouldn't evolve far enough to ask the question. That's even scarier than the Gaia hypothesis since our luck might well run out at any moment even without us rocking the boat. Both of these points of view make it unlikely that our current behavior will be benign.

(Indeed, climate models tend to be unstable; it seems to require a narrow range of parameters to get realistic behavior. The reasons for this are interesting, but can only be investigated if people relax and accept that the models themselves are worthy of certain forms of investigation. Had computers preceded the carbon spike, so that there was less controversy, I suspect we would have broader investigations about climate modeling than we do now.)

Starting from the point of view that climate change needs explaining is wrong. If we assume that it is stability that needs explaining, we are quickly left wondering at what point the stability we count on will fail, and how much pushing it needs to fail very badly.

We are left with a very silly proposition, that we establish beyond a shred of doubt what the least amount of carbon (etc.) in the atmosphere is that is absolutely certain to have vast consequences. This treats carbon as a defendant in a courtroom with intrinsic rights. But we only have one planet.

The main job we have as climatologists should be to establish beyond a shred of doubt the greatest amount of carbon absolutely certain NOT to have vast consequences.

Those who find our methods unimpressive, those who believe we contribute no information should rationally act as if any increase in concentration of radiatively active substances is extremely dangerous.

It is the fact that the 'skeptics' argue the exact opposite that convinces me they are not intellectually serious. It's not a coherent position at all. If they are serious, and really mean well, and really don't believe much that we say, they would argue for extreme caution. After all, if we don't really know at all how big or how small a change in aerosols or greenhouse gases might be enough to make the world dramatically worse, we really ought to stop doing any of that.

Update: A so-called skeptic, Patrick Frank, has an article in Skeptic magazine that (begins with the very premise) reaches a very poorly supported argument (my inaccurate summary restated per the author's request in the comments) that we know essentially nothing, and concludes from that, somehow, that we can act with impunity:
Nevertheless, those who advocate extreme policies to reduce carbon dioxide emissions inevitably base their case on GCM projections, which somehow become real predictions in publicity releases. But even if these advocates admitted the uncertainty of their predictions, they might still invoke the Precautionary Principle and call for extreme reductions “just to be safe.” This principle says, “Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.”34 That is, even if we don’t fully know that CO2 is dangerously warming Earth climate, we should curtail its emission anyway, just in case. However, if the present uncertainty limit in General Circulation Models is at least ±100 degrees per century, we are left in total ignorance about the temperature effect of increasing CO2. It’s not that we, “lack … full scientific certainty,” it’s that we lack any scientific certainty. We literally don’t know whether doubling atmospheric CO2 will have any discernible effect on climate at all.
If our knowledge of future climates is zero then for all we know either suppressing CO2 emissions or increasing them may make climate better, or worse, or just have a neutral effect. The alternatives are incommensurate but in our state of ignorance either choice equally has two chances in three of causing the least harm.35 Complete ignorance makes the Precautionary Principle completely useless. There are good reasons to reduce burning fossil fuels, but climate warming isn’t one of them.
(emphasis added)

Of course, the characterization of climate models is ludicrous. I would agree in some sense that complete ignorance makes the precautionary principle useless, and I am not a big fan of absolute principles. But in the present case the stakes are vast, and we do know that we are emitting greenhouse gases and aerosols that materially affect the climate in vast quantities. Given only that knowledge and no other it strikes me that we should restrain thses activities to the greatest extent feasible.
Fortunately Tapio Schneider has an article in the same issue, with a cogent if unsurprising presentation of the conventional wisdom. The usual "sense vs nonsense" sort of balance, but better than letting the person who doesn't even understand the evidence he's cherry-picking claim the label of "skeptic" unchallenged.

Update: The author of the ill-informed article has much to say below. Essentially my critique of it doesn't add anything to Tapio Schneider's, while his adds considerably to mine. Thanks to Hank Roberts for pointing this out in an email.

33 comments:

  1. Michael good post

    YES, we are very very lucky to be here at all. I agree.

    I see by the papers that NYTimes columnist Thomas Friedman has a new book coming out this summer titled "Hot, Flat and Crowded". Here's a quote from the book:

    "When the world is flat, you can innovate without having to emigrate ... we are about to see creative destruction on steroids."

    ReplyDelete
  2. I disagree with the premise. Stability is the zero-order assumption from Stefan-Boltzmann. A hotter body radiates more. OK, there are feedbacks that can make for a hotter or colder surface even if the radiating temperature stays the same - and albedo certainly could cause bistability in principle (snowball earth), with or without a biosphere. (gratuitous self-citation, since no-one else ever has :-) )

    It's not even clear what you mean by "stable". We have had ~6C swings over a few thousand years, and even larger changes over the longer term! If you mean, why is the multi-annual energy balance generally within 0.1W/m^2 globally (or whatever), then I would refer to the Central Limit Theorem.

    ReplyDelete
  3. There is in deep paleo a topic called the "faint young sun paradox" which addresses the question as to why the ocean was liquid even when the sun was much dimmer. Calls to just the right amount of methane amount to calls for dumb luck, else for unknown homeostasis mechanisms, or somewhat smarter luck. It seems like luck anyway.

    Then there is the matter of how we got out of the snowball state, a question which engages much of Ray P's thinking.

    Looking over the last 30 M yrs, we see what amounts to an oscillatory instability which is not well understood. How hard can we push it?

    ReplyDelete
  4. This line of thought can get you very close to the anthropic principle aka lucky design

    ReplyDelete
  5. But Michael, re the last 30M+ years, doesn't it appear that the warmer state is overall more stable (ignoring for the moment the issue of a bumpy transition possibly involving a methane hydrate excursion or two)? IOW, isn't the instability pretty much associated with the ice sheets due to their ability to change states quickly?

    ReplyDelete
  6. You said: "Those who find our methods unimpressive, those who believe we contribute no information should rationally act as if any increase in concentration of radiatively active substances is extremely dangerous.

    It is the fact that the 'skeptics' argue the exact opposite that convinces me they are not intellectually serious. It's not a coherent position at all. If they are serious, and really mean well, and really don't believe much that we say, they would argue for extreme caution. After all, if we don't really know at all how big or how small a change in aerosols or greenhouse gases might be enough to make the world dramatically worse, we really ought to stop doing any of that."
    -------------

    I have to disagree. We all act on rational expectations. I can name any number of scenarios that could happen to you or I in the course of a day. We don't live our lives as if even one of them ARE going to happen, much less as if ALL of them are going to happen.

    ReplyDelete
  7. To the extent that paleoclimate shows stability outside the extreme high and low limits set by physics, I'll attribute biological evolution for the stability.

    This, I sorta think, is the Lovelock/Margulis 'Gaia' position.

    But then I suppose I ought to include a certain amount of luck as well...

    ReplyDelete
  8. You've completely misrepresented my article: "...begins with the very premise that we know nothing, and concludes we can act with impunity."

    I neither began with that premise, nor ended with that conclusion. Either you've commented on my article without properly reading it, or you've read it without comprehending it. Your discussion is shallow and tendentious.

    ReplyDelete
  9. So I stand accused of being shallow and tendentious by someone who is willing to publish a belief that the uncertainty in climate models is 100 degrees per century.

    ReplyDelete
  10. My thanks for modifying your comments, even though you declined to fully remove the original inaccuracies. However, you also wrote, "...concludes from that, somehow, that we can act with impunity."

    Show anywhere in my article where I concluded anything remotely like that.

    You also wrote that I, "...accused [you] of being shallow and tendentious"

    Rather, I wrote that your discussion was shallow and tendentious. There was no ad hominem in my comment. Please read for accuracy.

    It wasn't a "belief" I published, but a calculated estimate based on integrated cloud error. All my work is in the Supporting Information freely available on the Skeptic website. If you don't like the centennial uncertainty margins I imputed to GCMs, then show where my estimates are wrong.

    You work with climate models. How about if you propagate all the parameter uncertainties through the GCM of your choice, and report back the predictive uncertainty limits. Then we'd all know how to judge an climate prediction.

    ReplyDelete
  11. The idea that parameters can be tuned independently is not Pat Frank's most amateurish mistake, since the famous Dr. Myles Allen makes a similar one.

    It is at the root of his confusion though, just as it is with Dr Allen's, in my opinion. Allen et al though at least have the sense to eliminate the models which produce nonphysical results in their parameter sweeps.

    What you do is propagate the uncertainty conditional on the model doing a sufficiently good job on the observed climate. That is in fact what I have been engaged to help do around here.

    You will find the spread much smaller than you expect if you limit the spread to models which produce a credible result on the unforced system.

    ReplyDelete
  12. "There are good reasons to reduce burning fossil fuels, but climate warming isn’t one of them. (...) And aren’t we much better off accumulating resources to meet urgent needs than expending resources to service ignorant fears?"

    Hmm, that sounds very much like a proposal that we act with impunity.

    Michael, Pat's worst mistake was in imagining that he could fool anyone into thinking that without the models there's no proof that we have a problem. And who knows, maybe a few people who read his article and not Tapio's were fooled to the extent they weren't completely put off by the crazed rant Pat chose to end with.

    ReplyDelete
  13. Where did I suggest that parameters can be tuned independently? You exhibit a penchant for assigning me positions that I have never taken.

    Your comment, "What you do is propagate the uncertainty conditional on the model doing a sufficiently good job on the observed climate," gives away the problem.

    Your parameterization scheme is empirical rather than physical, as you say it's based upon correspondence with observables rather than on physical theory. That means a given scheme has no obvious physical meaning, even though it may scale physical processes, and the parameters themselves have no obvious physics-based magnitude.

    And so the reason the parameter uncertainties haven't been propagated through the physics of a GCM is that you have no idea what those uncertainties are.

    Given your stated approach, you ought to approve of the cloud forcing uncertainty I use in the article, as it's entirely empirical and based upon correspondence between models and climate observables. And that forcing uncertainty, by the way, is as large as the entire excess forcing assigned to all the human-produced greenhouse gases. And yet, in your name, in its SPM the IPCC passes over the former and claims to detect the latter.

    You say that the spread of model error is smaller if it is limited to credible results on unforced systems. That is, you want to limit uncertainties to already tuned parameter schemes. This is a statistically false uncertainty because it snoops the results and inheres a circular reasoning.

    But in any case, look at Figure 8.2b from WGI Chapter 8 of the 4AR. It shows the global mean annual model temperature error, which looks to average about (+/-)2 C. How does one propagate a per-year (+/-)2 C uncertainty across 100 years in a time-step projection without accumulating a huge final uncertainty?

    I stopped by here to protest an outlandish misrepresentation of my article, not to endure repeated personal disparagements. If I made "amateurish" mistakes, go ahead and point one out in what I've actually done, rather than invent ones for your convenience. It's all there in the Supplemental Information. Try referring to the specifics of that.

    Steve Bloom, I see your posts here are as cavalier as they were at ClimateAudit.

    ReplyDelete
  14. An exhaustive list of the places in which Mr. Frank displays confusion is not intrinsically a matter of great interest.

    Suffice it to say that the figure to which he refers is an intermodel rms error in climatology. The reader is left to puzzle out how to "propagate" this to long time scales. I hope most will see why Mr. Frank's approach is problematic.

    ReplyDelete
  15. I think Mr. Frank does a good job of showing how your heavily invested beliefs are problematic.

    Not in his article, but in this response thread. You have displayed the same arrogant behavior in essentially all of your posts and responses.

    The joking title of your blog says it all. You mock all the same accusations you make of others, while engaging in every practice you decry.

    ReplyDelete
  16. I've searched the Skeptic site for the supporting calculations that Patrick Frank says support his method for propagating errors through a Global Climate Model. Where are they?

    Paul Middents

    ReplyDelete
  17. Steven, I heard this somewhere: "As with the case of the resort where young women go looking for rich husbands and rich husbands go looking for young women, the situation is not quite as symmetrical as it might at first appear."

    Your point that my reaction to Pat Frank makes me come off as arrogant did not miss my attention prior to your comment. Indeed, I caution against coming off as arrogant.

    In the present case, I took the chance, because 1) it helped me make the case I was originally starting with 2) it was so transparently incorrect as well as offensive that it really is not a worthy target for a serious rebuttal.

    The point here is that ignorance should lead to caution.

    Those alleging that our field is ignorant should be the ones arguing most stridently for caution. Yet they regularly jump to the opposite conclusion, Mr. Frank being a particularly extreme example.

    You must make a case that there is a systematic bias in out predictions to argue that signs indicating caution are invalid. While there are a few efforts to do that (like most of our opposition,. I think arguing backwards from the conclusion, but never mind that for now) there are plenty of arguments that start with ignorance, skip over bias, and go directly to a suggestion that no caution is indicated. It is those arguments that I am trying to call into question.

    For myself, the days when I find myself most suspicious of the field's abilities are the days when I fear most for the future. This seems coherent to me. The opposite does not.

    ReplyDelete
  18. I looked a little harder and found Mr. Frank's supporting information:

    http://www.skeptic.com/the_magazine/featured_articles/v14n01resources/climate_belief_supporting_info.pdf

    ReplyDelete
  19. Thank you Paul. On p 29 of Frank's material appears the following text.

    [others and]... IPCC are using the mean of each tn-1 point as the initial condition for calculation of each subsequent tn, but discarding the ±(numerical uncertainty) of each tn-1 at each iterated forward step of the GCM temperature projection. Every tn-1 is thus iterated forward as though it were perfectly accurate, physically. But this is incorrect. The numerical uncertainty, ±e1(tn-1), is a true uncertainty in the magnitude of each calculated temperature, tn-1. That same numerical uncertainty becomes a real physical uncertainty when the calculated temperature tn-1 is propagated forward to calculate further physical quantities. That is, as soon as each tn-1 is propagated into and through the physical theory of a GCM, its uncertainty is transformed from numerical into physical. That means the entire numerical uncertainty around each tn-1 becomes a true physical uncertainty in the magnitude of each tn-1 and must become part of the physical initial state of each subsequent tn. Every ±tn-1 must be included in the calculation of each subsequent tn .

    In practice, this means the numerical uncertainty accumulates across the years of a time-wise GCM temperature projection in a manner identical to the propagation of theory-bias error as described by equation S6 above. The accumulating numerical uncertainty then also produces a widening vertex of physical uncertainty about the mean temperature series, and the radius of this vertex increases with the number of projection years.


    This recapitulates the fundamental naive error which I already noted. I do not think it is worth ploughing through the thirty pages given an error of this caliber.

    The production of thirty pages of charts and graphs is not identical with the production of scientific sense.

    ReplyDelete
  20. He's a chemist -- yet he isn't even mentioning, let alone worried, about the real urgent problem, ocean pH change. Why not? That's far more clear and precise than the atmospheric change, and happening faster.

    I don't get it. It seems to me that people are attacking climate models because they can't bear the thought of what happens if we address simple physical chemistry changes.

    ReplyDelete
  21. By the way, I'm seeing mention that Dr. Wunsch 'reviewed' this article.
    The article says he looked at part of an earlier version -- not the same thing. I'd like to know his opinion.

    I recall he wrote this not long ago: "global warming is both real and threatening in many different ways, some unexpected.... As a society, we need to take out insurance against catastrophe in the same way we take out homeowner's protection against fire.... Scientifically, we can recognize the reality of the threat, and much of what society needs to insure against. Statements of concern do not need to imply that we have all the answers."

    http://www.realclimate.org/index.php/archives/2007/03/swindled-carl-wunsch-responds/langswitch_lang/fr#more-417

    ReplyDelete
  22. Does this help?

    "Calculation of the global mean RMS error, ..... demonstrated improved intermodel consistency since the TAR. Problems still remain, however, ..."
    ipcc-wg1.ucar.edu/wg1/Comments/drafts/AR4WG1_Ch08_SOD.pdf

    ReplyDelete
  23. Or this -- an inline answer Gavin gave at RC:

    "Frank confuses the error in an absolute value with the error in a trend. It is equivalent to assuming that if a clock is off by about a minute today, that tomorrow it will be off by two minutes, and in a year off by 365 minutes. In reality, the errors over a long time are completely unconnected with the offset today. - gavin]"

    http://www.realclimate.org/index.php/archives/2008/05/what-the-ipcc-models-really-say/#comment-86545

    ReplyDelete
  24. Hank, yes I was thinking something along those lines; by treating a climatology as a trend he shows a complete lack of familiarity with the approaches we take and so is ill-equipped to critique them.

    Also, he misses the point of climate modeling altogether, which is that the error is bounded in climate models for essentially the same reason that we have climate zones at all. The individual excursions are constrained by a remarkably interesting set of physical principles (with known mathematical expressions).

    We keep coming back to the same seasons because energy flows through the system in certain ways. We set up our climate models to behave analgously.

    That's why, though small errors accumulate to overwhelm weather prediction accuracy (far faster than numerical error, by the way) they don't accumulate in climate projections.

    There are plenty of other problems with climate models, but the way of thinking expressed by Pat Frank makes adds nothing sensible to the conversation.

    ReplyDelete
  25. So, Michael, I gather you think it's amateurish and problematic to take the uncertainties in intermediate results and propagate them forward through a sequential step-wise calculation, with those results, in order to know the uncertainty in the final result.

    Just as peculiarly, Gavin and you each seem to think that the uncertainty in the results of a stepwise calculation can be reduced by taking differences to produce an anomaly trend.

    That's not correct. The uncertainty about a trend in full values keeps its full magnitude if you recast the trend as anomalies by subtracting a mean or a base value.

    Further, climate modelers don't calculate trends. They calculate entire climates and then extract the trends with respect to a baseline. The uncertainties about the climatological values of cloudiness, or temperature, etc., keep their magnitude when climate anomalies are calculated across time.

    Uncertainties are not constrained by physical bounds, Micheal. When the uncertainties exceed the bounds, it means you no longer know where within the bounds your physical system lays.

    This point is demonstrated, for example, by the minimal (+/-)2.8 Wm^-2 forcing uncertainty due to clouds. This uncertainty is as large as the entire purported forcing effect of all of the excess human-produced GHGs. That uncertainty means you don't know where the physical system exists within that (+/-)2.8 Wm^-2 and so the effect of GHGs can't be detected with any certainty.

    When that system is propagated forward in time, the uncertainty causes a spread in calculated outcomes at each step. That spread can't be ignored. It means that after a certain number of steps, the uncertainty is larger than your physical bounds, and you no longer know where your system is, within those bounds.

    This is a common problem in calculating the states of all time-dependent physical systems. With your training, it's beyond understanding how this escapes you.

    ReplyDelete
  26. Pat, I am not likely the right person to lift the confusion under which you are laboring. However, mostly for the benefit of readers, I will make an effort in that direction.

    Your argument, while formulated a bit idiosyncratically, is sound for weather prediction but not for climate prediction.

    That is, your claim "It means that after a certain number of steps, the uncertainty is larger than your physical bounds, and you no longer know where your system is, within those bounds." is universally regarded as correct, but climate models do not attempt to predict the state of the system.

    They attempt to predict the statistical properties of the system, and do so effectively, for instance, matching paleontological evidence about paleoclimates.

    As Gavin points out, you also misunderstand the meaning of the figure you point to, which is not a trend but a climatology. There is nothing annual about it, so you cannot multiply it by a century to get a century-scale perturbation.

    Mostly, though, you have confused the meanings of the terms weather and climate. I recommend you consider the Lorenz system, specifically considering the regularity of its behavior as well as the unpredictability of its state.

    ReplyDelete
  27. Mike, the figure that I pointed to in the 4AR has nothing to do with the issues raised in my article. The caveat Gavin made about it is tangential and I prefer to let it lay for the time being.

    However, to the main point, uncertainty in cloud forcing, for example, amounts to an uncertainty in initial conditions. In each calculation of the evolution of the system one must calculate an ensemble of trajectories at each step, rather than just one, because the initial conditions have a spread of values represented by the uncertainty extremes.

    At every step in a serial calculation, the constant physical uncertainty in, e.g., cloud forcing, produces a new spread of values about each of the multiple intermediate states of the evolving system, and so every individual trajectory in every intermediate ensemble will further furcate into a new ensemble with every step in the calculation of subsequent states. Trivially, each furcation will derive from the two extremes of the relevant physical uncertainty. The result will be a widening cone of trajectories that will spread out in every phase-space dimension of the system for which there is a constant physical uncertainty of input.

    The spread of those trajectories will represent an uncertainty apart from, and in addition to, anything due to deterministic chaos.

    There is no magic statistical bullet that allows anyone to avoid the cumulative physical uncertainty that attends intermediate states in calculating step-wise projections. Climate futures projections are step-wise calculations.

    Averaging of trajectories will not reduce any of the physical uncertainty.

    ReplyDelete
  28. "Climate futures projections are step-wise calculations."

    Not exactly.

    They are the cumulative statistics of step-wise calculations, calculations which are subject to constraints analogous to those which apply to the real world.

    They converge to meaningful results for the same reasons that the earth has a climate. The small errors that overwhelm an instantaneous prediction simply come out in the wash in the statistics.

    There's plenty of empirical evidence of this if you don't buy the argument a priori.

    Sorry, you are just blowing smoke. You may think you know what you are talking about but you don't.

    Do you suppose you are the first person to think of these things? On the one hand you claim to be surprised by my foolishness. On the other, nobody in the whole of science seems to have taken note of this point prior to you. It's not just me. It's you against the entire National Academy, for instance.

    Occam's razor, if nothing else, should give you pause.

    ReplyDelete
  29. It's not me against Matthew Collins, Mike. See Collins, M. (2002) "Climate predictability on interannual to decadal time scales: the initial value problem" Climate Dynamics 19, 671–692.

    You can find the abstract here: http://www.springerlink.com/content/lhyxl04t4qmexnem/

    Collins produced an artificial climate using the HadCM3 and then tested how well it could reproduce that same climate when the initial conditions were very slightly varied. HadCM3 test projections went to zero fidelity after only four seasons, even though it was the perfect model for the reference climate. And he was testing climate, not weather.

    I've looked at the citation record of this paper in SciSearch to see who else may have done such a test. By that criterion, no one has done. But it's almost the perfect test for the effect of physical uncertainties on predictive reliability, and the result validates my point and my article.

    The HadCM3 outputs stayed bounded but the information content went rapidly to zero. This is exactly what is expected if uncertainties accumulate as the model propagates stepwise forward.

    Why not take the GCM of your choice and do the analogous tests? Publish the results in a peer-reviewed journal for us all to see. We'd then discover who is smoking what.

    There's no statistical magic bullet that removes physical uncertainties.

    ReplyDelete
  30. The initial value problem is interesting and controversial but it is the boundary value problem where climate models have established utility.

    It is the residual "stay bounded" part that you acknowledge that is of dominant interest in the GCM world. That's the predictable part, and that's what's normally called "climate".

    I wish the seasonal variability folks didn't call what they do "climate prediction". There is this middle ground between climate and weather. I think it's at best of academic interest. I doubt it will ever have much practical utility, despite the fact that they've been talking this stuff up for decades.

    Collins' assessment agrees with mine on this. The fact that you don't perceive the difference is further evidence that you aren't closely acquainted with practitioners in the field. Why don't you write Collins and ask him whether he thinks GCMs are of no use in predicting the future on multidecadal scales?

    Since he has done some paleoclimate work I expect he will disagree.

    ReplyDelete
  31. Shorter Pat Frank,

    If I throw a coin once, the uncertainty of the outcome is 50%.

    If I throw it 100x, the net physical[?] uncertainty of the outcome is 5000%.

    ReplyDelete
  32. Mike, you wrote, "The initial value problem is interesting and controversial but it is the boundary value problem where climate models have established utility."

    How would you know? You've never done the experiment. Neither has anyone else. Why not repeat Collins' experiment over the long term? It would be a publishable study. Then we'd all know if the initial value problem is unimportant. Otherwise, your claim is unsupported, but badly undercut by Collins' work.

    It's pretty striking that you'd claim a decadal run is not relevant to climate. It's even more striking that you'd claim a climate model that runs to zero fidelity after only a year will somehow track back to a correct trajectory over longer times.

    Collins' result shows that you and Gavin are wrong to exclude the initial value problem. And the entire discussion in terms of initial value or boundary value problems ignores the problem of theory-bias. The tests I did SI section 4.2 were to see how the cloud errors behaved. They behaved like theory bias, and not like Gaussian random error reducible by averaging.

    Theory bias means your projections will diverge continuously from the correct trajectory. Along every axis on which theory-bias operates, the trajectory in a stepwise calculation will continuously trifurcate something like 1+3^(s-1), where 's' is the number of steps, and '3' represents the mean and two uncertainty value extremes. After a multidecadal multi-stepwise calculation, the result will be a very large number of trajectories, and their spread will represent the lack of certainty in the result. This lack of certainty remains present no matter that all of the trajectories are bounded. This is the meaning of Collins' result. The HadCM3 produced bounded climates that had zero fidelity.

    All my Figure 4 calculation does is compress the cloud feedback uncertainty into two dimensions. The wide 2D uncertainty vertex speaks to the higher dimensional spread of the climate trajectories. If the IPCC wants to represent temperature excursions in two dimensions, it ought to represent the physical uncertainty properly, rather than giving us a physically meaningless ensemble SD.

    And for those readers who want to see what initial conditions do to long term Lorentz projections, go here: http://tinyurl.com/94gfm

    Click twice rapidly in the white field, trying to not move the mouse. This will produce two trajectories with very similar initial conditions. Keep watch, and the trajectories will diverge. Eventually, they'll be out of phase, but bounded. Zero fidelity. That's Earth climate and your climate projection, Mike, even with a perfect model. Collins' result in microcosm.

    ReplyDelete
  33. I'm not sure whether Pat Frank willfully misunderstands my position in order to mislead others, or simply displays his incapacity for understanding the topic at hand. The last comment leans rather in the direction of deliberate obfuscation. As usual, it is difficult to be sure.

    I refer the interested reader to these two attempts of mine to elucidate the present matter.

    part 1

    part 2

    With that, comments are closed on this thread.

    ReplyDelete