Monday, July 02, 2007

RealClimate again...

They seem to be desperate to defend their models, if they go to such length in declining the use of stations methodology. Although they accept that single stations might prove to be error-striken, they believe (no counter-checking here) that either the companies running them solve the problems by interpolating, or that it doesn't effect the median temperature data.
Then they go on in length about how climate models are physical in contrast to statistical models and thus are fairly unaffacted by temperature record changes and station errors.

I can't believe that, because they must have some boundary conditions (like starting temperatures) that have to be assembled from the real world. Sadly, they don't release the source codes to their models to verify this. The second problem, I have, is the difference between physical and statistical models, they draw upon.
They obviously think that all mechanics involved in world climate are physically well-known and cared about.
I want to draw upon an example that is smaller in scale, but even as chaotic as world climate, the process of chip removal during broaching or drilling processes. Although the physical laws governing the process are known, the chip creation itself is so chaotic that no prediction can hold. Yes, we can say that there are longer chips or smaller chips, if we know the material in use, but we can't predict what kind of chips we get.
The same we see in climate models, we can tell that there will be a change, either up or down given the natural cycle, but we can't predict the actual next step.
To claim that we have a full physical model of the whole world and its temperature and thermodynamic states is just an illusion. If we had, we could project the way the wind will change and cloud forming is going to happen, however, we cannot (as is stated by the IPCC report).

We indeed have some rudimentary understanding of what governs the earth climate system (like sun-cycles, cosmic rays, cloud forming, Heat Transfer, albedo), but we still don't have physical equations that are more than a close approximation of the 'now'-state.
The whole function to govern the temperature state, would be something like that:

DeltaT = f(cloud forming) * k1 + f(sun-energy) * k2 + f(GHG) * k3 + f(Aerosol) * k4 + f(albedo) * k5 etc.

And as real-climate has already demonstrated, those weighting factors are not known and can become utterly complex...

3 comments:

Anonymous said...

Err, yes they do release the source code for the models. You are free to download and run GISS's model if you care to.

Also, the models boundary conditions are the outside forcings upon the climate model. The temperature record would provide initital conditons. The models are not constrained to observations if you downloaded the model you could work this out for yourself. Climate is a boundary condition problem not an initial condition problem and the boundary conditions are the outside factors controlling Earth's energy budget.

Here is the link:

http://www.giss.nasa.gov/tools/modelE/

Max said...

Ahhh, thank you, because the last time I asked Mr. Mann, whether his code is freely available, he said, he is sorry that it was not...

I'll be back, once I waded through it...

Max said...

As I already said, it is a boundary problem, but when it comes to boundary conditions affecting a climate system, we also need to know what they constitute. Do we exactly know that? They claim we do, but we assuem some energy transfers on several layers of atmosphere and to the ground and back. Also, we make assumptions about solar constants that are not entirely accurate.

We are testing some or most of these assumptions in laboratorys in the hope that no couplings are missed. You usually test one variable while keeping the others constant, but what if they intersect and influence each other?
Iin this case, we try to estimate the effect in the laboratory, but still don't know whether it is the same in the real climate.