Sunday, 3 June 2007

More Climate Inconvenient Truth - Too Much Rain

The climate models used to predict doom and gloom in 50 or 100 years unless we give up heating our homes, driving cars, flying anywhere, importing anything or bathing in warm water are meant to be accurate enough to use as a basis for a change to public policy that will cost trillions of dollars.

Unfortunately, there are more and more examples of climate models' inability to predict anything other than what day of the week it is at a nominated point in the future.

From New Scientist - a solid supporter of global warming catastrophism - comes this article:
Climate experts have cast doubt on the conclusions of a new study predicting that a warmer world would lead to more rainfall - a contradiction of the prediction of most climate change models - which was based on just 20 years of data.
Notice how New Scientist starts its article by immediately jumping to the defence of established orthodoxy rather than coming from a "New study casts doubts on the accuracy of climate models" angle.
Climate models predict that as the planet warms, more water will be suspended in the atmosphere, because hotter air can retain more humidity.

However, this will not be accompanied by an equal increase in rainfall, according to the same models: for every degree of warming, atmospheric humidity will increase by about 7%, while precipitation will only go up by between 1% and 3%.
This is a critical issue. The catastrophic predictions are based on greenhouse gasses such as CO2 acting as a forcing agent on water vapour, which is a feedback mechanism. By increasing the amount of water in the atmosphere it is predicted that temperatures will rise at 'unprecedented' rates in an upward spiral. If the atmosphere is not retaining water at the predicted rate then temperature predictions need to be reduced accordingly.
Frank Wentz and colleagues at Remote Sensing Systems in California, US, looked at satellite records from between 1986 and 2005 to see if the climate models' predictions of a correlation between increased temperatures and reduced rainfall bore out in reality.

During that time, average temperatures increased by 0.4°C. The satellite data allowed the team to measure the total water vapour in the atmosphere and precipitation. They found that over the two decades, both factors increased by between 1.1% and 1.2% - or roughly 6.5% for each degree of warming.
The fact is that the climate models got it wrong, wrong, wrong as they always do. No getting around it but surely there's some way to invalidate the study?
Too short a study?
Oh. OK. 20 years is too short a time frame to be useful but Climate Brown Shirts are happy to make predictions for 2020, 0.65 x 20 years away, or 2050, 2.15 x 20 years away. These people are not serious.
"The satellite data for the last 20 years shows an increase in rainfall that is three times what the models predicted," says Wentz. "This represents one of the first tests of the models used for the predictions of the Intergovernmental Panel on Climate Change. The results show a significant discrepancy between model and observations."
A discrepancy of three times is not a trifling issue. It is a major, elephant in the room issue that needs to be factored into climate models. Of course, the ensuing reduction in temperature rise predictions will call into question the wisdom of the billions of dollars spent on climate modelling so it's unlikely to happen.
Wentz and his colleagues admit that the reasons for the discrepancy are not clear. One possible explanation is that two decades is too short a time to detect a real trend, a criticism that has already been levelled at the study by several climatologists.
On the one hand 20 years is too short a time frame but on the other we're told that we need to take urgent action right now. This lack of consistency does not add credibility to climate science at all.
The increase in precipitation between 1986 and 2005 could have been greater than it will be when averaged over longer time scales.
It looks to me like numbers will need to be fudged.
Gavin Schmidt, a climatologist at the NASA Goddard Institute points out that the 20 years studied were dominated by a couple of El Niño events, which increased precipitation during that time. "The trends are not really significant," he says. "I think some more work would be necessary to really pin their argument down."
And right on cue here's one of the Chief Fudgers himself, along to tell us that it's all in line with expectation and "The trends are not really significant." Of course, when we get a few hot days it's proof of global warming and not a trend that's "not really significant". That's one of the great things about climate science. When you're wrong you're right.
"Two decades is a relatively short period of time for this type of analysis, but it is all we have," Wentz told New Scientist. While the 1997 to 1999 El Niño increased precipitation by 3%, models predicted it would only result in a 1.5% increase - it's "another example of the climate models under-predicting rain variability", he says.
Quick, New Scientist, you need another 'authority' to derogate the importance of the study!
Roy Spencer of the University of Alabama in Huntsville, US, who was not involved in the study, is familiar with the satellite instruments used by Wentz. "It is not clear that the trend they are measuring is rainfall," he says.
That'll do. Find a bloke that wasn't involved in the study. Perfect.
The instrument measures the total amount of liquid water in the atmosphere, but does not give an indication of its altitude. If you don't know if it is falling, how do you know it is rain, asks Spencer. He believes that in a warming climate, warm air rising from the Earth's surface could hold liquid water in the atmosphere for longer.
"He believes..." He believes! He believes! He believes? What the hell does that mean? Isn't he meant to
know? Isn't the science settled? What's belief got to do with it?
"I think this is probably the most accurate measurement of liquid water ever made," says Spencer, "but I question the physical interpretation" that there has been more precipitation.
Even better. It's impossible to deny the accuracy of the equipment so deny the interpretation. I still don't think that New Scientist has quite done enough to demonstrate how wrong 'non-consensus' scientists really are.
Spencer has himself expressed views that the scale of global warming may not be as bad as predicted. In the early 1990s, he participated in experiments led by John Christy, also of the University of Alabama in Huntsville, which suggested that the lower troposphere was cooling, not warming. Both scientists later admitted their measurements were faulty (see Sceptics forced into climate climb-down).
There you go! Trip back 15 years - a huge amount of time in climate science terms - and find someone who was wrong.
Christy says he is now working with Wentz on a new project using improved satellite instruments. Unlike the previous generation of instruments, these ones are able to gauge the relative altitude of raindrops.
John Christy is a brilliant scientist and well-known climate change skeptic. Far from a humiliating 'climb down', Christy thanked the people who discovered his error, factored it into his analysis and came to the conclusion that the earth was heating at between 1.2-1.3C per century, which he quite rightly describes as nothing to worry about.

Yet again we see the failure of climate models to predict the past with even a modicum of accuracy. Yet again we see the Climate Brown Shirt lobby come rushing to the aid of their unscientific religion. And we're meant to spend trillions of dollars on this stuff? Seriously?

2 comments:

Anonymous said...

You're a dope. Like all the others in climate denial, your "oops ... I guess I was wrong" will come way too late.

Darren said...

I'm glad you continue to post this. Someone has to stand up to the Church of Global Warming and finger them as the cult they are.