Manglende varme

Diverse — Drokles on August 2, 2011 at 12:24 pm

Det er ikke kun den danske sommer, der mangler varme, det gør hele kloden. I hvert fald ifølge varmeregnskabet, hvor indstrålingen af energi fra Solen skal være lig udstråling fra Jorden. Forskydninger i denne balance er lig med temperaturforandringer. Den herskende indtil det tyranniserende teori er at de forandringer i atmosfærens CO2 indhold vi oplever i disse intenst industrialiserede tider har forskubbet ind- og udstrålingsbalancen så en opvarmnig af hidtil usete proportioner er trådt i kraft. Og som bevis har man programmeret teorien ind i computermodeller, der alle bekræfter teorien.

Problemet har været at temperaturen efter alt at dømme ikke er steget i de seneste 10 til 15 år, hvilket har ført til en debat om, hvor på Jorden varmen gemte sig. Et af de mest seriøse bud var at oceanerne havde slugt varmen og gemt den et godt stykke under overfladen, hvor den blot ligger og lurer på at komme frem igen med al sin opsparede styrke og gøre Shi-bi-dua’s Costa Kalundborg til grusom virkelighed (bortset fra at Afrika bliver som en ovn).

Men et stort netværk af nye avancerede bøjer, der kunne dykke langt ned under overfladen til den lurende varme, som man søsatte fra omkring årtusindskiftet, har ikke fundet noget af den slags. Og varmen blev heller ikke fundet blandt Osama Bin Ladens samling af pizzabakker og pornofilm. Nu viser nye målinger foretaget på den fysiske virkelighed tilsyneladende at varmen alligevel ikke ophober sig her på Jord, som ellers antaget, men rent faktisk siver ud i Universet hurtigere end antaget. Her fra dr. Roy Spencers arbejdsgiver Huntsville University, Alabama

The previously unexplained differences between model-based forecasts of rapid global warming and meteorological data showing a slower rate of warming have been the source of often contentious debate and controversy for more than two decades.

In research published this week in the journal “Remote Sensing,” Spencer and UAHuntsville’s Dr. Danny Braswell compared what a half dozen climate models say the atmosphere should do to satellite data showing what the atmosphere actually did during the 18 months before and after warming events between 2000 and 2011.

“The satellite observations suggest there is much more energy lost to space during and after warming than the climate models show,” Spencer said. “There is a huge discrepancy between the data and the forecasts that is especially big over the oceans.”

Not only does the atmosphere release more energy than previously thought, it starts releasing it earlier in a warming cycle. The models forecast that the climate should continue to absorb solar energy until a warming event peaks.

Instead, the satellite data shows the climate system starting to shed energy more than three months before the typical warming event reaches its peak.

“At the peak, satellites show energy being lost while climate models show energy still being gained,” Spencer said.

Hvis dette står til troende tyder det altså på at Jordens temperatur allerede nu er i balance og ikke vil forrykke sig yderligere. Med mindre at indstrålingen ændrer sig. Og det gør den måske ifølge en stor tredelt undersøgelse af Solen, som i de kommende årtier ikke vil berige os med den samme aktivitet af solpletter, som muligvis via deres forstødende effekt på kosmisk stråling afgør Jordens skydække og således betingelserne for ind- og udstråling.

Spencers arbejde med at måle udstrålingen har dog mødt en del kritik, som Live Science glad giver udtryk for

However, no climate scientist contacted by LiveScience agreed.

The study finds a mismatch between the month-to-month variations in temperature and cloud cover in models versus the real world over the past 10 years, said Gavin Schmidt, a NASA Goddard climatologist. “What this mismatch is due to — data processing, errors in the data or real problems in the models — is completely unclear.”

Other researchers pointed to flaws in Spencer’s paper, including an “unrealistic” model placing clouds as the driver of warming and a lack of information about the statistical significance of the observed temperature changes. Statistical significance is the likelihood of results being real, as opposed to chance fluctuations unrelated to the other variables in the experiment.

“I cannot believe it got published,” said Kevin Trenberth, a senior scientist at the National Center for Atmospheric Research.

Several researchers expressed frustration that the study was attracting media attention.

“If you want to do a story then write one pointing to the ridiculousness of people jumping onto every random press release as if well-established science gets dismissed on a dime,” Schmidt said. “Climate sensitivity is not constrained by the last two decades of imperfect satellite data, but rather the paleoclimate record.”

Spencer agreed that his work could not disprove the existence of manmade global warming. But he dismissed research on the ancient climate, calling it a “gray science.”

Men bortset fra den ideologiske kamp mellem den skyldbetingede teori om menneskeskabte ulykker overfor den frie udfoldelse ligger der også en videnskabelig kamp, nemlig mellem målinger og modeller,s om det ovenfor antydes. Klimaet er umuligt at definere præcist endsige måle på. Alt vil være tilnærmet og statisktisk forarbejdet. Men modeller er grundlæggende kun udtryk for skabernes forestilling. Der er megen prestige og mange bevillinger på spil i denne kamp. American Thinker har en glimrende beskrivelser af klimamodellernes grundlæggende problem

Climate science has embraced computer climate models as the tool it uses to compute the magnitude of the warming effect of CO2.  The climate models are riddled with problems.  Kevin Trenberth, a noted climate scientist and a prominent promoter of global warming alarmism, said this about the models: “none of the climate states in the models correspond even remotely to the current observed climate.”  The effect of CO2 is measured by a theoretical number called climate sensitivity.  There are more than 20 climate modeling groups around the world.  These groups each spend millions on programmers and supercomputers, searching for the value of climate sensitivity.  They all get different answers, differing by a ratio of more than two to one.  This failure of consensus would normally be considered a sign that the approach is not working.  But if climate science can’t make predictions of doom, it will cease to be important and funding will collapse.  The climate science establishment had to relax the normal rules of science for its own survival and for the sake of its post-normal-science political goals.

The global warming establishment devised a solution.  They decided to take the average of the various disagreeing models and claim that the average is closer to the truth than any of the models incorporated in the average.  They call this a multi-model ensemble.  The skeptic will ask if averaging together the results from more modeling groups makes the result better, why not spend a few billion dollars more and establish another 20 or 50 modeling groups to still better zero in on the truth?  To read the justifications for multi-model ensembles is to enter a reality distortion field.

The climate models make predictions that cannot be tested because you would have to wait 50 or 100 years to see if the predictions are correct.  The models are evaluated and calibrated by simulating the observed climate of the 20th century.  The entirely unjustified assumption is made that if the models can match the 20th-century climate they must be working well and will be able to predict the future.  This is known as backtesting.  The problem with backtesting is that models may fit the historical data for the wrong reasons.  If a model is complicated, with enough adjustable parameters, it may be capable of fitting almost anything.  Many people have devised stock market models that work well when tested against history.  If such models could predict the future movement of markets or pick winning stocks it would be far easier to make money in the stock market than it is.

Læs endelig hele den rasende spændende artikel om opgøret mellem den klassiske videnskab og “den postnormale videnskab”.

1 Kommentar »

  1. D…………….Partly as a result of my recent with Andy Dessler on cloud feedbacks the variable mostly likely to determine whether we need to worry about manmade global warming I have once again returned to an analysis of the climate models and the satellite observations..I have just analyzed the 20th Century runs from the IPCCs three most sensitive models those producing the most global warming and the 3 least sensitive models those that produce the least global warming and compared their behavior to the 10 years of global temperature and radiative budget data as did ..The following plot shows the most pertinent results. While it requires some explanation an understanding of it will go a long way to better appreciating not only how climate models and the real world differ but also what happens when the Earth warms and cools from year-to-yearsay from El Nino or La Nina..What the plot shows is on the vertical axis how much net loss or gain in radiant energy occurs for a given amount of global-average surface warming at different time lags relative to that temperature peak on the horizontal axis .

    Comment by hemp — August 4, 2011 @ 12:31 am

RSS feed for comments on this post. TrackBack URI

Kommentér indlægget...

Monokultur kører på WordPress