Worse than nothing? – Watts with that?

Reposted by AIER

Robert L. Bradley Jr. – June 23, 2021

“Climate modeling is central to climate science …” (Stephen Koonin, below)

If the history of climate modeling is written in the distant future, the main story could be how the simple, predictable answer turned out to be the wrong one, resulting in overestimated warming and false horrors from the heightened (man-made) greenhouse effect.

In the meantime, empirical and theoretical evidence for this groundbreaking judgment is mounting, despite the best efforts of the establishment to look the other way.

See a University of Colorado Boulder press release earlier this month entitled “Warm Clouds, Cool Planet” subtitled “Precipitation Feedback Cycle Means Models May Overestimate Warming.”

“Today’s climate models show more warmth than their predecessors,” the announcement begins.

But a paper published this week shows how models can be wrong on the side of excessive warming: The warming clouds of the earth cool the surface more than expected, the team led by Germany reported in Nature Climate Change.

“Our work shows that the increase in climate sensitivity brought about by the last generation of climate models should be treated with caution,” said CIRES Fellow Jennifer Kay, Associate Professor of Atmospheric and Ocean Sciences at Boulder University and co-author of the paper.

The press release further elaborates on how incorporating this negative feedback will improve next-generation climate models, which is paramount in light of the upcoming Sixth Assessment of the Intergovernmental Panel on Climate Change (IPCC). But will conflicting modelers and the politicized IPCC be open with the elephant in the room?

background

Strong positive feedbacks from the release of carbon dioxide (CO2) and other man-made greenhouse gases (GHG) turn a modest and even positive warming into the opposite. It has been thought that increased evaporation in a warmer world (mainly from the oceans) causes strong positive feedback that doubles or even triples primary warming.

Technically, water molecules trap heat, and clouds or steam in the upper tropical troposphere – where the air is extremely dry – trap significantly more heat and thicken the greenhouse. How water inhabits this upper layer (approximately 30,000-50,000 feet) to either block (increase) or release (decrease) heat is debatable, leaving behind the externalities unknown to climate economics. And it is the upper troposphere where climate models confuse data.

Assuming a fixed relative humidity, all other things being equal, modelers can call against changed physical processes that could well negate secondary warming. This controversial assumption opens the door to unrealistic hyper-modeling. (For economists, the analogy would assume “perfect competition” to unleash hypertheorization.)

Model critics have questioned the simplified handling of complexity for decades. Meanwhile, climate models have predicted a much stronger warming than turned out.

Theorists have long quarreled with model engineers. Richard Lindzen from MIT, author of Dynamics in Atmospheric Physics, has put forward various hypotheses as to why the water vapor feedback is much lower than in the model. Judith Curry, whose blog Climate Etc. is a leading resource for observing physics and related developments, is another critic of highly sensitive models.

“There are a number of credible perspectives that I try to take into account,” she says. “It’s a very complex problem and we don’t have the answers yet.”

And now we have way too much faith in some very dubious climate models and inadequate data sets. And we’re not really putting the problem broad enough to… make credible predictions about the range of things we could possibly see in the 21st century.

Mainstream recognition

Climate scientists know that climate models are extremely complicated and fragile. In What We Know About Climate Change (2018, p. 30), MIT’s Kerry Emanuel explains:

Computer modeling of the global climate is perhaps the most complex endeavor mankind has ever undertaken. A typical climate model consists of millions of lines of computer instructions designed to simulate an enormous range of physical phenomena….

Although the equations that represent the physical and chemical processes in the climate system are known, they cannot be solved exactly. …. The problem with this is that many important processes run on much smaller scales.

The parameterization problem resembles the fallacies of macroeconomics, in which the crucial causality of individual action is ignored. Microphysics is the driver of climate change, but the equations are unsettled and lie in the subnet range. Like macroeconomics, macroclimatology should have been highly qualified and degraded long ago.

My mentor Gerald North, former head of the Climatology Department at Texas A&M, had made a number of observations in 1998-99 about the gross, overrated nature of climate models that are still relevant today.

We don’t know much about modeling climate. It’s like we’re modeling a human. Models are finally in position to tell us that the creature has two arms and two legs, but we are asked to cure cancer.

There are good reasons for the lack of consensus in science. It’s just too early. The problem is difficult, and there are pitifully few ways to test climate models.

You have to fill in what happens between 5 km and the surface. The standard route leads through atmospheric models. I can’t find a better excuse.

The different models couple differently to the oceans. There is quite a bit of leeway here (indefinite fudge factors). If a model is too sensitive, you can just inject a little more ocean to match the plate. Because of this, all models with different sensitivities seem to mock the record equally well. (Model builders would be offended by my explanation, but I think it is correct.)

[Model results] could also be sociological: to get the socially acceptable answer.

The IPCC’s 5th Assessment (2013), the “official” or mainstream report, recognizes fundamental uncertainties but accepts the model methodology and results at face value. “The complexity of the models”, it says (p. 824), “has increased considerably since the IPCC’s first assessment report in 1990 …”

However, any additional complexity that is intended to improve an aspect of the simulated climate also introduces new sources of possible errors (e.g. from uncertain parameters) and new interactions between model components that, if only temporarily, introduce the simulation a model of other aspects of the climate system. In addition, despite the advances made, scientific uncertainty remains about the details of many processes.

The Humble Nature of Climate Modeling was published by The Economist in 2019. “The prediction of the future climate is riddled with uncertainty” explains:

[Climate modeling] is a complicated process. The code of a model must represent everything from the laws of thermodynamics to the intricacies of how air molecules interact with each other. To operate it means to perform trillions of math operations per second – so supercomputers are required.

[S]Also models are rough. Millions of grid cells may sound like a lot, but they mean that the area of ​​a single cell when viewed from above is around 10,000 square kilometers, while an air or ocean cell can have a volume of up to 100,000 km3. If you treat these enormous areas and volumes as points, many details are missing.

Clouds, for example, pose special challenges for model builders. Depending on how and where they arise, they can either warm or cool the climate. But a cloud is much smaller than even the smallest grid cells, so that its individual effect cannot be grasped. The same applies to regional effects caused, for example, by topographical features or islands.

Building models is also made difficult by a lack of knowledge of how carbon – the central atom in the molecules of carbon dioxide and methane, the most important heat-storing greenhouse gases besides water vapor – moves through the environment.

“But researchers are doing their best,” concluded The Economist.

In fact, climate models clearly overestimate global warming, even by half. And the gap just keeps widening as a cool 2021 is in full swing. And as for the future, anthropogenic warming is limited by the logarithmic rather than the linear effect of the greenhouse effect. The saturation effect means that the warming decreases as the atmosphere contains more CO2. The warming caused by a doubling of CO2 does not occur again when it rises three times, but when it rises four times.

The window of weakness closes quickly, thus explaining the shrill language of prominent politicians. But it is the underlying climate models, not the climate itself, that are running out of time.

“Unsettled” is becoming mainstream

The rough methodology and wrong conclusions of climate modeling emerge from the shadows. Physicist and computer expert Steven Koonin explains in his influential Unsettled: What Climate Science Tells Us, What it Doesn’t, and Why It Matters (Chapter 4):

Climate modeling is central to climate science…. However, many important phenomena occur on scales smaller than the 100 km (60 mile) grid size (such as mountains, clouds, and thunderstorms), and therefore researchers must make “sub-grid” assumptions to build a complete model …

Since the results are generally not very similar to the climate system we observed, the modelers then adjust (“tune”) these parameters in order to better match some features of the real climate system.

Too little voting makes the model unrealistic, but too much voting “runs the risk of boiling the books – that is, pre-determining the answer,” Koonin adds. Then he quotes from a paper co-authored by 15 world-class modelers:

Tuning is often seen as an inevitable but dirty part of climate modeling, more engineering than science, a tinkering that doesn’t deserve to be included in the scientific literature. Indeed, tuning can be seen as an unspeakable way of compensating for model errors.

Conclusion

Climate modeling was arguably worse than nothing because false information was presented as true and “consensus”. Alarmism and disruptive political activism (forced substitution of inferior energies; challenges to lifestyle norms) have developed a life of their own. Fire, ready, aim has replaced prudence, from science to public order.

Data continues to confuse naive climate models. A very difficult theory is slowly but surely explaining why. The climate debate is back in physics, where it should never have stopped.

4.8
13th
voices

Item rating

Like this:

To like Loading…

Comments are closed.