Diskusjonsforum > CO2 og menneskeskapte påvirkninger

Warming by the Sun or by the Atmosphere ...?

(1/11) > >>

Okular:
Real-world climatic significance of ’the enhanced greenhouse effect’ – a straightforward test toward potential falsification.

A surface with a temperature above 0 K would have an incoming and an outgoing heat flux – it would gain its heat from somewhere (incoming heat flux) and at the same time give off heat to its surroundings according to its thermodynamic state. This outgoing heat flux would be the heat loss of that surface.

The surface of the Earth is such a surface:

(From http://earthobservatory.nasa.gov/Features/EnergyBalance/page5.php.)

If the surface maintains thermodynamic equilibrium with its surroundings, the outgoing heat flux would balance the incoming exactly. And its temperature would remain constant.

To change the temperature of this surface, one thus has to do (at least) one of two things:
•   Increase the incoming heat flux – the heat gain, or
•   reduce the outgoing heat flux – the heat loss.

So, turning to Earth’s surface, a ’body’ holding a mean temperature of +15°C or 288K, and assuming it’s in a state of thermodynamic equilibrium (this will of course never really be the case 100%), this means the surface temperature remains constant and the heat flux OUT equals the heat flux IN (from the Sun) – according to the new Stephens et al. 2012 study, ~165 W/m2.

The total mean heat loss flux from the global surface of our planet is acquired by summing the shares from latent heat transfer (through evaporation), sensible heat transfer (through conduction/convection) and net thermal radiation (net IR UP).

What, then, can we do to raise the surface temperature from this state of equilibrium?

As already mentioned, there are two ways and two ways only:
1)   We can increase the incoming heat (the heat gain), or
2)   we can reduce the outgoing heat (the heat loss).

These two are distinctly different methods by which to accomplish surface warming.

With 1) the surface is warmed directly – by increasing a heat flux (the incoming one). Extra heat is supplied.

With 2) the surface is warmed INdirectly – by reducing a heat flux (the outgoing one). No extra heat is supplied.

(In both cases the opposite flux is assumed to remain constant.)


This distinction is crucial.


The two distinct ways to achieve surface warming can be called:
1)   The Solar Method, and
2)   The Atmospheric Method.

Why?

Simply because of what the 2nd law of thermodynamics dictates. It says that heat (the net energy flow) between two warm objects in thermal contact will always go from the warmer to the cooler object, because a system like this will always spontaneously move toward the highest possible state of entropy, which in this case would be thermodynamic equilibrium (even temperature). In nature this process is irreversible.

This process described: ”When two systems come into thermal contact, they exchange energy through the microscopic interactions of their particles. When the systems are at different temperatures, the result is a spontaneous net flow of energy from higher to lower temperature, so that the higher temperature decreases [through heat loss] and the lower increases [through heat gain].”

The Sun is warmer than the Earth’s surface. Thus it CAN transfer heat to the surface. And it does. The NET FLOW OF ENERGY between the Sun and the Earth’s surface will spontaneously go from the former to the latter. Always.

The atmosphere, on the other hand, is cooler than the Earth’s surface. It can NOT transfer heat to the surface. And it doesn’t. It will of course always and continuously transfer thermal ENERGY to the surface, simply because it has a temperature above 0 K. But it can never, on a global scale, transfer HEAT. The NET FLOW OF ENERGY between the atmosphere and the Earth’s surface will spontaneously go from the latter to the former. Always.

As one can see, there are two strictly separate thermodynamic mechanisms at hand to explain surface warming.

And hence we have a way to determine how the observed global warming of the recent three and a half decades came about.

Through which of these two mechanisms was the global surface temperature raised?

This is important:

- The Sun can ONLY change (raise/lower) the surface temperature directly, that is
   by changing the net energy INput – heat gain.

- The atmosphere can ONLY change the surface temperature INdirectly, that is by
   changing the net energy OUTput – heat loss.


So, who is the culprit? The Sun or the Atmosphere?


We have two possible scenarios:

a)
If the solar input increases, the Earth’s surface will warm directly. That means the surface temperature will rise first. And THEN, as a response, the total heat loss from the surface will start increasing. To catch up. It will potentially increase until the heat gap is closed (heat IN + <--> heat OUT) and balance is restored.

b)
If the atmospheric forcing is strenghtened, for instance by increasing the optical depth to surface IR radiation through a rising atmospheric content of GHGs, the Earth’s surface will warm INdirectly. That means the total surface heat loss will be suppressed (reduced) first. And THEN, as a response, the surface temperature will start rising. And it will potentially rise until the heat gap is closed (heat IN <--> heat OUT –) and balance is restored.

You see the opposite course of events here?

The size of the heat gap and the time it takes to close it in each case will depend on the size and the rate of the change – how large is the change/divergence accomplished over how much time?

If the change is sudden and large, the closing process will take time. The imbalance will sustain for a considerable period.

If the change is small and gradual/incremental, the imbalance will hardly be observable, because the response mechanisms will have no problem keeping up with the continuous effort to open up the heat gap.


(From ’Earth's Climate Past and Future’ (W.F. Ruddiman, 2000).)

Anyway, at this point I think you understand what we need to look for to determine where the observed global warming came from.


Returning to points a) and b) above:

With direct surface heating (the Sun) the situation looks like this – heat IN + <--> heat OUT. Then the total surface heat loss will have to increase FROM its original level to be able to balance the increased input.

With INdirect surface heating (the atmosphere) the situation looks like this – heat IN <--> heat OUT –. Then the total surface heat loss will have to increase back up TO its original level to be able to mend the reduced output.

(In both cases, the surface temperature will rise in the meantime.)


This leads us to the following realization:

The total heat loss (the heat/net energy flux OUT) from the Earth’s surface during observed warming can, if:

The Sun is responsible for the warming, ONLY be observed to either remain constant/unchanged (if the increased forcing is very small and gradual) or to INCREASE.

And if:

The atmosphere is responsible for the warming, the total global surface heat loss can ONLY be observed to either remain constant/unchanged (if the increased forcing is very small and gradual) or to DECREASE.

Conversely, in the solar case, the total heat loss can NOT be observed to DECREASE during warming.

And, in the atmospheric case, the total heat loss can NOT be observed to INCREASE during warming.


So there you have it. The test.

Has the total global heat loss from the Earth’s surface (latent heat transfer + sensible heat transfer + net IR) increased or decreased during the global warming period we’ve seen since the mid 70s?

Is the Sun or the atmosphere responsible?

Did the warming start at the surface or in the atmosphere?

Okular:
CLAUSIUS-CLAPEYRON

The Clausius-Clapeyron equation describes the relationship between the temperature of a liquid and its vapor pressure: The higher the temperature, the greater the vapor pressure.



This could be restated in the following way: The higher the temperature of the liquid, the greater the evaporation from its surface. Water, for instance, has a vapor pressure at 373K of 1 atm. This means that at 100C, its vapor pressure exactly equals the atmospheric pressure at sea level and the water will start boiling – the upward pressure from the vapor being released from the water's surface at this point is as large as the constant downward pressure from (the weight of) the atmosphere above it. Water vapor will of course also be released from the water's surface long before it reaches this particular temperature. But the RATE of evaporation will always be related to the atmospheric pressure AND the temperature of the liquid – the rate of evaporation will be greater both if you increase the temperature AND if you lower the atmospheric pressure.

Clausius-Clapeyron also by extension says something about the amount of water vapor a certain parcel of air can hold at a specific temperature/pressure-configuration before it becomes saturated and the vapor starts condensing (dew).

Notice however, the operative word here is ’CAN’, not ’MUST’. It is NOT the case that a volume of humid air at say sea level HAS TO contain so and so much water vapor at a certain temperature. It has the POTENTIAL of containing so and so much water vapor. But other factors regulate how much water vapor it REALLY contains.

There seems to be a great deal of confusion here.

People seem to think that as the atmosphere grows warmer, it will automatically ’suck’ more water vapor out of the oceans, making this a positive feedback to the atmospheric warming. If this is NOT what they think, there really is no way to make head or tail of it all …

This approach of course turns the causal relationships all on their head. The Clausius-Clapeyron equation clearly states that it’s the LIQUID's temperature that matters. It is the LIQUID's temperature that decides how much vapor will be available.

In other words, it is the global sea surface temperatures that decide how much water vapor will be released into the troposphere, NOT the tropospheric temperature.

No, rather we have the following causal chain:

Rise in SST -> evaporation rate increases -> latent heat is transferred from the ocean to the troposphere, being released upon condensation -> rise in tropospheric temperature.

Evaporation from the ocean surface is a NEGATIVE feedback to ocean warming, not a POSITIVE feedback to tropospheric warming. In fact, the evaporation upon condensation is what CAUSES the warming of the troposphere in the first place – as a direct response to the original surface heating.



(From http://oceanworld.tamu.edu/resources/oceanography-book/heatbudgets.htm.)

No, as one recalls, the ONLY means by which the atmosphere can bring about warming of the Earth's surface is by reducing and then limiting its total heat loss. If that heat loss were to increase past its original level (the rate of heat loss before the forcing started working), then that would put a stop to the atmospheric forcing mechanism – it would no longer work. Crossing this line would mean the surface would no longer be able to accumulate the constant heat coming in from the Sun by means of reduced/limited heat loss. It would start cooling again.

It’s that easy. Plain and simple.

This means that ’an enhanced greenhouse effect’ would work rather by restraining the rate of evaporation than by accelerating it.

Otherwise it simply would not work.

Okular:
I had a look at the ERA Interim Reanalysis data (of the ECMWF) on the KNMI Climate Explorer regarding the four surface energy fluxes (net downward shortwave radiation (pos.), net outgoing longwave radiation (neg.), latent heat transfer (neg.) and sensible heat transfer (neg.)) from 1979 to 2012. And got some very interesting results. It turns out that of the three negative fluxes (regulating the rate and magnitude of heat loss from the surface) only the change in latent heat transfer really matters. Also, they’ve all grown more negative (more efficient in ridding the surface of heat, that is) globally during the modern warming. As one would expect.

The key seems to be in the latent heat transfer. Not (at all) in surface thermal radiation.

Here is net global surface solar radiation (SSR) from 1979 to 2012 (ERA Interim of the ECMWF – data downloaded from KNMI Climate Explorer):


Here are the other net global surface energy fluxes – sensible heat (green), thermal radiation (STR) (red) and latent heat (blue):


Subtracting the sum of the three outgoing net fluxes from the incoming net solar flux gives this net surface energy balance curve for the Earth as a whole from 1979 to 2012:


Robustly positive all along, yet still trending unmistakably downward and now finally getting pretty close to perfect balance – maybe withing 3-5 years we’re there, crossing the line … The mean imbalance between incoming and outgoing (1979-2012) is +7,22 W/m^2 (which sounds like a lot).

This is still according to the ECMWF of course.

Just out of curiosity I made a running total on the data behind the plot above. It came out like this:


So the funny thing is, even though all of Earth’s net surface heat loss fluxes have steadily increased in strength/efficiency (becoming more negative) since 1979 (sensible heat by ~0.8 W/m^2, STR by 0.8-1 W/m^2 and latent heat by ~6 W/m^2, to a total of 7.6 – 7.8 W/m^2) and with the mean net solar input upon the global surface today pretty much equal to what it was in 1979, Earth has been accumulating a LOT of energy/heat. The global solar input has simply been larger the last 34 years than the output from Earth’s surface, the heat loss processes working hard to catch up. And that’s the funny bit. According to AGW theory, what would cause the energy imbalance is a DEcreasing of the total net upward heat flux from the surface. For instance, in a theoretical steady state, with solar IN (considered static) exactly balanced by IR+latent+sensible OUT, more GHGs would indirectly lessen the total heat flux from the surface, making it less negative (more positive) which would then create the observed positive imbalance. But this theoretical course of events is quite the opposite of what apparently actually happens in the real world. Here the IR flux, the sensible heat flux and the latent heat flux are all increasing as a function of surface temperature. Or should we say, as a function of the increasing difference/divergence between the surface temperature and that of the air layer directly above it. If the standard AGW hypothesis were right, the lapse rate should lift the mean temperature level off the ground with increasing concentration of GHGs in the atmosphere. That is to say, the incremental same-temperature levels would be situated gradually higher from the tropospheric mean emission height on down to the surface. This means that in the end, the layer of air just above the ground/sea would warm independently of the surface (a tiny bit) and would thereby in a snapshot reduce the temperature gradient between the air and the surface, reducing the total net heat flux from the ground/sea. For this to be the case, though, the temperature gap between the surface and the air layer adjacent to it must either be observed to DEcrease or to remain stable (they both warm in step). If this gap were rather observed to INcrease, this whole construct would crumble. Then the surface cannot be the follower. Then the surface is the driver. Which is what all common sense is telling us is the case. Look at these two graphs:



This is ICOADS SST vs. ICOADS Tair. The first graph covers a large chunk of the Pacific Ocean (30N-40S, 150E-100W). The second a significant part of the North Atlantic (62N-0, 60-15W). Watch how the SST trends are distinctively steeper than the Tair trends in both diagrams. How would an air layer colder than the surface and at the same time with a lower warming rate force the warming of the surface? It couldn’t. And it doesn’t. And it agrees with the ERA Interim Reanalysis data.

Finally, I did the same operation for the tropical Pacific fluxes as I did for the global ones. Here is the result, directly compared to the global (tropical Pacific (24N-24S, 120E-80W) (black), global (red)). From top to bottom – solar, sensible, IR and latent. Watch how much more positive the solar is in the Pacific and accordingly how much more negative the latent heat flux is. For the other two fluxes the difference seems inconsequential:

Okular:
I’ll refer to Dee et al. 2011, “The ERA-Interim reanalysis: configuration and performance of the data assimilation system” for a thorough discussion on the ERA Interim project. What comes plainly out when reading the document, is that the reanalysis model has overestimated the mean surface solar input:

--- Sitat ---Due to a programming error in the calculation of incident solar radiation as a function of solar zenith angle, the global solar radiation in ERA-Interim is overestimated by about 2 W/m^2.
--- Slutt sitat ---

and

--- Sitat ---For solar irradiance, ERA-Interim uses a constant value of 1370 W/m^2 throughout, i.e. no account is taken of the solar cycle. Variations due to the varying distance between the Earth and the Sun are incorporated as described in Paltridge and Platt (1976).
--- Slutt sitat ---

According to the newest satellite estimates, the mean solar irradiance is ~1361.7 W/m^2 (http://wattsupwiththat.com/2011/01/14/total-solar-irradiation-tsi-value-lower-in-2008/). The range in total irradiance between high and low within each cycle is ~1 W/m^2 with PMOD and ~1.5 (1-2) W/m^2 with ACRIM.



Disregarding the solar cycle amplitudes won’t affect the long term average (over several cycles). It will, however, affect the decadal variation. That means the graphs I’ve presented (based on ERA Interim) show less variation than reality. The 8.3 W/m^2 (1370-1361.7) difference between model assumption and real-world measurements is significant. It will probably overestimate the average energy input from the Sun at Earth’s global surface by ~1 W/m^2 (1361.7/8,3 = 164 W/m^2).

Dee et al. continue:

--- Sitat ---The energy balance at the top of the atmosphere in ERA Interim has improved, with an estimated energy loss of 1.2 W/m^2 (7.4 W/m^2 for ERA 40). However, the energy balance at the surface boundary is poor in ERA Interim, with a global value of 6.9 W/m^2 (3.8 W/m^2 for ERA 40). This degradation occurs primarily over oceans and is associated with an increase in net solar radiation there. Over land the surface energy balance actually improves in ERA Interim, to 0.5 W/m^2 (1.3 W/m^2 for ERA 40).

Källberg (2011) suggests that the model clouds are the major contributor to the imbalance in surface energy, based on a correspondence between spin-up/spin-down of cloudiness and of the net energy fluxes.
--- Slutt sitat ---

My own calculated mean value for the global energy balance (1979-2012) turned out to be +7.22 W/m^2. Dee et al. finds a +6.9 W/m^2 imbalance (1979-2010).

Based on the quotes above it seems justified adjusting the ERA Interim solar input down by 2+1= 3 W/m^2. This would reduce the global net energy imbalance 1979-2012 to 4,22 W/m^2 (second graph below), which actually sounds AND looks much more plausible than the original +7,22 (first graph below):



Note how in the lower graph (the ‘new and improved’ +4,22 one) we’re already very close to perfect balance and have been so for a few years, quite on the verge of crossing the line into negative territory.

Here are the running totals (accumulated energy) for the +7,22 and the +4,22 scenarios:



We’re obviously at the summit plateau.

What’s very interesting to observe, is how the evolution in Earth’s energy balance seems to follow the same pattern as ENSO East (NINO3.4) does. One might imagine an oceanic equilibrium line, across which the Earth system fluctuates in giant cycles. Below the equilibrium line the ocean’s heat loss is on average greater than the input from the Sun. There is a net loss of energy content. Above the equilibrium line the situation is reversed. There is a net builup of energy content. The main regulating mechanism seems to be the rate of evaporation from the ocean surface.

From the 70s to the 80s this equilibrium line was somehow crossed. The Earth system shifted from a negative to a positive balance. And here’s the take-home message: After the shift is completed, the trend starts falling at once, on its way back towards the equilibrium line. The initial divergence is gradually and steadily reduced. But the positive energy imbalance is still there all along. Energy is accumulating in the system, only at a slowing rate until it finally reaches zero. We’re very close now to that point.

Compare this to the MEI curve. What do we see?


A mighty upward shift in 1976/77. Before this, the curve is generally running below the zero line. After, the curve is generally above. But what about the trend? It starts falling directly from 1977 onwards. It’s basically negative all the way ’till today. Yet the world has become warmer and warmer during this same period. Since a few years back now the MEI/NINO3.4 curve fluctuates around the zero line, straddling the border between El Niño and La Niña dominance.

Coincidence?

Okular:
E.M. Smith - Tropopause Rules

I find it peculiar that everyone is so taken in by the whole notion of the so-called ’radiative greenhouse effect’ being such an ingrained necessity, such a self-evident, requisite part, as it were, of our atmosphere’s inner workings. The ’truth’ and the ’reality’ of the effect is completely taken for granted, a priori. And yet, the actual effect is still only a theoretical construct.

In fact, when looking at the real Earth system, it’s quite evident that this effect is not what’s setting the surface temperature of our planet.

The whole thing is actually pretty obvious.

The Earth, a rocky sphere at a distance from the Sun of ~149.6 million kilometers, where the Solar irradiance comes in at 1361.7 W/m2, with a mean global albedo, mostly from clouds, of 0.3 and with an atmosphere surrounding it containing a gaseous mass held in place by the planet’s gravity, producing a surface pressure of ~1013 mb, with an ocean of H2O covering 71% of its surface and with a rotation time around its own axis of ~24h, boasts an average global surface temperature of +15°C (288K).

Why this specific temperature? Because, with an atmosphere weighing down upon us with the particular pressure that ours exerts, this is the temperature level the surface has to reach and stay at for the global convectional engine to be able to pull enough heat away fast enough from it to be able to balance the particular averaged out energy input from the Sun that we experience.

It’s that simple.

The higher the net energy input at the surface, the higher the potential surface temperature and the more powerful the convection – the higher the tropopause is located (pushed). This is all intimately connected and readily observable – compare the tropics with the polar regions.

The convectional engine simply transports the heat generated by the Sun at the surface up into the atmosphere as far as gravity allows. From above this level the heat will then be freely radiated back out into space.

This level is the border between the troposphere and the stratosphere. In the first one, convection reigns supreme. In the second, above the convection top (set by the surface atmospheric pressure, gravity and net energy input/heat), radiation takes over. Radiation does of course work all the way from the surface to space. But in the troposphere it’s just there – a second order heat transport mechanism in the midst of all the latent and sensible heat transport lifting the surface heat up and away from the ground to maintain balance.

There simply is no way radiation can ever control what’s going on temperaturewise at the surface, nor at the tropopause or anywhere in between. This can only occur where there is no convection – above the tropopause.

The surface is directly convectively coupled to the atmosphere above it. The atmosphere is warmed mainly by surface processes, primarily through latent and sensible heat transfer (deep convection). This is the ’blanket effect’ of the atmosphere. It insulates the surface simply by being warm. And by weighing down upon it, restricting evaporation and convection.

The dayside of the Earth receives vast amounts of solar energy, much more than it needs to reach ’actual’ daytime temperatures. The excess energy, mainly in the tropics, goes into driving the atmospheric circulation (through convection), bringing part of the heat north and south … and to the nightside. That’s why the Earth has much cooler days and much milder nights than the Moon – the constant spreading around of received heat. The atmosphere in this way simply evens out the temperature highs and lows and as a consequence raises the mean surface temperature of the Earth compared to that of the Moon.

The Sun manages perfectly well to heat the surface of the Earth to the +15°C that we experience.

The whole 255K (–18°C) warming from the Sun is a total illusion. This is the simple point that Joe Postma is making. I find it hard to grasp why people are so against his realistic model and so clinging on to the unnatural notion of the evened out solar input. There simply is no need or room for any ’extra’ atmospheric heating. That is, beyond the fact that it’s there, having a mass and a heat capacity, being warmed by and convectively (and hence, by temperature gradient) bound to the surface.


=== === ===


Ok. Examining the observational data from the real world, then, is there any specific AGW signal in the global (or regional) temperature record of the Earth during the modern warming era?

Or is something else, a naturally occurring phenomenon, pulling the strings ...?

Navigering

[0] Oversikt

[#] Neste side

Skift til full visning