31 May 2010

What causes the CO2 increase?

What causes the CO2 increase of 110 parts per million from the year 1850 to 2010?   Is it human made or natural?


The Keeling Curve is perhaps the premier piece of global warming propaganda.   


Its coincidence with the graph of human emissions over the same 50 year time period contributes to the impression that humans are causing the carbon dioxide increase.

Dave Keeling searched for years before finding the site with the desired properties at the summit of Mauna Loa in Hawaii. 

CO2 is a heavy gas, preferring the ground to high altitudes.  Strong daily circulation currents go up about 1 - 2 kms.  The Mauna Loa site in Hawaii is chosen to be above this area of maximum CO2 activity.  Elevation has a smoothing effect on the variations of CO2.


See also: Wisconsin Tower (below).

Keeling carefully selected his site at 3,400 metres to naturally smooth the curve making it appear coincidental to human CO2 emissions since 1850. 

It is also selected to be in the middle of the world's largest ocean so as to be dominated by the ocean signal. 



All of the measuring sites of the Scripps Institution of Oceanography are located close to the ocean.  

This is for a reason:  oceans readily dissolve carbon dioxide and so smooth the natural variations giving rise to the so-called background amount.  The oceans control the amount of CO2 that's in air by an equilibrium of emission and absorption governed by the equation:

CO2(g) <==> CO2(aq)

The CO2 measured at Mauna Loa is technically true for that area (not withstanding artificial adjustments which I'll explain below), but to claim that it's representative of the whole Earth is global warming propaganda.  There can be quite substantial variations in the lower troposphere that Mauna Loa never picks up.

CO2 doesn't mix well and can be chunky. It can take 50 years for CO2 from the northern hemisphere to reach the southern hemisphere.  (See:  Study of hemispheric CO2 timing...).

When you read the so-called global average given on the NOAA Earth System Research Laboratory website what you're getting is the average of marine surface sites.  As I mentioned, due to the immense solubility of carbon dioxide in water the oceans smooth variations in the signal and this is the reason for the site selection.  There could be all sorts of variation of CO2 on land and we wouldn't know based on this specially prepared analysis by Scripps, NOAA etc.

On land the CO2 can vary by 100ppm in a day. E.g. compare Diekirch, Luxembourg to Mauna Loa:

It can change by ~100ppm in minutes depending on wind speed and direction.

Over Wisconsin Tower CO2 can be very high in concentration in the morning in summer.
So, while it's true to say that Mauna Loa measures a background level it ignores the fluctuations that can happen in the lower 2 kms of the troposphere and over land. Mauna Loa is good for trends but masks a lot of the true story close to the ground and over land.

Historical measurements show much less smoothness and continuity as the specially smoothed Mauna Loa curve.  You see the very flat linear shape at Mauna Loa in the middle of the Pacific  (green line) compared to sites near the Atlantic and high continental mountains (red line):
Furthermore, by using the Mauna Loa site adjustments can be made every time CO2 levels go out of bounds with the variations blamed on the nearby Mauna Loa volcano or wind. The Mauna Loa measurements and all contemporary carbon dioxide measurements made by Scripps and other authorities exclude variations of CO2 outside a certain bound. So, CO2 measurements appear much smoother than they really are.  Again, nothing deceptive in and of itself, but if you pretend that this is the whole story for the whole world it is disingenuous because there is a conscious attempt to match the measured CO2 curve to human industrial output.

There are around 38,000 billion tonnes of carbon in the ocean waters.



Judging by the area under the following graph total cumulative contribution of humans from the year 1850 to today is approximately 1.18 trillion tonnes CO2.
 
Using the above chart for carbon fluxes and stores and converting it to a CO2 equivalent there are 150.33 Tt CO2 in ocean, air and soils.  This is the CO2 that's in regular flux in the biosphere.  

1.18 Tt CO2 human  ÷  150.33 Tt CO2 nature = a 0.78% increase.  Given the rapid solubility of CO2 in water a 0.78% increase can not lead to a 40% increase in the air.

With nature's huge sources and sinks is it not more reasonable to suggest that a small change in the natural equilibrium causes the CO2 increase?




Swedish climate expert Dr. Fred Goldberg estimates that humans contribute 4% to the carbon dioxide that's currently in air.

The current rise of CO2 from the year 1850 of 110ppm can be fully explained by supposing an increase of 1C in ocean temperature due to the reduced solubility of CO2 in warmer water.




Real Climate wants us to believe that nature causes the large seasonal variation in CO2 but has nothing to do with the yearly increase. If Real Climate was right and dC13 is a human signature then the graph of the dC13 would look like a fairly straight line like human emissions.  Instead, it takes the shape of the natural seasonal variations.
 

Past CO2 concentrations varied by 100ppm by itself without human intervention. It appears ocean heat releases the CO2 with a time delay of about 800 years.

And, a study by Jaworowski in Antarctica suggests that past CO2 variations were even greater than 100ppm and have been minimised by contamination of ice core samples and CO2 loss in the ice over time
. (More references here)

The solubility of CO2 in the oceans is nowhere near saturation and has plenty of capability to absorb our CO2. Our cumulative emissions are small: less than 1% of the natural ocean store. Given the oceans massive reserve and high flux with the atmosphere, dissolved CO2 and carbonates in the ocean seems the most likely source of the CO2 increase.

09 May 2010

Is warmth bad?

Let us assume global warming is real.  Is such warmth bad?

There's no correlation between warmer temperature and bad weather.  Yes, there is more energy with more heat but that doesn't necessarily mean bad weather.

This decade just passed was supposedly the warmest in 130 years yet there was no corresponding increase in hurricanes, tornadoes or bad weather.





GISS global temp followed by two graphs of hurricane activity.

Tornadoes in US decreasing:

There's no anecdotal or physical evidence to suggest that that the medieval warm period had worse weather than today.  In fact all accounts are that warmth is good.

Some AGW adherents fear glaciers melting.  Warmth does melt glaciers, however glaciers are an equilibrium between ice loss and precipitation.  Increased melting can be offset by an increase in precipitation.

Greenland is losing some mass at the edges but is gaining mass in the middle due to precipitation.
 

This is exactly what you'd expect from warmer temperatures: ice loss at the edge but more replenishment on top from snow.  I've seen AGW graphs that show the ice loss at the edges but ignore the replenishment on top from snow. 

The real danger is dryness and that's what the ice ages are associated with.

The New World Order environmental world government should have stuck with their global cooling scare of the 1970's.  We have much to fear from an ice age.  During ice ages the climate oscillates wildly by 8 degrees Celsius.



During an ice age there's less moisture to create the negative feedback mechanisms that keeps the temperature in a small range -- clouds and water vapour and evaporation.  The oceans become lower and saltier.  There is less rain.  There is more hardship.  There are huge ice sheets that chew up land in the northern hemisphere.  There would be millions of climate refugees.  An ice age is everything that Al Gore says global warming would be but for real: gloom, doom, war, famine and hardship.

This current interglacial is the respite from the elements humans needed to domesticate plants and animals and create civilisation.  Of course, the warmers want to turn back the clock on human civilisation, consumption and comfort and so on, so it's no surprise to hear them shun life-giving warmth for death-conducive cold.

Earth has been 10C warmer through most of last 500 million years:


(Source: Scotese with my own annotations.)

The dinosaurs thrived under hot conditions for 180 million years.

In the long run we would be better off if the polar ice caps melted completely.






Since Antarctica covered the pole of the south 60 million years ago the world has been beset by coldness and a more variable climate.  There's nothing that AGW or anything else can do to move Antarctica off the pole any time soon and thus warm up the climate up to a more desirable level.

Even the IPCC has doubts about the threats posed by warmth because it's not clear what the effect of cloud feedbacks are.  There was their recent embarrassing back-down regarding catastrophic weather events being linked to global warming. 

AGW scientists admit that global warming is uneven; their climate models show there would be more warming at the poles than the tropics.
FIG: Surface air temperature change in the two solstice seasons when using doubled-CO2 sea surface temperatures as calculated in the GISS (DBL CO2) and GFDL (ALT) models circa early–mid-1980s. [From Rind (1987).]
Even CRU/hockey team mouthpiece Realclimate admits it:
"..global warming is expected to warm the polar regions faster than the lower latitudes, hence reducing the meridional (north-south) temperature differences (gradient)"
And less temperature difference between cold and warm areas makes for more peaceful weather.

So, taking hurricanes as an example: AGW sources often talk of how increased sea surface temperatures increase the likelihood of hurricanes.  But, that's only part of the story.  It's not just the temperature of the water but the temperature of the air as well.  It's the difference between the water and the air:
"A hurricane's strength depends on the temperature contrast between the ocean's water and the air high in the atmosphere, at the storm's top. If the world's oceans warm up, it's also possible that the upper atmosphere will also warm up. The result could be temperature contrasts much like those now. In other words, warmer oceans wouldn't necessarily mean more frequent or stronger hurricanes." - Doyle Rice, the USA TODAY weather editor.
Without cold air blasting out of the north there won't be enough temperature difference between the warm, moist air rising from tropical seas and the air higher up to allow clouds to form.  Without water vapour condensing to clouds there is no energy released from the latent heat of water vapour and hence no power for the hurricanes.

As long as the sea surface temperature is above 27C there is potential to form a hurricane.  However, if the air temperature is increased as well as the sea surface temp it will take power away from the hurricane.  Since AGW theory says that the atmosphere is warming the oceans, warming of the air must come before the warming of the oceans.  Therefore, there is no increase in difference between upper layers of air and warm, humid air rising from the oceans.  That only one element of the hurricane issue, ocean temps, is presented and not the relative temperature necessary for water vapour condensation is typical of the one-sidedness of AGW science.

You could argue that natural cooling cycles in the air will exacerbate hurricanes because of the ocean's overall warming.  But the oceans are cooling.  Kevin Trenberth was mighty upset at the missing heat.  Missing because they believe in AGW but the observations aren't showing the warming despite rising CO2 levels.  In any case it's still not established that CO2 causes the upward temps of the air.

AGW source environment.about.com admits there are other causes to  the observed hurricane activity in the last few decades:
"Other scientists believe that the increase in severe hurricanes over the past decade is due to natural salinity and temperature changes deep in the Atlantic—part of a natural environmental cycle that shifts back and forth every 40-60 years."
It's linked to El Nino and other cycles.

Please stop this insane form of thermophobia -- the fear of warmth.  It's just an excuse to put forward a political agenda.

Update 15 July 2010:

An interesting article in the Vancouver Sun discusses how cold times in the past were bad for humans in China:

Cooling caused wars and drought in China - study (new link)

The article is laced with AGW overtones. This is despite this information being at odds with AGW because it shows that warming is not bad -- cooling is.

Consider this: in higher latitudes, away from the tropics, there is worse weather in the winter.  This is because the pole gets colder and there is a greater difference between the temperature of the pole and the tropics.

I was reminded of this fact by the story of the American sailor girl who hit rough weather in the Indian Ocean. a few weeks ago and had to be rescued.  I thought to myself, why is there worse weather during the Southern Ocean during the southern winter?  According to AGW more heat means worse weather but winter has less heat and worse weather.  Then I realised it was the temperature difference. 

Of course, that makes sense because heat on its own doesn't do anything.  There must be a displacement of this heat through non-equilibrium with its surroundings.  More heat itself doesn't create energy displacement; differences do.

Hence my claim that temperature difference is more important than total temperature in creating bad weather is proven each and every six months with the changing!