Editor's Notes:

Further review of the Wallace Christy paper has occurred and is no doubt being chewed on over at skepticalscience.com, etc.    The article below was the result of Christy’s testimony in April of this year in Congress.    He was trying to have CO2 removed from the EPA’s purview.   

 

Also the first paragraph below is meaningful to underscore the importance of the change of phase of WV into water droplets or clouds and that this release of energy occurs at about 10Km much higher than what MODTRAN would say is where WV even occurs. 

 

From Weekly Roundup - more on Christy and His Testimony

https://wattsupwiththat.com/2017/04/30/weekly-climate-and-energy-news-roundup-268/

 

The theory of latent heat is well tested. It is the absorption or release of energy through phase change of a substance. For example, liquid water at the surface is converted to a gas, water vapor, it absorbs heat, without necessarily increasing temperature. The heat absorbed changes the bonding energy among the molecules, resulting in the change of phase from liquid to gas. In the Charney hypothesis, the gas rises into the atmosphere until it condenses back to liquid water, releasing the energy as heat. The process will significantly amplify the warming caused by CO2. In the global climate models, based on the Charney hypothesis, the release of latent energy is centered over the tropics at about 10 km (33,000 feet), 250 to 200 mb of pressure. This is the so-called “hot-spot.”

 

Using the Canadian Climate Model as an example, Christy gives a pictorial representation of the “hot-spot.” He outlines the area from the surface to 50,000 feet (15km), making it clear where the pronounced atmospheric warming should occur, according to the modelers and the prevalent hypothesis. By keeping his analysis below 50,000 feet, Christy avoids any confusion of the principal issue with stratospheric cooling, for which there is no generally accepted explanation.

 

Christy shows that, in general, global climate models (CMIP5), from 32 institutions, greatly overestimate the tropospheric warming trends (50,000 feet or below). The number of simulations each institution contributes varies from one to eighteen. For the empirical data, Christy uses 3 different satellite datasets, 4 balloon datasets, and the average of 3 reanalysis datasets.  The different types of datasets closely correspond, contrasting the average of the models which greatly overestimate the observations by 2.5 to 3 times.

 

Clearly, the global climate models fail this basic test and the hypothesis of a significant amplification of the effect of CO2 as encompassed in the Charney Report fails.  Following the procedures of the scientific method, they must be rejected until substantially revised.

 

Christy reveals that in its Fifth Assessment Report (AR-5, 2013) the UN Intergovernmental Panel on Climate Change (IPCC) provided information that supports his conclusions.  In the Supplementary Material of Chapter 10, the report (Figure SM 10.1) showed that the tropical trends of climate models with greenhouse gases added failed to match actual trends while climate models without greenhouse gases added agreed with actual trends. Christy simplified the material for his testimony (Figure 5). In short, the reasoning that the IPCC offered elsewhere as proof of the strong influence of greenhouse gases was proof of their weak influence.

 

Revised Paper by Wallace, Christy, and D’Aleo: In his testimony, Christy discusses the simple statistical model used in the August paper by Wallace, Christy, and D’Aleo. At the time of Christy’s testimony, the paper was undergoing revision and made stronger.  The paper has been reviewed by several experts in relevant sciences and statistics.

 

One of the major issues regarding the global climate models is their complexity. They involve multiple complex climate processes that have not been adequately solved. Weather models also involve such processes, but can be used to predict over short periods of time — a matter of days.  Much of the improvement in numerical weather prediction is from improvement in measuring initial conditions. For meaningful climate predictions, initial conditions should be irrelevant. But, if the current climate models are to be meaningful, thorough understanding of the climate processes is necessary.

 

The widely accepted Kiehl – Trenberth Annual Global Mean Energy Budget Model (TWTW March 11 & 18) gives an example of the annual global energy flows that must be known with a high degree of precision before meaningful predictions can be made from global climate models.  Although research has been on-going for over 35 years, adequate measurements of these energy flows may take decades more.

 

The beauty of the simplified approach used by Wallace, Christy, and D’Aleo (WCD) is that bulk atmospheric and surface measurements are used.  Thus, detailed knowledge and measurement of the processes involved are not necessary.  The statistical techniques employed are widely used in economics and other types of studies.  Certainly, economics is not considered a precise science. But, at this point, climate science is not precise.

 

The central issue remains – will a significant increase in CO2 result in dangerously higher surface temperatures.  The EPA’s finding that greenhouse gases endanger human health and welfare (endangerment finding) assert that it will.  But as seen above, both the hypothesis and models the EPA relies on fail basic testing.

 

The WCD approach addresses changes in global average temperatures, both atmospheric and surface, using four explanatory variables: 1) changes in CO2; 2) changes in solar activity; 3) changes in volcanic activity; and 4) changes in the coupled ocean-atmosphere phenomenon called the El Niño Southern Oscillation (ENSO) as indicated in NOAA’s Multivariate ENSO Index.

 

Change in first variable (CO2) is considered human-caused, anthropogenic. The others are natural and unpredictable, chaotic. (Some solar experts may disagree that changes in the sun are unpredictable.). The key issue is whether the three natural variables adequately explain changes in temperatures, without needing the influence of CO2.

 

Fourteen different temperature datasets are analyzed, including balloon and surface data from 1959 to 2015 and satellite data from 1979 to 2015.

 

As Christy stated in his March written testimony:

The fact that this statistical model explains 75-90 percent of the real annual temperature variability, depending on dataset, using these influences (ENSO, volcanoes, solar) is an indication the statistical model is useful.  In addition, the trends produced from this statistical model are not statistically different from the actual data (i.e. passing the “scientific-method” trend test which assumes the natural factors are not influenced by increasing GHGs). This result promotes the conclusion that this approach achieves greater scientific (and policy) utility than results from elaborate climate models which on average fail to reproduce the real world’s global average bulk temperature trend since 1979.

 

It is important to note that the WCD report brings up two econometric issues that may result in overestimates of the influence of human-released CO2 on surface temperatures: 1) multi-collinearity and 2) simultaneity.

 

Multi-collinearity results from influences other than those directly considered.  For example, over the time-period considered, on an annual scale, CO2 is rising roughly linearly.  On a similar scale, certain other activities may have roughly linear trends causing increasing temperatures, which are incorrectly attributed to CO2.  One such human activity is urbanization.

 

Simultaneity would occur when an increase in temperatures from natural causes results in an outgassing of CO2 from the oceans, which may be falsely attributed to humans.  CO2 outgassing can be seen in the ice cores from Antarctica with an increase in CO2 following a general warming.  There are statistical ways to address such issues.

 

Christy’s comment concerning the IPCC models is appropriate:

The over-warming of the atmosphere by the IPCC models relates to a problem the IPCC AR5 encountered elsewhere.  In trying to determine the climate sensitivity, which is how sensitive the global temperature is relative to increases in GHGs, the IPCC authors chose not to give a best estimate.  [A high climate sensitivity is a foundational component of the last Administration’s Social Cost of Carbon.] The reason? … climate models were showing about twice the sensitivity to GHGs than calculations based on real, empirical data. I would encourage this committee, and our government in general, to consider empirical data, not climate model output, when dealing with environmental regulations.

 

See links under Challenging the Orthodoxy and Defending the Orthodoxy.

 

 

Section for a video or follow-on comment

We should revisit occasionally what the proper role of government is.   As the constitution was a good sense of direction, we need a core set of principles to add in order to deal with the future.

 

So many want to engineer society, remove risk, assist certain groups, rather than let individuals thrive and raise communities.  Why?

 

Is Democracy where we all "get it good and hard" or is it the best means to a free society?

 

Should we roll with the special interests, or make the government achieve its proper role, what is that role, and how to do this?

 

When do deficits and governments become too large?

 

Government is becoming more elitist while trying to sell corrections to problems it created, what makes this possible?

 

Add Comments

 

Powered by Disqus