ED Notes:

This article is to consider how good the CO2 measurements are.  Is Moana Loa really good and have the core samples been used properly?

            Once again we are seeing doubt rise in how AGW believers use or even construct data.   Check out fig 1.   The difference between these two curves is revealing.   We have been assuming Moana Loa data is pretty solid and that CO2 is a well-mixed IRG.  All will come into question with his thoughts below.   Fig 2 also shows how the data was selected to get the right curve for AGW.

 

 

Bespoke Climate Science: Temperature and CO2 Data Made to Order

Anthony Watts / February 9, 2017

Guest opinion: Dr. Tim Ball

WUWT Article Link

 

From the beginning, the pattern of the Intergovernmental Panel on Climate Change (IPCC) and its proponents was to produce the science required for the political agenda. It began publicly with James Hansen providing the science Senator Timothy Wirth required for his 1988 Congressional Hearing. The entire process was set up to allow for the creation of bespoke science to determine the political decisions. The Conference of the Parties COP), the political agency acts on the faux science of the Summary for Policymakers, which is released before the Physical Science Basis. Therefore, when the emails were leaked from the Climatic Research Unit (CRU) just before COP 15, they had to abandon the Kyoto Protocol.

 

The revelation by Dr. John Bates that the “pause-busting” graph produced by NOAA was manipulated was no surprise. It was just another piece of bespoke science produced to push forward the AGW agenda and the Paris Climate Agreement. Bates used strange terminology by saying the graph was “hyped” and based on unverifiable, misleading, data. This is Orwellian Newspeak, for saying it was deliberately falsified for a predetermined result. They cheated. Bates is not a whistleblower because he waited until he retired to speak out. It is likely he would still be silent if Hillary Clinton were elected.

 

If he was such a good climate scientist, why didn’t he see the corrupted science that was going on for most of his career?  The answer is a combination of he didn’t know much about climate, and it would jeopardize his career and pension.  I can’t repeat often enough German meteorologist and physicists Klaus-Ekhart Puls experience.

 

“Ten years ago I simply parroted what the IPCC told us. One day I started checking the facts and data – first I started with a sense of doubt but then I became outraged when I discovered that much of what the IPCC and the media were telling us was sheer nonsense and was not even supported by any scientific facts and measurements.  To this day I still feel shame that as a scientist I made presentations of their science without first checking it.”

 

Puls was on the outside, but Bates was on the inside. It makes a mockery of the claim that Bates was an expert and rigorous in the scientific method. That is not possible given the amount of conflicting information available to anybody who took even a cursory look. Apparently, everything Bates did assumed the science of anthropogenic global warming (AGW) was settled. There were enough pieces of conflicting, missing or inaccurate evidence raising red flags everywhere. He ignored them all. Even now, I am unaware of him asking why Tom Karl took the actions he did. The answer is simple; the data was inadequate, but what there was didn’t fit the AGW hypothesis. It was essential to make the temperature go up to make their predictions correct, but also to fit the CO2 curve they were producing.

John Maynard Keynes said,

“When the facts change, I change my mind. What do you do, Sir?

In official, Intergovernmental Panel on Climate Change (IPCC) science, they change, ignore, or create ‘facts’ to achieve the desired result.  Dr. Bates’s, less than adequate actions and exposure of the Karl fiasco at least took the corruption outside the skeptic community. It is also less likely to be covered up, refuted, obfuscated or contradicted with the new regime.

 

 However, even in the skeptic community, it is unlikely to receive the proper emphasis because people are still loathe to believe such a global deception can or did occur.  It is global in geographic extent, but it is also global in the amount of corruption of data. Tom Karl’s corruption only dealt with the temperature of the last 20 years.  The facts were not fitting the hypothesis all along. Karl’s problem was that temperature stopped increasing, but CO2 continued to increase. It was a Huxley moment,

“The great tragedy of science, a lovely hypothesis destroyed by an ugly fact.”

 

The first and easiest defensive strategy only made things worse. They stopped calling it global warming and began calling it climate change. Proponents of the deception tried another tactic. Benjamin Santer said it is not statistically significant and will only qualify if it lasts 17 years.  When that period came and went, more drastic measures were required. Panic set in, so they went to the fallback position, alter the instrumental record – step forward the acknowledged master – Tom Karl.  It was so obviously wrong that even Dr. Bates noticed. He claims his bosses ignored his protests. He should have gone public right then!

 

Some of us knew the global instrumental temperature record was being adjusted to fulfill the political AGW objective. We also knew the paleoclimate record was adjusted through the manipulations of the “hockey stick.” Ironically, that graph incorporated manipulated proxy records for the handle and manipulated instrumental records for the blade. It also incorporated another escape/excuse technique; the data goes missing. Phil Jones who created the blade lost his data. It is likely we will see more data (and even source codes) go missing as the deception is exposed. Will the mainstream media continue to ignore it?

 

The manipulation and production of bespoke temperature data were also carried out with the CO2 data. I can anticipate all the trolls trying to defend the ‘official’ CO2 science as they have done every time anyone presented individual pieces of contradictory evidence. In climate science, the violent reaction against an individual is a sure measure of how close that person is to exposing the truth. Now, the parallels with the temperature manipulation and the amount of CO2 evidence that fits the scenario but still produces failed predictions is overwhelming.

    Some support for that comment is in the following information created to mislead and misdirect the public:

  • Most people think CO2 is the most abundant and important greenhouse gas.
  • Because CO2 is such a small percentage of the greenhouse gases they created a measure called “climate sensitivity,” which claims that CO2 is more “effective” as a greenhouse gas than water vapor.  The sensitivity number has consistently decreased and many say it is zero.  A few say it is a negative, cooling, agent.
  • Estimates of the amount of annual human CO2 production added to the atmosphere are produced by the IPCC.  They claim it is about 3% of the annual total.  The number consistently increases despite changes in the world economy.
  • They claimed the length of time CO2 remains in the atmosphere, known as the Residency Time, is at least 100 years.  It is only 5 to 7 years.  I know there are arguments about what residency time means, but it is irrelevant because the IPCC used the 100-year value in their calculations of the Global Warming Potential (GWP) of CO2.
  • Their calculations added the human production of CO2 to the atmosphere but ignored the human portion removed.  Agriculture and forestry probably remove 50 percent of total production.
  • The Antarctic Ice cores and all other records show temperature increases before CO2.  Most of the public still don’t know this, even though it was determined in the late 1990s.
  • They falsely assumed that CO2 was evenly distributed throughout the atmosphere.  The OCO2 satellite that began gathering data in September 2014 disproved it.
  • • They promoted or at least didn’t contradict, the claim that CO2 is a pollutant.  Carbon dioxide is essential for all life on Earth. Research shows current levels of 400 ppm are approximately one-third the optimum for most plants.  Empirical evidence from CO2 levels injected into commercial greenhouses indicates optimum yields at levels between 1000 and 1200 ppm.  Interestingly, this is the average level of the last 300 million years.  The plants are malnourished at 400 ppm.

 

The last point is important because the public is led to believe that CO2 levels are dangerously high, have never been higher, and any further increase is potentially catastrophic. The higher historic record of CO2 was as threatening to their AGW claims as the existence of previous warmer periods to their temperature claims. This lie was necessary to support the AGW hypothesis. They needed a low pre-industrial level for an upward trend to match the growth of industrialization. They also needed to eliminate troubling natural variability in the record.

 

Figure 1 shows CO2 levels from ice cores and stomata (pores on leaves) over a 2000-year span. It appeared in 2002 to contradict the official claims based just on ice cores.

 

Figure 1

 

The average level for the 2000-year ice core record is approximately 265 ppm with a variability of about 10 ppm created by a 70-year smoothing average, while it is 300+ ppm with 50 ppm variability for the stomata record.  The stomata data show the higher readings and variability when compared to the excessively smoothed ice core record that became the base point for the modern instrumental readings (see Figure 3).

 

How well do ice core records represent the atmosphere for each year?  The answer is they are not representative, and you have no idea which year is involved. I spent many hours discussing all the limitations with one of the best glacier experts, Dr. Fritz Koerner, one of the few people to drill in the Arctic and Antarctic.  He told me in the late 1980s that his Arctic records were showing temperature changing before CO2.

 

It takes decades for the bubble of air to be trapped in the ice.  There is no way of knowing which year or even decade is represented.  The surface of a glacier is very wet as melting occurs even in winter under direct sunlight.  Once the bubble is formed meltwater moving through the ice contaminates it.  As Brent C. Christner, reported in “Detection, Recovery, Isolation and Characterization of Bacteria in Glacial Ice and Lake Vostok Accretion Ice” bacteria form in the ice, releasing gases even in 500,000 – year – old ice at considerable depth.  Pressure of overlying ice, causes a change below 50m and brittle ice becomes plastic and begins to flow. The layers formed with each year of snowfall gradually disappear with increasing compression.  It requires a considerable depth of ice over a long period to obtain a single reading at depth.  Then there are the problems with contamination and losses during drilling and core recovery.

 

One of the first people to suffer attacks for daring to identify the problems with the CO2 data was Professor Zbigniew Jaworowski.  In a paper titled, “Climate Change: Incorrect information on pre-industrial CO2” he told a US Senate Committee on Commerce, Science, and Transportation hearing that,

“The basis of most of the IPCC conclusions on anthropogenic causes and on projections of climatic change is the assumption of low level of CO2 in the pre-industrial atmosphere. This assumption, based on glaciological studies, is false.”

 

“The notion of low pre-industrial CO2 atmospheric level, based on such poor knowledge, became a widely accepted Holy Grail of climate warming models.  The modelers ignored the evidence from direct measurements of CO2 in atmospheric air indicating that in 19th century its average concentration was 335 ppmv[11] (Figure 2). In Figure 2 encircled values show a biased selection of data used to demonstrate that in 19th century atmosphere the CO2 level was 292 ppmv[12]. A study of stomatal frequency in fossil leaves from Holocene lake deposits in Denmark, showing that 9400 years ago CO2 atmospheric level was 333 ppmv, and 9600 years ago 348 ppmv, falsify the concept of stabilized and low CO2 air concentration until the advent of industrial revolution [13].”

 

The modelers did know about the 19th century data because Tom Wigley, who took over as Director of the Climatic Research Unit (CRU) from Hubert Lamb, introduced them to the climate science community. I remember this well because his 1983 article, “The Pre-Industrial Carbon Dioxide Level” in Climatic Change became a seminar in my graduate climate class. Wigley did what many others have done in manipulating the climate story by cherry-picking from a wide range of readings, eliminating only high readings and claiming the pre-industrial level was approximately 270 ppm. This influenced the modelers Wigley was working with at the CRU and the IPCC. He was the key person directing the machinations as revealed by the leaked emails from the (CRU).

 

There are some 90,000 direct instrumental measures beginning in 1812 in the record Wigley analyzed. Scientists wanted to understand the composition and dynamics of the atmosphere, but, unlike today, began by collecting data before theorizing. These scientists were not collecting the data to prove global warming or any other theory. They took precise measurements with calibrated instruments as Ernst-Georg Beck thoroughly documented. These measures were as troubling as the famous 7c graph from the 1990 IPCC Report that showed the MWP. In an obituary, I quoted Beck’s friend Edgar Gartner as follows;

Due to his immense specialized knowledge and his methodical severity Ernst very promptly noticed numerous inconsistencies in the statements of the Intergovernmental Panel on Climate Change (IPCC). He considered the warming of the earth’s atmosphere as a result of a rise of the carbon dioxide content of the air of approximately 0.03 to 0.04 percent as impossible. And he doubted that the curve of the CO2 increase noted on the Hawaii volcano Mauna Loa since 1957/58 could be extrapolated linearly back to the 19th century. (Translated from the German)

 

Beck sent me his preliminary research, and I supported his efforts but warned him of the vicious attacks he would experience. He wrote to me in November 2009, ironically the same month the emails were leaked from the CRU to say,

In Germany the situation is comparable to the times of medieval inquisition.

 

Wigley was not the first to misuse the 19th century data, but he did reintroduce it to the climate community. British Steam engineer Guy Stewart Callendar believed that increasing CO2 would cause warming. He did what Wigley and all pro-IPCC climate scientist have done by selecting only readings that support the hypothesis.

 

Figure 2 After Jaworowski (Trend lines added by author)

 

Figure 2 shows how the data group selected by Callendar dramatically lowers the average from approximately 370 ppm to 270ppm and alters the trend from decreasing to increasing.

 

Ernst-Georg Beck confirmed Jaworowski’ s research.  An article in Energy and Environment examined the readings in great detail and validated their findings. In a devastating conclusion, Beck states

“Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC. Review of available literature raise the question if these authors have systematically discarded a large number of valid technical papers and older atmospheric CO2 determinations because they did not fit their hypothesis?  Obviously they use only a few carefully selected values from the older literature, invariably choosing results that are consistent with the hypothesis of an induced rise of CO2 in air caused by the burning of fossil fuel.”

 

So the pre-industrial level is some 50 ppm higher than the level put into the computer models that produce all future climate predictions.  The models also incorrectly assume uniform atmospheric global distribution and virtually no variability of CO2 from year to year.

Beck also found,

“Since 1812, the CO2 concentration in northern hemispheric air has fluctuated exhibiting three high level maxima around 1825, 1857 and 1942 the latter showing more than 400 ppm.”

 

He provided a plot (Figure 3) comparing 19th century readings with ice core and Mauna Loa data.  You can see how the Mauna Loa record links smoothly on to the end of the ice core record.  Compare that with the variability of the 19th century readings and the short OCO2 record.  In fact, all the records show the variability, but it is statistically eliminated in both the ice core and Mauna Loa record. Variability is basic statistical technique generally ignored by climate science because it indicates the problems with their hypothesis.  Consider what is lost if a 70-year is applied to any other climate data set.  It virtually eliminates the modern instrumental record which is less than 70 years long for most weather stations.

 

Figure 3

 

Smoothing is applied to the Mauna Loa and all current atmospheric readings, which can vary up to 600 ppm in a single day.  Statistician William Brigg’s advises that you never, ever, smooth a time-series.  The loss is greater if high readings are eliminated before the smoothing. Charles Keeling, a devout ‘CO2 is causing warming’ believer, built the Mauna Loa station to achieve and control the results. In “50 Years of Continuous Measurement of CO2 on Mauna Loa” Beck wrote,

“Mauna Loa does not represent the typical atmospheric CO2 on different global locations but is typical only for this volcano at a maritime location in about 4000 m altitude at that latitude.”

 

It is on the side of a volcano with CO2 leaking through very porous ground for hundreds of square kilometers around the crater. Keeling used the lowest afternoon readings and ignored natural sources to create the measures required.

As Beck noted, the Keeling family “owns the global monopoly of calibration of all CO2 measurements.”   Keeling’s son is a co-author of the IPCC reports, and that agency accepts Keeling data as representative of global readings.

 

The IPCC and its proponents needed to control two variables, temperature and CO2, to create their AGW narrative.  Both variables had to begin low at the start of the pre-industrial period and increase steadily to the present. Both had to be higher in the modern record than at any time before. Manipulation of temperature records, long known about in the skeptic community, were recently exposed by an insider, Dr. John Bates.

 

The immediate attempts to downplay his revelation are proof that it is problematic. NOAA had already disclosed its typical reaction by shamefacedly using the intellectual property argument when Congressman Lamar Smith subpoenaed the data.  This dodge was originally recommended by Phil Jones to IPCC and CRU members in a February 6, 2004, email and exploited by Michael Mann.  It is egregious and unacceptable because it prevents the essential practice of science to achieve reproducible results.  Worse, it is work paid for by the taxpayer and then used to impinge unnecessary taxes, rules, and regulations on those taxpayers.  It is the greatest deception in history and only gets worse as bureaucrats, and so many with political and financial interests continue to defend the indefensible.

 

Caveat from Anthony:  As I have said on other occasions, I don’t view the atmospheric CO2 work of Ernst-Georg Beck as being particularly accurate or useful, due to the uncertainty of his chemical reduction method, and the fact that many of his measurements were done within cities, which have highly variable CO2 levels that aren’t representative of global values. In figure 3, note the “local effective concentration” label. I simply don’t consider the 19th century measurements accurate enough to be credible to compare to current global values. – Anthony Watts

 

Section for a video or follow-on comment

We should revisit occasionally what the proper role of government is.   As the constitution was a good sense of direction, we need a core set of principles to add in order to deal with the future.

 

So many want to engineer society, remove risk, assist certain groups, rather than let individuals thrive and raise communities.  Why?

 

Is Democracy where we all "get it good and hard" or is it the best means to a free society?

 

Should we roll with the special interests, or make the government achieve its proper role, what is that role, and how to do this?

 

When do deficits and governments become too large?

 

Government is becoming more elitist while trying to sell corrections to problems it created, what makes this possible?

 

Could include a pic

This could also be inserted into the field above, or erased

 

Currently as a society, we are having a most difficult time discussing political issues.  What is driving this?   And why a rebirth in political culture would be a good thing.

 

Market Economy

Are "markets" dead as some would conjecture? Or is free enterprise what got us here?

 

Economic Theories

At the heart of economics there are several possible economic schools of thought, the essence of these schools of thought and how they relate to our lives.

  

Add Comments

 

Powered by Disqus