Weekly Climate and Energy News Roundup #468 – Watts Up With That?

The Week That Was: 2021-08-28 (August 28, 2021)
Brought to You by SEPP (www.SEPP.org)
The Science and Environmental Policy Project

Quote of the Week: “Those who can make you believe absurdities can make you commit atrocities” – Voltaire [H/t Tony Heller]

Number of the Week: – The missing 97%


By Ken Haapala, President, Science and Environmental Policy Project (SEPP)

Scope: This TWTW will focus on three additional, significant omissions, holes, in the “Summary for Policymakers” of the Physical Science Basis of Sixth Assessment Report (AR6) by the UN’s Intergovernmental Panel on Climate Change (IPCC). The Summary for Policymakers (SPM) is used to justify drastic changes in the use of fossil fuels. Given the gravity of such changes, the document should be of the highest scientific standards and demonstrate the scientific integrity of the IPCC.

Specifically, statistician Steve McIntyre, who exposed Mr. Mann’s hockey-stick for having statistical difficulties, illustrates how the IPCC ignores the Southern Hemisphere in the construction of the New Hockey-stick featured in the SPM. Separately, statisticians Ross McKitrick and William Briggs illustrate that, as presented, the new field of attribution analysis has no foundation in theoretical statistics. Thus, after-the-fact statements such as the one by World Weather Attribution that: “At the Ahr River the flood is estimated to be a 500-year event or rarer according to preliminary data” are little more than hot air.

The Number of the Week, the missing 97%, is based on calculations by Physics Professor emeritus Howard Hayden using assertions in earlier IPCC reports. This 97% is critical for life on the planet as we know it because it prevents the land masses from entering a deep freeze at night. Yet it is not discussed in recent IPCC reports except in passing reference deep in the reports.

TWTW will also include recent observations on renewable power generation in the Pacific Northwest and California. Wind power continues as a child on a pogo-stick and solar power slumbers through the California nights.


Lacking Data: One of the criticisms against those who argue for the Medieval Warm Period is that the data is largely confined to Europe. This is correct, although data from China and South America showing the warm period are increasing. Nonetheless, those who assert the “lack of data” argument should be perplexed by the New Hockey-stick in the IPCC AR6 SPM. The study known as PAGES (2019) provides the foundation for the New Hockey-stick. However, the IPCC does not identify it as the source.

Over a number of years, Steve McIntyre has traced the studies called PAGES to a private group called Past Global Changes based in Bern, Switzerland. McIntyre refers to earlier versions of PAGES studies as well when he writes:

“The 30-60N latitude band [latband] gets lots of attention in paleoclimate collections – probably more proxies than the rest of the world combined. The 30-60S latitude band is exactly the same size, but it is little studied. It is the world of the Roaring Forties and Furious Fifties, a world that is almost entirely ocean. The only land is New Zealand, Tasmania and the southern coast of Australia facing Antarctica, the tip of South Africa and the narrow part of South America: southern Chile and Argentina. But 96% or so [of this latitude band] is ocean.

“No Ocean Proxies

Although the 60-30S is almost entirely ocean, PAGES 2019 did not use a single ocean proxy in its data. They used only eight series (out of 19 PAGES 2017). Seven tree ring series: two from New Zealand (both less than 500 years), three from Tasmania (one long, two less than 500 years), two from southern South America (both less than 500 years) and one weird lake sediment from Chile (a “singleton” proxy using pigments in the sediments).

“Only One Long Proxy

Only one proxy in the network has values prior to AD750 and only two proxies have values prior to AD1450. Thus, the only information directly comparing medieval and modern values comes from these two proxies: Mt Read, Tasmania (a series used as long ago as {Mann et al 1998} and {Jones et al 1998} and many times since); and the Laguna Aculeo pigment series – neither of which have shapes remotely similar to the PAGES2K 60-30S latband reconstruction – see below. (The latband reconstruction was calculated from the enormous file at NOAA here).” [Not included in TWTW.]

After presenting the data, much if it cut short and not revealing the hockey-stick pattern, McIntyre brings what could be called IPCC magic.

“The blade of the reconstruction HS [Hockey-stick] goes from -1 sigma in early 20th century to more than 4 sigma in 2000. Yet there is no comparable deviation in any of the underlying proxies. The three South American proxies and the long Mt Read, Tasmania tree ring chronology don’t have anything like a blade; the four short tree ring chronologies (two Tasmania and two New Zealand) have increase sharply in 20th century, but not enough to yield the PAGES 2019 HS. (These tree chronologies have been selected from a much larger candidate population – a screening process that already imparts a serious bias.)

“The only 30-60S proxy with a value in the year 2000 is Mount Read, which has a value of ~1 sigma. Yet the PAGES 2019 30-60S (CPS) reconstruction has a value of over 4 sigma. How did they do that?

“PAGES 2019 provide code for the generation of figures from reconstructions but didn’t archive the code for the generation of the reconstructions. (At least in the links provided in any of the articles.) So, it’s impossible to precisely diagnose what’s going on.

“Although PAGES proclaim the importance of public archiving as a selection criterion, only one of the tree ring chronologies (the long Mount Read chronology) can be firmly associated with ITRDB measurement data archives…”

Then comes the real deficiency of the IPCC:

“But most of all, given that the 60-30S latband is almost entirely (~96%) ocean, it seems bizarre that PAGES 2019 did not use any ocean core proxies, especially since there are physical formulas for estimating SST from alkenone or Mg/Ca measurements. Any conversion of tree ring widths to temperature in deg C is the result of ad hoc statistical fitting, not a universal formula. Alkenone values have been measured all over the modern ocean and nicely fit known ocean temperatures. In addition, alkenone values for ocean cores going back to deeper time (even to the Miocene [23 million years ago]) give a consistent and reproducible narrative. So, there’s a lot to like about them as a candidate for a “good” proxy.

“While there are numerous high-resolution (10-year resolution) alkenone and Mg/Ca measurements in the North Atlantic with values through the last millennium and up to the present, to my knowledge, there were not any such series as of PAGES 2013 or PAGES 2017. (In my opinion, IPCC AR5 ought to have noted this and suggested that this deficiency be remedied.)” [Boldface in original]

The so-called “peer reviewed” data, on which the IPCC claims it relies, is lacking. After presenting ocean core data that shows a cooling, which may have been dropped in the latest version of PAGES or truncated with the latest data removed, McIntyre concludes:


Given that the 60-30S latband is almost entirely ocean, it seems logical that IPCC and PAGES2K should use data from ocean proxies to estimate past temperature in this latitude band. But this isn’t what they’ve done. Instead, they’ve purported to estimate past temperature from a few scattered tree ring chronologies, only one of which reaches earlier than AD1850; and an idiosyncratic singleton pigment series. Ironically, the only 30-60S proxy series in PAGES 2019 that reaches back into the first millennium – the Mount Read, Tasmania tree ring series – was used by Mann et al 1998-1999, Jones et al 1998 and numerous other supposedly “independent” multiproxy studies. Neither of the two series reaching back to the medieval period permit the conclusion that modern period is warmer than medieval period. Caveat: I’m not saying that it isn’t; only that this data doesn’t show it, let alone support the big-bladed HS cited by IPCC. High-resolution alkenone measurements from ocean cores offshore Chile show a consistent decrease in ocean temperatures over the past two millennia that is neither reported nor discussed by IPCC (or PAGES 2019). [Boldface is Italics in original.]

“To be clear, some of the technical articles on 30-60S ocean core proxies by specialist authors are truly excellent and far more magisterial than the IPCC mustered, in particular, several articles on offshore Chile. Here are a few:

“Mohtadi et al, 2007. Cooling of the southern high latitudes during the Medieval Period

and its effect on ENSO link

“Killian and Lamy 2012. A review of Glacial and Holocene paleoclimate records from southernmost Patagonia (49-55degS) link

“Collins et al 2019. Centennial‐Scale SE Pacific Sea Surface Temperature Variability Over the Past 2,300 Years link” [links given in original]

With such omissions and deficiencies, the IPCC AR6-SPM lacks scientific standards to be a credible source of scientific knowledge. The IPCC does not meet Richard Feynman’s standards for employing the scientific method and for scientific integrity. See links under Challenging the Orthodoxy – IPCC and Defending the Orthodoxy.


False Certainty: An article in the July 10 TWTW asked: “How Scientists Are So Confident They Know What’s Causing This Insane Weather.” The article stated:

“According to legendary Princeton geoscientist Michael Oppenheimer, scientists are no longer guessing when it comes to tying extreme events like this to climate change, because a whole new field now exists that aims to tie a nice, neat bow around these very questions.

“’There is now a well-developed science of ‘event attribution’ which deals with uncertainty,’ Oppenheimer told The Daily Beast. (His own research over the years has focused on what the specific hazards of climate change will be, not necessarily event attribution.)

“Here’s Oppenheimer’s explanation of how event attribution scientists do their jobs: They use Fractional Attribution of Risk (FAR), which he said is “the fraction of the intensity of an event (like a heatwave) that can be attributed to human-made greenhouse gases.” For example, event attribution scientists calculated the FAR on 2017’s Hurricane Harvey—after the fact—and it had, Oppenheimer explained, about two times what would have been the case without the greenhouse gases at 2017 levels. That gave Harvey a FAR score of 0.5.”

The process is disturbing because there appears to be a lack of theoretical basis for assigning a statistical probability to an event after it occurred, without being able to predict such an event occurring. Earlier the Global Warming Policy Foundation published a report by statistician William Briggs titled: “The Climate Blame Game: Are we really causing extreme weather?” In the summary Briggs writes:

“Claims made in so-called climate change event attribution studies suffer from gross over-certainties and cannot be trusted. The techniques used in these studies are in their infancy and do not warrant the trust put into them. These studies assume either (a) perfect forecasting models, or (b) known, uncertainty-free causes of climate change. Neither condition holds. Because of this, attribution claims are far too certain or are wrong. They should not be used in any policy decisions.”

In an April 14 post on his blog, Briggs writes:

“In order to attribute individual weather events to humankind, scientists need a perfect model of the climate. They do not have this. Therefore, claims that we are responsible for any particular weather event are at best overconfident, if not plain wrong.”

“Attribution studies assume that the weather has been getting worse, yet empirical observations do not support this generic assumption.”

Econometrician Ross McKitrick, who with Steve McIntyre, exposed Mr. Mann’s hockey-stick as based on a shoddy understanding of statistics, has addressed criticism of the event attribution to the theoretical level exposing that the process, called “optimal fingerprinting,” lacks a sound theoretical basis. He traces “optimal fingerprinting” to a paper by Myles Allen and Simon Tett, which was published in Climate Dynamics in 1999. On August 10, 2021, the same journal published McKitrick’s criticism “Checking for model consistency in optimal fingerprinting: a comment.” The abstract states:

Allen and Tett (1999, herein AT99) introduced a Generalized Least Squares (GLS) regression methodology for decomposing patterns of climate change for attribution purposes and proposed the “Residual Consistency Test” (RCT) to check the GLS specification. Their methodology has been widely used and highly influential ever since, in part because subsequent authors have relied upon their claim that their GLS model satisfies the conditions of the Gauss-Markov (GM) Theorem, thereby yielding unbiased and efficient estimators. But AT99 stated the GM Theorem incorrectly, omitting a critical condition altogether, their GLS method cannot satisfy the GM conditions, and their variance estimator is inconsistent by construction. Additionally, they did not formally state the null hypothesis of the RCT nor identify which of the GM conditions it tests, nor did they prove its distribution and critical values, rendering it uninformative as a specification test. The continuing influence of AT99 two decades later means these issues should be corrected. I identify 6 conditions needing to be shown for the AT99 method to be valid.

The Gauss-Markov (GM) theorem was developed over centuries starting with mathematician Carl Friedrich Gauss (1777-1855) and by mathematician Andrey Markov (1856-1922). It is critical to the use of Generalized (Ordinary) Least Squares (GLS) regression analysis producing meaningful results. Among other issues, it requires that errors in the analysis are uncorrelated, have equal variances and an expected value of zero. Failure to meet the requirements renders the work meaningless. [This form of statistics is separate from Bayesian statistics.]

Writing in Judith Curry’s “Climate Etc.” McKitrick addresses the issues he raises in a less mathematical form than in the paper. Also, he gave Allen and Tett the opportunity to comment. McKitrick guesses at potential objections:

“1. Yes but look at all the papers over the years that have successfully applied the AT99 method and detected a role for GHGs. Answer: the fact that a flawed methodology is used hundreds of times does not make the methodology reliable, it just means a lot of flawed results have been published. And the failure to spot the problems means that the people working in the signal detection/Optimal Fingerprinting literature aren’t well-trained in GLS methods. People have assumed, falsely, that the AT99 method yields “BLUE” – i.e., unbiased, and efficient – estimates. Maybe some of the past results were correct. The problem is that the basis on which people said so is invalid, so no one knows.

“2. Yes but people have used other methods that also detect a causal role for greenhouse gases. Answer: I know. But in past IPCC reports they have acknowledged those methods are weaker as regards proving causality, and they rely even more explicitly on the assumption that climate models are perfect. And the methods based on time series analysis have not adequately grappled with the problem of mismatched integration orders between forcings and observed temperatures. I have some new coauthored work on this in process.

“3. Yes, but this is just theoretical nitpicking, and I haven’t proven the previously published results are false. Answer: What I have proven is that the basis for confidence in them is non-existent. AT99 correctly highlighted the importance of the GM theorem but messed up its application. In other work (which will appear in due course) I have found that common signal detection results, even in recent data sets, don’t survive remedying the failures of the GM conditions. If anyone thinks my arguments are mere nitpicking and believes the AT99 method is fundamentally sound, I have listed the six conditions needing to be proven to support such a claim. Good luck. [Boldface added]

Until McKitrick’s six conditions are addressed, there is little reason to assume that studies based on Event Attribution or Optimal Fingerprinting are meaningful. This field in climate studies shows, again, that climate researchers and their journals need people who understand the foundations and limits of statistical techniques. See links under Challenging the Orthodoxy.


Pogo-stick Power: For almost two months, TWTW followed the changes in wind power generation as reported by the Bonneville Power Administration (BPA), which includes the Columbia River Gorge “where the wind always blows.” The total nameplate generation is 27,879 MW of which 79.5% is hydro and 10.5% (2930 MW) is wind. [The balance is from other sources such as nuclear, gas, biomass, etc.]

The failures in wind generation are glaring. On August 22, it peaked at about 2700 MW then fell to zero by noon August 23, bouncing up to 500 MW on the early afternoon of August 24 before falling to near zero early morning on August 25. By midnight it climbed to around 2000 MW where it bounced around until late morning on August 27 before falling to near zero by midnight. It has stayed near zero past midnight.

The rapid changes in wind power force rapid changes in hydropower to balance the load, balancing generation with demand. Hydropower often varies rapidly from about 4000 MW to about 10,000 MW. Only specifically designed hydro turbines can take the stress of rapid change.

Wind power in South Australia in August (winter) is as erratic. Why anyone thinks such erratic power is suitable for modern civilization is beyond belief. There is no utility scale backup in operation and estimates of costs are not available. See links under Energy Issues—Australia and Energy Issues—US.


California Slumbers: The California Systems Operator maintains daily graphs on electricity supply and demand and for specific types of renewables. Over the past few days, natural gas generation varied from 18,000 MW down to 9,000 MW, imports varied between 10,000 MW to 5,000 MW and solar varied between zero and 12,000 MW to meet a total demand of 26,000 to 28,000 MW. Over the time period observed, without major storms, solar power is more consistent than wind power, because it fails every evening.

New York used to be called the city that never sleeps. If it follows California in solar power, it and Los Angeles may be called the cities that always slumber. See link under Energy Issues—US.


Next TWTW: The above emphasized significant deficiencies in IPCC AR6, particularly in the Summary for Policymakers. These include the use of data, the origin of which is not known and certainly not peer-reviewed, and the false confidence expressed in Event Attribution or Optimal Fingerprinting.

The next TWTW will continue on similar deficiencies such as significant misinterpretation of the greenhouse effect by the IPCC and whatever Steve McIntyre may post in his review of the New Hockey-stick. Of significant interest is what the IPCC claims of ice core borings in the Antarctic, which shows that variations in carbon dioxide followed variations in temperatures for hundreds of thousands of years. This is contrary to the IPCC’s New Hockey-stick.


14th ICCC: The 14th International Conference on Climate Change presented by The Heartland Institute will be October 15 to 17, 2021, at Caesars Palace in Las Vegas. See https://climateconference.heartland.org/


Number of the Week: – The missing 97%: Using formula found in the Third Assessment Report, “Simplified expression of Radiative forcing for the Trace gas carbon dioxide,” Howard Hayden calculates the radiative forcing from a doubling of CO2 is 3.7 Watts/meter squared (W/m2). Yet, the IPCC calls this tiny amount a forcing.

The “Heat flow chart” in the Fifth Assessment Report gives the thermal outgoing radiation at the surface of 398 W/m2 and the thermal outgoing radiation at the top of the atmosphere at 239 W/m2. The latter figure is close to what is being found by satellites in the ongoing CERES experiment.

The difference between the thermal radiation at the surface and outgoing radiation at the top of the atmosphere is 159 W/m2, which keeps the average temperature of the earth about 34°C warmer than it would be without greenhouse gases. This is particularly important because as explained by John Tyndall following his experiments starting in 1859, greenhouse gases prevent the land masses from going into a deep freeze each night, killing all growing plants.

If we add the 3.7 W/m2 (forcing) to 159 W/m2 (necessary for life) and divide the 3.7 W/m2 (forcing) by the total, we get that the forcing is only equal to about 2.3% of the greenhouse effect with a doubling of CO2. Forcing is a catchy name for 2.3% of something, but there is no catchy name for the other 97.7% of the greenhouse effect which is so necessary for life. Suggestions are welcome. Email Howard Hayden at [email protected].

Commentary: Is the Sun Rising?

Where The Sun Don’t Shine: Climate Alarmists’ Thinking

By I & I Editorial Board, Aug 20, 2021

Challenging the Orthodoxy — NIPCC

Climate Change Reconsidered II: Physical Science

Idso, Carter, and Singer, Lead Authors/Editors, Nongovernmental International Panel on Climate Change (NIPCC), 2013

Summary: https://www.heartland.org/_template-assets/documents/CCR/CCR-II/Summary-for-Policymakers.pdf

Climate Change Reconsidered II: Biological Impacts

Idso, Idso, Carter, and Singer, Lead Authors/Editors, Nongovernmental International Panel on Climate Change (NIPCC), 2014


Summary: https://www.heartland.org/media-library/pdfs/CCR-IIb/Summary-for-Policymakers.pdf

Climate Change Reconsidered II: Fossil Fuels

By Multiple Authors, Bezdek, Idso, Legates, and Singer eds., Nongovernmental International Panel on Climate Change, April 2019


Download with no charge:


Why Scientists Disagree About Global Warming

The NIPCC Report on the Scientific Consensus

By Craig D. Idso, Robert M. Carter, and S. Fred Singer, Nongovernmental International Panel on Climate Change (NIPCC), Nov 23, 2015


Download with no charge:


Nature, Not Human Activity, Rules the Climate

S. Fred Singer, Editor, NIPCC, 2008


Global Sea-Level Rise: An Evaluation of the Data

By Craig D. Idso, David Legates, and S. Fred Singer, Heartland Policy Brief, May 20, 2019

Challenging the Orthodoxy

The IPCC’s attribution methodology is fundamentally flawed

By Ross McKitrick, Climate Etc., Aug 18, 2021

Link to paper: Checking for model consistency in optimal fingerprinting: a comment

By Ross McKitrick, Climate Dynamics, Aug 10, 2021


Propaganda Masquerading As Climate Science

By William Briggs, His Blog, Aug 25, 2021

Link to report: The Climate Blame Game: Are We Really Causing Extreme Weather?

By William M Briggs, GWPF, 2021

April 14 post: https://wmbriggs.com/post/35291/

Boris Johnson’s wind delusion poses national security risk

By Staff, GWPF, Aug 8, 2021

Link to paper: The Workable Alternative to Net Zero

By Capell Aris and John Constable, GWPF, 2021


The settled science of fingerprinting

By John Robson, Climate Discussion Nexus, Aug 25, 2021

[SEPP Comment: On McKitrick’s paper, above]

The West Is a Fire Plain. Get Over It.

By Randal O’Toole, Liberty and Ecology, Aug 23, 2021 [H/t Jane Stroup]

Unsettling the apple cart VI: Koonin on apocalypses that ain’t

By John Robson, Climate Discussion Nexus, Aug 25, 2021

“Continuing University of Guelph professor Ross McKitrick’s look at Steven E. Koonin’s landmark book Unsettled: What Climate Science Tells Us, What it Doesn’t, and Why it Matters.”

Challenging the Orthodoxy – IPCC

PAGES2019: 30-60S

By Stephen McIntyre, Climate Audit, Aug 26, 2021

Source link

Recent Articles

Related Stories

Leave A Reply

Please enter your comment!
Please enter your name here