There are different methods for measuring global temperatures. The satellite record as compiled by meteorologists from the University of Alabama, Huntsville (UAH), is used by the United Nation's Intergovernmental Panel on Climate Change. This is the temperature time series most often quoted by those sceptics of anthropogenic global warming.
Republican Senator Ted Cruz made much of the 18-year long pause in this record of his cross-examination of Sierra Club President Aaron Mair at a US Senate subcommittee meeting late last year.
Mr Mair, like most climate justice activists unfamiliar with this evidence, had no idea that this record showed no warming since the super-El Nino of 1997-1998. Rather than provide an explanation of the trends in any of the global temperature datasets, Mr Mair could only explain that global warming was real, because there was a scientific consensus that global warming was real. Of course a "consensus" is a form of politics, and while it can deny a fact, it can't actually change one. Whether or not there is a trend in a series of numbers is determined by statistics, not consensus or opinion.
Until this year, there was no warming trend in the UAH satellite data. Then the February 2016 update to the database showed a surge in global temperatures, particularly in the northern hemisphere. This has been attributed to an El Nino event, which is exactly the same phenomena that caused the surge in temperatures in 1997-1998. So, the pause has been broken, and the cause is not carbon dioxide.
As a colleague emailed me, "The extent of the observed increase in global temperatures is out of all proportion to the increase in carbon dioxide for the same period." That's correct. El Nino events are not caused by carbon dioxide. They are natural events which manifest as changes in ocean and atmospheric circulation patterns across the Pacific Ocean. An El Nino typically begins with a weakening of the trade winds and is measured by a fall in air pressure over Tahiti, and a rise in surface pressure over northern Australia, measured at Darwin.
While the UAH satellite data only goes back to 1979, there is a record of changes in air pressure at Darwin back to 1876. This time series indicates super El Ninos occurred back in 1914/1915 and also 1940/1941. Considering the surface temperature records as measured with mercury thermometers for some locations in eastern and northern Australia, temperatures back in 1914/1915 and 1940/1941 were hotter than they are now.
So, while the global satellite temperature data indicates that February 2016 is the hottest month on record, this pertains to a record that only goes back to 1979. If we consider the much longer surface temperature record for many individual locations across Australia and other parts of the world, February 2016 is not that hot.
But this is an exceedingly contentious claim, rejected by a "consensus" of climate scientists that rely exclusively on homogenized temperature series. That is the early temperature records are almost all adjusted down through the homogenization process, so the present appears warmer relative to the past. It is not contested that the official surface temperature records are adjusted, or that the early records are generally cooled. This is justified mostly on the basis of "world's best practice" and that temperature series should show global warming from at least 1910. I've explained the inconsistencies in the adjustment made in the homogenization of the original observed maximum temperatures at Darwin in a technical paper originally accepted for publication in the International Journal of Climatology (see postscript for more information).
Back in 1876, the weather station at Darwin was the responsibility of Charles Todd, Australia's Postmaster General and Superintendent of Telegraphs. His priority until 1872 had been the construction of an overland telegraphic line from Adelaide to Darwin along which he established 14 regional weather stations. Todd's first passion was meteorology and astronomy, both of which he used in his weather forecasting.
Air pressure measurements were important to the weather forecasts being issued by Todd, with these the measurements standardized based on local temperature measurements. It was thus important that local temperatures were accurately recorded. While these records stood for over 100 years, beginning in 1996 the Australian Bureau of Meteorology started "adjusting" all the old temperature records used in the calculation of official temperature trends, including the maximum temperature series for Darwin.
What the homogenization process tends to do to the temperature record is not only create a global warming trend where none previously existed, but it also removes the natural cooling and warming cycles so evident in the raw observational data. For example, in the unadjusted maximum temperatures as recorded at Darwin the hottest year is 1907. Temperatures then appeared to cool to 1942 when there is a spike in temperatures. Note that 1941/1942, like 1997/98 and 2015/2016 were El Nino years. These were also years of minimum lunar declination.
Unlike modern meteorologists, Todd understood that climate change on Earth is driven by extraterrestrial phenomena. But he would likely have cautioned against single-cause explanations recognizing that there are multiple and overlapping periodicities evident in the history of the Earth's climate. There are natural cycles that spans tens of thousands of years affected by changes in the Earth's tilt, and much shorter cycles affected by changes in solar activity. Early 20th Century astronomers and weather forecasters, particularly Inigo Owen Jones, were interested in the planets. They noted decades in advance that 1940/41 would have been a year of Jupiter, Saturn and Uranus conjunction.
Todd would have outlawed the practice of homogenization. Scientists of that era considered the integrity of observational records sacrosanct.
At an online thread I recently read the following comment about homogenization:
Don't you love the word homogenise? When I was working in the dairy industry we used to have a homogeniser. This was a device for forcing the fat back into the milk. What it did was use a very high pressure to compress and punish the fat until it became part of the milk. No fat was allowed to remain on the top of the milk it all had to be to same consistency ... Force the data under pressure to conform to what is required. Torture the data if necessary until it complies ...
Clearly the Bureau's remodeling of historical temperature data is unnatural and unscientific. In erasing the natural climate cycles and generating a global warming trends, the capacity of modern climate scientists to forecast spikes in global warming is greatly diminished, as is their capacity to forecast droughts and floods.
Because of the homogenization of the surface temperature record in the compilation of national and global climate statistics, those skeptical of anthropogenic global warming, have long preferred the UAH satellite record. Even though this record only begins in 1979.
The UAH global temperature record for the lower troposphere which once showed no trend for 18 years, now shows a surge in warming. This warming, however, is neither catastrophic nor outside the bounds of natural variability. And it certainly hasn't been caused by carbon dioxide.
Postscript
A manuscript entitled "Reconciling temperature trends generated from raw versus homogenized monthly temperature datasets for Darwin, Australia, 1895 to 2014", was first submitted to the International Journal of Climatology in early June 2015. After major revisions, a letter was received from the journal's editor, Radan Huth indicating that the manuscript had been accepted subject to minor revisions. In particular the letter from Huth stated:
Both referees are satisfied with the revisions you have made; they have, nevertheless, a few additional minor comments. I am, therefore, very happy to grant conditional acceptance of the paper, subject to you making satisfactory revisions as clarified below. These revisions are minor, in the sense that there are no major recalculations or major analyses required, but there may be less-major analyses needed and there are a number of important revisions to the text that are necessary. Please review the attached document listing the file requirements for your revision.
These revisions were duly made and the manuscript resubmitted on 18th December 2015. On 10th February 2016, a letter was received from Huth indicating that the manuscript was now "denied publication" in the journal. A key concern from reviewer #2 was my addition of a comment in the text which indicated that the temperature series, as homogenized by the Australian Bureau of Meteorology, may not be a good representation of historical temperatures. In the earlier drafts of the manuscript I had let the data speak for itself, but in the minor revisions I had been asked to provide commentary.
The letter from Huth also included comments from a new third reviewer who rejected the manuscript, and provided a long defense of the methods currently used by the Australian Bureau of Meteorology for the homogenization of historical temperature data.
Earlier in 2015 a version of the same manuscript had been submitted to, and rejected by, two other international climate science publications.
In a letter to the editor of the journal Environmental Modeling and Assessment on 20th April 2015, accompanying the papers submission to that journal, I wrote:
The science of climate change depends very much on reliable data, which has been quality controlled. Adjusting time series through the process of homogenization has become routine, but is controversial. Furthermore a jargon and techniques unique to climate science have developed around homogenization, perhaps without adequate consideration of the value of more traditional statistical techniques that can be more easily replicated. Indeed, we believe that our use of control charts has potentially broad application in the assessment of temperature data.
In the case of Darwin, which has a particularly important temperature time series used in the calculation of Australian and also global mean annual temperatures, the homogenized temperature time series display quite different trends from the raw series. This has been a point of contention on blogs, in the popular press, and also in the technical literature. We hope this controversy can be resolved, at least in part, through meticulous consideration of the evidence, and by applying more standard statistical techniques in particular control charts, before any subsequent assessment of rates of warming.
Comment was received back from the Editor-in-Chief that:
This paper does not fall within the scope of Environmental Modeling and Assessment. Our readership is primarily interested in studies that analyse phenomena and complex decisions related to human interactions with the environment with the help of sophisticated quantitative models. While the case study discussed here and its findings may well be valuable and publishable in a more specialised Climatology journal, the modelling methodology appears to be entirely routine and of little interest to our readership.
The manuscript was subsequently submitted to the journal Theoretical and Applied Climatology, with the editor, Hartmut Grassl, sending it out to four scientists for peer review. He subsequently rejected the paper on the basis that:
Reviewers' comments on your work have now been received. You will see that two reviewers reject your paper while the other two ask for modifications. Therefore I have to reject your paper.
In fact, the comment from the first reviewer was as follows:
Reviewer #1: Please see attached review. I checked accept with minor revisions. The revisions are so minor that I almost checked "Accept as is". The paper does not need to be returned to me for further review. It is a very nice piece of work.
No comments:
Post a Comment