Pages

Monday, 27 April 2015

Two new reviews of the homogenization methods used to remove non-climatic changes

By coincidence this week two initiatives have been launched to review the methods to remove non-climatic changes from temperature data. One initiative was launched by the Global Warming Policy Foundation (GWPF), a UK free-market think tank. The other by the Task Team on Homogenization (TT-HOM) of the Commission for Climatology (CCl) of the World meteorological organization (WMO). Disclosure: I chair the TT-HOM.

The WMO is one of the oldest international organizations and has meteorological and hydrological services in almost all countries of the world as its members. The international exchange of weather data has always been important for understanding the weather and to make weather predictions. The main role of the WMO is to provide guidance and to define standards that make collaboration easier. The CCl coordinates climate research, especially when it comes to data measured by national weather services.

The review on homogenization, which the TT-HOM will write, is thus mainly aimed at helping national weather services produce better quality datasets to study climate change. This will allow weather services to provide better climate services to help their nations adapt to climate change.

Homogenization

Homogenization is necessary because much has happened in the world between the French and industrial revolutions, two world wars, the rise and fall of communism, and the start of the internet age. Inevitably many changes have occurred in climate monitoring practices. Many global datasets start in 1880, the year toilet paper was invented in the USA and 3 decades before the Ford Model T.

As a consequence, the instruments used to measure temperature have changed, the screens to protect the sensors from the weather have changed and the surrounding of the stations has often been changed and stations have been moved in response. These non-climatic changes in temperature have to be removed as well as possible to make more accurate assessments of how much the world has warmed.

Removing such non-climatic changes is called homogenization. For the land surface temperature measured at meteorological stations, homogenization is normally performed using relative statistical homogenizing methods. Here a station is compared to its neighbours. If the neighbour is sufficiently nearby, both stations should show about the same climatic changes. Strong jumps or gradual increases happening at only one of the stations indicate a non-climatic change.

If there is a bias in the trend, statistical homogenization can reduce it. How well trend biases can be removed depends on the density of the network. In industrialised countries a large part of the bias can be removed for the last century. In developing countries and in earlier times removing biases is more difficult and a large part may remain. Because many governments unfortunately limit the exchange of climate data, the global temperature collections can also remove only part of the trend biases.

Some differences

Some subtle differences. The Policy Foundation has six people from the UK, Canada and the USA, who do not work on homogenization. The WMO team has nine people who work on homogenization from Congo, Pakistan, Peru, Canada, the USA, Australia, Hungary, Germany, and Spain.

The TT-HOM team has simply started outlining their report. The Policy Foundation creates spin before they have results with publications in their newspapers and blogs and they showcase that they are biased to begin with when they write [on their homepage]:
But only when the full picture is in will it be possible to see just how far the scare over global warming has been driven by manipulation of figures accepted as reliable by the politicians who shape our energy policy, and much else besides. If the panel’s findings eventually confirm what we have seen so far, this really will be the “smoking gun”, in a scandal the scale and significance of which for all of us can scarcely be exaggerated.
My emphasis. Talk about hyperbole by the click-whore journalists of the Policy Foundation. Why buy newspapers when their articles are worse than a random page on the internet? The Policy Foundation gave their team a very bad start.

Aims of the Policy Foundation

Hopefully, the six team members of the Policy Foundation will realise just how naive and loaded the questions they were supposed to answer are. The WMO has asked us whether we as TT-HOM would like to update our Terms of Reference; we are the experts after all. I hope the review team will update theirs, as that would help them to be seen as scientists seriously interested in improving science. Their current terms of reference are printed in italics below.

The panel is asked to examine the preparation of data for the main surface temperature records: HadCRUT, GISS, NOAA and BEST. For this reason the satellite records are beyond the scope of this inquiry.

I fail to see the Policy Foundation asking something without arguments as a reason.

The satellite record is the most adjusted record of them all. The raw satellite data does not show much trend at all and initially even showed a cooling trend. Much of the warming in this uncertain and short dataset is thus introduced the moment the researchers remove the non-climatic changes (differences between satellites, drifts in their orbits and the height of the satellites, for example). A relatively small error in these adjustments, thus quickly leads to large trend errors.

While independent studies for the satellite record are sorely missing, a blind validation study for station data showed that homogenization methods work. They reduce any temperature trend biases a dataset may have for reasonable scenarios. For this blind validation study we produced a dataset that mimics a real climate network with known non-climatic changes, so that we knew what the answer should be. We have a similar blind validation of the method used by NOAA to homogenize its global land surface data.

The following questions will be addressed.

1. Are there aspects of surface temperature measurement procedures that potentially impair data quality or introduce bias and need to be critically re-examined?

Yes. A well-known aspect is the warming bias due to urbanization. This has been much studied and was found to produce only a small warming bias. A likely reason is that urban stations are regularly relocated to less urban locations.

On the other hand, the reasons for a cooling bias in land temperatures have been studied much too little. In a recent series, I mention several reasons why current measurements are cooler than those in the past: changes in thermometer screens, relocations and irrigation. At this time we cannot tell how important each of these individual reasons is. Any of these reasons is potentially important enough to explain the 0.2°C per century cooling trend bias found in the GHNv3 land temperatures. The reasons mentioned above could together explain a much larger cooling trend bias, which could dramatically change our assessment of the progress of global warming.

2. How widespread is the practice of adjusting original temperature records? What fraction of modern temperature data, as presented by HadCRUT/GISS/NOAA/BEST, are actual original measurements, and what fraction are subject to adjustments?

Or as Nick Stokes put it "How widespread is the practice of doing arithmetic?" (hat tip HotWhopper.)

Almost all longer station measurement series contain non-climatic changes. There is about one abrupt non-climatic change every 15 to 20 years. I know of two long series that are thought to be homogeneous: Potsdam in Germany and Mohonk Lake, New York, USA. There may be a few more. If you know more please write a comment below.

It is pretty amazing that the Policy Foundation knows so little about climate data that it asked its team to answer such a question. A question everyone working on the topic could have answered. A question that makes most sense when seen as an attempt to deceive the public and insinuate that there are problems.

3. Are warming and cooling adjustments equally prevalent?

Naturally not.

If we were sure that warming and cooling adjustments were of the same size, there would be no need to remove non-climatic changes from climate data before computing a global mean temperature signal.

It is known in the scientific literature that the land temperatures are adjusted upwards and the ocean temperatures are adjusted downwards.

It is pretty amazing that the Policy Foundation knows so little about climate data that it asked its team to answer such a question. A question everyone working on the topic could have answered. A question that makes most sense when seen as an attempt to deceive the public and insinuate that there are problems.

4. Are there any regions of the world where modifications appear to account for most or all of the apparent warming of recent decades?

The adjustments necessary for the USA land temperatures happen to be large, about 0.4°C.

That is explained by two major transitions: a change in the time of observation from afternoons to mornings (about 0.2°C) and the introduction of automatic weather stations (AWS), which in the USA happens to have produced a cooling bias of 0.2°C. (The bias due to the introduction of AWS depends on the design of the AWS and the local climate and thus differs a lot from network to network.)

The smaller the meteorological network or region you consider, the larger the biases you can find. Many of them average out on a global scale.

5. Are the adjustment procedures clearly documented, objective, reproducible and scientifically defensible? How much statistical uncertainty is introduced with each step in homogeneity adjustments and smoothing?

The adjustments to the global datasets are objective and reproducible. These datasets are so large that there is no option other than processing them automatically.

The GHCN raw land temperatures are published, the processing software is published, everyone can repeat it. The same goes for BEST and GISS. Clearly documented and defensible are matters if opinion and this can always be improved. But if the Policy Foundation is not willing to read the scientific literature, clear documentation does not help much.

Statistical homogenization reduces the uncertainty of large-scale trends. Another loaded question.

Announcement full of bias and errors

Also the article [by Christopher Booker in the Telegraph and reposted by the Policy Foundation] announcing the review by the Policy Foundation is full of errors.

Booker: The figures from the US National Oceanic and Atmospheric Administration (NOAA) were based, like all the other three official surface temperature records on which the world’s scientists and politicians rely, on data compiled from a network of weather stations by NOAA’s Global Historical Climate Network (GHCN).

No, the Climate Research Unit and BEST gather data themselves. They do also use GHCN land surface data, but would certainly notice if that data showed more or less global warming than their other data sources.

Also the data published by national weather services show warming. If someone assumes a conspiracy, it would be a very large one. Real conspiracies tend to be small and short.

Booker: But here there is a puzzle. These temperature records are not the only ones with official status. The other two, Remote Sensing Systems (RSS) and the University of Alabama (UAH), are based on a quite different method of measuring temperature data, by satellites. And these, as they have increasingly done in recent years, give a strikingly different picture.

The long-term trend is basically the same. The satellites see much stronger variability due to El Nino, which make them better suited for cherry picking short periods, if one is so inclined.

Booker:In particular, they will be wanting to establish a full and accurate picture of just how much of the published record has been adjusted in a way which gives the impression that temperatures have been rising faster and further than was indicated by the raw measured data.

None of the studies using the global mean temperature will match this criterion because contrary to WUWT-wisdom the adjustments reduce the temperature trend, which gives the "impression" that temperatures have been rising more slowly and less than was indicated by the raw measured data.

The homepage of the Policy Foundation team shows a graph for the USA (in Fahrenheit), reprinted below. This is an enormous cherry pick. The adjustments necessary for the USA land temperatures happen to be large and warming, about 0.4°C. The reasons for this were explained above in the answer to GWPF question 4.



That the US non-climatic changes are large relative to other regions should be known to somewhat knowledgeable people. Presented without context on the homepage of the Policy Foundation and The Telegraph, it will fool the casual reader by suggesting that this is typical.

[UPDATE. I have missed one rookie mistake. Independent expert Zeke Hausfather says: Its a bad sign that this new effort features one graph on their website: USHCN version 1 adjusted minus raw. Unfortunately, USHCN v1 was replaced by USHCN v2 (with the automated PHA rather than manual adjustments) about 8 years ago. The fact that they are highlighting an old out-of-date adjustment graph is, shall we say, not a good sign.]

For the global mean temperature, the net effect of all adjustments is a reduction in the warming. The raw records show a stronger warming due to non-climatic changes, which climatologists reduce by homogenization.

Thus what really happens is the opposite of what happens to the USA land temperatures shown by the Policy Foundation. They do not show this because it does not fit their narrative of activist scientists, but this is the relevant temperature record with which to assess the magnitude of global warming and thus the relevant adjustment.



Previous reviews

I am not expecting serious journalists to write about this. [UPDATE. Okay, I was wrong about that.] Maybe later, when the Policy Foundation shows their results and journalists can ask independent experts for feedback. However, just in case, here is an overview of real work to ascertain the quality of the station temperature trend.

In a blind validation study we showed that homogenization methods reduce any temperature trend biases for reasonable scenarios. For this blind validation study we produced a dataset that mimics a real climate network. Into this data we inserted known non-climatic changes, so that we knew what the answer should be and could judge how well the algorithms work. It is certainly possible to make a scenario in which the algorithms would not work, but to the best of our understanding such scenarios would be very unrealistic.

We have a similar blind validation of the method used by NOAA to homogenize its global land surface data.

The International Surface Temperature Initiative (ISTI) has collected a large dataset with temperature observations. It is now working on a global blind validation dataset, with which we will not only be able to say that homogenization methods improve trend estimates, but also to get a better numerical estimate of by how much. (In more data sparse regions in developing countries, the methods probably cannot improve the trend estimate much, the previous studies were for Europe and the USA).

Then we have BEST by physicist Richard Muller and his group of non-climatologists who started working on the quality of station data. They basically found the same result as the mainstream climatologists. This group actually put in work and developed an independent method to estimate the climatic trends, rather than just do a review. The homogenization method from this group was also applied to the NOAA blind validation dataset and produced similar results.

We have the review and guidance of the World Meteorological Organization on homogenization from 2003. The review of the Task Team on Homogenization will be an update of this classic report.

Research priorities

The TT-HOM has decided to focus on monthly mean data used to establish global warming. Being a volunteer effort we do not have the resources to tackle the more difficult topic of changes to extremes in detail. If someone has some money to spare, that is where I would do a review. That is a seriously difficult topic where we do not know well how accurately we can remove non-climatic problems.

And as mentioned above, a good review of the satellite microwave temperature data would be very valuable. Satellite data is affected by strong non-climatic changes and almost its entire trend is due to homogenization adjustments; a relatively small error in the adjustments thus quickly leads to large changes in their trend estimates. At the same time I do not know of a (blind) validation study nor of an estimate of the uncertainty in satellite temperature trends.

If someone has some money to spare, I hope it is someone interested in science, no matter the outcome, and not a Policy Foundation with an obvious stealth agenda, clearly interested in a certain outcome. It is good that we have science foundations and universities to fund most of the research; funders who are interested in the quality of the research rather than the outcome.

The interest is appreciated. Homogenization is too much of a blind spot in climate science. As Neville Nicholls, one of the heroes of the homogenization community, writes:
When this work began 25 years or more ago, not even our scientist colleagues were very interested. At the first seminar I presented about our attempts to identify the biases in Australian weather data, one colleague told me I was wasting my time. He reckoned that the raw weather data were sufficiently accurate for any possible use people might make of them.
One wonders how this colleague knew this without studying it.

In theory it is nice that some people find homogenization so important as to do another review. It would be better if those people were scientifically interested. The launch party of the Policy Foundation suggests that they are interested in spin, not science. The Policy Foundation review team will have to do a lot of work to recover from this launch party. I would have resigned.

[UPDATE 2019: The GWPF seems to have stopped paying for their PR page about their "review", https://www.tempdatareview.org/. It now hosts Chinese advertisements for pills. I am not aware of anything coming out of the "review", no report, no summary of the submitted comments written by volunteers in their free time for the GWPF, no article. If you thought this was a PR move to attack science from the start, you may have had a point.]


Related reading

Just the facts, homogenization adjustments reduce global warming

HotWhopper must have a liberal billionaire and a science team behind her. A great, detailed post: Denier Weirdness: A mock delegation from the Heartland Institute and a fake enquiry from the GWPF

William M. Connolley gives his candid take at Stoat: Two new reviews of the homogenization methods used to remove non-climatic changes

Nick Stokes: GWPF inquiring into temperature adjustments

And Then There's physics: How many times do we have to do this?

The Independent: Leading group of climate change deniers accused of creating 'fake controversy' over claims global temperature data may be inaccurate

Phil Plait at Bad Astronomy comment on the Telegraph piece: No, Adjusting Temperature Measurements Is Not a Scandal

John Timmer at Ars Technica is also fed up with being served the same story about some upward adjusted stations every year: Temperature data is not “the biggest scientific scandal ever” Do we have to go through this every year?

The astronomer behind the blog "And Then There's Physics" writes why the removal of non-climatic effects makes sense. In the comments he talks about adjustments made to astronomical data. Probably every numerical observational discipline of science performs data processing to improve the accuracy of their analysis.

Steven Mosher, a climate "sceptic" who has studied the temperature record in detail and is no longer sceptical about that reminds of all the adjustments demanded by the "sceptics".

Nick Stokes, an Australian scientist, has a beautiful post that explains the small adjustments to the land surface temperature in more detail.

Statistical homogenisation for dummies

A short introduction to the time of observation bias and its correction

New article: Benchmarking homogenisation algorithms for monthly data

Bob Ward at the Guardian: Scepticism over rising temperatures? Lord Lawson peddles a fake controversy

18 comments:

  1. Hi Victor

    It's a bit amusing that nowhere in your article that is full of links you link to the url of the new GWPF inquiry:
    http://www.tempdatareview.org/

    Also you blame the GWPF of producing spin already, but here is their own announcement of the inquiry:

    INQUIRY LAUNCHED INTO GLOBAL TEMPERATURE DATA INTEGRITY
    Date: 25/04/15 Global Warming Policy Foundation
    London: 26 April 2015. The London-based think-tank the Global Warming Policy Foundation is today launching a major inquiry into the integrity of the official global surface temperature records.
    An international team of eminent climatologists, physicists and statisticians has been assembled under the chairmanship of Professor Terence Kealey, the former vice-chancellor of the University of Buckingham.

    Questions have been raised about the reliability of the surface temperature data and the extent to which apparent warming trends may be artefacts of adjustments made after the data are collected. The inquiry will review the technical challenges in accurately measuring surface temperature, and will assess the extent of adjustments to the data, their integrity and whether they tend to increase or decrease the warming trend.

    Launching the inquiry, Professor Kealey said:

    “Many people have found the extent of adjustments to the data surprising. While we believe that the 20th century warming is real, we are concerned by claims that the actual trend is different from – or less certain than – has been suggested. We hope to perform a valuable public service by getting everything out into the open.”

    To coincide with the inquiry launch Professor Kealey has issued a call for evidence:

    “We hope that people who are concerned with the integrity of climate science, from all sides of the debate, will help us to get to the bottom of these questions by telling us what they know about the temperature records and the adjustments made to them. The team approaches the subject as open-minded scientists – we intend to let the science do the talking. Our goal is to help the public understand the challenges in assembling climate data sets, the influence of adjustments and modifications to the data, and whether they are justifiable or not.”

    All submissions will be published.

    

Further details of the inquiry, its remit and the team involved can be seen on its website www.tempdatareview.org

    The quote you gave is from Booker. It's unfair to blame GWPF for that.

    Marcel

    ReplyDelete
  2. As I mentioned to you on my blog, you should just send something like this to the GWPF review panel. You've essentially done their job for them and could save them a great deal of time.

    ReplyDelete
  3. Marcel,
    The GWPF could have chosen to distance themselves from Booker. They have not. It's hard not to conclude that they broadly agree with the tone of his article.

    ReplyDelete
  4. Marcel Crok, I hope this blog makes its readers a bit better informed after reading it. Thus I prefer to link to sources of good information and try to avoid linking to political pressure groups. Google is good enough that anyone interested can copy a quote and immediate find it.

    It is interesting that you defend your Policy Foundation this way. Did you every complain that WUWT does not link to the scientific articles they are abusing?

    When a blog that claims to be about science (WUWT) does not link to science, that sends a signal. It is a strong signal than a blog that claims to be about science (VV) that does not link to political spin.

    The quote I gave is from Christopher Booker, columnist with The Telegraph, and was published on the homepage of the Policy Foundation. The quotes fit to the loaded questions in the Terms of Reference of the review team. The level of ignorance also fits. Is it a bad assumption that the Policy Foundation agrees with the articles they post on their homepage?

    Small fun fact. The Telegraph titles:
    Top Scientists Start To Examine Fiddled Global Warming Figures
    The policy Foundation titles:
    Top Scientists Start To Examine Adjusted Global Warming Figures
    Trying to look more neutral and demonstrating that they feel responsible for what they publish on their own homepage.

    ReplyDelete
  5. AndThenTheresPhysics, one the team has read the scientific literature and is able to ask specific questions, I am naturally happy to answer them.

    As soon as our TT-HOM team has a text, we will also circulate it among the experts in homogenization. We know who they are and do not need to build an expensive homepage for that.

    Knowing my introverted colleagues, who have no interest in stupid political games, but in doing good science, this homepage will anyway not work. You will have to write them and show you are a serious person who put in some work. That is how science works. The homepage, that is how politics works.

    ReplyDelete
  6. Marcel,
    "The quote you gave is from Booker. It's unfair to blame GWPF for that. "

    It is prominently featured on the GWPF website.

    ReplyDelete
  7. The homogenization guidance of TT-HOM will have three parts.

    1. A report with the basic information on the homogenization of monthly data, the methods and how to apply them for the users and more experienced people the mathematical basis and need for research.

    2. A frequently asked questions on the internet. This way we can stay up to date. The answers will be shorter and more informal and link to information sources, including the report.

    3. An up to date list with free software for homogenization on the internet with some guidance about their strengths and weaknesses.

    Once we have something to show, we will ask the homogenization distribution list for feedback. If you are interested in contributing, have questions as user of homogenization methods, or would like to join the list, please send me a mail.

    ReplyDelete
  8. If the neighbour is sufficiently nearby, both stations should show about the same non-climatic changes.

    Shouldn't that be the "same climatic changes" (i.e. delete the "non-")
    --
    William

    ReplyDelete
  9. William, yes you are right, thank you. I have immediately corrected this error. Was a night job. :)

    ReplyDelete
  10. Victor, if you intend to submit this to the review, it would be wise to remove the snark. Snark is justified of course, but it might lessen the chances of your insights being heard. Also I'd be happy to proof read it and remove grammatical errors. Mail me if interested (no charge :-).
    --
    William

    ReplyDelete
  11. I wrote nothing the review team should not already know if they are only minimally qualified. Thus I did not plan on submitting this post. If they have questions that need my expertise, I would be happy to answer them.

    Would welcome some corrections. Thank you very much for the offer. Only Google knows your email, at Blogger, the blogger does not see it. (I guess on Wordpress you do.) Could you write me?

    ReplyDelete
  12. Regarding the 'snark' - considering this field has become your bailiwick, I can understand the roots of your sarcasm. That said, I would indeed follow William's advice and remove it and submit it to the GWPF.

    ReplyDelete
  13. Kevin O'Neill, they are not reviewing me. I do not have a dataset, I do not have a homogenization method, I just understand a little how they work and have made a nice blind validation study.

    How would it help science if I submit something to this review team? Or what other consideration do you have for your advice?

    ReplyDelete
  14. William, your corrections have been implemented. Thank you very much. It reads much better know.

    And thank you to all my readers who bear with me while I deface the English language.

    ReplyDelete

  15. Sarcasm is good, cynicism is necessary. Marcel Crok is truly expert in playing the implausible deniability card and demanding something or other.

    ReplyDelete
  16. Eli did you read my new page on moderation and are you testing the limits? Or is this just your normal friendly self? I would not mind if you tone down a bit when you comment here.

    ReplyDelete

  17. No Victor, Eli has been diving into the Legacy Tobacco Archives, and as Lily Tomlin said no matter how cynical you become, it's never enough to keep up and that's the truth.

    So when you deal with people like the gwpf and Marcel Crok, if you let them pretend to innocence they run right over you.

    ReplyDelete
  18. As ATTP, John Timmer and others have said, these antics of the GWPF are déjà vu all over again. That's quite aside from the ridiculous presumptions and allegations on which their Snark*-hunting is based.

    However I have a suggestion that would make this enterprise much more interesting for those watching. Specifically, the GWPF should also apply 'homogenisation' to the zero-th degree (see what I did there?) indicators of warming - sea ice and glacier melting, phenological shifts, ocean heat content, sea level rise, increases in the occurrence of extreme heat records, and other such physical manifestations of the effect that human carbon emissions have on the planet's climate.

    I for one would be mightily fascinated with what the GWPF might manage to produce, especially if they published in the peer-reviewed literature.



    (*Of course Victor's commentary needs to have snark contained within - it arises from the very subject matter.)

    ReplyDelete

Comments are welcome, but comments without arguments may be deleted. Please try to remain on topic. (See also moderation page.)

I read every comment before publishing it. Spam comments are useless.

This comment box can be stretched for more space.