In the words of Anthony Watts, the "sceptic" with one of the most read blogs, this abstract is a ”new peer reviewed paper recently presented at the European Geosciences Union meeting.” A bit closer to the truth is that this is a conference contribution by Steirou and Koutsoyiannis, based on a graduation thesis (Greek), which was submitted to the EGU session "Climate, Hydrology and Water Infrastructure". An EGU abstract is typically half a page, it is not possible to do a real review of a scientific study based on such a short text. The purpose of an EGU abstract is in practice to decide who gets a talk and who gets a poster, nothing more, everyone is welcome to come to EGU.
I have never seen an abstract that was rejected at EGU; rejection rates are in the order of a few percent and these are typically empty or double abstracts and are due to technical problems during submission. It would have been better if this abstract was send to the homogenisation session at EGU. This would have fitted much better to the topic and would have allowed for a more objective appraisal of this work. Had I been EGU convener of the homogenisation session, I would probably have accepted the abstract, but given it a poster because the errors signal inexperience with the topic and I would have talked to them at the poster.
As a goal-oriented guy, Anthony Watts found the two most erroneous statements, only one of these statements is in the abstract (printed in full below), the other statement is in the completely not reviewed slides of the talk.
We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant. From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends. One of the most common homogenization methods, ‘SNHT for single shifts’, was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence. The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.
Homogenisation changes the trend
The first statement cited by Anthony Watts is from the slides:of 67% of the weather stations examined, questionable adjustments were made to raw data that resulted in: "Increased positive trends, decreased negative trends, or changed negative trends to positive,” whereas “the expected proportions would be 1/2 (50%)."
This is plainly wrong. You would not expect the proportions to be 1/2, inhomogeneities can have a typical sign, e.g. when an entire network changes from North wall measurements (typical in the 19th century) to fully closed double-Louvre Stevenson screens in the gardens or from a screen that is open to the North or bottom (Wild, Pagoda, Montsouri) to a Stevenson screen, or from a Stevenson screen to an automatic weather stations as currently happens to save labor. The UHI produces a bias in the series, thus if you remove the UHI the homogenisation adjustments would have a bias. There was a move from stations in cities to typically cooler airports that produces a bias and again this would make that you do not expect that the proportions are 1/2. Etc. See e.g. the papers by Böhm et al. (2001) Menne et al., 2010; Brunetti et al., 2006; Begert et al., 2005 or my recent posts on homogenisation. Also the change from roof precipitation measurements to near ground precipitation measurements cause a bias (Auer et al., 2005).
The same people who complain about biases in the global mean temperature trend due to the urban heat island effect, do not want homogenisation to change the trends. Dear "sceptics", this is not consistent. Do you want homogenisation to remove the artificial trend due to the UHI or not? Sounds to me, as if you only want to confuse a casual reader.
Validation of statistical homogenisation
The first sentence of the second statement cited by Anthony Watts is at least from the abstract, but the second sentence is only found in the not reviewed slides:“homogenization practices used until today are mainly statistical, not well justified by experiments, and are rarely supported by metadata. It can be argued that they often lead to false results: natural features of hydroclimatic times series are regarded as errors and are adjusted.”
The WMO recommendation is to first homogenize climate data using parallel measurements, but also to perform statistical homogenisation as one is never sure that all inhomogeneities are recorded in the meta data of the station.
I was involved in the COST Action HOME, which just finished a study with a blind numerical experiment, which justified statistical homogenisation and clearly showed that statistical homogenisation improves the quality of temperature data (Venema et al., 2012). Many validation studies of homogenisation algorithms have been published before (see references in Venema et al., 2012).
In a different approach, the statistical homogenisation methods were also validated using breaks known in meta data in the Swiss (Kuglitsch, 2012). The size of the biased inhomogeneities is also in accordance with numerous experiments with parallel measurements; see Böhm et al. (2010) and Brunet et al. (2010) and references therein.
Definitely, it would be good to be able to homogenize data using parallel measurements more often. Unfortunately, it is often simply not possible to perform parallel measurements because the need for the change is not known several years in advance. Thus statistical homogenisation will always be needed as well and as the validation studies show produces good results and makes the trends in temperature series more reliable.
personnel appraisal
A student can make an error and conferences are there to talk about preliminary results, much more worrying is that Anthony Watts keeps on getting his facts wrong. An EGU abstract is simply not a peer reviewed paper. Of the three sentences Watts cited from the "peer reviewed paper", two can only be found in the slides of the talk, which are not reviewed. Every post on Watts up with that? that is on a topic I am knowledgeable about, contains serious factual errors and clear misrepresentations. I am not talking about having another opinion, but facts. If clear facts are already wrong, I start doubting the rest. One wonders why the readers of Watts up with that? keep on reading that stuff. There seems to be little interest in the truth among these self-proclaimed sceptics.UPDATE: Anthony Watts has corrected the "peer-review" error.
REPLY: I was of the impression that it was “in press” but I’ve change the wording to reflect that. Hopefully we’ll know more soon. – AnthonyNo idea how you can confuse the linked 260 word conference abstract and some power point slides with a scientific paper that is "in press", in other words a full scientific article that is finished except for being published.
UPDATE 2: After a remark by Willis Eschenbach, I have have changed the statement that homogenisation improves climate data, to the more accurate one that it improves temperature data.
UPDATE 3: The post that started this discussion was written by Marcel Crok at his blog: The State of the Climate.
UPDATE 2015: Now two and a half year later, there is still no new activity on this study. I guess that that means that Koutsoyiannis admits his conference abstract did not show what the State of the Climate and WUWT wanted you to believe. Together with all the other WUWT fails it is clear that someone interested in climate change should never get his information from WUWT.
More posts on homogenisation
- Statistical homogenisation for dummies
- A primer on statistical homogenisation with many pictures.
- Homogenisation of monthly and annual data from surface stations
- A short description of the causes of inhomogeneities in climate data (non-climatic variability) and how to remove it using the relative homogenisation approach.
- New article: Benchmarking homogenisation algorithms for monthly data
- Raw climate records contain changes due to non-climatic factors, such as relocations of stations or changes in instrumentation. This post introduces an article that tested how well such non-climatic factors can be removed.
- HUME: Homogenisation, Uncertainty Measures and Extreme weather
- Proposal for future research in homogenisation of climate network data.
Some interesting backlinks
- Investigation of methods for hydroclimatic data homogenization?
- A short discussion of the history and some arguments.
- Blog discussions, conference presentations and peer review
- Demetris Koutsoyiannis take on the discussion and his reply to the above post.
- Where’s the Skepticism?
- Tamino discusses Anthony Watts post, also the comments are worth reading.
- Station Homogenization as a Statistical Procedure
- The comment function at Climate Audit makes it easy to respond to each other and produces a lively discussion.
- New peer reviewed paper recently presented at the European Geosciences Union meeting.
- And finally, do not forget to read the comments in the WUWT post. Have a good laugh and lose all respect for pseudo-skeptics.
References
Begert, M., Schlegel, T., and Kirchhofer, W.: Homogeneous temperature and precipitation series of Switzerland from 1864 to 2000. Int. J. Climatol., 25, 65–80, doi: 10.1002/joc.1118, 2005.Brunet, M., Asin, J., Sigró, J., Banón, M., García, F., Aguilar, E., Esteban Palenzuela, J., Peterson, T. C., and Jones, P.: The minimization of the screen bias from ancient Western Mediterranean air temperature records: an exploratory statistical analysis. Int. J. Climatol., doi: 10.1002/joc.2192, 2010.
Brunetti M., Maugeri, M., Monti, F., and Nanni, T.: Temperature and precipitation variability in Italy in the last two centuries from homogenized instrumental time series. International Journal of Climatology, 26, 345–381, doi: 10.1002/joc.1251, 2006.
Böhm R., Auer, I., Brunetti, M., Maugeri, M., Nanni, T., and Schöner, W.: Regional temperature variability in the European Alps 1760–1998 from homogenized instrumental time series. International Journal of Climatology, 21, 1779–1801, DOI: 10.1002/joc.689, 2001.
Böhm, R. P.D. Jones, J. Hiebl, D. Frank, M. Brunetti, M. Maugeri. The early instrumental warm-bias: a solution for long central European temperature series 1760–2007. Climatic Change, pp. 101:41–67, doi: 10.1007/s10584-009-9649-4, 2010.
Kuglitsch, F. G., R. Auchmann, R. Bleisch, S. Brönnimann, O. Martius, and M. Stewart, Break detection of annual Swiss temperature series, J. Geophys. Res., doi: 10.1029/2012JD017729, in press, 2012.
Menne, M. J., Williams, C. N. jr., and Palecki M. A.: On the reliability of the U.S. surface temperature record. J. Geophys. Res. Atmos., 115, no. D11108, doi: 10.1029/2009JD013094, 2010.
Steirou, E., Investigation and evaluation of methods for homogenization of hydroclimatic data, Diploma thesis, 156 pages, Department of Water Resources and Environmental Engineering – National Technical University of Athens, Athens, 2011.
Venema, V., O. Mestre, E. Aguilar, I. Auer, J.A. Guijarro, P. Domonkos, G. Vertacnik, T. Szentimrey, P. Stepanek, P. Zahradnicek, J. Viarre, G. Müller-Westermeier, M. Lakatos, C.N. Williams, M.J. Menne, R. Lindau, D. Rasol, E. Rustemeier, K. Kolokythas, T. Marinova, L. Andresen, F. Acquaotta, S. Fratianni, S. Cheval, M. Klancar, M. Brunetti, Ch. Gruber, M. Prohom Duran, T. Likso, P. Esteban, Th. Brandsma. Benchmarking homogenization algorithms for monthly data. Climate of the Past, 8, pp. 89-115, doi: 10.5194/cp-8-89-2012, 2012.
Congratulations on your post correcting the misleading presentation on homogenization, and Anthony's misrepresentation of its validity and significance.
ReplyDeleteI hope you do not get banned from this web site for contradicting Herr Watts. This has happened to me.
Looking at the original presentation, I realized it didn't seem like a peer reviewed presentation, but I don't have the scientific background to make a definitive statement. Also, a lot of literature has been written on homogenization and why newer equipment reads colder than the old equipment, as you point out, and this was unknown to or ignored by the presentor. What is surprising, is that Anthony Watts, who is a weatherman, who criticizes the placement of surface station equipment, should know this, but appears not to. This is clearly a case of bias induced by cognitive dissonance.
Bravo. Good post, kept very professional, and allowing an update because of a comment from a non-scientist. I wish there were more like you.
ReplyDelete:-) There was a time, when I was not a scientist. Hopefully, there will be a time when I am no longer scientist and enjoy my time in the sun.
DeleteVictor, you've done a very nice job rebutting all the nonsense at WUWT site about homogenisation.
ReplyDeleteThe title of the Watt's post itself is misleading. All the authors claim is just that the true trend is somewhere between 0.42 and 0.76 °C. The span of relative difference to the adjusted trend is therefore (0 °C - 0.34 °C)/0.76 °C = 0 - 0.45. That is by no means "about half" as stated by Anthony in the title!
The funny thing in the EGU presentation is about SNHT and long-term persistence data. Well we shouldn't be surprised it the method based on independent and white noise series fails in such cases. When dealing with a relative homogenisation method it's almost impossible to get significant natural long-term peristence signal in the difference series, as nearby stations exhibit almost the same climate signal.
Gregor, I forgot to respond to those trend estimates, next to the complete misrepresentation of how a homogenisation algorithm works. The 0.76 °C per century is the more objective value. If the raw data would have had a trend of 2 °C per century and we would not homogenise to get the 0.76°C value, these "sceptics" would complain. And it is not that I did not comment on homogenisation before, there is no excuse for this misinformation by WUWT.
DeleteI decided not to write about the long-term persistence as it is quite technical and the slides provide too little information, but I expect that the conference presentation is somehow wrong in this respect. Together with Henning Rust and Olivier Mestre, I have written a paper on homogenisation and long-term persistence and we found that homogenisation did not remove LTP, the correction method did not reduce the Hurst coefficient. Homogenisation does remove LTP in the difference time series, but not in the station data itself. We did not test the detection part, not expecting any problems there and it would have been much more work. The influence of homogenisation on the Hurst coefficient can now be tested on the HOME benchmark dataset.
Rust, H.W., O. Mestre, and V.K.C. Venema.
Less jumps, less memory: homogenized temperature records and long memory.
(low quality (0.5 Mb) |
high quality (24 Mb))
JGR-Atmospheres, 113, D19110, doi: 10.1029/2008JD009919, 2008.