Showing posts with label WMO. Show all posts
Showing posts with label WMO. Show all posts

Wednesday, June 12, 2019

The World Meteorological Organisation will build the greatest global climate change network

“Having left a legacy of a changing climate, this [reference climate network] is the very least successive generations can expect from us in order to enable them to more precisely determine how the climate has changed.”
 

Never trust a headline. The WMO cannot build the network. But the highest body of the World Meteorological Organisation (WMO) has approved our plans for a Global Climate Reference Station Network. Its Congress with the leaders of all member organisations meets every two years in neutral Geneva, Switzerland, and has approved the report on a Global Surface Reference Network of the Global Climate Observing System (GCOS) Task Team on a reference network. The WMO is the oldest international organisation and coordinates the works of its members, mostly national weather services. So the WMO will not build the network itself; we are now looking for volunteers.

(Disclosure: I am a member of the Task Team.* Funny: in a team with big name climatologists I am somehow the "Climate scientist representative".)

Humanity is performing the greatest experiment in its history. We better measure it accurately. For humanity and for science.

Never trust a headline. What the heck does “greatest” mean? As someone trying to estimate how much the climate has changed, I would have been so happy if people had continued the really poor measurement methods they used in the 19th century. Mercury thermometers were placed in the North (pole) facing window of an unheated room. Being so close to the building is not good for ventilation, the sun could get on the sensor or heat the wall beneath. I would have lost that fight. Mercury thermometers are now forbidden. Weather prediction models would be better than this observation. The finance minister would have forced us to switch to automatic measurements. We may think that how we measure today is good enough, but people in 2100 will likely disagree.

At least following the biggest technological steps will be unavoidable. If that happens we will make long comparisons with the old and new set-up; estimating differences in the averages is not enough, also the variability is affected, which is hard to estimate. The reasons for measurement errors will change and thus its dependence on the weather.

Any data processing, if only averaging or applying a calibration factor, that is performed today, will be performed on hardware and software that is not available in 2100. Any instrument we would buy off the shelf will not be available in 2100; the upper air reference network is being forced to change their instruments because Vaisala will soon no longer sell them. So best means that we have open hardware and open software so that we can keep on building the instrument, can redo the data processing from scratch and can recreate the exact same processing on newer computers or whatever we use after the Butlerian Jihad.

Photo of a station of the US Climate Reference Network with a prominent wind shield for the rain gauges.
A station of the US Climate Reference Network.

Never trust a headline. What does measuring climate mean? I work on improving trend estimates based on historical measurements made in many different ways by comparing neighbouring stations with each other (statistical homogenisation). This makes me acutely aware that there is only so much you can do with statistical homogenisation, a considerable error remains. It works relatively well for annual average temperatures because the correlations between stations are high. Much harder are estimates of the changes of the variability around the means, which are important for changes in extreme weather. Especially estimates of changes in precipitation, humidity, insolation, cloud cover, snow depth, etc. have wide confidence intervals because statistical homogenisation is very hard. For these other observations having reference data that does not need to be statistically homogenised is crucial. These other variables are very important for climate change impacts and understanding how the climate is changing. Reference networks can not only help in quantifying these confidence intervals, but as an independent line of evidence also provide confidence the confidence intervals are right.

The preliminary proposal for variables to observe in reference quality is:

  • Air temperature
  • Precipitation
  • Pressure
  • Wind speed and direction (10 m)
  • Relative humidity
  • Surface radiation (down and up)
  • Land Surface Temperature
  • Soil moisture
  • Soil temperature
  • Snow/ice (Snow Water Equivalent)
  • Albedo
If you disagree or have additional ideas please contact us.


Tiered system of systems approach.

Never trust a headline. By itself this network will not be the best to study climate change. We also need the other stations. The reference network will be the stable backbone of the entire climate observation system. The part which is best at estimating the long term trends, while we need the other stations to reduce sampling errors and study spatial patterns.

Maintaining a reference station will be clearly more expensive than a standard climate station. Thus the number of stations will be limited. For the long term warming we expect to need about 200 stations well spread over the world. This takes into account that even if we select locations where we expect nothing will happen in the next century, we will still loose some stations due to conflict or "progress".

At a reference station (or nearby) preferably also measurements with the locally standard set-up are made, so that they can be compared with each other and provide information on any measurement problems. This will improve the quality of the entire network. A network with 200 reference stations would on average have about 1 station per country. For the comparison with the national networks having at least one station per country would also be desirable, but large countries will need multiple stations and it is also more efficient when countries with a reference station have multiple stations because a large part of the costs are overheads (well-trained operators and well-instrumented laboratories).


A society grows great when old men plant trees whose shade they know they will never sit in - Greek proverb (I did not check the provenance, experience tells me, the source of such quotes is always wrong, but do leave a comment).

Never trust a headline. The reference network is not only interesting for studying climate change. If it were we would need to wait many decades before it becomes useful. In this age that would likely mean that it would not be funded. Due to the metrological [sic] standards for computing confidence intervals and the traceability back to SI standards, the measurements will be comparable all over the world within specific confidence intervals for the absolute values, not just the (e.g., temperature) anomalies mostly used to study climate change. Together with the representativeness of the stations for the region this makes the network useful for the validation of absolute estimates from satellites or atmospheric models.

Also the comparison of the reference measurements with the national networks will produce valuable information within the first decade. For example, the American Climate Reference Network shows that the warming estimates of the national network are reliable and if anything underestimates the warming in America; the reference network has the larger trend.

Graph showing the US climate reference network (USCRN) and the normal US network (ClimDiv)
The US Climate Reference Network (USCRN; red line) is below the normal national station network (ClimDiv; green line) in beginning and above it at the end. The trend of the reference network is thus larger. (The values themselves are quite noisy because America is just a small part of the Earth and trends over such short periods do not contain information on long-term warming.)

Never trust a headline. We are land animals and it is thus come natural to us to see climate stations as prototypical for climate observations, but the climate system is much richer. There is already a network for reference upper air measurements (GRUAN) made with weather balloons (radiosondes). The high metrological quality of the ARGO network probably also makes them a reference network. They measure ocean temperature profiles to estimate the ocean heat content.

Both the upper air and the oceans are wonderfully uniform media to measure; characterising the influence of the surroundings and preventing changes therein will be the main additional challenge of a land station network.

Studying climatic changes in urban regions is also important. Here it would be even more important to accurately describe the surrounding because changes will happen. Thus urban regions would need their own reference network.

We hope that our reference network will stimulate the founding of further reference networks. The cryosphere (the part of the Earth which is frozen) needs specialised observations. Hydrological and marine surface observations in reference quality would be very valuable; we should never forget that 70% of the Earth is water. Observations of tiny airborne particles (aerosols) and clouds could be made in reference quality.



In other news. The WMO Congress has also decided to make & share more real-time observations for weather predictions. The norms for quality & quantity will become more strict & are monitored.

20-25% of WMO members is already compliant.

25-30% would be compliant if they would share their data internationally. Many of these countries are big, so they represent a larger part of the world.

The rest will need international support to build the capacity to extend their measurement program and share the data.

Hopefully, the Green Climate Fund can help. The 24/7 monitoring by the WMO will give feedback to the funders on the value of their investment.

Climatology has the advantage that national weather services perform observations operationally. This institutional support has produced the long series we can use to study climate change. We currently see huge changes in the biosphere. Insects seem to be vanishing, but this is really hard to study without long-term observations. The ecological long-term observational programs need institutional support.

Where possible these reference networks should aim to use the same locations, so that the observations can support each other, as well as to reduce costs. It may be easier to obtain funding for reference networks in a large coalition than for every network separately. So I hope that these other communities will develop similar plans. If you know of anyone in these communities, please point them to this post or our report.

We estimate that this reference land station network will cost a few million dollars per year. Thus running this network for a decade would still cost much less than a single satellite missions, which measures far fewer climate variances and has much less accuracy and less confidence in its accuracy. If you know someone at Lockheed Martin or Airbus who may be interested in building a space-grade reference network and has the right lobbyists, please tell them of this initiative.

Coming back the first paragraph: we need volunteers. We need weather services interested in setting up reference stations and we need ones interested in becoming a Lead Centre. A Lead Centre would coordinate the network, organise joint calibrations and comparison campaigns, lead the drawing up of measurement requirements, etc. To spread the work load it could be an idea to one Lead Centre to one instrument or observation type. Please talk about this with your colleagues and spread this post.

UPDATE November 2020. The World Meteorological Organization Commission for Observation, infrastructure and information system (INFCOM) has approved the plan. The climate reference network implementation plan is now part of the WMO Infrastructure Commission workplan, which includes in its outputs and deliverables the establishment of a GSRN, identifying candidate stations and the call for the Lead Centre. Based on this and on the recommendation from the report of the GSRN task team, published in February 2019 (GCOS-226),  a new task team has been established to develop (i) a draft implementation plan for the GSRN, (ii) a proposal for management and governance structures of the GSRN, and (iii) a process for nominating and approving stations contributing to the GSRN.


* The opinions in the post are mine, the report represents the opinion of the Task Team.

Further reading

Thorne P.W., H.J. Diamond, B. Goodison, S. Harrigan, Z. Hausfather, N.B. Ingleby, P.D. Jones, J.H. Lawrimore, D.H. Lister, A. Merlone, T. Oakley, M. Palecki, T.C. Peterson, M. de Podesta, C. Tassone, V. Venema and K.M. Willett, 2018: Towards a global land surface climate fiducial reference measurements network. Int J Climatol., 38, pp. 2760–2774. https://doi.org/10.1002/joc.5458

The report of the GCOS Task Team: GCOS Surface Reference Network (GSRN): Justification, requirements, siting and instrumentation options

GCOS, 2017: Report of the 1st Meeting of the GCOS Surface Reference Network (GSRN) Task Team
Maynooth, Ireland, 1-3 November 2017.

My first post trying to get the discussion going in October 2016: A stable global climate reference network

January 2018 GCOS Newsletter on designing a GCOS Surface Reference Network

Saturday, October 7, 2017

A short history of homogenisation of climate station data



The WMO Task Team on Homogenisation (TT-HOM) is working on a guidance for scientists and weather services who want to homogenise their data. I thought the draft chapter on the history of homogenisation doubles as a nice blog post. It is a pretty long history, starting well before people were worrying about climate change. Comments and other important historical references are very much appreciated.

Problems due to inhomogeneities have long been recognised and homogenisation has a long history. In September 1873, at the “International Meteorologen-Congress” in Vienna, Carl Jelinek requested information on national multi-annual data series ([[k.k.]] Hof- und Staatsdruckerei, 1873), but decades later, in 1905 G. Hellmann (k.k. Zentralanstalt für Meteorologie und Geodynamik, 1906) still regretted the absence of homogeneous climatological time series due to changes in the surrounding of stations and new instruments and pleaded for stations with a long record, “Säkularstationen”, to be kept as homogeneous as possible.

Although this “Conference of directors” of the national weather services recommended maintaining a sufficient number of stations under unchanged conditions today these basic inhomogeneity problems still exist.

Detection and adjustments

Homogenisation has a long tradition. For example, in early times documented change points have been removed with the help of parallel measurements. Differing observing times at the astronomical observatory of the k.k. University of in Vienna (Austria) have been adjusted by using multi-annual 24 hour measurements of the astronomical observatory of the k.k. University of Prague (today Czech Republic). Measurements of Milano (Italy) between 1763 and 1834 have been adjusted to 24 hour means by using measurements of Padova (Kreil, 1854a, 1854b).

However, for the majority of breaks we do not know the break magnitude; furthermore it is most likely that series contain undocumented inhomogeneities as well.Thus there was a need for statistical break detection methods. In the early 20th century Conrad (1925) applied and evaluated the Heidke criterion (Heidke, 1923) using ratios of two precipitation series. As a consequence he recommended the use of additional criteria to test the homogeneity of series, dealing with the succession and alternation of algebraic signs, the Helmert criterion (Helmert, 1907) and the “painstaking” Abbe criterion (Conrad and Schreier, 1927). The use of Helmert’s criterion for pairs of stations and Abbe’s criterion still has been described as appropriate tool in the 1940s (Conrad 1944). Some years later the double-mass principle was popularised for break detection (Kohler, 1949).


German Climate Reference Station which was founded in 1781 in Bavaria on the mountain Hohenpeißenberg.

Reference series

Julius Hann (1880, p. 57) studied the variability of absolute precipitation amounts and ratios between stations. He used these ratios for the quality control. This inspired Brückner (1890) to check precipitation data for inhomogeneities by comparison with neighbouring stations;he did not use any statistics.

In their book “Methods in Climatology” Conrad and Pollak (1950) formalised this relative homogenization approach, which is now the dominant method to detect and remove the effects of artificial changes. The building of reference series, by averaging the data from many stations in a relatively small geographical area, has been recommended by the WMO Working Group on Climatic Fluctuations (WMO, 1966).

The papers by Alexandersson (1986) and Alexandersson and Moberg (1997) made the Standard Normal Homogeneity Test (SNHT) popular. The broad adoption of SNHT was also for the clear guidance on how to use this test together with references to homogenize station data.

Modern developments

SNHT is a single-breakpoint method, but climate series typically contain more than one break. Thus a major step forward was the design of methods specifically designed to detect and correct multiple change-points and work with inhomogeneous references (Szentimrey, 1999; Mestre, 1999; Caussinus and Mestre, 2004). These kind of methods were shown to be more accurate by the benchmarking study of the EU COST Action HOME (Venema et al., 2012).

The paper by Caussinus and Mestre (2004) also provided the first description of a method that jointly corrects all series of a network simultaneously. This joint correction method was able to improve the accuracy of all but one contribution to the HOME benchmark that was not yet using this approach (Domonkos et al., 2013).

The ongoing work to create appropriate datasets for climate variability and change studies promoted the continual development of better methods for change point detection and correction. To follow this process the Hungarian Meteorological Service started a series of “Seminars for Homogenization” in 1996 (HMS 1996, WMO 1999, OMSZ 2001, WMO 2004, WMO 2006, WMO 2010).

Related reading

Homogenization of monthly and annual data from surface stations
A short description of the causes of inhomogeneities in climate data (non-climatic variability) and how to remove it using the relative homogenization approach.
Statistical homogenisation for dummies
A primer on statistical homogenisation with many pictures.
Just the facts, homogenization adjustments reduce global warming
Many people only know that climatologists increase the land surface temperature trend, but do not know that they also reduce the ocean surface trend and that the net effect is a reduction of global warming. This does not fit to well to the conspiracy theories of the mitigation sceptics.
Five statistically interesting problems in homogenization
Series written for statisticians and climatologists looking for interesting problems.
Why raw temperatures show too little global warming
The raw land surface temperature probably shows too little warming. This post explains the reasons why: thermometer screen changes, relocations and irrigation.
New article: Benchmarking homogenization algorithms for monthly data
Raw climate records contain changes due to non-climatic factors, such as relocations of stations or changes in instrumentation. This post introduces an article that tested how well such non-climatic factors can be removed.

References

Brückner, E., 1890: Klimaschwankungen seit 1700 nebst Bemerkungen über Klimaschwankungen der Diluvialzeit. E.D. Hölzel, Wien and Olnütz.
Alexandersson, A., 1986: A homogeneity test applied to precipitation data. J. Climatol., 6, pp. 661-675.
Alexandersson, H. and A. Moberg, 1997: Homogenization of Swedish temperature data .1. Homogeneity test for linear trends. Int. J. Climatol., 17, pp. 25-34.
Caussinus, H. and O. Mestre, 2004: Detection and correction of artificial shifts in climate series. Appl. Statist., 53, Part 3, pp. 405-425.
Conrad, V. and C. Pollak, 1950: Methods in Climatology. Harvard University Press, Cambridge, MA, 459 p.
Conrad V., O. Schreier, 1927: Die Anwendung des Abbe’schen Kriteriums auf physikalische Beobachtungsreihen. Gerland’s Beiträge zur Geophysik, XVII, 372.
Conrad, V., 1925: Homogenitätsbestimmung meteorologischer Beobachtungsreihen. Meteorologische Zeitschrift, 482–485.
Conrad V., 1944: Methods in Climatology. Harvard University Press, 228 p.
Domonkos, P., V. Venema, O. Mestre, 2013: Efficiencies of homogenisation methods: our present knowledge and its limitation. Proceedings of the Seventh seminar for homogenization and quality control in climatological databases, Budapest, Hungary, 24 – 28 October 2011, WMO report, Climate data and monitoring, WCDMP-No. 78, pp. 11-24.
Hann, J., 1880: Untersuchungen über die Regenverhältnisse von Österreich-Ungarn. II. Veränderlichkeit der Monats- und Jahresmengen. S.-B. Akad. Wiss. Wien.
Heidke P., 1923: Quantitative Begriffsbestimmung homogener Temperatur- und Niederschlagsreihen. Meteorologische Zeitschrift, 114-115.
Helmert F.R., 1907: Die Ausgleichrechnung nach der Methode der kleinsten Quadrate. 2. Auflage, Teubner Verlag.
Peterson T.C., D.R. Easterling, T.R. Karl, P. Groisman, N. Nicholls, N. Plummer, S. Torok, I. Auer, R. Boehm, D. Gullett, L. Vincent, R. Heino, H. Tuomenvirta, O. Mestre, T. Szentimrey, J. Salinger, E.J. Forland, I. Hanssen-Bauer, H. Alexandersson, P. Jones, D. Parker, 1998: Homogeneity adjustments of in situ atmospheric climate data: A review. Int. J. Climatol., 18, 1493-1517.
Hungarian Meteorological Service (HMS), 1996: Proceedings of the First Seminar for Homogenization of Surface Climatological Data, Budapest, Hungary, 6-12 October 1996, 44 p.
Kohler M.A., 1949: Double-mass analysis for testing the consistency of records and for making adjustments. Bull. Amer. Meteorol. Soc., 30: 188 – 189.
k.k. Hof- und Staatsdruckerei, 1873: Bericht über die Verhandlungen des internationalen Meteorologen-Congresses zu Wien, 2.-10. September 1873, Protokolle und Beilagen.
k.k. Zentralanstalt für Meterologie und Geodynamik. 1906: Bericht über die internationale meteorologische Direktorenkonferenz in Innsbruck, September 1905. Anhang zum Jahrbuch 1905. k.k. Hof-und Staatsdruckerei.
Kreil K., 1854a: Mehrjährige Beobachtungen in Wien vom Jahre 1775 bis 1850. Jahrbücher der k.k. Central-Anstalt für Meteorologie und Erdmagnetismus. I. Band – Jg 1848 und 1849, 35-74.
Kreil K., 1854b: Mehrjährige Beobachtungen in Mailand vom Jahre 1763 bis 1850. Jahrbücher der k.k. Central-Anstalt für Meteorologie und Erdmagnetismus. I. Band – Jg 1848 und 1849, 75-114.
Mestre O., 1999: Step-by-step procedures for choosing a model with change-points. In Proceedings of the second seminar for homogenisation of surface climatological data, Budapest, Hungary, WCDMP-No.41, WMO-TD No.962, 15-26.
OMSZ, 2001: Third Seminar for Homogenization and Quality Control in climatological Databases, Budapest.
Szentimrey, T., 1999: Multiple Analysis of Series for Homogenization (MASH). Proceedings of the second seminar for homogenization of surface climatological data, Budapest, Hungary; WMO, WCDMP-No. 41, 27-46.
Venema, V., O. Mestre, E. Aguilar, I. Auer, J.A. Guijarro, P. Domonkos, G. Vertacnik, T. Szentimrey, P. Stepanek, P. Zahradnicek, J. Viarre, G. Müller-Westermeier, M. Lakatos, C.N. Williams,
M.J. Menne, R. Lindau, D. Rasol, E. Rustemeier, K. Kolokythas, T. Marinova, L. Andresen, F. Acquaotta, S. Fratianni, S. Cheval, M. Klancar, M. Brunetti, Ch. Gruber, M. Prohom Duran, T. Likso,
P. Esteban, Th. Brandsma. Benchmarking homogenization algorithms for monthly data. Climate of the Past, 8, pp. 89-115, doi: 10.5194/cp-8-89-2012, 2012. See also the introductory blog post and a post on the weaknesses of the study.
WMO, 1966: Climatic Change, Report of a working group of the Commission for Climatology. Technical Note 79, WMO – No. 195. TP.100, 79 p.
WMO 1999: Proceedings of the Second Seminar for Homogenization of Surface Climatological Data, Budapest, Hungary, 9 – 13 November 1998, 214 p.
WMO, 2004: Fourth Seminar for Homogenization and Quality Control in Climatological Databases, Budapest, Hungary, 6-10 October 2003, WCDMP-No 56, WMO-TD No. 1236, 243 p.
WMO, 2006: Proceedings of the Fifth Seminar for Homogenization and Quality Control in Climatological Databases, Budapest, Hungary, 29 May – 2 June 2006. Climate Data and Monitoring WCDMP- No 71, WMO/TD- No. 1493.
WMO, 2010: Proceedings of the Meeting of COST-ES0601 (HOME) Action, Management Committee and Working groups and Sixth Seminar for Homogenization and Quality Control in Climatological Databases, Budapest, Hungary, 26 – 30 May 2008, WMO reports on Climate Data and Monitoring, WCDMP-No. 76.

Monday, August 21, 2017

Germany weather service opens up its data

Some good news from Germany. The government has decided to make the data of the German weather service (DWD) freely available for all. This comes after a parliamentary hearing in April on a bill to make the data freely available. All but one expert at the hearing were positive about this change.

Data was mostly already free for research, but really free data still helps science a lot. If data is only free for research that means that you have to sign a contract. For a global study that means 200 contracts, in the best case where all countries do this, in the local language with hard to find contact persons, with different conditions each time and often only a part of the data. If the data is really free, you can automatically download it, create regional and global collections, enrich them with additional information, add value with data processing (homogenisation, quality control, extremes, etc.) and publish them for everyone to use. It would also make the data streams more transparent.

This move was aided by the resolution of the World Meteorological Organisation (WMO) calling on its members, the weather services, to free their data:
Strengthen their commitment to the free and unrestricted exchange of [Global Framework for Climate Services] GFCS relevant data and products;

Increase the volume of GFCS relevant data and products accessible to meet the needs for implementation of the GFCS and the requirements of the GFCS partners;
Unfortunately, there still is no legally binding require to share the data. The weather services cannot force their governments to do so, but the resolution makes it clear that governments refusing to open the data are hurting their people.




There is also a downside, the German weather service, [[Deutscher Wetterdienst]] (DWD), currently earns about 3.5 million Euro selling data. In perspective that is about 1 percent of their 305 million Euro budget. (The DWD earns about 20% of their budget themselves and thus costs only 3 Euro per citizen per year.)

Because of these earnings many weather services are reluctant to open up their data. Especially in poorer countries these earnings can be a considerable part of the budget. On the other hand, the benefits to society of open data are sure to be much higher. Because of more people and companies will actually use the data and because better data products can be produced. When it comes to climate data I hope that the international climate negotiations can free the data in return for funding for the observational networks of poorer countries.

The main problem in Germany are, optimistically were, the commercial weather services. They fear competition, both from the DWD themselves and because free data lowers the barrier to entry for other companies to start offering better services. These companies have been so successful that a long time it was even forbidden for the DWD to publish their weather predictions on their homepage. Weather prediction the DWD still had to make because it is their job to warn for dangerous weather. That was an enormous destruction of value created with taxpayer money to create an artificial market for (often worse quality) weather predictions.

There is a similar problem where commercial media companies have succeeded in limiting the time that public broadcasting organisations can make their information available for watching/listening/download. This destruction of public capital is still ongoing.

Good that for weather and climate data common sense has won in Germany. Only a small number of countries have made their data fully open, but I have the impression that there is a trend. It would be great if someone would track this, if only to create more pressure to open the data holdings.

Related reading

Link to the DWD open data portal.

German parliament press office: Experts endorse free provision of weather service data. In German: Experten befürworten entgeltfreies Angebot der Wetterdienst-Wetterdaten.

DWD press release: Amendment to the Deutscher Wetterdienst Act in force since 25 July 2017. Tasks and responsibilities of Deutscher Wetterdienst updated to take account of today's environment.

Free our climate data - from Geneva to Paris.

Congress of the World Meteorological Organization, free our climate data.

Thursday, July 27, 2017

WMO Recognition of Long-Term Observing Stations

From the July 2017 newsletter of the WMO [World Meteorological Organization] Integrated Global Observing System (WIGOS). With some additional links & [clarifications].

Long-term meteorological observations are part of the irreplaceable cultural and scientific heritage of mankind
that serve the needs of current and future generations for long-term high quality climate records. They are
unique sources of past information about atmospheric parameters, thus are references for climate variability
and change assessments. To highlight this importance, WMO has a mechanism to recognize long-term observing stations. By so doing, the Organization promotes sustainable observational standards and best practices that facilitate the generation of high-quality time series data.

The initiative is envisaged to maintaining long-term observing stations, including in particular stations with more than 100 years observations — Centennial Stations — in support of climate applications (DRR [Disaster Risk Reduction Programme], GFCS [Global Framework for Climate Services], etc.) and research (climate assessment, climate adaptation, etc.). While acknowledging the efforts by Members to run and maintain appropriate observing systems including long-term observing stations, existing and potential difficulties which Members’ NMHSs [National Meteorological and Hydrological Service; mostly national weather services] are facing, due to their overall resource constraints and competing societal interests at national level, are observed by the same time.

The mechanism involves close collaboration between the Commission for Climatology (CCl), the Commission for Basic Systems (CBS), the Commission for Instruments and Methods of Observations (CIMO), the Global Climate Observing System (GCOS) through an ad-hoc advisory board, as well as the WMO Members and the Secretariat. The 69th Session of WMO Executive Council (May 2017) recognized a first set of 60 long-term observing stations following an invitation letter from WMO Secretariat to Members to submit no more than three candidate stations. Further invitation letters will be released every second year to extend the list of WMO recognized long-term observing stations. The next call for the nomination of candidate stations will be issued in early 2018.

The recognition mechanism is based on recognition criteria that address the length, completeness and consistency of observations at a station, the availability of minimum station metadata, data rescue, WMO observing standards including siting classification, observational data quality control and the future of the observing station. A self-assessment template for recognition criteria compliance of individual observing stations has been developed for Members to submit candidate stations, which has to be filled in for each candidate station. After review by the above mentioned advisory board, a list of stations is tabled at Executive Council sessions for final decision. It is envisaged to renew the recognition of observing stations every ten years to ensure criteria compliance.

A special WMO Website has been implemented that provides information on the mechanism and lists candidate and recognized stations:

https://public.wmo.int/en/our-mandate/what-we-do/observations/long-term-observing-stations

Furthermore, the recognition will be reflected in the WIGOS station catalogues. It is also planned to design a certificate per recognized station as well as a metal plate for installation at the station site.

Monday, October 10, 2016

A stable global climate reference network


Historical climate data contains inhomogeneities, for example due to changes in the instrumentation or the surrounding. Removing these inhomogeneities to get more accurate estimates of how much the Earth has actually warmed is a really interesting problem. I love the statistical homogenization algorithms we use for this; I am a sucker for beautiful algorithms. As an observationalist it is great to see the historical instruments, read how scientists understood their measurements better and designed new instruments to avoid errors.

Still for science it would be better if future climatologists had an easier task and could work with more accurate data. Let's design a climate-change-quality network that is a stable as we can humanly get it to study the ongoing changes in the climate.

Especially now that the climate is changing, it is important to accurately predict the climate for the coming season, year, decade and beyond at a regional and local scale. That is information (local) governments, agriculture and industry needs to plan, adapt, prepare and limit the societal damage of climate change.

Historian Sam White argues that the hardship of the Little Ice Age in Europe is not just about cold, but also about the turbulent and unpredictable weather. Also the coming century much hardship can be avoided with better predictions. To improve decadal climate prediction of regional changes and to understand the changes in extreme weather we need much better measurements. For example, with a homogenized radiosonde dataset, the improvements in the German decadal prediction system became much clearer than with the old dataset.

We are performing a unique experiment with the climate system and the experiment is far from over. It would also be scientifically unpardonable not to measure this ongoing change as well as we can. If your measurements are more accurate, you can see new things. Methodological improvements that lead to smaller uncertainties is one of the main factors that brings science forward.



A first step towards building a global climate reference network is agreeing on a concept. This modest proposal for preventing inhomogeneities due to poor observations from being a burden to future climatologists is hopefully a starting point for this discussion. Many other scientists are thinking about this. More formally there are the Rapporteurs on Climate Observational Issues of the Commission for Climatology (CCl) of the World Meteorological Organization (WMO). One of their aims is to:
Advance specifications for Climate Reference Networks; produce a statement of guidance for creating climate observing networks or climate reference stations with aspects such as types of instruments, metadata, and siting;

Essential Climate Variables

A few weeks ago Han Dolman and colleagues wrote a call to action in Nature Goescience titled "A post-Paris look at climate observations". They argue that while the political limits are defined for temperature, we need climate quality observations for all essential climate variables listed in the table below.
We need continuous and systematic climate observations of a well-thought-out set of indicators to monitor the targets of the Paris Agreement, and the data must be made available to all interested users.
I agree that we should measure much more than just temperature. It is quite a list, but we need that to understand the changes in the climate system and to monitor the changes in the atmosphere, oceans, soil and biology we will need to adapt to. Not in this list, but important are biological changes, especially ecology needs support for long-term observational programs, because they lack the institutional support the national weather services provide on the physical side.

Measuring multiple variables also helps in understanding measurement uncertainties. For instance, in case of temperature measurements, additional observations of insolation, wind speed, precipitation, soil temperature and albedo are helpful. The US Climate Reference Network measures this wind speed at the height of the instrument (and humans) rather than at the meteorologically typical height of 10 meter.

Because of my work, I am mainly thinking of the land surface stations, but we need a network for many more observations. Please let me know where the ideas do not fit to the other climate variables.

Table. List of the Essential Climate Variables; see original for footnotes.
Domain GCOS Essential Climate Variables
Atmospheric (over land, sea and ice) Surface: Air temperature, Wind speed and direction, Water vapour, Pressure, Precipitation, Surface radiation budget.

Upper-air: Temperature, Wind speed and direction, Water vapour, Cloud properties, Earth radiation budget (including solar irradiance).

Composition: Carbon dioxide, Methane, and other long-lived greenhouse gases, Ozone and Aerosol, supported by their precursors.
Oceanic Surface: Sea-surface temperature, Sea-surface salinity, Sea level, Sea state, Sea ice, Surface current, Ocean colour, Carbon dioxide partial pressure, Ocean acidity, Phytoplankton.

Sub-surface: Temperature, Salinity, Current, Nutrients, Carbon dioxide partial pressure, Ocean acidity, Oxygen, Tracers.
Terrestrial River discharge, Water use, Groundwater, Lakes, Snow cover, Glaciers and ice caps, Ice sheets, Permafrost, Albedo, Land cover (including vegetation type), Fraction of absorbed photosynthetically active radiation, Leaf area index, Above-ground biomass, Soil carbon, Fire disturbance, Soil moisture.

Comparable networks

There are comparable networks and initiatives, which likely shape how people think about a global climate reference network. Let me thus describe how they fit into the concept and where they are different.

There is the Global Climate Observing System (GCOS), which is mainly an undertaking of the World Meteorological Organization (WMO) and the Intergovernmental Oceanographic Commission (IOC). They observe the entire climate system; the idea of the above list of essential climate variables comes from them (Bojinski and colleagues, 2014). GOCS and its member organization are important for the coordination of the observations, for setting standard so that measurements can be compared and for defending the most important observational capabilities against government budget cuts.

Especially important from a climatological perspective is a new program to ask governments to recognize centennial stations as part of the world heritage. If such long series are stopped or the station is forced to move, a unique source of information is destroyed or damaged forever. That is comparable to destroying ancient monuments.



A subset of the meteorological stations are designated as GCOS Surface Network measuring temperature and precipitation. These stations have been selected for their length, quality and to cover all regions of the Earth. Its monthly data is automatically transferred to global databases.

National weather services normally take good care of their GCOS stations, but a global reference network would have much higher standards and also provide data at better temporal resolutions than monthly averages to be able to to study changes in extreme weather and weather variability.



There is already a global radiosonde reference network, the GCOS Reference Upper-Air Network (GRUAN, Immler and colleagues, 2010). This network provides measurements with well characterized uncertainties and they make extensive parallel measurements when they transition from one radiosonde design to the next. No proprietary software is used to make sure it is know exactly what happened to the data.

Currently they have about 10 sites, a similar number is on the list to be certified and the plan is not make this a network of about 30 to 40 stations; see map below. Especially welcome would be partners to start a site in South America.



The observational system for the ocean Argos is, as far as I can see, similar to GRUAN. It measures temperature and salinity (Roemmich and colleagues, 2009). If your floats meet the specifications of Argos, you can participate. Compared to land stations the measurement environment is wonderfully uniform. The instruments typically work a few years. Their life span is thus between a weather station and a one-way radiosonde ascent. This means that the instruments may deteriorate somewhat during their lifetimes, but maintenance problems are more important for weather stations.

A wonderful explanation of how Argos works for kids:


Argos has almost four thousand floats. They are working on a network with spherical floats that can go deeper.



Finally there are a number of climate reference networks of land climate stations. The best known is probably the US Climate Reference Network (USCRN, Diamond and colleagues, 2013). It has has 131 stations. Every station has 3 identical high quality instrument, so that measurement problems can be detected and the outlier attributed to a specific instrument. To find these problems quickly all data is relayed online and checked at their main office. Regular inspections are performed and everything is well documented.



The USCRN has selected new locations for its stations, which are expected to be free of human changes of the surroundings in the coming decades. This way it takes some time until the data becomes climatologically interesting, but they can already be compared with the normal network and this gives some confidence that its homogenized data is okay for the national mean; see below. The number of stations was sufficient to compute a national average in 2005/2006.



Other countries, such as Germany and the United Kingdom, have opted to make existing stations into a national climate reference network. The UK Reference Climatological Stations (RCS) have a long observational record spanning at least 30 years and their distribution aims to be representative of the major climatological areas, while the locations are unaffected by environmental changes such as urbanisation.


German Climate Reference Station which was founded in 1781 in Bavaria on the mountain Hohenpeißenberg. The kind of weather station photo, WUWT does not dare to show.
In Germany the climate reference network are existing stations with a very long history. Originally they were the stations where conventional manual observations continued. Unfortunately, they will now also switch to automatic observations. Fortunately, after making a long parallel measurement to see what this does to the climate record*.

An Indian scientist proposes an Indian Climate Reference Network of about 110 stations (Jain, 2015). His focus is on precipitation observations. While temperature is a good way to keep track on the changes, most of the impacts are likely due to changes in the water cycle and storms. Precipitation measurements have large errors; it is very hard to make precipitation measurements with an error below 5%. When these errors change, that produces important inhomogeneities. Such jumps in precipitation data are hard to remove with relative statistical homogenization because the correlations between stations are low. If there is one meteorological parameters for which we need a reference network, it is precipitation.

Network of networks

For a surface station Global Climate Reference Network, the current US Climate Reference Network is a good template when it comes to the quality of the instrumentation, management and documentation.

A Global Climate Reference Network does not have to do the heavy lifting all alone. I would see it as the temporally stable backbone of the much larger climate observing system. We still have all the other observations that help to make sampling errors smaller and provide the regional information you need to study how energy and mass moves through the climate system (natural variability).

We should combine them in a smart way to benefit from the strengths of all networks.



The Global Climate Reference Network does not have to be large. If the aim is to compute a global mean temperature signal, we would need just as many samples as we would need to compute the US mean temperature signal. This is in the order of 100 stations. Thus on average, every country in the world would have one climate reference station.

The figure on the right from Jones (1994) compares the temperature signal from 172 selected stations &mdsh; 109 in the Northern Hemisphere. 63 in the Southern Hemisphere. &mdash with the temperature signal computed from all available stations. There is nearly no difference, especially with respect to the long term trend.

Callendar (1961) used 80 only stations, but his temperature reconstruction fits quite well to the modern reconstructions (Hawkins and Jones, 2013).

Beyond the global means

The number of samples/stations can be modest, but it is important that all climate regions of the world are sampled; some regions warm/change faster than others. It probably makes sense to have more stations in especially vulnerable regions, such as mountains, Greenland, Antarctica. We really need a stable network of buoys in the Arctic, where changes are fast and these changes also influence the weather in the mid-latitudes.


Crew members and scientists from the US Coast Guard icebreaker Healy haul a buoy across the sea ice during a deployment. In the lead an ice bear watcher and a rescue swimmer.
To study changes in precipitation we probably need more stations. Rare events contribute a lot to the mean precipitation rate. The threshold to get into the news seems to be the rain sum of a month falling in on one day. Enormous downpours below that level are not even newsworthy. This makes the precipitation data noisy.

To study changes in extreme events we need more samples and might need more stations as well. How much more depends on how strong the synergy between the reference network and the other networks is and thus how much the other networks could then be used to produce more samples. That question needs some computational work.

The idea to use 3 redundant instruments in the USCRN is something we should also use in the GCRN and I would propose to also to create clusters of 3 stations. That would make it possible to detect and correct inhomogeneities by making comparisons. Even in a reference network there may still be inhomogeneities due to changes in the surrounding or management (which were not noticed).


We should also carefully study whether is might be a problem to only use pristine locations. That could mean that the network is no longer representative for the entire world. We should probably include stations in agricultural regions, that is a large part of the surface and they may respond differently from natural regions. But agricultural practices (irrigation, plant types) will change.

Starting a new network at pristine locations has as disadvantage that it takes time until the network becomes valuable for climate change research. Thus I understand why Germany and the UK have opted to use locations where there are already long historical observations. Because we only need 100+ stations it may be possible to select existing locations from the 30 thousand stations we have that are and likely stay pristine in the coming century. If not, I would not compromise and use a new pristine location for the reference network.

Finally, when it comes to the number of stations, we probably have to take into account that no matter how much we try some stations will become unsuitable due to war, land-use change and many other unforeseen problems. Just look back a century and consider all the changes we experienced, the network should be robust against such changes for the next century.

Absolute values or changes

Argos (ocean) and GRUAN (upper air) do not specify the instruments, but set specification for the measurement uncertainties and their characterization. Instruments may thus change and this change has to be managed. In case of GRUAN they perform many launches with multiple instruments.

For a climate reference land station I would prefer to keep the instruments exactly the same design for the coming century.

To study changes in the climate climatologists look at the local changes (compute anomalies) and average those. We had a temperature increase of about 1°C since 1900 and are confident it is warming. This while the uncertainty in the average absolute temperature is of the same order of magnitude. Determining changes directly is easier than first estimating the absolute level and then look whether it is changing. By keeping the instruments the same, you can study changes more easily.


This is an extreme example, but how much thermometer screens weather and yellow before they are replaced depends on the material (and the climate). Even if we have better materials in the future, we'd better keep it the same for stable measurements.
For GRUAN managing the change can solve most problems. Upper air measurements are hard; the sun is strong, the air is thin (bad ventilation) and the clouds and rain make the instruments wet. Because the instruments are only used once, they cannot be too expensive. On the other hand, each time starting with a freshly calibrated instrument makes the characterization of the uncertainties easier. Parallel measurements to manage changes are likely more reliable up in the air than at the surface where two instruments measuring side by side can legitimately measure a somewhat different climate, especially when it comes to precipitation, where undercatchment strongly depends on the local wind or for temperature when cold air flows at night hugging the orography.

Furthermore, land observations are used to study changes in extreme weather, not just the mean state of the atmosphere. The uncertainty of the rain rate depends on the rain rate itself. Strongly. Even in the laboratory and likely more outside where also the influence factors (wind, precipitation type) depend on the rain rate. I see no way to keep undercatchment the same without at least specifying the outside geometry of the gauge and wind shield in minute detail.

The situation for temperature may be less difficult with high-quality instruments, but is similar. When it comes to extremes also the response time (better: response function) of the instruments becomes important and how much out-time the instrument experiences, which is often related to severe weather. It will be difficult to design new instruments that have the same response functions and the same errors over the full range of values. It will also be difficult to characterize the uncertainties over the full range of values and velocity of changes.

Furthermore, the instruments of a land station are used for a long time while not being observed. Thus weather, flora, fauna and humans become error sources. Instruments which have the same specifications in the laboratory may thus still perform differently in the field. Rain gauges may be more or less prone to getting clogged by snow or insects, more or less attractive for drunks to pee in. Temperature screens may be more or less prone to be blocked by icing or for bees to build their nest in. Weather stations may be more or less attractive to curious polar bears.

This is not a black and white situation. It will depend on the quality of the instruments which route to prefer. In the extreme case of an error free measurement, there is no problem with replacing it with another error free instrument. Metrologists in the UK are building an instrument that acoustically measures the temperature of the air, without needing a thermometer, which should have the temperature of the air, but in practice never has. If after 2 or 3 generations of new instruments, they are really a lot better in 50 years and we would exchange them, that would still be a huge improvement of the current situation with an inhomogeneity every 15 to 20 years.



The software of GRUAN is all open source. So that when we understand the errors better in future, we know exactly what we did and can improve the estimates. In case we specify the instruments, that would mean that we need Open Hardware as well. The designs would need to be open and specified in detail. Simple materials should be used to be sure we can still obtain them in 2100. An instruments measuring humidity using the dewpoint of a mirror will be easier to build in 2100 than one using a special polymer film. These instruments can still be build by the usual companies.

If we keep the instrumentation of the reference network the same, the normal climate network, the GCOS network will likely have better equipment in 2100. We will discover many ways to make more accurate observations, to cut costs and make the management more easy. There is no way to stop progress for the entire network, which in 2100 may well have over 100 thousand stations. But I hope we can stop progress for a very small climate reference network of just 100 to 200 stations. We should not see the reference network as the top of hierarchy, but as the stable backbone that complements the other observations.

Organization

How do we make this happen? First the scientific community should agree on a concept and show how much the reference network would improve our understanding of the climatic changes in the 21st century. Hopefully this post is a step in this direction and there is an article in the works. Please add your thoughts in the comments.

With on average one reference station per country, it would be very inefficient if every country would manage its own station. Keeping the high metrological and documentation standards is an enormous task. Given that the network would be the same size as USCRN, the GCRN could in principle be managed by one global organization, like USCRN is managed by NOAA. It would, however, probably be more practical to have regional organizations for better communication with the national weather services and to reduce travel costs for maintenance and inspections.

Funding


The funding of a reference network should be additional funding. Otherwise it will be a long hard struggle in every country involved to build a reference station. In developing countries the maintenance of one reference station may well exceed the budget of their current network. We already see that some meteorologists fear that the millennial stations program will hurt the rest of the observational network. Without additional funding, there will likely be quite some opposition and friction.

In the Paris climate treaty, the countries of the world have already pledged to support climate science to reduce costs and damages. We need to know how close we are to the 2°C limit as feedback to the political process and we need information on all other changes as well to assess the damages from climate change. Compared to the economic consequences of these decisions the costs of a climate reference network is peanuts.

Thus my suggestion would be to ask the global climate negotiators to provide the necessary funding. If we go there, we should also ask the politicians to agree on the international sharing of all climate data. Restrictions to data is holding climate research and climate services back. These are necessary to plan adaptation and to limit damages.

The World Meteorological Organization had its congress last year. The directors of the national weather services have shown that they are not able to agree on the international sharing of data. For weather services selling data is often a large part of their budget. Thus the decision to share data internationally should be made by politicians who have the discretion to compensate these losses. In the light of the historical responsibility of the rich countries, I feel a global fund to support the meteorological networks in poor countries would be just. This would compensate them for the losses in data sales and would allow them to better protect themselves against severe weather and climate conditions.

Let's make sure that future climatologists can study the climate in much more detail.

Think of the children.


Related information

Hillary Rosner in the NYT on the global greenhouse gas reference network: The Climate Lab That Sits Empty

Free our climate data - from Geneva to Paris

Congress of the World Meteorological Organization, free our climate data

Climate History Podcast with Dr. Sam White mainly on the little ice age

A post-Paris look at climate observations. Nature Geoscience (manuscript)

Why raw temperatures show too little global warming

References

Bojinski, Stephan, Michel Verstraete, Thomas C. Peterson, Carolin Richter, Adrian Simmons and Michael Zemp, 2014: The Concept of Essential Climate Variables in Support of Climate Research, Applications, and Policy. Journal of Climate, doi: 10.1175/BAMS-D-13-00047.1.

Callendar, Guy S., 1961: Temperature fluctuations and trends over the earth. Quarterly Journal Royal Meteorological Society, 87, pp. 1–12. doi: 10.1002/qj.49708737102.

Diamond, Howard J., Thomas R. Karl, Michael A. Palecki, C. Bruce Baker, Jesse E. Bell, Ronald D. Leeper, David R. Easterling, Jay H. Lawrimore, Tilden P. Meyers, Michael R. Helfert, Grant Goodge, Peter W. Thorne, 2013: U.S. Climate Reference Network after One Decade of Operations: Status and Assessment. Bulletin of the American Meteorological Society, doi: 10.1175/BAMS-D-12-00170.1.

Dolman, A. Johannes, Alan Belward, Stephen Briggs, Mark Dowell, Simon Eggleston, Katherine Hill, Carolin Richter and Adrian Simmons, 2016: A post-Paris look at climate observations. Nature Geoscience, 9, September, doi: 10.1038/ngeo2785. (manuscript)

Hawkins, Ed and Jones, Phil. D. 2013: On increasing global temperatures: 75 years after Callendar. Quarterly Journal Royal Meteorological Society, 139, pp. 1961–1963, doi: 10.1002/qj.2178.

Immler, F.J., J. Dykema, T. Gardiner, D.N. Whiteman, P.W. Thorne, and H. Vömel, 2010: Reference Quality Upper-Air Measurements: guidance for developing GRUAN data products. Atmospheric Measurement Techniques, 3, pp. 1217–1231, doi: 10.5194/amt-3-1217-2010.

Jain, Sharad Kumar, 2015: Reference Climate and Water Data Networks for India. Journal of Hydrologic Engineering, 10.1061/(ASCE)HE.1943-5584.0001170, 02515001. (Manuscript)

Jones, Phil D., 1994: Hemispheric Surface Air Temperature Variations: A Reanalysis and an Update to 1993. Journal of Climate, doi: 10.1175/1520-0442(1994)007<1794:HSATVA>2.0.CO;2.

Pattantyús-Ábrahám, Margit and Wolfgang Steinbrecht, 2015: Temperature Trends over Germany from Homogenized Radiosonde Data. Journal of Climate, doi: 10.1175/JCLI-D-14-00814.1.

Roemmich, D., G.C. Johnson, S. Riser, R. Davis, J. Gilson, W.B. Owens, S.L. Garzoli, C. Schmid, and M. Ignaszewski, 2009: The Argo Program: Observing the global ocean with profiling floats. Oceanography, 22, p. 34–43, doi: 10.5670/oceanog.2009.36.

* The transition to automatic weather stations in Germany happened to have almost no influence on the annual means, contrary to what Klaus Hager and the German mitigation sceptical blog propagandise based on badly maltreated data.

** The idea to illustrate the importance of smaller uncertainties by showing two resolutions of the same photo comes from metrologist Michael de Podesta.

Saturday, June 13, 2015

Free our climate data - from Geneva to Paris

Royal Air Force- Italy, the Balkans and South-east Europe, 1942-1945. CNA1969

Neglecting to monitor the harm done to nature and the environmental impact of our decisions is only the most striking sign of a disregard for the message contained in the structures of nature itself.
Pope Francis

The 17th Congress of the World Meteorological Organization in Geneva ended today. After countless hours of discussions they managed to pass a almost completely rewritten resolution on sharing climate data in the last hour.

The glass is half full. On the one hand, the resolution clearly states the importance of sharing data. It demonstrates that it is important to help humanity cope with climate change by making it part of the global framework for climate services (GFCS), which is there to help all nations to adapt to climate change.

The resolution considers and recognises:
The fundamental importance of the free and unrestricted exchange of GFCS relevant data and products among WMO Members to facilitate the implementation of the GFCS and to enable society to manage better the risks and opportunities arising from climate variability and change, especially for those who are most vulnerable to climate-related hazards...

That increased availability of, and access to, GFCS relevant data, especially in data sparse regions, can lead to better quality and will create a greater variety of products and services...

Indeed free and unrestricted access to data can and does facilitate innovation and the discovery of new ways to use, and purposes for, the data.
On the other hand, if a country wants to it can still refuse to share the most important datasets: the historical station observations. Many datasets will be shared: Satellite data and products, ocean and cryosphere (ice) observations, measurements on the composition of the atmosphere (including aerosols). However, information on streamflow, lakes and most of the climate station data are exempt.

The resolution does urge Members to:
Strengthen their commitment to the free and unrestricted exchange of GFCS relevant data and products;

Increase the volume of GFCS relevant data and products accessible to meet the needs for implementation of the GFCS and the requirements of the GFCS partners;
But there is no requirement to do so.

The most positive development is not on paper. Data sharing may well have been the main discussion topic among the directors of the national weather services at the Congress. They got the message that many of them find this important and they are likely to prioritise data sharing in future. I am grateful to the people at the WMO Congress who made this happen, you know who you are. Some directors really wanted to have a strong resolution as justification towards their governments to open up the databases. There is already a trend towards more and more countries opening up their archives, not only of climate data, but going towards open governance. Thus I am confident that many more countries will follow this trend after this Congress.

Also good about the resolution is that WMO will start monitoring data availability and data policies. This will make visible how many countries are already taking the high road and speed up the opening of the datasets. The resolution requests WMO to:
Monitor the implementation of policies and practices of this Resolution and, if necessary, make proposals in this respect to the Eighteenth World Meteorological Congress;
In a nice twist the WMO calls the data to be shared: GFCS data. Thus basically saying, if you do not share climate data you are responsible for the national damages of climatic changes that you could have adapted to and you are responsible for the failed adaptation investments. The term "GFSC data" misses how important this data is for basic climate research. Research that is important to guide expensive political decisions on mitigation and in the end again adaptation and ever more likely geo-engineering.

If I may repeat myself, we really need all the data we can get for an accurate assessment of climatic changes, a few stations will not do:
To reduce the influence of measurement errors and non-climatic changes (inhomogeneities) on our (trend) assessments we need dense networks. These errors are detected and corrected by comparing one station to its neighbours. The closer the neighbours are, the more accurate we can assess the real climatic changes. This is especially important when it comes to changes in severe and extreme weather, where the removal of non-climatic changes is very challenging.
The problem, as so often, is mainly money. Weather services get some revenues from selling climate data. These can't be big compared to the impacts of climate change or compared to the investments needed to adapt, but relative to the budget of a weather service, especially in poorer countries, it does make a difference. At least the weather services will have to ask their governments for permission.

Thus we will probably have to up our game. The mandate of the weather services is not enough, we need to make clear to the governments of this world that sharing climate data is of huge benefit to every single country. Compared to the costs of climate change this is a no-brainer. Don Henry writes that "[The G7] also said they would continue efforts to provide US$100 billion a year by 2020 to support developing countries' own climate actions." The revenues from selling climate data are irrelevant compared to that number.

There is just a large political climate summit coming up, the COP21 in Paris in December. This week there was a preparatory meeting in Bonn to work on the text of the climate treaty. This proposal already has an optional text about climate research:
[Industrialised countries] and those Parties [nations] in a position to do so shall support the [Least Developed Countries] in the implementation of national adaptation plans and the development of additional activities under the [Least Developed Countries] work programme, including the development of institutional capacity by establishing regional institutions to respond to adaptation needs and strengthen climate-related research and systematic observation for climate data collection, archiving, analysis and modelling.
An earlier climate treaty (COP4 from 1998) already speaks about the exchange of climate data (FCCC/CP/1998/16/Add.1):
Urges Parties to undertake free and unrestricted exchange of data to meet the needs of the Convention, recognizing the various policies on data exchange of relevant international and intergovernmental organizations;
"Urges" is not enough, but that is a basis that could be reinforced. With the kind of money COP21 is dealing with it should be easy to support weather services of less wealthy countries to improve their observation systems and make the data freely available. That would be an enormous win-win situation.

To make this happen, we probably need to show that the climate science community stands behind this. We would need a group of distinguished climate scientists from as much countries as possible to support a "petition" requesting better measurements in data sparse regions and free and unrestricted data sharing.

To get heard we would probably also need to write articles for the national newspapers, to get published they would again have to be written by well-known scientists. To get attention it would also be great if many climate blogs would write about the action on the same day.

Maybe we could make this work. My impression was already that basically everyone in the climate science community finds the free exchange of climate data very important and the current situation a major impediment for better climate research. After last weeks article on data sharing the response was enormous and only positive. This may have been the first time that a blog post of mine that did not respond to something in the press got over 1000 views. It was certainly my first tweet that got over 13 thousand views and 100 retweets:


This action of my little homogenization blog was even in the top of the twitter page on the Congress of the WMO (#MeteoWorld), right next to the photo of the newly elected WMO Secretary-General Petteri Taalas.



With all this internet enthusiasm and the dedication of the people fighting for free data at the WMO and likely many more outside of the WMO, we may be able to make this work. If you would like to stay informed please fill in the form below or write to me. If enough people show interest, I feel we should try. I also do not have the time, but this is important.






Related reading

Congress of the World Meteorological Organization, free our climate data

Why raw temperatures show too little global warming

Everything you need to know about the Paris climate summit and UN talks

Bonn climate summit brings us slowly closer to a global deal by Don Henry (Public Policy Fellow, Melbourne Sustainable Society Institute at University of Melbourne) at The Conversation.

Free climate data action promoted in Italian. Thank you Sylvie Coyaud.

If my Italian is good enough, that is Google Translate, this post wants the Pope to put the sharing of climate data in his encyclical. Weather data is a common good.


* Photo at the top: By Royal Air Force official photographer [Public domain], via Wikimedia Commons