Showing posts with label conference. Show all posts
Showing posts with label conference. Show all posts

Thursday, 19 September 2019

European Meteorological Society Meeting highlights on station data quality and communication #EMS2019

Last week I was at the Annual Meeting of the European Meteorological Society in Copenhagen, Denmark. Here are the highlights for station data (quality) and communication.

Warming in Svalbard

Øyvind Nordli and colleagues estimated the warming on the Arctic island of Svalbard/Spitsbergen; see figure below. They use the linear red line to estimate the total warming and claim 3.8°C of warming. I would say it warmed a whooping 6°C (11°F). The graph already mostly shows that such a linear trend based estimate will underestimate the total warming.

The monthly data was already published in 2014. At that time I would have called it 5°C of warming; recent years were very warm.

They put a lot of work in the homogenization; even made modern parallel measurements to estimate the effect of past relocations of the station. The next, almost published, paper is on the daily data, so that we can study changes in the number of growing, freezing or melting days.



Warming in the tropical hot spot

There is a small region high up in the air in the tropics that is dear to many climate "skeptics", the tropical hot spot. It is one of the coldest places on Earth which warms strongly when the world is warming (for any reason). Because some observations do not show as much warming there, climate "skeptics" have declared this region to be the arbiter of climate truth, these observations and satellite estimates to the be best we have and most informative for the changes of our climate.


The warming for a GISS model equilibrium run for a 2% increase in solar forcing showing a maximum around 20N to 20S around 300mb (10 km).

Back to reality, it is really hard to make good measurements of such a cold place starting at such a tropically warm place. The thermometer needs to be reliable over about 100°C of range. That is a lot. It is not that easy to launch a weather balloon up to such heights and colds; the balloon will expand enormously. While the countries making these measurements are among the poorest on Earth.

What I had not realized is how few weather balloon make it to such heights. A poster by Souleymane Sy showed this; see Figure below. For trend estimates the sharp drop off above the pressure level of 300mb is especially very worrying. Changes in this drop off level due to changes in equipment can easily lead to changes in the estimated temperature. There is a part of the tropical hot spot below 300mb; that would be the part I would prioritize in trend estimates.


Number of radiosonde stations recording at least a given percentage of temperature and relative humidity monthly data at mandatory pressure levels since 1978 to present time for the Tropics (20° North to 20° South).

Weather forecasts in America and Europe

Communication at the EMS mostly means presenting the daily TV weather forecasts. There was a lovely difference between American and European presenters. The Americans were explaining how to dumb down your forecast as much as possible. A study found that most high school students in Alabama could not find their county on a map of Alabama; so the advice is to put a city name on every number on the map. The Europeans presented their educational work.

Our Irish friends had made three one-hour shows about the weather on consecutive days between 7 and 8pm when normally the soaps are running; light information in a botanical garden with a a small audience.

German weather presenter Karsten Schwanke got a price for his educational weather forecasts, which add information on climate change; for example in case of Dorian show the increase in the sea surface temperature. For Schwanke providing context is the main task of TV weather, the local numbers are available from a weather app.


Karsten Schwanke explains the relationship between the jet stream, wild fires and the drought in Europe. In German.

An increasing problem is fake weather predictions. Amateurs who can make a decent map are often seen as reliable sources, which can be dangerous in case of severe weather.

American weather caster Jay Trobec reported that it is common to have weather information three times during a news block, before, in the middle and at the end. In Europe you just get weather at the end. In America the weather is live, a presenter explaining everyone should leave the disaster area they went to to make this live broadcast. In Europe typically reported and the weather shown in videos. Trobec stated that during severe weather people watch TV rather than use the internet.


Live hurricane weather. :-)

The difference is likely that there is not that much severe weather in Europe, you normally watch the weather to see if you have to take an umbrella with you, rarely to see whether your house will soon be destroyed. Live weather would be looking at a weather presenter slowly getting wet in the drizzle. In addition, European public media have an educational mandate, they are paid by the public to make society better, while in America media is commercial and will do whatever makes money.

In the harbor of Copenhagen is the famous little mermaid. Tourists ships went to see it, had to keep quite a distance and could only show her back. Typically the boats only waited a few seconds because there was nothing to see. But due to commercial pressure they had to have the little mermaid on their tour schedule. They follow demand, whether the outcome is good or not.

Short hits communication

  • When asked what 30% probability of rain means for a weather prediction most people gave the wrong answer: that 30% of the region would experience rain. The formally correct answer is that 30% of the cases this prediction is made you will experience rain. To be fair to the people, I often explain the need to give such a percentage by saying that in case of showers we cannot tell whether it rains in Bonn or Cologne. I feel this is quite common explanation and the main effect. The German weather service is working on providing more detailed probabilistic information to weather brigades. That seems to be appreciated (and they answered the question mostly right).
  • Amanda Ruggeri won the journalism award for her story on sea level rise in Miami, which was reviewed by ClimateFeedback who found its scientific credibility to be "very high". Recommended read.
  • EUMETSAT operates the European satellites once in space. They also make MOOCs ([[Massive Open Online  Courses]]). They have one on the oceans and one on the atmosphere. They are a great way to introduce these topics to new people and in future they plan to do more live. 
  • Climate change is seen as the Top Global Threat according to global polling by the Pew Institute. In 2018 67 percent of the world sees climate change as a major threat to their country.  
  • During a Q&A someone remarked that it would be good to talk about the history of climatology more because people are spreading the rumor that climatology is a new field of science trying to make it sound less solid.
  • In case I have any Finnish speaking readers, Finland has a two-yearly bulletin on weather and climate, recently revamped.
  • Copernicus has a "new" journal on statistical climatology, ideally suited for homogenization studies: Advances in Statistical Climatology, Meteorology and Oceanography (ASCMO). It does not have an Impact Factor yet, but seeing the editorial team and reading a few articles it is clearly a serious journal and likely will get one soon. It is worth building up such a journal to have an outlet for statistical/methodological studies on climate. We already published there once; post upcoming.
  • Did you know about STATMOS, an American Research Network for Statistical Methods for Atmospheric and Oceanic Sciences?

Short hits observations

  • I had seen people use measurements of cosmic rays to estimate the soil moisture between the surface and the probe, but it was new to me to use it to measure the amount of snow on top of a glacier.
  • Michal Zak of the Czech Hydrometeorological Institute and colleagues had an interesting way to estimate how urban a station is. They computed the absolute day to day differences of the maximum and of the minimum temperature and subtracted them from each other. If the maximum temperature varies more a station is likely urban, if the minimum varies more it is likely rural. For Prague and its surrounding the differences between stations were not particularly large and smaller than its seasonal cycle, but it could be a useful check. This could also be a measure that could help one to selected climatologically similar pairs of stations in relative statistical homogenization.
  • The Homogenization Seminar in Budapest will be from 18 to 21 of May 2020. Announcements will follow, e.g., on the homogenization list. (I should write less mails to the homogenization list; at EMS someone asked to be added to the homogenization newsletter.) 
  • Carla Mateus studied Data Rescue (DARE) as a scientific problem. By creating one really high quality transcribed dataset as a benchmark, she studied how accurately various groups transcribed historical observations. Volunteers of the Irish meteorological society were an order of magnitude more accurate (0.3% errors) than students (3.3%). Great talk.
  • Our colleagues from Catalonia studied the influence of the time of observation. Manual observations tend to be made at 8am, while automatic measurements often use a normal calendar day. This naturally mattered most for the minimum temperature. With statistical homogenization the small breaks are hard to find, to formulate it diplomatically.
  • Monika Lakato has ambitious plans to study changes in hourly precipitation in Hungary motivated by increases in rain intensity (precipitation amount on rainy days).
  • Peter Domonkos studied how well network-wide trends are corrected in the new MULTITEST benchmark dataset (the presentation as pptx file). He found that his method (ACMANTv4) was able to reduce this error by about 30% and others were worse. It would be interesting to study what is different in the MULTITEST dataset or this analysis because the results of Williams et al. (2012) are much more optimistic; here 50 to 90% of the trend error is removed for similarly dense networks.
  • ACMANTv4 is on GitHub and about to be published. Some colleagues already used it. 

Meteorological Glossaries

Miloslav Müller gave a talk on the new Slovak meteorological glossary, listing many other glossaries. So I now have a bookmark folder full of glossaries.
To finish with a great audience comment on the last day, not directly weather related: "In Russian education everything is explained, you do not have to remember or study." I loved that expression. That is the reason I studied physics, I also loved biology, but you have to remember so much and my memory is very poor for random stuff like names of organisms. When you understand something, you (I?) automatically remember it, it does not even feel like learning.

Related reading

The IPCC underestimates global warming. This post explains why using linear regression underestimates total warming

Annual Meeting of the European Meteorological Society

Wednesday, 27 December 2017

Open science and science communication at #EGU18, the European Geophysical Union General Assembly

The EGU General Assembly 2018 will bring 14,500 geoscientists from all over the world to Vienna. Those are not just climate scientists plotting to take over the world. Climatology is just one of 22 disciplines present. In my last post on interesting meetings for climate data scientists, I already pointed to the relevant scientific meetings taking place for climate data scientists.

In this post I wanted to point to more general meetings (sessions, debates and short courses) that take place that may be of interest to climate scientists on Open Science, Science Communication, scientific publishing and climate change.

Such big conferences have downsides, if you meet someone you will have to immediately get contact information because chances of meeting twice by accident are small. But an advantage is the attention for topics affecting many sciences, which do not take place at more focussed workshops and smaller meetings. There are sessions on nonlinear physics, which pose methodological problems almost all geoscientists have to deal with and I attended a lot as young scientist.

There was also a time where I worked on many different topics, clouds, downscaling, homogenisation, where EGU was great for meeting all these communities in the same week. Lately I was mostly focussed on homogenisation and I have not visited EGU for some time. If you are mostly interested in one session, it is a long trip and an expensive conference for just a few talks and a poster session.

Maybe I paid less attention to this in the past, but it looks like EGU nowadays has a very wide range of meetings on Open Data, Open Code, Open Science, Publishing, Citizen Science and Science Communication. As I am thinking of destroying a multi-billion dollar scientific publishing industry, those are topics very much on my mind. So I think I will visit EGU this year and have curated a list below of meetings that I think climate scientists could be interested in. (Descriptions are often shortened.)

Bottom up scientific conferences

Old rabbits, with which the Germans do not only mean our professorial chemistry bunny, can skip to the list, but for young scientists and outsiders I thought it would be interesting to explain how such a huge conference is organised. With 14,500 geoscientists writing more than 20,000 abstracts on the work they would like to present, it is impossible for the conference organisers to determine what will happen and the content of the conference is very much a bottom-up affair.

The conference is split up in 22 disciplinary divisions and 13 divisions of general interest. One of these divisions is "Climate: Past, Present, Future". It could have been called "climate". Within these divisions you have dozens of so called "sessions", which are meetings on a specific topic.

Everyone can propose a session. For this EGU there was a call-for-sessions with a deadline in September. As far as I know the only condition is that one of the organisers of the session needs to have a PhD.

The next step is the call-for-abstracts, which for EGU2018 ends the 10th of January. Everyone can submit an abstract to a session of their liking describing what they would like to talk about. Again bottom-up.

Normally abstracts are accepted. When I was organiser of the downscaling session, I could see the number and about 1% of the abstracts was rejected. They were mostly double or empty ones, where something had gone wrong during submission. If the organiser thinks there is something wrong with your work, the abstract is normally still accepted, but you will likely get a poster.

Space is limited and the organiser can only select one third of the abstracts as talks, the others become posters. One time block with talks is one and a half hour in which six presentations can be given. The minimum size of a session is thus 18 abstracts. If your session gets less, the divisions leaders will merge your session with another one on a similar topic. Getting to 18 abstracts is the main barrier to organising your own session.

Talks are best for broadcasting a new result, you reach more people, but there is only time for a few questions and thus little feedback. Posters are much better for feedback. As a convener an important criterion for making an abstract a talk or a poster is thus the stage the study seemed to be in. In the early phase feedback is important, if the work is finished broadcasting is important. In addition talks are typically for studies of more general interest and if it is known how well someone talks that is also an important consideration. Thus if you want to get talk, make sure to mention some results to make clear you are in the final stages and make sure the abstract is clear and contains no typos, which are proxies for being able to present your work clearly.

The posters are EGU are typically well visited, especially the main evening poster session with free beer and wine to get people talking. Personally I spend most of my time at the posters. If a talk is not interesting, 15 precious minutes are gone, if a poster is not interesting you just walk along.

Some sessions at EGU have a system been a talk and a poster called a PICO session. Here people present their work in a 2-minute talk and afterwards every presenter stands next to a touch screen with the presentation for detailed discussions. The advantage of a poster over a PICO is that the poster is up all day.



Next to these sessions where people present their latest, you can also reserve rooms for splinter meetings to talk with each other or organise short courses. Many of the interesting meetings listed below are short courses.

Science

Great Debate 4 Low-risk geo-engineering: are techniques available now?

With the Paris agreement, a majority of the world’s countries have agreed to keep anthropogenic warming below 2 °C. According to the Intergovernmental Panel on Climate Change*, this target would require not only reducing all man-made greenhouse gas emissions to zero but also removal of large amounts of carbon dioxide from the atmosphere, or other type of geo-engineering techniques. The issue of geo-engineering has been heavily debated during the last years and we are therefore asking: Are the potential risks with geo-engineering sufficiently known? Are safe geo-engineering techniques available? Are they available now?

This debate will address these questions of crucial importance for today’s society. It will discuss the most recent discoveries of geo-engineering techniques, their potential to reduce global warming and their potential risks.
* I am not sure whether the IPCC states this. The scenarios that stay below 2 °C do have carbon dioxide removal. But scenarios are just that, scenarios, not predictions for the future.

I think we urgently need to talk about a geo-intervention. There is no reason to wait until the Earth has warmed 2 °C, climate change does unacceptable damage right now.

Great Debate 2 Hands on or hands off?

A great debate on whether scientists should get involved in policy.
In recent years there has been a growing distrust of experts in the public imagination which has been expressed in numerous debates from Brexit to the US presidential election. This gives rise to serious questions about the role of scientists in policy making and the political sphere. As geoscientists, our disciplines can have a real impact on the way humanity organises itself, so what should our role in that be? There are serious tensions here between the desire for our knowledge to have real impact and make a difference, the need for scientific detachment and objectivity, and respect for broader perspectives and for democracy itself.

The key questions for this debate are:
  • Should geoscientists restrict themselves to knowledge generation and stay out of the policy world?
  • Or should we be getting involved and making change happen?
  • Should our voices as experts be heard louder than others?
  • Or does evidence-based policy undermine democracy?
  • Should we be hands on or keep our hands off
Conferences are busy, so let me answer the questions so you do not have to go.

Four of the five organisers are from the UK, but I hope that at least outside of Anglo-America it is uncontroversial for scientists to inform the public and policy makers of their findings. Scientists are humans and have human rights, including free speech. Germany and several other European countries have even set up climate service centres to facilitate the flow of information from science to groups that need to adapt to climatic changes.

When it goes further, trying to convince people of certain solutions, please let go of your saviour complex, you will mostly like not achieve much. The way scientists are trained to think and communicate works well for science, but it is not particularly convincing outside of it. The chance you are good at convincing people is not much better than the chance of some random dude or grandma down the road.

When it comes to informing people of our findings our voice should naturally be louder than that of groups misinforming people. In countries with a functioning media that does not need to be particularly loud. The opposite of evidence-based policy is misinformation-based policy. It is clearly less democratic and an abuse of power to set up a misinformation campaign to get your way politically because the public would not support your policies if they knew the truth. That is a violation of the self-determination of people to control their lives.

Educational and Outreach Symposia Session Geoethics: ethical, social and cultural implications of geoscience knowledge, education, communication, research and practice

 

Educational and Outreach Symposia Session Vision for Earth Observations in 2040

Educational and Outreach Symposia is a surprising place for this session. I hope the right people find it.
As both Earth science and technology advance while the expectations for the extent, quality, and timeliness of environmental information to be provided to the world’s population increases, the opportunity exists to harness the increased knowledge and capability to improve those products and services.

The World Meteorological Organization has made important contributions in making the connection between knowledge and products in the areas of weather, water, and climate through its periodic visions, most recently the Vision for the Global Observing System (GOS) in 2025. The WMO is now in the process of doing an update for the 2040 time frame, taking into account both surface and space-based measurements.

In this session, presentations that look ahead to the 2040 time frame and address expected observational capability that can realistically be expected to be available in that time frame, the expected demand for products and services informed by observational data that may be required for public use in that time frame, and mechanisms for connecting the two are all sought. Presentations for this session can address the full range of products for Earth System Science and are not limited to those addressed by the WMO in its development of the 2040 vision.
I hope a global climate station reference network will be part of this vision for 2040.

Interdisciplinary Session Big data and machine learning in geosciences

This session aims to bring together researchers working with big data sets generated from monitoring networks, extensive observational campaigns and extremely detailed modeling efforts across various fields of geosciences. Topics of this session will include the identification and handling of specific problems arising from the need to analyze such large-scale data sets, together with and methodological approaches towards automatically inferring relevant patterns in time and space aided by computer science-inspired techniques. Among others, this session shall address approaches from the following fields:
  • Dimensionality and complexity of big data sets
  • Data mining and machine learning
  • Deep learning in geo- and environmental sciences
  • Visualization and visual analytics of big data
  • Complex networks and graph analysis
  • Informatics and data science

Interdisciplinary PICO Session R’s deliberate role in Earth sciences

 

Interdisciplinary Session Citizen Science in the Era of Big Data

I wish they were a bit clearer on what kind of statistics specially for Big Data they are thinking of. I guess with lots of data it is easy to a result that is statistically significant, but physically negligibly small and not interesting. If you start analysing the data like a fishing expedition you should be extra careful not to fall into the multiple testing gap. It may be that they mean that with "Big Data approaches."
Citizen Science (the involvement of laypeople in scientific processes) is gaining momentum in one discipline after another, thereby more and more data on biodiversity, earthquakes, weather, climate, health issues among others are being collected at different scales. In many cases these datasets contain huge amounts of data points collected by various stakeholders. There is definitely power in numbers of data points, however, the full potential of these datasets is not realized yet. Traditional statistics often fail to utilize these prospects. Statistics for Big Data can unveil patterns hidden that are otherwise would not be visible in datasets. Since Big Data approaches and citizen science are still developing fields, most projects miss Big Data analyses.

In this session we are looking for successful approaches of working with Big Data in all fields of citizen science. We want to ask and find answers to the following questions:
  • Which Big Data approaches can be used in citizen science?
  • What are the biggest challenges and how to overcome them?
  • How to ensure data quality?
  • How to involve citizen scientists in Big Data Analyses, or is it possible?

Scientific publishing

Great Debate 1 Who pays for Scientific Publishing?

This Great Debate will address the following questions: whether the profits generated by traditional publishers are justifiable and sustainable, to what extent scientists should contribute to the business, what are the current and future alternatives, and what role will preprint servers play?
See also my recent post on the new preprint servers for the Earth Sciences and the townhall meeting below on self-archiving and EarthArXiv.

Townhall Meeting EarthArXiv - a preprint server for the Earth Sciences

Preprints and preprint servers are set to revolutionise and disrupt the standard approaches to scholarly publishing in the Earth Sciences. Yet, despite being widely-used and demonstrably successful in several other core science disciplines, the concept of preprints is new to many Earth Sciences. As a result, education is needed, such that Earth Scientists can benefit from the use of preprints and preprint servers. In this townhall we will introduce the general concepts of preprints and preprint servers, illustrating this with a demonstration of EarthArXiv, a community-led preprint server. We will also lead a general discussion of the use of preprints.

PICO Session Future of (hydrological) publishing

This session was one reason to write this blog post. It sounds really interesting and it is somewhat hidden by being in the Hydrology Division, where non-hydrologists may miss it. This could be a good place for my coming out with the idea of grassroots scientific publishing, where the scientific community takes back control of the quality assessment, beginning with making more informative open reviews of already published articles.
In recent years, the current and future system of scientific publishing has been heavily debated. Most of these discussions focused on criticizing aspects of the current system such as:
  • the scientific publishing industry being one of the most profitable branches (Guardian, 2017) in media, because the scientific community basically does all the work for free
  • the peer review system being corrupted, or at least not functioning perfectly
  • the limited access to scientific papers due to its current business model
  • the surging number of submitted papers in recent years, especially with strict publication requirements for PhD candidates. This is putting more pressure on editors, reviewers and readership, and will decrease the visibility and impact of each publication.
Times are changing, which can be seen in the increased demand and supply for open access publishing. However, we believe there might be plenty of other ideas and suggestions on how to improve scientific publishing. We invite and challenge everyone from the scientific community to propose ideas on how to do so in 5 minute presentations. Afterwards we will continue the discussion to answer questions such as: Who needs to pay for reading our work? Who should publish our work? How to cope with the excessive amount of submitted papers? Should we even be publishing?

Short course What are the key problems in Climate Science?

Climate science is a wide discipline that encompasses many of the EGU divisions, yet it is not always easy to know what the key problems are outside of your own specific area. ... During the short course, four climate experts from different divisions will introduce the “key problems” in their discipline, giving you an overview of what the current “hot topics” are. This course will provide you with enough background to venture into other divisions during the rest of the meeting. The floor will then be open for questions and discussion with our experts.
With so many disciplines together EGU would theoretically be an important place to learn about problem in other fields and see how that fits to yours.

However, the talks at EGU are very short, just 12 minutes. They do not leave much time for an introduction and are thus hard to follow for outsiders. It may be nice to extend the idea of this short course with just four hot topics to many more topics. Make it into a science slam, where you do not talk about your own work, but introduce the field in a way an outsider can get it.

It looks like these four key problems and their speakers are selected by the conveners. A science slam could be open to all like the normal talks.

Open Science

Townhall Meeting OSGeo Townhall: Open Science demystified

OSGeo is hosting this Townhall event to support the collaborative development of open source geospatial software by promoting sustainable Open Science within EGU. The Open Source Geospatial Foundation, or OSGeo, is a not-for-profit umbrella organization for Free and Open Source geospatial tools, including QGIS, gvSIG, GRASS GIS, Geoserver and many others.
The paradigm of Open Science is based on the tiers Open Access, Open Data and Free Open Source Software (FOSS). However, the interconnections between the tiers remain to be improved. This is a critical factor to enable Open Science.
This Townhall meeting reaches out all across EGU, especially welcoming Early Career Scientists, to network and discuss the current challenges and opportunities of the FOSS tier, including:
  • the easy approach to choosing software licences
  • recognition for scientific software: how to write a software paper?
  • software life cycle: who will maintain your software after you've finished your PhD and found a decent job?
  • funding software development: evolving software begun for your own research needs into something larger, that serves other's needs, and boosts your scientific reputation.
  • software reviews: how to set up software development such that other developers get involved in an early stage?
  • how can OSGeo help you with all these questions?

Short course How to find and share data in geosciences?

This short course aims to present some tips and tricks to accelerate the process of finding, processing and sharing the Geosciences data. We will also discuss the importance of open science and the opportunities it provides.

Short course Writing reproducible geoscience papers using R Markdown, Docker, and GitLab

I think code reproducibility is overrated. It is much stronger to make an independent reproduction and if the result depends on minute details being the same, it is mostly likely not a useful result. But the most of the same methods can be useful to speed up scientific progress. Sharing data and code is wonderful and helps other scientists get going faster. The code should thus preferably also run somewhere else.
Reproducibility is unquestionably at the heart of science. Scientists face numerous challenges in this context, not least the lack of concepts, tools, and workflows for reproducible research in Today's curricula.
This short course introduces established and powerful tools that enable reproducibility of computational geoscientific research, statistical analyses, and visualisation of results using R (http://www.r-project.org/) in two lessons:

1. Reproducible Research with R Markdown
Open Data, Open Source, Open Reviews and Open Science are important aspects of science today. In the first lesson, basic motivations and concepts for reproducible research touching on these topics are briefly introduced. During a hands-on session the course participants write R Markdown documents, which include text and code and can be compiled to static documents (e.g. HTML, PDF).
R Markdown is equally well suited for day-to-day digital notebooks as it is for scientific publications when using publisher templates.
To understand the rest of the description, I need to explain what Docker means:
Docker is a tool that can package an application and its dependencies in a virtual container that can run on any Linux server. This helps enable flexibility and portability on where the application can run, whether on premises, public cloud, private cloud, bare metal, etc.
Gitlab is a collaborative coding system based on the versioning system [[Git]]. Comparable to be probably better known [[GitHub]] and [[Bitbucket]].
2. GitLab and Docker
In the second lesson, the R Markdown files are published and enriched on an online collaboration platform. Participants learn how to save and version documents using GitLab (http://gitlab.com/) and compile them using [[Docker]] containers (https://docker.com/). These containers capture the full computational environment and can be transported, executed, examined, shared and archived. Furthermore, GitLab's collaboration features are explored as an environment for Open Science.
P.S. Those homepages really suck big time, except if their goal is to scare away anyone who is a hard core coder and already knows the product. That is why I mostly linked to Wikipedia.

Short course Building and maintaining R packages

R is a free and open software that gained paramount relevance in data science, including fields of Earth sciences such as climatology, hydrology, geomorphology and remote sensing. R heavily relies on thousands of user-contributed collections of functions tailored to specific problems, called packages. Such packages are self-consistent, platform independent sets of documented functions, along with their documentations, examples and extensive tutorials/vignettes, which form the backbone of quantitative research across disciplines.

This short course focuses on consolidated R users that have already written their functions and wish to i) start appropriately organizing these in packages and ii) keep track of the evolution of the changes the package experiences. While there are already plenty of introductory courses to R we identified a considerable gap in the next evolutionary step: writing and maintaining packages.

Short course Improving statistical evaluations in the geosciences

I love the (long) description of topics. Looks like just what a geo-scientist needs.

During my studies I got lucky. Studying physics, the only statistics we got was some error propagation for lab work. Somehow I was not happy with that and I found a statistics course in the sociology department. There was not much mathematics, one student even asked what the dot between X and Y was. I did not even understand the question, but the teacher casually answered that that was the multiplication sign. Maybe out of need it focussed on the big ideas, on the main problems and typical mistakes people make. It looks like this could be a similar course, but likely with more math.

Session Open Data, Reproducible Research, and Open Science

Open Data and Open Science not only address publications, but scientific research results in general, including figures, data, models, algorithms, software, tools, notebooks, laboratory designs, recipes, samples and much more.

Furthermore, they relate to the communication, review, and discussion of research results and consider changing needs regarding incentives, quality assessment, metrics, impact, reputation, grants and funding. Thus Open Data and Open Science encompass licensing, policy-making, infrastructures and scientific heritage, while safeguarding the dynamic nature of science and its evolving forms.
...
The speakers present success stories, failures, best practices, solutions and introduce networks. It is aimed to show how researchers, citizens, funding agencies, governments and other stakeholders can benefit from Open Data, Reproducible Research, and Open Science in various flavors, acknowledging the drawbacks and highlighting the opportunities available for geoscientists.

The session shall open a space to exchange experiences and to present either successful examples or failed efforts. Learning from others and understanding what to adopt and what to change are to help towards own undertakings and new initiatives, so that they become successes.

Educational and Outreach Symposia Session Promoting and supporting equality of opportunities in geosciences

Following the success of previous years, this session will be exploring reasons for the existence of underrepresentation of different groups (cultural, national and gender) by welcoming a debate with scientists, decision-makers and policy analysts related to geosciences.

The session will be focusing on both remaining obstacles that contribute to underrepresentation and on best practices and innovative ideas to tackle obstacles.

Science communication

Short course Help! I'm presenting at a scientific conference!

Sounds like this short course on giving a scientific presentation is tailored to newbies, although the seniors could also use some help. The seniors are hardest to change, they have learned that they have a highly motivated captive audience an that after a crappy talk everyone will pretend it was a good one.
Presenting at a scientific conference can be daunting for early career scientist and established. How can you optimally take advantage of those 12 minutes to communicate your research effectively? How do you cope with nervousness? What happens if someone asks a question that you don’t think you can answer? Is your talk tailored to the audience?

Giving a scientific talk is a really effective way to communicate your research to the wider community and it is something anyone can learn to do well! This short course provides the audience with hands-on tips and tricks in order to make your talk memorable and enjoyable for both speaker and audience.

Short course Once upon a time in Vienna

A short course on story telling, which is really important for readable prose. Although this blog post is probably not the best place to make this case.
This is an interactive workshop led by a professional communications facilitator and writer, and academics with a range of earth science outreach experience. Through a combination of expert talks, informal discussion, and practical activities, the session will guide you through the importance of storytelling, how to find exciting stories within your own research, and the tools to build a memorable narrative arc.

Short course Rhyme your research

After seeing the term "experienced science-poet" I was forced to include this short course. They missed the opportunity to write the description as poem.
Poetry is one of the oldest forms of art, potentially even predating literacy. However, what on Earth does it have to do with science? One is usually subjective and emotive, whilst the other (for the most part) is objective and empirical. However, poetry can be a very effective tool in communicating science to a broader audience, and can even help to enhance the long-term retention of scientific content. During this session, we will discuss how poetry can be used to make (your) science more accessible to the world, including to your students, your professors, your (grand)parents, and the general public.

Writing a poem is not a particularly difficult task, but writing a good poem requires both dedication and technique; anyone can write poetry, but it takes practice and process to make it effective. In this session, experienced science-poets will discuss the basics of poetry, before encouraging all participants to grab a pen and start writing themselves. We aim to maximise empowerment and minimise intimidation. Participants will have the opportunity to work on poems that help to communicate their research, and will be provided with feedback and advice on how to make them more effective, engaging and empathetic. Those who wish to do so may also recite their creations during the “EGU Science Poetry Slam 2018”.

Educational and Outreach Symposia Session Scientists, artists and the Earth: co-operating for a better planet sustainability

As communicators and artists, ​we have a shared responsibility ​to raise awareness of the importance of planet sustainability. ​ Educating people ​in this regard has normally been executed through traditional educational method​s.​ ​But ​there is evidence that science-art collaborations play a vital role in contributing to this issue, through the emotional and human connection that the arts can provide. This session,​ already in​ its fourth edition, has presented interesting ​and progressive ​​art science collaborations across a number of disciplines focussed on representing Earth science content. ​We have witnessed that climate change, natural hazard, meteo​rol​o​gy, palaeontology, earthquakes and volcanoes, geology have ​been successfully presented through music, visual art, photography, theatre, literature, digital art, ​where the artists ​explored new ​practices and methods in their work with scientists. ​A fundamental part of all art is the presentation of their final work. Then we provide a related 'performative session', to allow artists perform excerpts of their work and fully reveal the impact of this work in communicating the bigger planet sustainability message. This related session is entitled “A pilot-platform for performing your Earth&Art work”​​.

Short course Visualization in Earth Science: best practices

This short course is co-organized by the ESSI division: Earth & Space Science Informatics.
With constantly growing data sizes, both an effective visualization, as well as an efficient data analysis are getting more and more important. Different tasks in visualization require different visualization strategies. Geoscience data presents particular challenges, being typically large, multivariate, multidimensional, time-varying and uncertain. This short course aims at the presentation and demonstration of commonly available visualization tools, that are especially well suited to analyze earth science data sets. We at DKRZ -- the German Climate Computing Centre -- have many years of experience in the visualization of earth science data sets, and the goal of this workshop is to pass this knowledge on to you. We will show, explain and demonstrate the tools live, with which we work in our daily routine, and show you how to create effective and meaningful visualizations using free software.

Short course How to cartoon science

 

 Short course Science for Policy: What is it and how can scientists become involved in policy processes?

Organised by the EGU policy expert Chloe Hill.
Part 1: will focus on basic science for policy and communication techniques that can be used to engage policymakers. It will be of particular interest to anyone who wants to make their research more policy relevant and learn more about science-policy.

Part 2: will include invited speakers who will outline specific EU processes and initiatives and explain how scientists can become involved with them.

Short course Communicating your research to teachers, schools and the public - interactively

If you are serious about communicating your research to teachers, schools and the public then you should know something about these audiences, be familiar with the most effective ways of engaging with each of them and be clear about what the ‘take away’ messages would be. ... Methods of engaging the public and families through open days and similar events are different again, and usually use a range of activities to engage and educate at the same time. We will discuss insights and strategies for these different audiences and ask you to have a go yourself.

Short course Debunking myths and fake news: how can geoscientists fight misinformation and false claims

Maybe you’ve had an argument on social media with a climate change denier who is convinced the Earth is not warming. Or maybe you’ve received an email from a scared relative forwarding you a piece from an unreliable website about how total solar eclipses produce harmful rays that can make you blind. How do you go about convincing them they are mistaken without them holding on even more to their false beliefs? In the age of Brexit and Trump, of fake news and of expert snubbing, geoscientists have a role to play in tactfully fighting misinformation related to the Earth, space and planetary sciences. This short course will explore ways in which researchers can promote evidence and facts, prevent fake news from spreading, and successfully debunk false claims.

Short course Connect2Communicate: communicating your message with charisma, clarity and conviction

Making use of established techniques from the world of theatre and improvisation, this session will enable participants to make genuine connection with their audience.

Short course Science writing: selling your research through press releases and articles

Our press office once organised a short workshop on writing press releases, which was given by a former journalist. He could explain well what a journalist wanted from a press release, but did not understand that the interests of scientists are different. This course may be better, it is given by scientists.
The course will consist of: an introduction on how to identify a good science story; general tips on how to write with clarity and flair; an introduction on how to go about promoting your work via press releases and working with embargoes; tips on working with press officers and journalists; practical exercises on headline writing; and practical exercises about turning abstracts into press releases.

Short course Communicating geoscience to the media

The news media is a powerful tool to help scientists communicate their research to wider audiences. However, at times, messages in news reports do not properly reflect the real scientific facts and discoveries, resulting in misleading coverage and wary scientists. This is especially problematic in fields such as climate science, where climate skeptics can twist the research results to draw conclusions that are baseless. A way scientists have to prevent misleading or even inaccurate coverage is to improve the way they communicate and work with journalists. In this short course, co-organised with the CL and CR divisions, we will bring together science journalists and researchers with experience working with the media to provide tips and tricks on how scientists can better prepare for interviews with reporters. We will also provide pointers on how to ensure a smooth working relationship between researchers and journalists by addressing the needs and expectations of both parties. The focus will be on climate topics, but much of the advice would be applicable to other geoscience areas.

Educational and Outreach Symposia Session ECORD IODP Outreach: Past, Present and Future

The International Ocean Discovery Program is an international programme that works to explore the oceans and the rocks beneath them. ...
This session addresses the formats by which we disseminate scientific information and discoveries arising from ocean drilling – what have we done in the past, what are we doing now, and what ideas do we have for the future engagement of students with ocean research drilling. Experiences and examples of best practice illustrated in poster or oral format will present school teachers, university lecturers and researchers that describe their outreach efforts in the lab, field and geoscience classrooms to promote high-quality geoscience education at all levels.

Educational and Outreach Symposia Session Games for Geoscience

Games have the power to ignite imaginations and place you in someone else’s shoes or situation, often forcing you into making decisions from perspectives other than your own. This makes them potentially powerful tools for communication, through use in outreach, disseminating research, in education at all levels, and as a method to train the public, practitioners and decision makers in order to build environmental resilience. The session is a chance to share your experiences and best practice with using games to communicate geosciences, be they analogue, digital and/or serious games.

Educational and Outreach Symposia Session Communication and Education in Geoscience: Practice, Research and Reflection

Do you consider yourself a science communicator? Does your research group or institution participate in public engagement activities? Have you ever evaluated or published your education and outreach efforts?

Scientists communicate to non-peer audiences through numerous pathways including websites, blogs, public lectures, media interviews, and educational collaborations. A considerable amount of time and money is invested in this public engagement and these efforts are to a large extent responsible for the public perception of science. However, few incentives exist for researchers to optimize their communication practices to ensure effective outreach. This session encourages critical reflection on science communication practices and provides an opportunity for science communicators to share best practice and experiences with evaluation and research in this field.

Related reading

Slides of the talk: How to convene a session at the EGU General Assembly by Stephanie Zihms, Roelof Rietbroek and Helen Glaves.

EGU2018 and its call-for-abstracts.

The call-for-session of EMS2018 is currently open. Suggestions for improvements of the description of the "Climate monitoring: data rescue, management, quality and homogenization" session are welcome.

The fight for the future of science in Berlin. My report of thisyear's conference on scholarly communication, which lots of ideas and initiatives on Open Science and publishing.

Where is a climate data scientist to go in 2018?

Wednesday, 20 December 2017

Where is a climate data scientist to go in 2018?

Where is a climate data scientist to go in the next year? There are two oldies in Old Europe (EGU and EMS) and two three new opportunities in the Southern Hemisphere (Early Instrumental Meteorological Series, AMOS and the Data Management Workshop in Peru).

Early Instrumental Meteorological Series - Conference and Workshop

The Oeschger Centre for Climate Change Research (OCCR) organises a conference and workshop on Early Instrumental Meteorological Series. Hosts are Stefan Brönnimann (Institute of Geography) and
Christian Rohr (Institute of History).

The first two days are organised like a conference, the last two days like a workshop. It will take place from 18 to 21 June 2018 at the University of Bern, Switzerland. Registration and abstract submission are due by 15 March 2018.
The goal of this conference and workshop is to discuss the state of knowledge on early instrumental meteorological series from the 18th and early 19th century. The first two days will be in conference-style and will encompass invited talks from different regions of the world (including participation by skype) on existing compilations and on individual records, but also on instruments and archives as well as on climate events and processes. Contributed presentations (most will be posters) are welcome.

The third and fourth days target a smaller audience and are in workshop-style. The goal is to compile a detailed inventory of all early instrumental records: What has been measured, where, when and by whom? Is the location of the original data known? Have they been imaged, digitised, homogenised, or are they already in existing archives? This work will help to focus future data rescue activities.

Data Management Workshop in Peru

New is the “Data Management for Climate Services” Workshop taking place in Lima - Peru from the 28 of May to 1 of June 2018. I am trying to learn Spanish, but languages are clearly not my strong point. At a restaurant I would now be able to order cat, turtle and chicken. I think I will eat a lot of chicken. Fortunately the workshop will be carried out in two languages and will have a professional translation service from Spanish – English / English – Spanish.

The workshop is inspired by the series of EUMETNET Data Management Workshops held every two years in Europe. It would be great if similar initiatives would be tried on other continents.

The abstract submission deadline is soon: the 15 of January 2018.
  • Session 1: METADATA
    • Methods for data rescue and cataloguing; data rescue projects.
    • Methods of metadata rescue for the past and the present; systems for metadata storage; applications and use of metadata.
    • Methods for quality control of different meteorological observations of different specifications; processes to establish operational quality control.
  • Session 2: DATA HOMOGENIZATION
    • Methods for the homogenization of monthly climate data; projects and results from homogenization projects; investigations on parallel climate observations; use of metadata for homogenization.
  • Session 3: GRIDDED DATA
    • Verification of gridded data based on observations; products based on gridded data; methods to produce gridded data; adjustments of gridded data in complex topographies such as the Andes.
  • Session 4: CLIMATE SERVICES
    • Products and climate information: methods and tools of climate data analysis; presentation of climate products and information; products on extreme events
    • Climate services in Ibero-America: projects on climate services in Ibero-America.
    • Interface with climate information users: approaches to building the interface with climate information users; experiences from exchanges with users; user requirements on climate services.

EMS Annual Meeting

This time the EMS Annual Meeting: European Conference for Applied Meteorology and Climatology will be in Budapest, Hungary, from 3 to 7 September 2018. The abstract submission deadline is still far away, but if you have ideas for sessions this is the moment to say something. The call for sessions is open until January 4th. New sessions can be proposed.

The session on "Climate monitoring: data rescue, management, quality and homogenization" is organised by Manola Brunet-India, Ingeborg Auer, Dan Hollis and me. If you have any suggestions for improvements of our please tell me. New this year is that we have explicitly added marine data. The forgotten 70% will be forgotten no longer.
Robust and reliable climatic studies, particularly those assessments dealing with climate variability and change, greatly depend on availability and accessibility to high-quality/high-resolution and long-term instrumental climate data. At present, a restricted availability and accessibility to long-term and high-quality climate records and datasets is still limiting our ability to better understand, detect, predict and respond to climate variability and change at lower spatial scales than global. In addition, the need for providing reliable, opportune and timely climate services deeply relies on the availability and accessibility to high-quality and high-resolution climate data, which also requires further research and innovative applications in the areas of data rescue techniques and procedures, data management systems, climate monitoring, climate time-series quality control and homogenisation.
In this session, we welcome contributions (oral and poster) in the following major topics:
  • Climate monitoring , including early warning systems and improvements in the quality of the observational meteorological networks.
  • More efficient transfer of the data rescued into the digital format by means of improving the current state-of-the-art on image enhancement, image segmentation and post-correction techniques, innovating on adaptive Optical Character Recognition and Speech Recognition technologies and their application to transfer data, defining best practices about the operational context for digitisation, improving techniques for inventorying, organising, identifying and validating the data rescued, exploring crowd-sourcing approaches or engaging citizen scientist volunteers, conserving, imaging, inventorying and archiving historical documents containing weather records.
  • Climate data and metadata processing, including climate data flow management systems, from improved database models to better data extraction, development of relational metadata databases and data exchange platforms and networks interoperability.
  • Innovative, improved and extended climate data quality controls (QC), including both near real-time and time-series QCs: from gross-errors and tolerance checks to temporal and spatial coherence tests, statistical derivation and machine learning of QC rules, and extending tailored QC application to monthly, daily and sub-daily data and to all essential climate variables.
  • Improvements to the current state-of-the-art of climate data homogeneity and homogenisation methods, including methods intercomparison and evaluation, along with other topics such as climate time-series inhomogeneities detection and correction techniques/algorithms, using parallel measurements to study inhomogeneities and extending approaches to detect/adjust monthly and, especially, daily and sub-daily time-series and to homogenise all essential climate variables.
  • Fostering evaluation of the uncertainty budget in reconstructed time-series, including the influence of the various data processes steps, and analytical work and numerical estimates using realistic benchmarking datasets.
The next step is to analyse the data to understand what happens with the climate system. For this there is the session: "Climate change detection, assessment of trends, variability and extremes".

AMOS-ICSHMO

It is too late to submit abstracts, but you can still visit the Joint 25th AMOS National Conference and 12th International Conference for Southern Hemisphere Meteorology and Oceanography, AMOS-ICSHMO 2018, to be held at UNSW Sydney from 5 to 9 February 2018.

New is a session on "Data homogenisation and other statistical challenges in climatology" organised by Blair Trewin and Sandy Burden.
This session is intended as a forum to present work addressing major statistical challenges in climatology, from the perspectives of both climatologists and statisticians. It is planned to have a particular focus on climate data homogenisation, including the potential for merging observations from multiple sources. However, papers on all aspects of statistics in climatology are welcome, including (but not limited to) spatial analysis and uncertainty, quality control, cross-validation, and extreme value and threshold analysis. Statistical analyses of temperature and rainfall will be of most interest, but studies using any meteorological data are welcome.
If I see it right The session has four talks:
  • Testing for Collective Significance of Temperature Trends (Radan Huth)
  • Investigating Australian Temperature Distributions using Record Breaking Statistics and Quantile Regression (Elisa Jager)
  • A Fluctuation in Surface Temperature in Historical Context: Reassessment and Retrospective on the Evidence (James Risbey)
  • The Next-Generation ACORN-SAT Australian Temperature Data Set (Blair Trewin)
And there is a session on "Historical climatology in the Southern Hemisphere" organised by Linden Ashcroft, Joëlle Gergis, Stefan Grab, Ruth Morgan and David Nash.
Historical instrumental and documentary records contain valuable weather and climate data, as well as detailed records of societal responses to past climatic conditions. This information offers valuable insights into current and future climate research and climate change adaptation strategies. While the use of historical climate information is a well-developed field in the Northern Hemisphere, a vast amount of untapped resources exist in the southern latitudes. Recovering this material has the potential to dramatically improve our understanding of Southern Hemisphere climate variability and change. In this session we welcome interdisciplinary submissions on the rescue, interpretation and analysis of historical weather, climate, societal and environmental information across the Southern Hemisphere. This can include:
  • Instrumental data rescue (land and ocean) projects and practices
  • Comparison of documentary, instrumental and palaeoclimate reconstructions
  • Historical studies of extreme events
  • Past social engagement with weather, climate and the natural environment
  • Development of long-term climate records and chronologies.
It has five talks:
  • An Australian History of Anthropogenic Climate Change (Ruth Morgan)
  • Learning from the Present to Understand the Past: The Case of Precipitation Covariability between Tasmania and Patagonia (Martin Jacques-Coper)
  • Learning from Notorious Maritime Storms of the Late 1800’s (Stuart Browning)
  • Climate Data Rescue Activities at Meteo-France in the Southern Hemisphere (Alexandre Peltier)
  • Recovering Historic Southern Ocean Climate Data using Ships’ Logbooks and Citizen Science (Petra Pearce)

EGU General Assembly

EGU will be held in Vienna, Austria, from 8 to 13 April 2018. The abstract submission deadline is looming: the 10th of January.

The main session from my perspective is: "Climate Data Homogenization and Analysis of Climate Variability, Trends and Extremes", organised by Xiaolan Wang, Rob Roebeling, Petr Stepanek, Enric Aguilar and Cesar Azorin-Molina.
Accurate, homogeneous, and long-term climate data records are indispensable for many aspects of climate research and services. Realistic and reliable assessments of historical climate trends and climate variability are possible with accurate, homogeneous and long-term time series of climate data and their quantified uncertainties. Such climate data are also indispensable for assimilation in a reanalysis, as well as for the calculation of statistics that are needed to define the state of climate and to analyze climate extremes. Unfortunately, many kinds of changes (such as instrument and/or observer changes, changes in station location and/or environment, observing practices, and/or procedures) that took place during data collection period could cause non-climatic changes (artificial shifts) in the data time series. Such shifts could have huge impacts on the results of climate analysis, especially when it concerns climate trend analysis. Therefore, artificial shifts need to be eliminated, as much as possible, from long-term climate data records prior to their application.

The above described factors can influence different essential climate variables, including atmospheric (e.g., temperature, precipitation, wind speed), oceanic (e.g., sea surface temperature), and terrestrial (e.g., albedo, snow cover) variables from in-situ observing networks, satellite observing systems, and climate/earth-system model simulations. Our session calls for contributions that are related to:
  • Correction of biases, quality control, homogenization, and validation of essential climate variables data records.
  • Development of new datasets and their analysis (spatial and temporal characteristics, particularly of extremes), examining observed trends and variability, as well as studies that explore the applicability of techniques/algorithms to data of different temporal resolutions (annual, monthly, daily, sub-daily).
  • Rescue and analysis of centennial meteorological observations, with focus on wind data prior to the 1960s, as a unique source to fill in the gap of knowledge of wind variability over century time-scales and to better understand the observed slowdown (termed “stilling”) of near-surface winds in the last 30-50 years.
Also the session on "Atmospheric Remote Sensing with Space Geodetic Techniques" contains a fair bit of homogenisation. For most satellite datasets homogenisation is done very differently as they do not have as much redundant data, but the homogenisation of humidity datasets based on the geodetic data of the global navigation satellite system ([[GNSS]], consisting of GPS, GLONASS and Galileo) is very similar.
Today atmospheric remote sensing of the neutral atmosphere with space geodetic techniques is an established field of research and applications. This is largely due to the technological advances and development of models and algorithms as well as, the availability of regional and global ground-based networks, and satellite-based missions. Water vapour is under sampled in current operational meteorological and climate observing systems. Advancements in Numerical Weather Prediction Models (NWP) to improve forecasting of extreme precipitation, requires GNSS troposphere products with a higher resolution in space and shorter delivery times than are currently in use. Homogeneously reprocessed GNSS observations on a regional and global scale have high potential for monitoring water vapour climatic trends and variability, and for assimilation into climate models. Unfortunately, these time series suffer from inhomogeneities (for example instrumental changes, changes in the station environment), which can affect the analysis of the long-term variability. NWP data have recently been used for deriving a new generation of mapping functions and in Real-Time GNSS processing these data can be employed to initialise Precise Point Positioning (PPP) processing algorithms, shortening convergence times and improving positioning. At the same time, GNSS-reflectometry is establishing itself as an alternative method for retrieving soil moisture.
We welcome, but not limit, contributions on the subjects below:
  • Physical modelling of the neutral atmosphere using ground-based and radio-occultation data.
  • Multi-GNSS and multi-instruments approaches to retrieve and inter-compare tropospheric parameters.
  • Real-Time and reprocessed tropospheric products for forecasting, now-casting and climate monitoring applications.
  • Assimilation of GNSS measurements in NWP and in climate models.
  • Methods for homogenization of long-term GNSS tropospheric products.
  • Studies on mitigating atmospheric effects in GNSS positioning and navigation, as well as observations at radio wavelengths.
  • Usage of NWP data in PPP processing algorithms.
  • Techniques on retrieval of soil moisture from GNSS observations and studies of ground-atmosphere boundary interactions.
Also for ecological data homogenisation is often needed. Thus the session "Digital environmental models for Ecosystem Services mapping" by Miquel Ninyerola, Xavier Pons and Lluis Pesquer may also be interesting.
The session aims to focus on understanding, modelling, analysing and improving each step of the process chain for producing digital environmental surface grids (terrain, climate, vegetation, etc.) able to be used in Ecosystem Services issues: from the sensors (in situ as well as Earth Observation data) to the map dissemination. In this context, topics as data acquisition/ingestion, data assimilation, data processing, data homogenization, uncertainty and quality controls, spatial interpolation methods, spatial analysis tools, derived metrics, downscaling techniques, box-tools, improvements on metadata and web map services are invited. Spatio-temporal analyses and model contribution of large series of environmental data and the corresponding auxiliary Earth Observation data are especially welcome as well as studies that combine cartography, GIS, remote sensing, spatial statistics and geocomputing. A rigorous geoinformatics and computational treatment is required in all topics.
EGU also has a nice number of Open Science, Science communication and Publishing sessions[, you can find links in my new post]. I hope I will find the time to also write about them in a next post.

Other conferences

The Budapest homogenisation workshop was this year, so I do not expect another one in 2018. In case you missed it, the proceedings is now published and contains many interesting extended abstracts.

Also the last EUMETNET Data Management Workshop was in 2017. If there are any interesting meetings that I missed, please tell us in the comments.

Wednesday, 13 September 2017

My EMS2017 highlights

When I did my PhD, our professor wanted everyone to write short reports about conferences they had attended. It was a quick way for him to see what was happening, but it is also helpful to remember what you learned and often interesting to read yourself again some time later. Here is my short report on last week's Annual Meeting of the European Meteorological Society (EMS), the European Conference for Applied Meteorology and Climatology 2017, 4–8 September 2017, Dublin, Ireland.

This post is by its nature a bit of lose sand, but there were some common themes: more accurate temperature measurements by estimating the radiation errors, eternal problems estimating various trends, collaborations between WEIRD and developing countries and global stilling.

Radiation errors

Air temperature sounds so easy, but is hard to measure. What we actually measure is the temperature of the sensor and because air is a good isolator, the temperature of the air and the sensor can easily be different. For example, due to self-heating of electric resistance sensors or heat flows from the sensor holder, but the most important heat flow is from radiation. The sun shining on the sensor or the sensor losing heat via infra-red radiation by contact with the cold atmosphere.



In the [[metrology]] (not meteorology) session there was a talk and several posters on the beautiful work by the Korea Research Institute of Standards and Science to reduce the influence of radiation on the temperature measurements. They used two thermometers one dark and one light coloured to estimate how large radiation errors are and to be able to correct for them. This set-up was tested outside and in their amazing calibration laboratory.

These were sensors to measure the vertical temperature profile, going up to 15 km high. Thus they needed to study the sensors over a huge range of temperatures (-80°C to 25°C); it is terribly cold at the tropopause. The dual sensor was also exposed to a large range of solar irradiances from 0 to 1500 Watts per square meter; the sun is much stronger up there. The pressure ranged from 10 hPa to the 1000 hPa we typically have at the surface. The low pressure makes the air an even better isolator. The radiosondes drift with the wind reducing ventilation, thus the wind only needed to be tested from 0 to 10 meters per second.

I have seen this set-up to study radiation errors for automatic weather stations, it would be great to also use it for operational stations to reduce radiation errors.

The metrological organisation of the UK is working on a thermometer that does not have a radiation error by directly measuring the temperature of the air. Micheal de Podesta does so by measuring the speed of sound very accurately. The irony is that it is hard to see how well this new sound thermometer works outside the lab because the comparison thermometer has radiation errors.

Micheal de Podesta live experiments with the most accurate thermometer in human history:



To lighten up this post: I was asked to chair the metrology session because the organiser of the session (convener) gave a talk himself. The talks are supposed to be 12 minutes with 3 minutes for questions and changing to the next speaker. Because multiple sessions are running at the same time and people may switch it is important to stick to the time. Also people need some time between the time blocks to recharge.

One speaker crossed the 12 minutes and had his back towards me so that I could not signal his time was up. Thus I walked across the screen to the other side in front of him. This gave some praise on Twitter.

If you speak a foreign language (and are nervous) it can be hard to deviate from the prepared talk.

Satellite climate data

There were several talks on trying to make a stable dataset from satellite measurements to make them useful for climate change studies. Especially early satellites were not intended for quantitative use, but only to look at the moving cloud systems. And also later the satellites were mostly designed for meteorological uses, rather than climate studies.

Interesting was Ralf Quast looking at how the spectral response of the satellites deteriorated while in space. The sensitivity for visible light did not decline similarly for all colours, but deteriorated faster for blues than for reds. This was studied by looking at several calibration targets expected to be stable: the Sahara desert, the dark oceans, and the bright top of tropical convective clouds. The estimates for post-launch measurements were similar to pre-launch calibrations in the lab.

Gerrit Hall explained that there are 17 sources of uncertainties for visible satellite measurements from the noise when looking at the Earth and when looking at the space for partial calibration to several calibration constants and comparisons to [[SI]] standards (the measurement units everyone, but the USA uses).

The noise levels also change over time, typically going up over the life time, but sometimes also going down for a period. The constant noise level in the design specification often used for computations of uncertainties is just a first estimate. When looking at space the channels (measuring different frequencies of light) should be uncorrelated, but they are not always.

Global Surface Reference Network

Peter Thorne gave a talk about a future global surface climate reference network. I wrote about this network for climate change studies before.

A manuscript describing the main technical features of such a network is almost finished. The Global Climate Observing System of WMO is now setting up a group to study how we can make this vision a reality to make sure that future climatologists can study climate change with a much higher accuracy. The first meeting will be in November in Maynooth.

Global stilling

The 10-meter wind speed seems to be declining in much of the mid-latitudes, which is called "global stilling". It is especially prevalent in middle Europe (as the locals say, in my youth this was called east Europe). The last decade there seems to be an uptick again; see graph to the right from The State of the Climate 2016.

Cesar Azorin-Molina presented the work of his EU project STILLING in a longer talk in the Climate monitoring session giving an overview of global stilling research. Stilling is also expected to be one of the reasons for the reduction in pan evaporation.

The stilling could be due to forest growth and urbanization, both make the surface rougher to the wind, but could also be due to changes in the large scale circulation. Looking at vertical wind profiles one can get an idea about the roughness of the surface and thus study whether that is the reason, but there is not much such data available over longer periods.

If you have such data, know of such data, please contact Cesar. Also for normal wind data, which is hard to get, especially observations from developing countries. The next talk was about a European wind database and its quality control, this will hopefully improve the data situation in Europe.

This was part of the climate monitoring session, which has a focus on data quality because Cesar also studied the influence of the ageing of cup anemometers that measure the wind speed. Their ball bearings tend to wear out, producing lower observed wind speeds. By making parallel measurements with new equipment and a few year old instruments he quantified this problem, which is quite big.

Because these anemometers are normally regularly calibrated and replaced I would not expect that this would produce problems for the long-term trend. Only if the wearing is larger now than it was in the past it would create a trend bias. But this does create quite a lot of noise in the difference time series between one station and a neighbour, thus making relative homogenisation harder.



Marine humidity observations

My ISTI colleague Kate Willet was recipient of the WCRP/GCOS International Data Prize 2016. She leads the ISTI benchmarking group and is especially knowledgeable when it comes to humidity observations. The price was a nice occasion to invite her to talk about the upcoming HadISD marine humidity dataset. It looks to become a beautiful dataset with carefully computed uncertainties.

There is a decline in the 2-meter relative humidity over land since about 2000 and it is thus interesting to see how this changes over the ocean. Preliminary results suggest that also over the ocean the relative humidity is declining. Both quality control of individual values and bias corrections are important.

Developing countries

There was a workshop on the exchange of information about European initiatives in developing countries. Saskia Willemse of Meteo Swiss organised it after her experiences from a sabbatical in Bhutan. Like in the rest of science a large problem is that funding is often only available for projects and equipment, while it takes a long time to lift an organisation to a higher level and the people need to learn how to use the equipment in praxis and it is a problem that equipment is often not interoperable.

More collaboration could benefit both sides. Developing countries need information to adapt to climate change and improve weather predictions. To study the climate system, science needs high quality observations from all over the world. For me it is, for example, hard to find out how measurements are made now and especially in the past. We have no parallel measurements in Africa and few in Asia. The Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) has much too few observations in developing countries. We will probably run into the same problem again with a global station reference network.

At the next EMS (in Budapest) there will be a session on this topic to get a discussion going how we can better collaborate. The organisers will reach out to groups already doing this kind of work in WMO, UNEP and the World Bank. One idea was to build a blog to get an overview of what is already happening.

I hope that it will be possible to have sustainable funding for weather services in poor countries, for capacity building and for making observations in return for opening up their data stores. That would be something the UN climate negotiations could do via the [[Green Climate Fund]]. Compared to the costs of reducing greenhouse gases and adapting our infrastructure the costs of weather services are small and we need to know what will happen for efficient planning.

Somewhat related to this is the upcoming Data Management Workshop (DMW) in Peru modelled after the European EUMETNET DMWs, but hopefully with more people from South and Central America. The Peru workshop is organised by Stefanie Gubler of the Swiss Climandes project and will be held from 28th of May to the 1st of June 2018. More information follows later.

Wet bulb temperature

For the heat stress of workers, the wet bulb temperature is important. This is the temperature of a well-ventilated thermometer covered in a wet piece of cloth. If there is some wind the wet bulb temperature is gives an indication of the thermal comfort of a sweating person.

The fun fact I discovered is that the weather forecasts for the wet bulb temperature are more accurate than for the temperature and the relative humidity individually. There is even some skill up to 3 weeks in advance. Skill here only means that the weather prediction is better than using the climatic value. Any skill can have economic value, but sufficiently useful forecasts for the public would be much shorter-term.

Plague

The prize for the best Q&A goes to the talk on plague in the middle ages and its relationship with the weather in the previous period (somewhat cool previous summer, somewhat warm previous winter and a warm summer: good rat weather).

Question: why did you only study the plague in the Middle Ages?
Answer: I am a mediaevalist.

Other observational findings

Ian Simpson studied different ways to compute the climate normals (the averages over 30 years). The main difference between temperature datasets were in China due to a difference in how China itself computes the daily mean temperature (from synoptic fixed hour measurements at 0, 6, 12, 18 hours universal time) and how most climatological datasets do it (from the minimum and maximum temperature). Apart from that the main differences were seen when data was incomplete because datasets use different methods to handle this.

There was another example where the automatic mode (joint detection) of HOMER produced bad homogenisation results. The manual mode of HOMER is very similar to PRODIGE, which is a good HOME recommended method, but the joint detection part is new and was not studied well yet. I would advice against its use by itself.

Lisa Hannak of the German weather service looked at inhomogeneities in parallel data: manual observations made next to automatic measurements. Because they are so highly correlated it is possible to see very small inhomogeneities and quite frequent ones. An interesting new field. Not directly related to EMS, but there will be a workshop on parallel data in November as part of the Spanish IMPACTRON project.

The European daily climate dataset ECA&D, which is often used to study changes in extreme weather, will soon have a homogenised version. Some breaks in earlier periods were not corrected because there were no good reference stations in this period. I would suggest to at least correct the mean in such a case, that is better than doing nothing and having a large inhomogeneity in a dataset people expect to be homogenised is a problem.

One of the things that seems to help us free meteorological/climate data is that there is a trend towards open government. This means that as much as possible the data the government has gathered is made available to the public via an [[API]]. Finland is just working on such an initiative and also freed the data of the weather service. There are many people and especially consultants using such data. We can piggy back on this trend.

One can also estimate humidity with GPS satellites. Such data naturally also need to be homogenised. Roeland Van Malderen works on a benchmark to study how well this homogenisation would work.

The Austrian weather service ZAMG is working on an update for the HISTALP dataset with temperature and precipitation for the Greater Alpine Region. The new version will use HOMER. Two regions are ready.

It was great to see that Mexico is working on the homogenisation of some of their data. Unfortunately the network is very sparse after the 90s, which makes homogenisation difficult and the uncertainty in the trends large.