Friday, 17 February 2012

HUME: Homogenisation, Uncertainty Measures and Extreme weather

Proposal for future research in homogenisation

To keep this post short, a background in homogenisation is assumed and not every argument is fully rigorous.

Aim

This document wants to start a discussion on the research priorities in homogenisation of historical climate data from surface networks. It will argue that with the increased scientific work on changes in extreme weather, the homogenisation community should work more on daily data and especially on quantifying the uncertainties remaining in homogenized data. Comments on these ideas are welcome as well as further thoughts. Hopefully we can reach a consensus on research priorities for the coming years. A common voice will strengthen our voice with research funding agencies.

State-of-the-art

From homogenisation of monthly and yearly data, we have learned that the size of breaks is typically on the order of the climatic changes observed in the 20th century and that period between two detected breaks is around 15 to 20 years. Thus these inhomogeneities are a significant source of error and need to be removed. The benchmark of the Cost Action HOME has shown that these breaks can be removed reliably, that homogenisation improves the usefulness of the temperature and precipitation data to study decadal variability and secular trends. Not all problems are already optimally solved, for instance the solutions for the inhomogeneous reference problem are still quite ad hoc. The HOME benchmark found mixed results for precipitation and the handling of missing data can probably be improved. Furthermore, homogenisation of other climate elements and from different, for example dry, regions should be studied. However, in general, annual and monthly homogenisation can be seen as a mature field. The homogenisation of daily data is still in its infancy. Daily datasets are essential for studying extremes of weather and climate. Here the focus is not on the mean values, but on what happens in the tails of the distributions. Looking at the physical causes of inhomogeneities, one would expect that many of them especially affect the tails of the distributions. Likewise the IPCC AR4 report warns that changes in extremes are often more sensitive to inhomogeneous climate monitoring practices than changes in the mean.

Monday, 16 January 2012

Homogenisation of monthly and annual data from surface stations

To study climate change and variability long instrumental climate records are essential, but are best not used directly. These datasets are essential since they are the basis for assessing century-scale trends or for studying the natural (long-term) variability of climate, amongst others. The value of these datasets, however, strongly depends on the homogeneity of the underlying time series. A homogeneous climate record is one where variations are caused only by variations in weather and climate. In our recent article we wrote: “Long instrumental records are rarely if ever homogeneous”. A non-scientist would simply write: homogeneous long instrumental records do not exist. In practice there are always inhomogeneities due to relocations, changes in the surrounding, instrumentation, shelters, etc. If a climatologist only writes: “the data is thought to be of high quality” and then removes half of the data and does not mention the homogenisation method used, it is wise to assume that the data is not homogeneous.

Results from the homogenisation of instrumental western climate records indicate that detected inhomogeneities in mean temperature series occur at a frequency of roughly 15 to 20 years. It should be kept in mind that most measurements have not been specifically made for climatic purposes, but rather to meet the needs of weather forecasting, agriculture and hydrology (Williams et al., 2012). Moreover the typical size of the breaks is often of the same order as the climatic change signal during the 20th century (Auer et al., 2007; Menne et al., 2009; Brunetti et al., 2006; Caussinus and Mestre; 2004, Della-Marta et al., 2004). Inhomogeneities are thus a significant source of uncertainty for the estimation of secular trends and decadal-scale variability.

If all inhomogeneities would be purely random perturbations of the climate records, collectively their effect on the mean global climate signal would be negligible. However, certain changes are typical for certain periods and occurred in many stations, these are the most important causes discussed below as they can collectively lead to artificial biases in climate trends across large regions (Menne et al., 2010; Brunetti et al., 2006; Begert et al., 2005).

In this post I will introduce a number of typical causes for inhomogeneities and methods to remove them from the data.

Tuesday, 10 January 2012

New article: Benchmarking homogenisation algorithms for monthly data

The main paper of the COST Action HOME on homogenisation of climate data has been published today in Climate of the Past. This post describes shortly the problem of inhomogeneities in climate data and how such data problems are corrected by homogenisation. The main part explains the topic of the paper, a new blind validation study of homogenisation algorithms for monthly temperature and precipitation data. All the most used and best algorithms participated.

Inhomogeneities

To study climatic variability the original observations are indispensable, but not directly usable. Next to real climate signals they may also contain non-climatic changes. Corrections to the data are needed to remove these non-climatic influences, this is called homogenisation. The best known non-climatic change is the urban heat island effect. The temperature in cities can be warmer than on the surrounding country side, especially at night. Thus as cities grow, one may expect that temperatures measured in cities become higher. On the other hand, many stations have been relocated from cities to nearby, typically cooler, airports. Other non-climatic changes can be caused by changes in measurement methods. Meteorological instruments are typically installed in a screen to protect them from direct sun and wetting. In the 19th century it was common to use a metal screen on a North facing wall. However, the building may warm the screen leading to higher temperature measurements. When this problem was realised the so-called Stevenson screen was introduced, typically installed in gardens, away from buildings. This is still the most typical weather screen with its typical double-louvre door and walls. Nowadays automatic weather stations, which reduce labor costs, are becoming more common; they protect the thermometer by a number of white plastic cones. This necessitated changes from manually recorded liquid and glass thermometers to automated electrical resistance thermometers, which reduces the recorded temperature values.



One way to study the influence of changes in measurement techniques is by making simultaneous measurements with historical and current instruments, procedures or screens. This picture shows three meteorological shelters next to each other in Murcia (Spain). The rightmost shelter is a replica of the Montsouri screen, in use in Spain and many European countries in the late 19th century and early 20th century. In the middle, Stevenson screen equipped with automatic sensors. Leftmost, Stevenson screen equipped with conventional meteorological instruments.
Picture: Project SCREEN, Center for Climate Change, Universitat Rovira i Virgili, Spain.


A further example for a change in the measurement method is that the precipitation amounts observed in the early instrumental period (about before 1900) are biased and are 10% lower than nowadays because the measurements were often made on a roof. At the time, instruments were installed on rooftops to ensure that the instrument is never shielded from the rain, but it was found later that due to the turbulent flow of the wind on roofs, some rain droplets and especially snow flakes did not fall into the opening. Consequently measurements are nowadays performed closer to the ground.

Sunday, 8 January 2012

What distinguishes a benchmark?

Benchmarking is a community effort

Science has many terms for studying the validity or performance of scientific methods: testing, validation, intercomparison, verification, evaluation, and benchmarking. Every term has a different, sometimes subtly different, meaning. Initially I had wanted to compare all these terms with each other, but that would have become a very long post, especially as the meaning for every term is different in business, engineering, computation and science. Therefore, this post will only propose a definition for benchmarking in science and what distinguishes it from other approaches, casually called other validation studies from now on.

In my view benchmarking has three distinguishing features.
1. The methods are tested blind.
2. The problem is realistic.
3. Benchmarking is a community effort.
The term benchmark has become fashionable lately. It is also used, however, for validation studies that do not display these three features. This is not wrong, as there is no generally accepted definition of benchmarking. In fact in an important article on benchmarking by Sim et al. (2003) defines "a benchmark as a test or set of tests used to compare the performance of alternative tools or techniques." which would include any validation study. Then they limit the topic of their article, however, to interesting benchmarks, which are "created and used by a technical research community." However, if benchmarking is used for any type of validation study, there would not be any added value to the word. Thus I hope this post can be a starting point for a generally accepted and a more restrictive definition.

Friday, 16 December 2011

Natural cures for Asthma?

About two years ago, I was diagnosed with asthma. After some changes to my lifestyle, the last whole body plethysmography measurement showed that my lungs are fine again although I do not use any medication anymore. The asthma is gone!

I would like to share these changes with you, hoping they may also benefit you and also to hear back what benefited others and what not. My personal experiment is a little small (n=1), thus it may well be that some improvement were just by accident and not because of lifestyle change. I have to say that I only had very light asthma. I never had an asthmatic attack, but regularly did wheeze lightly when exhaling at night, my voice was not so strong anymore and my lungs produced too much mucus (leading to some coughing and a coated tongue). Another sign was that the reliever medication (Bronchodilators) made jogging a lot easier.

Asthma is on the rise the last 50 years in the West. Already this points to lifestyle factors being important. Not much seems to be known about which factors these are. It has been noted that children growing up at farms as less affected by asthma as urban children. Based on this, it has been theorized that childhood contact with microbes is beneficial, but I guess there are quite a few other differences between the life on the country side and in cities.

The main changes I made are that I started with intermittent fasting, and nowadays sleep on a firm surface, and do much more walking/hiking. Also important may have been that I do not any grains any more, do less jogging and more sprinting and that I regularly tanning for more vitamin D.

Monday, 5 December 2011

Plan B for our planet

The Kyoto protocol is running out and the climate conference in Durban will likely end without a new stronger protocol to reduce (the growth) green house gas emissions. Maybe we need a plan B.

Under the Kyoto protocol a cap on the greenhouse gas emissions for the participating industrialized countries is set. Within this group emission rights can be traded, so that emissions are cut in the most efficient way. With a similar aim, emission can also be reduced by financing emission reductions in emerging economies and developing countries.

The problem of the Kyoto protocol is that the cap on the greenhouse emissions only makes sense if everyone is participating, or at least will participate as soon as they are rich enough. It is possible to do so, the Montreal protocol to curb emissions of chlorofluorocarbons (CFC) to protect the ozone layer works well. In case of the Montreal protocol only the producers of fridges, air conditionings and spray cans were affected. Greenhouse gasses are emitted by the energy, agricultural and building sectors. These are very powerful agents with an interest in the status quo. With propaganda and by encouraging conflicts, they can makes sure that there are always some large greenhouse gas producers not participating.

Maybe a global cap is not needed. Maybe we can see the problem as a dynamical one. How can we develop cost efficient technologies to reduce green house gas emissions. These technologies will be developed if there is a clear price signal; emitting CO2, wasting energy should be costly. Furthermore, economies of scale will be important to reduce the price and the increase competition.

For this, all we need are higher prices for greenhouse gas emissions (for non-renewable energy, etc.) in a large part of the world, but not necessarily all of the world. This part of the world should be allowed to protect itself against imports produced with cheap energy. That is all that the world would need to agree upon.

I expect that Europe would be the region that would start working this way. Due to the import levies, the playing field would be levelled and Europe's industry would be able to compete in the here and now with industries from the outside. In the long run, the Europe's industry would be come more efficient and would be world leader in green technologies. Technologies that will be needed everywhere once the prices of energy, concrete and fertilizer will start to rise due to shortages.

This is a quite attractive position. No disadvantage now, due to the levy, and likely advantages later, due to a technological leading role. It might well be that many countries would like to cooperate, get into this green region to be able to export without levies and to be part of the future.

An agreement that climate import levies are allowed may be easier to achieve as a global cap on greenhouse emissions.

Further reading

More posts on economics.

Thursday, 1 December 2011

Where good ideas come from

Steven Johnson wrote a book on creativity and innovations "Where good ideas come from - The seven patterns of innovation". The book is very well written, very captivating. In fact, I was reading another book (written by a scientist), which I put on hold to be able to read Good Ideas.

Afterwards, I am somewhat disappointed by the book. It was captivating, the examples where interesting and many references worth diving into, but I did not learn much about innovation and creativity. I expect that many people who were already interested in the topic, will feel the same.

In the last chapter Johnson tries to do some science himself. He takes a long list of innovations and divides them up along two axis: individual versus team work (networked) and market versus non-market. Most of the innovations are in the category networked and non-market. He finds this interesting as markets are supposed to be so great and market incentives should lead people to make more inventions, but that doesn't seem to be the case.

If you look at the innovations themselves, you quickly notice that the market innovations include some services and are mainly product and their components: airplane, steel, induction motor, contact lenses, etc. The non-market innovations do include some consumer products, but are mainly scientific ideas, theories, discoveries and instruments: Braille, periodic table, RNS splicing, Chloroform, EKG, cosmic background radiation, etc.

Depending on how much consumer products and how many scientific ideas you put into the list, you can get any ratio of market to non-market innovations. Furthermore, wouldn't the Western societies give most money for basic research to universities, the list would also have been very different. Thus it is not possible to conclude from this list, whether market or non-market forces are best at innovating.

I like Johnsons conclusion, I think he is right from an understanding of what motivates people to innovate, but you cannot reach this conclusion from the analysis in the last chapter.

Want to read something good by Steven Johnson? Read his book Emergence or go to his blog, with many interesting ideas on innovation on the internet.

Monday, 14 November 2011

Freedom to learn

On Psychology Today Peter Gray writes an intriguing blog on education called Freedom to learn.. In it he argues that forcing children to learn stifles their innate motivation to teach themselves and is thus counter productive.

The post titled Seven Sins of forced education gives as the fourth sin: "Interference with the development of personal responsibility and self-direction."

Children are biologically predisposed to take responsibility for their own education... The implicit and sometimes explicit message of our forced schooling system is: "If you do what you are told to do in school, everything will work out well for you." Children who buy into that may stop taking responsibility for their own education.

When I look back on my time as a student, what wonders me most, is that I never had the idea to explore a certain topic myself. I just read the books the school or university told me to read and that was it. Only as a researcher I started to take responsibility for my own eduction.

Another sin that fits well to the theme of the blog, variability, is the "Reduction in diversity of skills, knowledge, and ways of thinking." In a highly diversified economy, it is detrimental to teach everyone the same topics.

A highly recommended blog, Freedom to learn.

Further reading

More posts on paleo (diet and lifestyle topics inspired by evolutionary thinking).

Paleo and fruitarian lifestyles have a lot in common
A comparison of the main ideas and recommendations of these two lifestyles.
Natural cures for asthma?
Some ideas for natural ways, which helped me cure or reduce asthma.
Sleep and diversity
Differences in sleeping times, from early bird to night owls, may provide security advantages.
Is obesity bias evolutionary?
A critical comment on an article, which states that humans have an intrinsic propensity to eat too much.
Freedom to learn
Forcing children to learn stifles their innate motivation to teach themselves and may thus be counter productive.
The paleo culture
After the Ancestral Health Symposium 2012, as discussion started about the sometimes self-centred culture of the paleo community