Sunday, 13 December 2015

My theoretical #AGU15 program

This Monday the Fall Meeting of the American Geophysical Union (AGU2015) starts. An important function of conferences is knowing who is working on what. Browsing the program does a lot of that, even if you are not there. Here is an overview of where I would have had a look from my interest in climate data quality.

Links, emphasis and [explanations] are mine. The titles are linked to the AGU abstracts, where you can also mail, tweet or facebook the abstracts to others who may be interested.

Session: Taking the Temperature of the Earth: Long-Term Trends and Variability across All Domains of Earth's Surface

(Talks | Posters)

Inland Water Temperature and the recent Global Warming Hiatus

Simon J Hook, Nathan Healey, John D Lenters and Catherine O'Reilly
Extract abstract. We are using thermal infrared satellite data in conjunction with in situ measurements to produce water temperatures for all the large inland water bodies in North America and the rest of the world for potential use as climate indicator. Recent studies have revealed significant warming of inland waters throughout the world. The observed rate of warming is – in many cases – greater than that of the ambient air temperature. These rapid, unprecedented changes in inland water temperatures have profound implications for lake hydrodynamics, productivity, and biotic communities. Scientists are just beginning to understand the global extent, regional patterns, physical mechanisms, and ecological consequences of lake warming.
See also my previous post on the fast warming of rivers and lakes and the decrease in their freezing periods. Unfortunately the abstract does not say much about the "hiatus" mentioned in the title.

There is also a detailed study on the relationship between air and water temperature for Lake Tahoe.

Global near-surface temperature estimation using statistical reconstruction techniques

Colin P Morice, Nick A Rayner and John Kennedy
abstract. Incomplete and non-uniform observational coverage of the globe is a prominent source of uncertainty in instrumental records of global near-surface temperature change. In this study the capabilities of a range of statistical analysis methods are assessed in producing improved estimates of global near-surface temperature change since the mid 19th century for observational coverage in the HadCRUT4 data set. Methods used include those that interpolate according to local correlation structure (kriging) and reduced space methods that learn large-scale temperature patterns.

The performance of each method in estimating regional and global temperature changes has been benchmarked in application to a subset of CMIP5 simulations. Model fields are sub-sampled and simulated observational errors added to emulate observational data, permitting assessment of temperature field reconstruction algorithms in controlled tests in which globally complete temperature fields are known.

The reconstruction methods have also been applied to the HadCRUT4 data set, yielding a range of estimates of global near-surface temperature change since the mid 19th century. Results show relatively increased warming in the global average over the 21st century owing to reconstruction of temperatures in high northern latitudes, supporting the findings of Cowtan & Way (2014) and Karl et al. (2015). While there is broad agreement between estimates of global and hemispheric changes throughout much of the 20th and 21st century, agreement is reduced in the 19th and early 20th century. This finding is supported by the climate model trials that highlight uncertainty in reconstructing data sparse regions, most notably in the Southern Hemisphere in the 19th century. These results underline the importance of continued data rescue activities, such as those of the International Surface Temperature Initiative and ACRE.

The results of this study will form an addition to the HadCRUT4 global near-surface temperature data set.

The EUSTACE project: delivering global, daily information on surface air temperature

Nick A Rayner and Colin P Morice,
At first sight you may think my colleagues have gone crazy. A daily spatially complete global centennial high-resolution temperature dataset!

I would be so happy if we could get halfway reliable estimates of changes in weather variability and extremes from some high-quality, high-density station networks for the recent decades. It is really hard to detect and remove changes in variability due to changes in the monitoring practices and these changes are most likely huge.

However, if you read carefully, they only promise to make the dataset and do not promise that the data is fit for any specific use. One of the main ways mitigation sceptics misinform the public is by pretending that datasets provide reliable information for any application. In reality the reliability of a feature needs to be studied first. Let's be optimistic and see how far they will get; they just started, have a nice bag of tricks and mathematical prowess.
Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, we must develop an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. These relationships can be derived either empirically or with the help of a physical model.

Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals would be used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. ...
A separate poster will provide more details on the satellite data.

The International Surface Temperature Initiative

Peter Thorne, Jay H Lawrimore, Kate Willett and Victor Venema
The Initiative is a multi-disciplinary effort to improve our observational understanding of all relevant aspects of land surface air temperatures from the global-mean centennial scale trends to local information relevant for climate services and climate smart decisions. The initiative was started in 2010 with a meeting that set the overall remit and direction. In the intervening 5 years much progress has been made, although much remains to be done. This talk shall highlight: the over-arching initiative framework, some of the achievements and major outcomes to date, as well as opportunities to get involved. It shall also highlight the many challenges yet to be addressed as we move from questions of global long-term trends to more local and regional data requirements to meet emerging scientific and societal needs.

Methodologies and Resulting Uncertainties in Long-Term Records of Ozone and Other Atmospheric Essential Climate Variables Constructed from Multiple Data Sources

Trends in atmospheric temperature and winds since 1959

Steven C Sherwood, Nidhi Nishant and Paul O'Gorman

Sherwood and colleagues have generated a new radiosonde dataset, removing artificial instrumental changes as well as they could. They find that the tropical hotspot does exist, that the models predictions of this tropic hotspot in the tropical tropospheric trends thus fit. They find that the recent tropospheric trend is not smaller than before. I hope there is no Australian Lamar Smith who needs an "hiatus" and is willing to harass scientists for political gain.

That there is not even a non-significant change from the long-term warming trend is surprising because of the recently more frequent cooling El Nino Southern Oscillation (ENSO) phases (La Nina phases). One would expect to see the influence of ENSO even stronger in the tropospheric temperatures than in the 2-m temperature. This makes it more likely that this insignificant trend change in the 2-m temperature is a measurement artefact.
Extract abstract.We present an updated version of the radiosonde dataset homogenized by Iterative Universal Kriging (IUKv2), now extended through February 2013, following the method used in the original version (Sherwood et al 2008 Robust tropospheric warming revealed by iteratively homogenized radiosonde data J. Clim. 21 5336–52). ...

Temperature trends in the updated data show three noteworthy features. First, tropical warming is equally strong over both the 1959–2012 and 1979–2012 periods, increasing smoothly and almost moist-adiabatically from the surface (where it is roughly 0.14 K/decade) to 300 hPa (where it is about 0.25 K/decade over both periods), a pattern very close to that in climate model predictions. This contradicts suggestions that atmospheric warming has slowed in recent decades or that it has not kept up with that at the surface. ...

Wind trends over the period 1979–2012 confirm a strengthening, lifting and poleward shift of both subtropical westerly jets; the Northern one shows more displacement and the southern more intensification, but these details appear sensitive to the time period analysed. Winds over the Southern Ocean have intensified with a downward extension from the stratosphere to troposphere visible from austral summer through autumn. There is also a trend toward more easterly winds in the middle and upper troposphere of the deep tropics, which may be associated with tropical expansion.

Uncertainty in Long-Term Atmospheric Data Records from MSU and AMSU

In session: Methodologies and Resulting Uncertainties in Long-Term Records of Ozone and Other Atmospheric Essential Climate Variables Constructed from Multiple Data Sources
Carl Mears

This talk presents an uncertainty analysis of known errors in tropospheric satellite temperature changes and an ensemble of possible estimates that makes computing uncertainties for a specific application easier.
The temperature of the Earth’s atmosphere has been continuously observed by satellite-borne microwave sounders since late 1978. These measurements, made by the Microwave Sounding Units (MSUs) and the Advanced Microwave Sounding Units (AMSUs) yield one of the longest truly global records of Earth’s climate. To be useful for climate studies, measurements made by different satellites and satellite systems need to be merged into a single long-term dataset. Before and during the merging process, a number of adjustments made to the satellite measurements. These adjustments are intended to account for issues such as calibration drifts or changes in local measurement time. Because the adjustments are made with imperfect knowledge, they are therefore not likely to reduce errors to zero, and thus introduce uncertainty into the resulting long-term data record. In this presentation, we will discuss a Monte-Carlo-based approach to calculating and describing the effects of these uncertainty sources on the final merged dataset. The result of our uncertainty analysis is an ensemble of possible datasets, with the applied adjustments varied within reasonable bounds, and other error sources such as sampling noise taken into account. The ensemble approach makes it easy for the user community to assess the effects of uncertainty on their work by simply repeating their analysis for each ensemble member.

Other sessions

The statistical inhomogeneity of surface air temperature in global atmospheric reanalyses

In session: Evaluating Reanalysis: What Can We Learn about Past Weather and Climate? (Talks I | Talks II | posters)
Craig R Ferguson and Min-Hee Lee
Recently, a new generation of so-called climate reanalyses has emerged, including the 161-year NOAA—Cooperative Institute for Research in Environmental Sciences (NOAA-CIRES) Twentieth Century Reanalysis Version 2c (20CR V2c), the 111-year ECMWF pilot reanalysis of the twentieth century (ERA-20C), and the 55-year JMA conventional reanalysis (JRA-55C). These reanalyses were explicitly designed to achieve improved homogeneity through assimilation of a fixed subset of (mostly surface) observations. We apply structural breakpoint analysis to evaluate inhomogeneity of the surface air temperature in these reanalyses (1851-2011). For the modern satellite era (1979-2013), we intercompare their inhomogeneity to that of all eleven available satellite reanalyses. Where possible, we distinguish between breakpoints that are likely linked to climate variability and those that are likely due to an artificial observational network shift. ERA-20C is found to be the most homogeneous reanalysis, with 40% fewer artificial breaks than 20CR V2c. Despite its gains in homogeneity, continued improvements to ERA-20C are needed. In this presentation, we highlight the most spatially extensive artificial break events in ERA-20C.
There is also a more detailed talk about the quality of humidity in reanalysis over China.

Assessment of Precipitation Trends over Europe by Comparing ERA-20C with a New Homogenized Observational GPCC Dataset

In session: Evaluating Reanalysis: What Can We Learn about Past Weather and Climate?
Elke Rustemeier, Markus Ziese, Anja Meyer-Christoffer, Peter Finger, Udo Schneider and Andreas Becker
...The monthly totals of the ERA-20C reanalysis are compared to two corresponding Global Precipitation Climatology Centre (GPCC) products; the Full Data Reanalysis Version 7 and the new HOMogenized PRecipitation Analysis of European in-situ data (HOMPRA Europe).
ERA-20C...covers the time period 1900 to 2010. Only surface observations are assimilated namely marine winds and pressure. This allows the comparison with independent, not assimilated data.
Sounds interesting, the abstract unfortunately does not give much results yet.

Cyclone Center: Insights on Historical Tropical Cyclones from Citizen Volunteers

In session: Era of Citizen Science and Big Data: Intersection of Outreach, Crowd-Sourced Data, and Scientific Research
Peter Thorne, Christopher Hennon, Kenneth Knapp, Carl Schreck, Scott Stevens, James Kossin, Jared Rennie, Paula Hennon, Michael Kruk
The cyclonecenter.org project started in fall 2012 and has been collecting citizen scientist volunteer tropical cyclone intensity estimates ever since. The project is hosted by the Citizen Science Alliance (zooniverse) and the platform is supported by a range of scientists. We have over 30 years of satellite imagery of tropical cyclones but the analysis to date has been done on an ocean-basin by ocean-basin basis and worse still practices have changed over time. We therefore do not, presently, have a homogeneous record relevant for discerning climatic changes. Automated techniques can classify many of the images but have a propensity to be challenged during storm transitions. The problem is fundamentally one where many pairs of eyes are invaluable as there is no substitute for human eyes in discerning patterns. Each image is classified by ten unique users before it is retired. This provides a unique insight into the uncertainty inherent in classification. In the three years of the project much useful data has accrued. This presentation shall highlight some of the results and analyses to date and touch on insights as to what has worked and what perhaps has not worked so well. There are still many images left to complete so its far from too late to jump over to www.cyclonecenter.org and help out.

Synergetic Use of Crowdsourcing for Environmental Science Research, Applications and Education

In session: Era of Citizen Science and Big Data: Intersection of Outreach, Crowd-Sourced Data, and Scientific Research
Udaysankar Nair
...Contextual information needed to effectively utilize the data is sparse. Examples of such contextual information include ground truth data for land cover classification, presence/absence of species, prevalence of mosquito breeding sites and characteristics of urban land cover. Often, there are no agencies tasked with routine collection of such contextual information, which could be effectively collected through crowdsourcing.

Crowdsourcing of such information, that is useful for environmental science research and applications, also provide opportunities for experiential learning at all levels of education. Appropriately designed crowdsourcing activity can be transform students from passive recipients of information to generators of knowledge. ... One example is crowdsourcing of land use and land cover (LULC) data using Open Data Kit (ODK) and associated analysis of satellite imagery using Google Earth Engine (GEE). Implementation of this activity as inquiry based learning exercise, for both middle school and for pre-service teachers will be discussed. Another example will detail the synergy between crowdsourcing for biodiversity mapping in southern India and environmental education...
There is also a crowd sourced project for Land use and land cover for (urban) climatology: the World Urban Database. A great initiative.


Steps Towards a Homogenized Sub-Monthly Temperature Monitoring Tool

In session: Characterizing and Interpreting Changes in Temperature and Precipitation Extremes
Jared Rennie and Kenneth Kunkel
Land surface air temperature products have been essential for monitoring the evolution of the climate system. Before a temperature dataset is included in such reports, it is important that non-climatic influences be removed or changed so the dataset is considered homogenous. These inhomogeneities include changes in station location, instrumentation and observing practices. Very few datasets are free of these influences and therefore require homogenization schemes. While many homogenized products exist on the monthly time scale, few daily products exist... Using these datasets already in existence, monthly adjustments are applied to daily data [of NOAA's Global Historical Climatology Network – Daily (GHCN-D) dataset]...
Great to see NOAA make first steps towards homogenization of their large daily data collection, a huge and important task.

Note that the daily data is only adjusted for changes in the monthly means. This is an improvement, but for the weather extremes, the topic of this session, also the rest of the marginal distribution needs to be homogenized.


Temperature Trends over Germany from Homogenized Radiosonde Data

In session: Methodologies and Resulting Uncertainties in Long-Term Records of Ozone and Other Atmospheric Essential Climate Variables Constructed from Multiple Data Sources
Wolfgang Steinbrecht and Margit Pattantyús Ábráham
We present homogenization procedure and results for Germany’s historical radiosonde [(RS)] records, dating back to the 1950s. Our manual homogenization makes use of the different RS networks existing in East and West-Germany from the 1950s until 1990.

The largest temperature adjustments, up to 2.5K, are applied to Freiberg sondes used in the East in the 1950s and 1960s. Adjustments for Graw H50 and M60 sondes, used in the West from the 1950s to the late 1980s, and for RKZ sondes, used in the East in the 1970s and 1980s, are also significant, 0.3 to 0.5K. Small differences between Vaisala RS80 and RS92 sondes used throughout Germany since 1990 and 2005, respectively, were not corrected for at levels from the ground to 300 hPa.

Comparison of the homogenized data with other radiosonde datasets, RICH (Haimberger et al., 2012) and HadAT2 (McCarthy et al., 2008), and with Microwave Sounding Unit satellite data (Mears and Wentz, 2009), shows generally good agreement. HadAT2 data exhibit a few suspicious spikes in the 1970s and 1980s, and some suspicious offsets up to 1K after 1995. Compared to RICH, our homogenized data show slightly different temperatures in the 1960s and 1970s. We find that the troposphere over Germany has been warming by 0.25 ± 0.1K per decade since the early 1960s, slightly more than reported in other studies (Hartmann et al., 2013). The stratosphere has been cooling, with the trend increasing from almost no change near 230hPa (the tropopause) to -0.5 ± 0.2K per decade near 50hPa. Trends from the homogenized data are more positive by about 0.1K per decade compared to the original data, both in troposphere and stratosphere.
Statistical relative homogenization can only partially remove trend biases. Given that the trend needed to be corrected upwards, the real temperature trend may thus be larger.

Observed Decrease of North American Winter Temperature Variability

In session: Methodologies and Resulting Uncertainties in Long-Term Records of Ozone and Other Atmospheric Essential Climate Variables Constructed from Multiple Data Sources
Andrew Rhines, Martin Tingley, Karen McKinnon, Peter Huybers
There is considerable interest in determining whether temperature variability has changed in recent decades. Model ensembles project that extratropical land temperature variance will detectably decrease by 2070. We use quantile regression of station observations to show that decreasing variability is already robustly detectable for North American winter during 1979--2014. ...

We find that variability of daily temperatures, as measured by the difference between the 95th and 5th percentiles, has decreased markedly in winter for both daily minima and maxima. ... The reduced spread of winter temperatures primarily results from Arctic amplification decreasing the meridional temperature gradient. Greater observed warming in the 5th relative to the 95th percentile stems from asymmetric effects of advection [air movements] during cold versus warm days; cold air advection is generally from northerly regions that have experienced greater warming than western or southwestern regions that are generally sourced during warm days.
Studies on changes in variability are the best. Guest appearances of the Arctic polar vortex in the eastern USA had given me the impression that the variability had increased, not decreased. Interesting.

Century-Scale of Standard Deviation in Europe Historical Temperature Records

In session: Methodologies and Resulting Uncertainties in Long-Term Records of Ozone and Other Atmospheric Essential Climate Variables Constructed from Multiple Data Sources
Fenghua Xie
The standard deviation (STD) variability in long historical temperature records in Europe is analyzed. It is found that STD is changeable with time, and a century-scale variation is revealed, which further indicates a century-scale intensity modulation of the large-scale temperature variability.

The Atlantic multidecadal oscillation (AMO) can cause significant impacts in standard deviation. During the periods of 1870–1910 and 1950–80, increasing standard deviation corresponds to increasing AMO index, while the periods of 1920-50 and 1980-2000 decreasing standard deviation corresponds to decreasing AMO index. The findings herein suggest a new perspective on the understanding of climatic change
Studies on changes in variability are the best. This intriguing osculation in the standard deviation of temperature was found before for the Greater Alpine Region by Reinhard Böhm. He also found it in precipitation and pressure (with a little fantasy). One be careful with such studies, changes in the standard deviation due to changes in monitoring practises (inhomogeneities) are mostly not detected, nor corrected. Only a few national and regional daily datasets have been (partially) homogenized in this respect.


Could scientists ‘peer-review’ the web?

A Town Hall presentation by Emmanuel Vincent of Climate Feedback and Dan Whaley, the founder of Hypothesis, which is the software basis of Climate Feedback. On Tuesday Dec.15, 12:30 - 13:30 pm

It is quite sure that I missed relevant presentations. Please add them in the comments.




Related reading

Why raw temperatures show too little global warming

Lakes are warming at a surprisingly fast rate

3 comments:

  1. Victor - appears things are looking up for the satellites to finally coming to reality, and possibly more scientific confirmation for Karl.

    ReplyDelete
  2. The abstract of Sherwood on radiosondes is the strongest evidence that the microwave retrievals of temperature have some sort of problem. He had a paper on this earlier in case you want to see the (earlier?) evidence. Iterative homogenization methods are potentially more powerful, but you have to validate them very well, a small error can lead to larger trend errors rather than reductions in trend errors, what the algorithms are supposed to do.

    The better uncertainty estimates of RSS sound interesting and I expect them to be considerable. Let's see the results and later the paper. I would not expect better error estimates to change the public "hiatus" "debate". The change in slope was never statistically significant, but why care about statistics if you can make a political point against mitigation?

    The large changes in the microwave estimated temperature trends have up to now mostly been due to showing that unknown unknowns need to be taken into account. Thus my bet would be on an unknown unknown.

    You may find my earlier comment at Climate Etc. interesting. Hope to make it into a post one day. It fits to the current ATTP post, it is about the consilience of evidence:

    Given that the difference [between models and microwave temperature trends] is mainly due to the missing tropic hotspot in the satellite temperature trend, it seems more likely than not that there is some problem with the satellite trends.

    The tropical hotspot 1) is seen in some radiosonde datasets, 2) it is seen in radiosonde winds, 3) it is expected from basic physics (that we know that the moist adiabatic temperature profile should be a good approximation in the topics due to a lot of convection), 4) you see the strong response of the troposphere compared to the surface at shorter time scales and 5) it is seen in climate models.

    But we will only know this with confidence when we find the reason for the problem with the satellite trends or when we find problems with all of the other 5 pieces of evidence against it.

    ReplyDelete
  3. Australia does have Lamar Smith equivalents, but they don't have quite the same range of tools available to them as their American counterparts (especially as none of the authors of the Sherwood et al paper are government employees).

    ReplyDelete

Comments are welcome, but comments without arguments may be deleted. Please try to remain on topic. (See also moderation page.)

I read every comment before publishing it. Spam comments are useless.

This comment box can be stretched for more space.