Uncertainty in observed datasets has many forms, from the quality and/or consistency of the underlying data to the choices made within a chosen gridding/interpolation method (parametric uncertainty), to the network selection and analytical framework (structural uncertainty). Of these structural uncertainty generally has the largest influence on the resulting gridded product, particularly in the representation of extremes and their trend estimates. However rarely are datasets produced with uncertainty estimates and users are often unaware that the choice of observational product can substantially affect results. This project will assess how changing station networks or parameter settings within interpolation methods affect trends in temperature extremes and in turn whether this could affect detection and attribution analysis. The objective of this research problem is to test the sensitivity of gridded output to changing parameters and input station networks and to discuss in detail how and why the results vary when changing input parameters, what is important/less important when considering the climate of the region. Students will decide what parameters settings to test and how the station networks are set up. Data from the ETCCDI temperature indices e.g. annual maxima Tmax (TXx), annual minima Tmin (TNn) contained in the HadEX2 observational extremes indicators dataset (Donat et al., 2013) will be supplied for different regions. Ultimately the results from this project could feed into Research Problem 3.
Climate science resorts to spatial statistics for predicting future changes, detecting large time scales, or modeling unobserved zones and times (broadly termed climate reconstruction, using a variety of, often ad-hoc, imputation techniques). The amount of data involved is so large that it becomes a statistical problem. Indeed, in the presence of very large datasets, the estimation of parametric models, the prediction at unobserved sites and the associated uncertainty estimation may not be computationally feasible. Consequently, one of the main objectives of statistical climatology is to extract relevant information hidden in complex spatial-temporal climatological datasets. To identify spatial patterns, most well-known statistical techniques in climate studies are based on the concept of variance, like the k-means clustering algorithm, or the Empirical Orthogonal Function (EOF) analysis that decomposes estimated variance-covariance matrices. This makes sense for applications that aim at identifying patterns with respect to mean behaviors. In particular, it is ideally suited when the variable of interest follows a mixture of normal distributions because Gaussian random vectors are fully characterized by their mean vectors and their covariances matrix. A possible avenue to bridge this methodological gap resides in taking advantage of multivariate EVT and to adapt it to the context of dimension reduction. The problem of dimension reduction is challenging here, since multivariate EVT is by nature non-parametric (unlike Gaussian modeling through correlation matrix), and most applications of non-parametric multivariate EVT have dealt with very low dimensions (less than 10).
Both cold and warm temperature extremes have warmed since the middle of the 20th century, and a number of detection and attribution studies have demonstrated that human influence on the climate system has very likely contributed to this warming (IPCC, 2013). It is also widely accepted that human influence has affected the characteristics of warm spells/heat waves and other indicators of temperature that are related to impacts, such as the number of frost days per year (days with minimum temperature below 0C), the number of tropical nights per year (days with minimum temperature above 20C, a key factor associated with the health impacts of heat waves), and the number of warm days per year (days with daily maximum temperature above 25C). While it is generally accepted that human influence has affected these indices, this has not yet been confirmed with formal detection and attribution studies. The objective of the project is, therefore, to use formal detection and attribution methods to determine whether this acceptance, which is reported in IPCC assessments, is in fact, reasonable. This will involve (i) the careful assessment of a range of observed and simulated temperature indices that are contained in the HadEX2 observational extremes indicators dataset (Donat et al., 2013) and that have been extracted from CMIP5 simulations (Sillmann et al., 2013) respectively, and where appropriate, (ii) the application of well established detection and attribution methods (e.g., Hegerl and Zwiers, 2011) to determine whether observed changes in these indices can be attributed to human influence on the climate system. The indicators need careful assessment prior to becoming a subject of a detection and attribution analysis because model biases, or index definitions that are inappropriate for the climate to which they are applied, may create situations in which the indices become ineffective as indicators of variability or change.
The assessment of the prediction skill of extreme climate events is the first step towards an efficient application of seasonal prediction in both society and the industry. Multi-model global retrospective predictions will be used by the students to investigate the ability of current operational systems to predict the 10th and 90th percentiles of the seasonal precipitation, temperature and wind. They will compare the skill and reliability of the predictions for extreme events with the forecast quality of the seasonal averages, explore the conditional skill by stratifying the events as a function of a subset of large-scale variability modes (NAO, ENSO) and investigate how the skill and the prediction uncertainty changes as more prediction systems are added. A discussion of the relevance of predicting seasonal extreme events for different sectors is expected. Our research unit at IC3 is developing a set of R functions to perform the analyses on climate predictions that will be released via the CRAN.
Introductory material on predicting seasonal or decadal extremes:
An analysis will be made of the CMIP5 ensembles of coupled model simulations, comparing simulations that include both anthropogenic and natural forcings on climate with those that include only natural forcings. Estimates will be made of the changed probability of unusually warm/cold/dry/wet seasonal means in these models runs between the two runs. Investigations will also be carried out into the relability of these estimates, by comparing with observational estimates of the variability of temperature and precipitation, and into the uncertainty in these estimates due to modelling uncertainties. If comparable estimates [from a companion project - Project 4] are available from atmosphere only runs forced with observed SSTs and sea ice conditions, it would be interesting to compare and contrast results of the change in probobaility (eg for a particularly cold winter or wet summer in Northern Europe) from coupled model runs (which provide the overall change of risk) with results from SST and sea ice forced runs (which provide the change of risk conditional on a particular marine state, eg with the ENSO and Arctic sea ice conditions seen in a particular year. In this way it would be possible to investigate to what extent it is possible to partition any changed risk (eg in a cold winter or a wet summer) into a component attributable to anthropogenic climate change and a component attributable to natural internal climate variability.
A shift in the distribution of variables such as daily maximum winter temperatures and daily precipitation extremes (Coelho et al. 2008), towards higher values has been attributed to anthropogenic climate change for various mid-latitudal regions in the past (e.g. Pall et al. 2011, Otto et al. 2012). However, while there are many process based arguments suggesting also a change in the shape of these distributions, attribution studiesdemonstrating this have not currently been undertaken. With the very large ensemble of simulations of the European winter (DJF) 2013/2014 the students in this project will have the opportunity to explore, in the first instance, how the estimated parameters of the Generalized Extreme Value (GEV) distribution vary under a climate change scenario (Kharin and Zwiers 2005). While it is expected that the location parameter will change (i.e. the GEV distribution shifts to a warmer state), it is unknown how the scale and shape parameters might vary, as well as higher order extreme diagnostics such as the extremal index. The students will look at these measures over a range of different climate fields. Secondly there are 11 ensembles of this past winter as it might have been in a world without anthropogenic climate change with each of these ensembles being forced with observed SSTs but with 11 different plausible patterns of warming removed. The students will investigate whether and how the distribution of the analysed variables have changed.
Soil moisture is an important quantity in the assessment and investigation of droughts. In this project the students will study the importance of initial soil moisture conditions versus subsequent meteorological forcing using an reverse-ESP (Ensemble Streamflow Prediction) approach proposed by Wood and Lettenmaier 2008. Furthermore they will investigate changes in drought occurrence probability caused by trends in the mean and variability of soil moisture, and related changes of the soil moisture-temperature coupling (Mueller and Seneviratne 2012). We will divide the students in two sub-groups to address these two topics. They will work with the R programming language and use a conceptual simple water balance model to infer soil moisture from meteorological information (Orth and Seneviratne 2014). We will focus on North America and compare the results with respective findings for Europe.