Case studies: RACER consortium
RACER will demonstrate the use of techniques for addressing the four main research themes using case studies in different hazard areas, each with a clear end user focus. These case studies are as follows (scroll down for details of all studies, or click on one study to jump directly to its description):
Flood risk analysis relies on hydrological models. The sources of uncertainty in these models have been well-studied by hydrologists: model structural uncertainty is arguably the principal outstanding challenge. A first objective here is therefore to exploit modern statistical techniques to handle these issues.
Hydrological nonstationarity is another major challenge, with the potential impacts of climate and land use change at the forefront of the planning and policy problem . In standard practice, nonstationarity is often addressed via speculative changes to parameter values; or by inferring temporal variations from observed spatial variations and expert knowledge. Although progress in this area has been made at Imperial College, a major remaining challenge is the handling of uncertainties associated with future projections based on a synthesis of expert opinion and spatial data. A second objective is therefore to address this challenge.
The management of flood risk requires consideration of competing objectives (water supply, agriculture, heritage, ecology, etc). This multi-criteria problem raises challenges of: 1) development of measures of benefits or disadvantages, and tools for communicating multi-criteria trade-offs; 2) how best to prioritise management options under uncertainty. EPSRC’s FRMRC programme has addressed the first of these challenges, resulting in a landscape management decision-support tool, Polyscape (recently renamed as LUCI - Land Utilisation and Capability Indicator). The second remains as an objective for PURE.
For this case study, anticipated achievements and outputs are as follows:
- A methodology for treating model structural error in flood frequency and hydrograph predictions; and associated guidelines to improve the treatment of uncertainty in hydrological models;
- Guidelines on conducting flood frequency analysis under climate and land use change;
- New decision-support methods and tools for flood risk management under uncertainty;
- Demonstrations of applicability and implications in two UK catchments.
2. Probabilistic seismic hazard analysis: a case study of the UK, and implications for the global effort (Edinburgh, BGS, UCL, Birkbeck)
Standard practice in probabilistic seismic hazard analysis (PSHA) is to derive frequency-magnitude relationships from the instrumental, historical and recent (Holocene) geological record, and to combine these with models of ground motion to produce probabilistic assessments of ground shaking. PSHA is routinely used for anti-seismic design purposes, and also has uses in the reinsurance and planning communities.
Sources of uncertainties in PSHA include short-term and incomplete earthquake records; errors in earthquake magnitude estimation; and incomplete knowledge of fault locations and histories. However, a full uncertainty analysis has yet to be made for any PSHA study: we will carry out such an analysis for the first time. The UK provides a convenient case study due to ease of access to relevant data, but the lessons learned will be applicable elsewhere. Moreover, seismic hazard in the UK is highly topical due to the new nuclear build programme. One aim of this work element is therefore to conduct a ‘root and branch’ evaluation of PSHA, to produce an evaluation of the best current methodology along with a full uncertainty assessment.
PSHA has relatively modest data requirements. However, if more detailed data on fault locations and histories are available, these can be exploited to improve hazard assessments . To explore this, we will link to a current NERC project (NE/I024127/1) examining physical interactions between faults in the Italian Apennines. This region has excellent historical and instrumental earthquake data, and the active faults are well-constrained. We will use these data, with the above NERC project outputs, to produce a hazard assessment with uncertainty, and hence to demonstrate the quantification of uncertainty in locations where such detailed data are available.
For this case study, anticipated achievements and outputs are as follows:
- Updated basic earthquake catalogue for the UK based on new calibrations under uncertainty;
- Full characterisation of the net effect of all sources of error affecting seismic hazard;
- Updated seismic hazard map for the UK, for use in next-generation power plant construction and other critical facilities;
- Guidelines on conducting PSHA for applications outside the UK.
- A framework for seismic hazard assessment in areas, such as Italy, where fault interactions are constrained.
Although tsunami can be highly destructive, large events are infrequent and poorly represented in historical data sets or documentary records. Moreover, relevant geological data are limited in geographical extent. As a result, tsunami hazard assessments rely upon physics-based models that attempt to model tsunami waves using various approaches. The inputs to these models are often highly uncertain, especially with respect to tsunami source characteristics and near-shore bathymetry, which determines local wave amplification patterns. Comprehensive model evaluation is therefore particularly problematic.
In addition to these problems, no rigorous assessment has been undertaken of uncertainty in real-time tsunami warnings. In current operational warning systems, time-critical near-field warnings are often underestimates due to errors in initial estimates of earthquake magnitude. In contrast, transoceanic warnings are often overestimates and are criticized as false alarms. At present, warnings such as those produced by the Pacific Tsunami Warning Centre (PTWC) make no attempt to convey this uncertainty.
In this case study we will analyse the sources of uncertainty in operational tsunami warnings; focusing initially on rapid assessment of these uncertainties in real-time settings. We will also develop an open catastrophe model for tsunami risks. Anticipated achievements and outputs are:
- A methodology for assessing source, parameter and structural uncertainty in tsunami models, for different types of tsunami source (earthquake, landslide etc.) and levels of data availability;
- Development of tsunami risk assessment strategies with quantified uncertainties, demonstrated for specific areas that are of interest to user communities, such as Japan and Cascadia;
- Development of a statistically valid scheme for systematic testing and uncertainty evaluation of tsunami warning systems, to better enable cost-benefit analysis of proposed improvements;
- An open catastrophe model for tsunami risks, integrated with GEM and OpenQuake.
For many weather-related hazards, days-ahead forecasts are required for event management and for medium-term planning. Numerical weather prediction (NWP) models are the primary tools for producing such forecasts. Outputs from multiple models are often available, provided by different modelling centres as forecast products. Each model typically provides multiple scenarios as well as a single “best” (in some sense) forecast. Clearly however, there is potentially much to be gained from combining all the available information so as to exploit the strengths of each individual product. This falls squarely within the scope of Objective B above. Reflecting the interests of our project partners Aon Benfield, EuroTempest, Narec Capital and the Met Office, we will consider forecasts of two specific hazards: days-ahead windspeed forecasts for the UK and Europe, and 5-15 day forecasts of extreme cold outbreaks for the UK.
The windspeed analysis will focus on forecasts at 10m height, at leads from 1 day to 2 weeks, and will assess the extent to which the value of information to end users (eg the insurance, power and energy, and travel sectors) can be enhanced by providing an appropriate synthesis with fully quantified uncertainties. Current business practice often employs forecasts only to a lead time of 5 days. Our research will provide clear state-of-the-art guidance on the optimum real-time forecast skill and uncertainty for high windspeeds out to 2 weeks lead. These findings will be used with other state-of-the-art data to create more informed and reliable probabilistic predictions of business impact, in particular of insured wind loss and wind farm power.
The work on cold winter conditions will focus on 5-15 day forecasts of extreme cold conditions in the UK, that are relevant in planning for events such as increased incidence of road traffic accidents, icing and landing risks for air traffic, personal injury and structural damage to buildings caused by excessive weight of snowfall or freeze-thaw processes.
Anticipated achievements and outputs from this work are as follows:
- Full assessment of forecast skill and uncertainty for ten state-of-the-art NWP ensemble forecast models in predicting high wind speeds and extreme cold conditions, up to two weeks ahead;
- Optimised, probabilistic multi-model forecast systems for both windspeed and cold conditions;
- Models giving informed and reliable real-time probabilistic predictions of UK/European business impacts, focusing in particular upon insurance market wind loss by postcode; wind farm power output; and transport and health impacts of cold conditions.
The 2010 Eyjafjallajökull eruption demonstrated the vulnerability of the air transport sector and society to even a small volcanic eruption. New guidelines were brought in by the UK Civil Aviation Authority and EUROCONTROL afterwards, requiring predictions of ash cloud concentrations rather than just location as previously. Currently however, no formal attempt is made to quantify the large uncertainties in forecasts associated with volcanic ash transport and dispersion (VATD) model inputs (relating to meteorology and to eruption source parameters), parameters (e.g. deposition and sedimentation rates), and structural error (due to missing processes such as aggregation). The objective in this case study to quantify these uncertainties: this is of direct interest to our project partners the Met Office . Similar dispersion models are used for the management of incidents such as nuclear accidents or, on a more routine basis, to generate predictions of air quality: hence a study of uncertainties in volcanic ash transport could have much wider applicability.
Anticipated achievements and outputs are:
- A methodology for assessing source, parameter and structural uncertainty in dispersion models, for different types of volcanic eruption;
- Quantification of the relative importance of different sources of uncertainty and hence identification of measurements needed to constrain uncertainty.
- Combination of multiple uncertainties into a single probabilistic volcanic ash forecast.
For further details, or enquiries about becoming involved with the work of the consortia, contact:
Prof Thorsten Wagener, Department of Civil Engineering, University of Bristol
Dr Lisa Hill (CREDIBLE Project Manager)
Prof Richard Chandler, Department of Statistical Science, University College London