The uncertainty framework developed by CREDIBLE will bring new approaches into the field of natural hazards and adapt them to the specific needs of the field. It will create consistency and greater scientific rigour regarding the estimation of uncertainty in natural hazard risk assessment. In doing so, it will enhance the capacity, knowledge and skills of stakeholders from private and public sector organisations, and improve societal security through better and more consistent decision-making under uncertainty.
Our vision is of a coherent treatment of risk and uncertainty that spans all facets of risk management across the natural hazards. It is now widely recognised within environmental science that a more sophisticated treatment of uncertainty is highly desirable. Recent methodological and computational developments within statistics now make such treatment possible, when coupled with the growing number of scientists working at the interface between hazards and statistics, and expert elicitation techniques for structuring and assessing uncertainty.
CREDIBLE seeks to develop:
1. A framework and software suite of statistical methods to build a consistent foundation for uncertainty and risk assessment in natural hazards, to improve the transparency and defensibility of risk management choices.
2. Methods and software to assess less quantifiable aspects of uncertainty and risk, notably model limitations and future scenarios.
3. Visualisation and decision analysis methods and software to bridge the gap between hazard assessment and risk assessment, to support policy and decision-making, and to promote shared ownership of choices and actions.
4. Benchmarking studies, an online handbook, educational modules and a path for the adoption and evolution of the proposed methods and tools.
Work package 1: A Bayesian framework for forward propagation of uncertainty
Lead: Dr Jonty Rougier
This work package seeks to expand the role of probabilistic models for uncertainty assessment and quantification in natural hazards. The intention is to take state-of-the-art statistical techniques, and adapt them to the challenges of natural hazards.
Work package 2: A diagnostic framework for backward analysis controls on footprint and risk model output
Lead: Professor Thorsten Wagener
This work package is concerned with the forward propagation of uncertainty through the modelling process. It is increasingly recognised that analysing in the opposite direction - from model output to input and parameters - is equally important.
Work package 3: Accounting for uncertainties that cannot be included in the Bayesian framework
Lead: Professor Keith Beven
This work package focuses on the many sources of uncertainty in environmental systems that arise because of a lack of knowledge of some aspect of inputs, process representations, observed response and loss/damage estimation.
Work package 4: Expanding the Bayesian framework to merge multiple sources of information
Lead: Professor David Stephenson
Some natural hazards have precursors with sufficient lead times that significant interventions are possible (e.g. volcanoes, storms, tsunamis). The emphasis in this work package is planning for real-time event management, i.e. providing a framework to assist the risk manager in the run-up to an event, during its unfolding, and in the aftermath.
Work package 5: Framework and protocols for linking hazard / footprint / risk models to consequences and decision-making
Lead: Professor Jim Hall
This work package is concerned with decision analysis and policy. The most pressing difficulties are associated with ambiguous information, the need to appraise the probable outcomes of a series of actions, and the need to communicate the outcomes of analyses to policy-makers and the general public.
Volcanoes display a wide range of potentially hazardous phenomena, including explosive eruptions that generate ash fall, suspended ash hazards, high temperature pyroclastic flows, lahars, landslides and lavas. Hazard footprints, loss functions, model characteristics and modelling strategies are consequently diverse. The focus of this study is on quantifying the uncertainty within the Met Office dispersion model NAME III, and its subsequent communication to decision-makers during a hazard event.
Multi-phase fluid flow
Avalanches and lahars provide a useful testbed for the use of semi-empirical methods in hazard assessment for run-out of multi-phase fluid flow hazards. For avalanches, the alpha-beta model has been calibrated on hundreds of events, while the most popular operational model for lahar run-out, LAHARZ, is calibrated on only 27 events on 9 volcanoes. Contrasting the risks from these two hazards, with their very different epistemic uncertainties, will allow us to address the first open question of the IRASMOS programme: The quantification of risk includes a considerable degree of uncertainty. How should we deal with this uncertainty in the practical realisation of protection strategies?
The mean loss to windstorms depends not only on the frequency and intensity of the hazard but also on the spatial extent of the storm footprint. A common assumption in CAT models is that this footprint is constant. However, it is observed that extra-tropical cyclones can have different spatial extents. We aim to test methodologies that can be used to estimate the risk of intense windspeeds (>30 m/s) in a large windstorm event. We will use high-resolution ensemble weather hindcasts produced by the Met Office (TIGGE data) and exploit the windstorm footprint uncertainty in a risk assessment tool (the Met Office's Weather Impact Forecasting model).
Using the LISFLOOD-FP suite of flood inundation models (Bates and Dr Rood, 2000) and benchmarking the data sets assembled by the University of Bristol following the major flood events in Carlisle 2005, Tewkesbury 2007, and Cumbria 2009, we will explore the following questions: Can we use a bias aware Kalman Filter to systematically estimate modelling errors (or bias) in forcing data and spatially distributed parameter fields? What is the degree of non-stationarity of extreme value risk distributions due to climate, land use and population change in the basin? To what extent can model emulators capture shallow water flow physics and dynamics?
In practice, droughts build up over extended timescales and their severity is modified by management actions taken during such antecedent periods, meaning that probabilistic analysis of drought risk must account for these management interventions. Recognition of non-stationarity is challenging existing frameworks and stimulating new thinking for water resource management. We wish to understand the implications of adopting decision-making approaches that are based on potentially incorrect assumptions, to identify decision-making approaches that are as robust as possible. An existing model for the south east of England, developed in the ARCC Water Project, will be our test bed.
Extremes, non-stationarity and catalogue augmentation
The number of natural hazard events observed and available for model calibration and evaluation is generally small. Parametric uncertainty therefore remains large after calibration, and we want to understand whether catalogue augmentation can lead to a significant reduction. In this study we will combine hierarchical spatial-temporal modelling and extreme value theory with physical understanding of the hazard, to define measures of similarity to assess hazard transferability for augmentation of hazard catalogues under current and under potentially non-stationary future conditions.
For further details, or enquiries about becoming involved with the work of the consortia, contact:
Prof Thorsten Wagener, Department of Civil Engineering, University of Bristol
Dr Lisa Hill (CREDIBLE Project Manager)
Prof Richard Chandler, Department of Statistical Science, University College London