OR/14/042 Assessing and quantifying uncertainty

From Earthwise
Jump to navigation Jump to search
Nerc-logo-long.jpg
Royse, K R, and Hughes, A G (editors). 2014. Meeting Report: NERC Integrated Environmental Modelling Workshop (Held at the British Geological Survey, Keyworth, 4–5 February). British Geological Survey Internal Report, OR/14/042.

The ability to share data and models in order to base rational decisions and polices on the outcomes requires a rigorous assessment of uncertainty. It is necessary to develop mechanisms, standards and tools to enable uncertainty to be managed in a simple but robust way. An assessment of current uncertainty tools, both from within the environmental science sector and from other sectors such as the medical and engineering communities is required.

When assessing and quantifying uncertainty, it is clear that it isn’t the uncertainty that is reduced but the user’s confidence in the modelled outputs. In general, communication of uncertainty is very much user context dependent and therefore, it is the end-user that will determine what they need and how the information should be displayed and/or communicated. We have to be clear about our assumptions and the use of metrics of uncertainty when communicating our results to end-users.

Within a linked modelled system as well as understanding the parametric and structural uncertainty, there is a need to understand the component uncertainty across interfaces in order to understand the uncertainty within a linked modelled system. We therefore, need to develop tools that will track parametric uncertainty through each component of the linked modelling process. There is a need to understand the sensitivity of model parameters to the problem being posed, how it can vary depending on the question being asked and to understand how this affects the overall uncertainly of the linked model results.

Work that needs to be done includes:

  1. Develop tools and standards that will allow for the management of uncertainty within linked modelled systems
  2. Develop robust and repeatable methods to validate models and understand how observations bias results
  3. Develop tools that can understand and track how uncertainty is propagated through linked components and how parametric uncertainty may change depending on the question being asked or the components that are being integrated
  4. Be clear as to the limitations of models and look towards codifying the known limitations of models to ensure that their use is fit for purpose


BAS logo.jpg
BGS logo.jpg
CEH logo.jpg
NCAS logo.jpg
NCEO logo.jpg
NOC logo.jpg