Euradwaste '08 - EU Bookshop - Europa

Euradwaste '08 - EU Bookshop - Europa Euradwaste '08 - EU Bookshop - Europa

bookshop.europa.eu
from bookshop.europa.eu More from this publisher
10.12.2012 Views

certainty, conducted in RTDC-1 as Work Package 1.2 (WP1.2). This initial review is summarised in Section 2. RTDC-2 is organised in three work packages: WP2.1 is researching key drivers and methodologies for the treatment of uncertainty, addressing regulatory compliance, the communication of uncertainty, approaches to system PA, and techniques for sensitivity analysis. WP2.2 is proceeding in parallel with WP2.1 and is testing and developing the framework outlined in WP1.2 by undertaking a series of exercises to provide examples of uncertainty treatment from different European programmes at different stages of development. The work is divided into tasks that consider the main types of uncertainties (scenario, model, parameter), the treatment of spatial variability, and the development of probabilistic safety assessment tools. WP2.3 is a synthesis task pulling together the WP1.2 review, and research on the treatment of uncertainty under WP2.1 and the testing and development work under WP2.2 to arrive at final guidance on approaches for the treatment of uncertainty during PA and safety case development that contains state-of-the-art examples from RTDC-2 for a range of key areas. Most of this work is still underway, so we focus here on the outcome of the WP1.2 review, a brief description of the work in progress, and, in Section 4, a longer summary of the one complete task in RTDC-2 on the regulatory evaluation of uncertainty. 2. WP1.2 Initial Review of the Treatment of Uncertainty The aim of WP1.2 was to develop a document that synthesises the state-of-the-art at the beginning of the project, providing examples on approaches to the treatment of different types of uncertainty at different stages of safety case development and highlighting areas where further development would be most helpful. Information on the treatment of uncertainties was gathered from PAMINA participants and several other organisations using a questionnaire, and via a limited wider review of the literature. The questionnaire responses obtained represent 16 disposal programmes in 13 countries, including all of the countries with advanced programmes to implement geological disposal, allowing the review to give wide coverage of global activity. Selected results from the review are given here. A more complete summary is provided in [1]. 2.1 Types of Uncertainties Considered in PA There is consensus on both how uncertainties considered in PAs should be classified and the nature of uncertainties, although this is masked by variations in terminology and differences in the way uncertainties are treated in programmes. Uncertainties in PAs are generally classified as: 1. Uncertainties arising from an incomplete knowledge or lack of understanding of the behaviour of engineered systems, physical processes, site characteristics and their representation using simplified models and computer codes. This type of uncertainty is often called “model” uncertainty. It includes uncertainties that arise from the modelling process, including assumptions associated with the reduction of complex “process” models to simplified or stylised conceptual models for PA purposes, assumptions associated with the representation of conceptual models in mathematical form, and the inexact implementation of mathematical models in numerical form and in computer codes. 378

2. Uncertainties associated with the values of the parameters that are used in the implemented models. They are variously termed “parameter,” or “data” uncertainties. 3. Uncertainties associated with the possible occurrence of features, events and processes (FEPs) external to the disposal system that may impact the natural or engineered parts of the disposal system over time. These are usually referred to as “scenario” or “system” uncertainties. All three classes of uncertainty are related to each other, and particular uncertainties can be handled in different ways, such that they might be dealt with in one class or another for any single iteration of a PA/safety case, depending on programmatic decisions (e.g., on how to best to implement PA calculations or to communicate results) and practical limitations (e.g., on funding or timescales). The classification system for uncertainties given above essentially arises from the way the PA is implemented, and says little about the nature of the uncertainties. With respect to nature, a useful distinction can be made between epistemic and aleatory uncertainties. Epistemic uncertainties are knowledge-based, and therefore, reducible by nature. Aleatory uncertainties, on the other hand, are random in nature and are irreducible. All three classes of uncertainty contain elements that are epistemic and aleatory, although it may be generally true that “scenario” uncertainties contain a larger element of aleatory uncertainty than the other two groups. To take an example, typically “parameter” uncertainties may arise for the following reasons: The parameter values have not been determined exactly. This type of uncertainty is largely epistemic in quality, and can be reduced with further effort. The models use single (or spatially averaged) values for parameters, derived from measurements at discrete locations, whereas in reality there is continuous variation in parameter values over space - as well as over time. This type of uncertainty is partly aleatory in quality and cannot be reduced by further effort. 2.2 Dealing with Uncertainty in the Quantitative PA 2.2.1 Parameter uncertainty Uncertainties associated with model parameter values can be treated conveniently within most computational schemes. Common approaches to treating parameter value uncertainty include the following: 1. Setting probability distributions functions (PDFs) for parameter values, which are sampled during the course of a probabilistic assessment. 2. Repeat deterministic calculations where individual parameter values are varied across a range of likely or possible values, including deterministic calculations using values representing the best understanding available (“best estimate”) to better understand the system, e.g., with regard to sensitivities. 3. Deterministic calculations where deliberately pessimistic values of parameters are taken, producing a “conservative” estimate of the value of receptor quantities in order to demonstrate compliance with limits. 379

2. Uncertainties associated with the values of the parameters that are used in the implemented<br />

models. They are variously termed “parameter,” or “data” uncertainties.<br />

3. Uncertainties associated with the possible occurrence of features, events and processes (FEPs)<br />

external to the disposal system that may impact the natural or engineered parts of the disposal<br />

system over time. These are usually referred to as “scenario” or “system” uncertainties.<br />

All three classes of uncertainty are related to each other, and particular uncertainties can be handled<br />

in different ways, such that they might be dealt with in one class or another for any single iteration<br />

of a PA/safety case, depending on programmatic decisions (e.g., on how to best to implement PA<br />

calculations or to communicate results) and practical limitations (e.g., on funding or timescales).<br />

The classification system for uncertainties given above essentially arises from the way the PA is<br />

implemented, and says little about the nature of the uncertainties. With respect to nature, a useful<br />

distinction can be made between epistemic and aleatory uncertainties. Epistemic uncertainties are<br />

knowledge-based, and therefore, reducible by nature. Aleatory uncertainties, on the other hand, are<br />

random in nature and are irreducible.<br />

All three classes of uncertainty contain elements that are epistemic and aleatory, although it may be<br />

generally true that “scenario” uncertainties contain a larger element of aleatory uncertainty than the<br />

other two groups. To take an example, typically “parameter” uncertainties may arise for the following<br />

reasons:<br />

The parameter values have not been determined exactly. This type of uncertainty is largely epistemic<br />

in quality, and can be reduced with further effort.<br />

The models use single (or spatially averaged) values for parameters, derived from measurements<br />

at discrete locations, whereas in reality there is continuous variation in parameter values<br />

over space - as well as over time. This type of uncertainty is partly aleatory in quality and cannot<br />

be reduced by further effort.<br />

2.2 Dealing with Uncertainty in the Quantitative PA<br />

2.2.1 Parameter uncertainty<br />

Uncertainties associated with model parameter values can be treated conveniently within most computational<br />

schemes. Common approaches to treating parameter value uncertainty include the following:<br />

1. Setting probability distributions functions (PDFs) for parameter values, which are sampled during<br />

the course of a probabilistic assessment.<br />

2. Repeat deterministic calculations where individual parameter values are varied across a range of<br />

likely or possible values, including deterministic calculations using values representing the best<br />

understanding available (“best estimate”) to better understand the system, e.g., with regard to<br />

sensitivities.<br />

3. Deterministic calculations where deliberately pessimistic values of parameters are taken, producing<br />

a “conservative” estimate of the value of receptor quantities in order to demonstrate<br />

compliance with limits.<br />

379

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!