IRSN, Institut de radioprotection et de sûreté nucléaire

Search our site :


Contact us :

En Fr

All our expertise to protect you


Computer codes

SUNSET (Sensitivity and UNcertainty Statistical Evaluation Tool)

The SUNSET software is a statistical tool providing a collection of methods for information treatment in risk analysis studies.



Information treatment in SUNSET


Integrating uncertainty in risk analysis studies


SUNSET allows the user to evaluate the uncertainties associated to the results coming from risk analysis studies.

It includes statistical tools to perform a probabilistic assessment of uncertainties, where the uncertainty sources are modelled using random variables. The probabilistic methods have the advantage of providing a realistic and reasonable evaluation of the uncertainty margin associated with a given result and are simple to use. Regardless of how many uncertain variables are to be considered, probabilistic theory makes it possible to start with a rather limited number of calculations and to evaluate the confidence intervals for each result. However, probabilistic methods require a lot of information to select a probability density function for each uncertain parameter and also all the possible dependencies between the uncertain parameters. Unfortunately, such knowledge is rarely available. To overcome this lack of knowledge in practical studies, a principle of minimal information is used. It leads to select the uniform distribution when only the uncertainty range of the parameter is known, for example, or to take an independence assumption between two uncertain parameters when no information is known about their dependence.

To be able to quantify the influence of these choices on the uncertainty margins, techniques based on the Dempster-Shafer evidence framework are available in SUNSET. They allow the user to evaluate the robustness of the uncertainty analysis in presence of incomplete knowledge.


SUNSET can also be used for sensitivity analyses. In sensitivity analyses, one intends to identify the variables which have the largest contributions to the overall model response uncertainty, but also to directly study the explicit relationship between the variables and each response. To perform this kind of analysis, SUNSET integrates algebraic and statistical tools combining design of experiment theory and regression techniques. 


Both uncertainty analysis and sensitivity analyses require to run the model used several times in the risk analysis study in order to propagate the information from its input to its outputs. Therefore, several coupling functionalities are available in SUNSET: for analytical models, it is possible for the user to directly introduce the analytical expression in a SUNSET datadeck, whereas for complex computer codes, non-intrusive (i.e. by data files) or intrusive (i.e. by developing some special coupling procedures) coupling can be performed.

Example of application: SUNSET has been used to perform uncertainty analyses in case of loss of coolant accident (OECD BEMUSE Programme).



Example of SUNSET results: 95%-percentile associated to the first peak cladding temperature (BEMUSE Program) following a probabilistic and a Dempster-Shafer approach. The 95%-percentile is classically interpreted as the upper bound of the uncertainty margin in nuclear safety studies. This figure displays vertically the different methods to estimate this quantity (Dempster-Shafer or probabilistic) and horizontally the different values of the percentile. For a probabilistic approach, the 95%-percentile is estimated to 1124°K while it can take any values between 1058°K and 1156°K in the Dempster-Shafer framework. This range translates the effect of a misleading choice in the uncertainty modelling that is comparable to the dispersion between participants’ reference calculations.


Evaluation, fusion and aggregation of information


  • Evaluation and fusion of information


In safety assessment, different uncertainty analyses using different computer codes and implying different experts are generally performed. Each uncertainty study or expert can be viewed as an information source. Therefore, taking advantage of these analyses appears to be a question of information evaluation (i.e. how to measure the quality of provided information) and fusion (i.e. how to combine the different information given by several sources on a same output such as temperature).


For the evaluation of information, SUNSET allows to compute two numerical criteria: informativeness and calibration, in order to evaluate the precision of the information and its coherence with observed reference values (such as experimental ones) respectively. Concerning information fusion which intends to build a summary of all information provided by the sources, SUNSET also includes several fusion operators. The construction of the evaluation and the fusion tools is first of all based on the probability approach. In this case, the usual fusion operator is the mean operator from which it is difficult to clearly identify the possible conflicts between the sources. Therefore, a possibility-based approach is also available in SUNSET to offer a greater flexibility for fusion operators. More precisely, working in the possibilistic framework allows to define conjunctive and disjunctive operators. The first type is equivalent to taking the intersection and assumes the reliability of all sources; it is useful to quantify the conflict among sources. It produces precise but potentially unreliable results in case of strong conflict. The second type is equivalent to taking the union and makes the conservative assumption that at least one source is reliable. It generally produces imprecise but reliable results.


  • Aggregation of information


When an analyst has to evaluate a risk associated to a complex system involving several and conflictual stakes, he needs to make some compromises (for example:  to balance the benefit of keeping the containment of a nuclear reactor, with the induced increasing risk of overpressure). Such choices are based on his knowledge of the system and also of the importance related to different stakes involved in the accident. It is therefore necessary to derive formal approaches to aggregate the available information in order to generate further insights and understandings of the risk assessment. This formalization is all the more required if the information to aggregate is the result provided by a panel of experts. To achieve the aggregation, SUNSET integrates Multi-Criteria Decision Analysis (MCDA) tools based on outranking methods such as ELECTRE and PROMETHEE. The aim of MCDA is not to replace intuitive judgement or experience but to complement and to challenge intuition, to act as a sounding-board against which ideas can be tested. Thus, MCDA has to be embedded in the wider process of problem structuring by identifying criteria allowing to model the experts’ preferences in a form which is transparent and easy to work with.

Example of application: the evaluation and fusion tools available in SUNSET have been applied to synthesize the uncertainty results of the OECD BEMUSE benchmark related to the simulation of a loss of coolant accident.

The MCDA methodology has been applied to define multicriteria indices allowing to take into account the diversity of stakes involved in the case of a radionuclide release occurring during a postulated nuclear accident (cf PRIME project).


Example of SUNSET results: Fusion of the information related to the uncertainty associated to the first peak cladding temperature in the case of a large Break LOCA scenario (BEMUSE Program). SUNSET was used to compute the discrete values of each triangle represented by the yellow, pink, light and dark blue dots. Each triangle represents the coherence between users of a same thermohydraulic code obtained after a fusion between the provided information. The degree of coherence can be read on the vertical axis. If the top of the triangle is close to 1, the users of the code are very coherent (see yellow triangle for example) whereas a value close to 0 (light blue triangle) indicates that they are conflicting.



Data reconstruction


The objective of reconstruction techniques is to accurately approximate a phenomenon on a given region when only a limited number of observations or simulations is available.

In safety assessment, they can be used to construct an approximation of the computer code (or meta-model or response surface) in order to reduce the computational cost of the uncertainty propagation.

These techniques also play a key role in the treatment of spatial data, such as measurements over a given territory. In this case, they are used to predict new values of the phenomenon under study, to construct a map from the knowledge of a set of discrete observations coming from sensors, for example.


SUNSET provides two interpolatory techniques to perform this reconstruction. The first one is a deterministic barycentric approach called the inverse distance weighted method. It is appropriate to predict the local behaviour of the phenomenon without assuming a model to describe the whole data. However, it does not provide a prediction error associated with each predicted value which is crucial in safety assessment. To circumvent this limitation, a second interpolatory technique based on stochastic kriging approach is available in SUNSET. Its construction relies on the identification of the spatial dependence between the observed data (i.e. the similarity between two data with respect to their distance) through the computation of a semi-variogram. It allows the user to quantify an estimation error that controls the confidence in the predicted values.


Applications: SUNSET is currently used as a complementary tool to the mapping software involved in environmental monitoring studies performed at IRSN.



Example of SUNSET results related to environmental monitoring: Map of equivalent dose rate. Left, predicted values, right, estimation error (the black points stand for the sensors, the colour intensity is proportional to the estimated value). The post-processing of the SUNSET results (i.e. the map) was performed with ArcGIS. 


SUNSET datadeck construction and portability


A SUNSET datadeck is composed of a sequence of actions, each action corresponding to a given functionality chosen by the user for his/her analysis. Their specification is performed through a Graphical User Interface (GUI) called PelGUIS and developed at IRSN. The visualization of the SUNSET results can be done through the SUNSET GUI for basic plots or through external softwares such as Excel for more sophisticated post-processing.



Example of PelGUIS: Construction of an action



Ongoing works


SUNSET integrates the methodological developments performed at IRSN in collaboration with the University of Toulouse and the Ecole Centrale Marseille. They concern the modelling, propagation, synthesis and reconstruction of information. In the future SUNSET versions, it is planned to implement Bayesian networks approaches for risk analysis and to provide new models for data reconstruction such as lognormal kriging or kriging with heterogeneous nugget effect.  


Moreover, SUNSET has been coupled to several complex computer codes used/developed at IRSN such as the SYLVIA code for fire safety, or the ASTEC code, simulating all the phenomena that occur during a severe accident in a water-cooled nuclear reactor. It has also been integrated in the CLARA2 tool devoted to risk analysis in case of a pollutant release during a maritime transport (cf CLARA project).

A new coupling to the PROMETHEE platform is planned in order to optimize the available computational resources while exploiting the statistical methods of SUNSET. 

Send Print

Involved IRSN laboratory

SUNSET software

Description sheet



Send to a friend

The information you provide in this page are single use only and will not be saved.
* Required fields

Recipient's email:*  

Sign with your name:* 

Type your email address:*   

Add a message :

Do you want to receive a copy of this email?