Calibration and prediction of two nested computer codes - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2017

Calibration and prediction of two nested computer codes

Résumé

Thanks to computing power increase, risk quantification relies more and more on computer modeling. Methods of risk quantification based on a fixed computational budget exist, but computer codes are almost always considered as a single black box. In this paper, we are interested in analyzing the behavior of a complex phenomenon, which consists of two nested computer codes. By two nested computer codes, we mean that some inputs of the second code are outputs of the first code. Each code can be approximated by a parametrized computer model. First we propose methods to calibrate the parameters of the computer models and build a pre-dictor of the nested phenomenon for a given set of observations. The presented methods enable to take into account observations of the first code, the second code and the nested code. Second the choice of the observations is studied. Methods of sequential designs, that means step by step addition of observation points to the observations' set, are examined. These sequential designs aim at improving the performance of calibration or prediction while optimizing the computational budget. We show that the independent calibration of the 2 computer models is not efficient for predicting the nested phenomenon. We propose an original method that significantly improves the prediction's performance.
Fichier principal
Vignette du fichier
2017_02_22_SMP_Article.pdf (1014.01 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01566968 , version 1 (21-07-2017)

Identifiants

  • HAL Id : hal-01566968 , version 1

Citer

Sophie Marque-Pucheu, Guillaume Perrin, Josselin Garnier. Calibration and prediction of two nested computer codes. 2017. ⟨hal-01566968⟩
356 Consultations
224 Téléchargements

Partager

Gmail Facebook X LinkedIn More