- Pirot, Guillaume

###### Options

# Pirot, Guillaume

Nom

Pirot, Guillaume

Affiliation principale

Identifiants

## Résultat de la recherche

11 Résultats Retour aux résultats

### Filtres

##### Auteur

##### Éditeur

##### Institution

##### Sujet

##### Type

### Paramètres

Trier par

Résultats par page

Voici les éléments 1 - 10 sur 11

- PublicationAccès libreDistance-based kriging relying on proxy simulations for inverse conditioning(2013-1-10)
;Ginsbourger, David ;Rosspopoff, Bastien; ;Durrande, NicolasMontrer plus - PublicationAccès libreStochastic heterogeneity modeling of braided river aquifers: a methodology based on multiple point statistics and analog data
Montrer plus In this thesis a new pseudo-genetic method to model the heterogeneity of sandy gravel braided-river aquifers is proposed. It is tested and compared with other modeling approaches on a case study of contaminant transport. Indeed, in Switzerland or in mountainous regions, braided-river aquifers represent an important water resource that need to be preserved. In order to manage this resource, a good understanding of groundwater flow and transport in braided-river aquifers is necessary. As the complex heterogeneity of such sedimentary deposits strongly influences the groundwater flow and transport, groundwater behavior predictions need to rely on a wide spectrum of geological model realizations.

To achieve realistic sedimentary deposits modeling of braided river aquifers, the proposed pseudo-genetic algorithm combines the use of analogue data with Multiple-Point Statistics and process-imitating methods. The integration of analogue data is a key feature to provide additional, complementary and necessary information in the modeling process. Assuredly, hydrogeologist are often subject to field data scarcity because of budget, time and field constraints. Multiple-Points Statistics recent algorithms, on one hand, allow to produce realistic stochastic realizations from training set with complex structures and at the same time allow to honor easily conditioning data. On the other hand, process-imitating methods allow to generate realistic patterns by mimicking physical processes.

The proposed pseudo-genetic algorithm consists of two main steps. The first step is to build main geological units by stacking successive topography realizations one above the other. So, it mimics the successive large flood events contributing to the formation of the sedimentary deposits. The successive topographies are Multiple-Point Statistics realizations from a training set composed of Digital Elevation Models of an analogue braided-river at different time steps. Each topography is generated conditionally to the previous one. The second step is to generate fine scale heterogeneity within the main geological units. This is performed for each geological unit by iterative deformations of the unit bottom surface, imitating so the process of scour filling. With three main parameters, the aggradation rate, the number of successive iterations and the intensity of the deformations, the algorithm allows to produce a wide range of realistic cross-stratified sedimentary deposits.

The method is tested in a contaminant transport example, using as reference Tritium tracer experiment concentration data from MADE site, Columbus, Mississippi, USA. In this test case, an assumption of data scarcity is made. Analogue data are integrated in the geological modeling process to determine the input parameters required -- characteristic dimensions and conductivity statistical properties -- for two variants of the proposed pseudo-genetic algorithm as well as for multi-gaussian simulation and object based methods. For each conceptual model, flow and transport simulations are run over 200 geological model realizations to cover a part of the uncertainty due to the input parameters. A comparison of the plume behavior prediction is performed between the different conceptual models.

The results show that geological structures strongly influence the plume behavior, therefore the choice or the restriction to specific conceptual models will impact the prediction uncertainty. Though little information are available for the modeler, it is possible to achieve reasonable predictions by using analogue data. Of course, with limited information, it is impossible to make an accurate prediction to match the reference, and none of each conceptual model produces better predictions but all are useful to cover the uncertainty range. The results also underline the need to consider a wide exploration of the input parameters for the various conceptual models in order to recover the uncertainty.Montrer plus - PublicationAccès libreSimulation of braided river elevation model time series with multiple-point statistics
Montrer plus A new method is proposed to generate successive topographies in a braided river system. Indeed, braided river morphologymodels are a key factor influencing river–aquifer interactions and have repercussions in ecosystems, flood risk or water management. It is essentially based on multivariate multiple-point statistics simulations and digital elevation models as training data sets. On the one hand, airborne photography and LIDAR acquired at successive time steps have contributed to a better understanding of the geomorphological processes although the available data are sparse over time and river scales. On the other hand, geostatistics provide simulation tools for multiple and continuous variables, which allow the exploration of the uncertainty of many assumption scenarios. Illustration of the approach demonstrates the ability of multiple-point statistics to produce realistic topographies from the information provided by digital elevation models at two time steps.Montrer plus - PublicationAccès libreInfluence of conceptual model uncertainty on contaminant transport forecasting in braided river aquifers(2015)
; ; ;Huber, E ;Straubhaar, JHuggenberger, PMontrer plus Hydrogeologist are commonly confronted to field data scarcity. An interesting way to compensate this data paucity, is to use analog data. Then the questions of prediction accuracy and uncertainty assessment when using analog data shall be raised. These questions are investigated in the current paper in the case of contaminant transport forecasting in braided river aquifers. In using analog data from the literature, multiple unconditional geological realizations are produced following different geological conceptual models (Multi-Gaussian, Object-based, Pseudo-Genetic). These petrophysical realizations are tested in a contaminant transport problem based on the MADE-II tracer experiment dataset. The simulations show that reasonable contaminant transport predictions can be achieved using analog data. The initial concentration conditions and location regarding the conductivity heterogeneity field have a stronger influence on the plume behavior than the resulting equivalent permeability. The results also underline the necessity to include a wide variety of geological conceptual models and not to restrain parameter space exploration within each concept as long as no field data allows for conceptual model or parameter value falsification.Montrer plus - PublicationAccès libreA pseudo genetic model of coarse braided-river deposits(2015-12)
; ; Montrer plus A new method is proposed to produce three-dimensional facies models of braided-river aquifers based on analog data. The algorithm consists of two steps. The first step involves building the main geological units. The production of the principal inner structures of the aquifer is achieved by stacking Multiple-Point-Statistics simulations of successive topographies, thus mimicking the major successive flooding events responsible for the erosion and deposition of sediments. The second step of the algorithm consists of generating fine scale heterogeneity within the main geological units. These smaller-scale structures are generated by mimicking the trough-filling process occurring in braided rivers; the imitation of the physical processes relies on the local topography and on a local approximation of the flow. This produces realistic cross-stratified sediments, comparable to what can be observed in outcrops. The three main input parameters of the algorithm offer control over the proportions, the continuity and the dimensions of the deposits. Calibration of these parameters does not require invasive field measurements and can rely partly on analog data.Montrer plus - PublicationAccès libreA practical guide to performing multiple-point statistical simulations with the Direct Sampling algorithm
;Meerschman, Eef; ; ; ;Van Meirvenne, MarcMontrer plus The Direct Sampling (DS) algorithm is a recently developed multiple-point statistical simulation technique. It directly scans the training image (TI) for a given data event instead of storing the training probability values in a catalogue prior to simulation. By using distances between the given data events and the TI patterns, DS allows to simulate categorical, continuous and multivariate problems. Benefiting from the wide spectrum of potential applications of DS, requires understanding of the user-defined input parameters. Therefore, we list the most important parameters and assess their impact on the generated simulations. Real case TIs are used, including an image of ice-wedge polygons, a marble slice and snow crystals, all three as continuous and categorical images. We also use a 3D categorical TI representing a block of concrete to demonstrate the capacity of DS to generate 3D simulations. First, a quantitative sensitivity analysis is conducted on the three parameters balancing simulation quality and CPU time: the acceptance threshold*t*, the fraction of TI to scan*f*and the number of neighbors*n*. Next to a visual inspection of the generated simulations, the performance is analyzed in terms of speed of calculation and quality of pattern reproduction. Whereas decreasing the CPU time by influencing*t*and*n*is at the expense of simulation quality, reducing the scanned fraction of the TI allows substantial computational gains without degrading the quality as long as the TI contains enough reproducible patterns. We also illustrate the quality improvement resulting from post-processing and the potential of DS to simulate bivariate problems and to honor conditioning data. We report a comprehensive guide to performing multiple-point statistical simulations with the DS algorithm and provide recommendations on how to set the input parameters appropriately.Montrer plus - PublicationAccès libreInfluence of conceptual Model uncertainty on contaminant transport forecasting in braided river aquifers(2015-12)
; ; ;Huber, Emanuel; Huggenberger, PeterMontrer plus Hydrogeologist are commonly confronted to field data scarcity. An interesting way to compensate this data paucity, is to use analog data. Then the questions of prediction accuracy and uncertainty assessment when using analog data shall be raised. These questions are investigated in the current paper in the case of contaminant transport forecasting in braided river aquifers. In using analog data from the literature, multiple unconditional geological realizations are produced following different geological conceptual models (Multi-Gaussian, Object-based, Pseudo-Genetic). These petrophysical realizations are tested in a contaminant transport problem based on the MADE-II tracer experiment dataset. The simulations show that reasonable contaminant transport predictions can be achieved using analog data. The initial concentration conditions and location regarding the conductivity heterogeneity field have a stronger influence on the plume behavior than the resulting equivalent permeability. The results also underline the necessity to include a wide variety of geological conceptual models and not to restrain parameter space exploration within each concept as long as no field data allows for conceptual model or parameter value falsification.Montrer plus - PublicationAccès libreDistance-based kriging relying on proxy simulations for inverse conditioning
;Ginsbourger, David ;Rosspopoff, Bastien; ;Durrande, NicolasMontrer plus Let us consider a large set of candidate parameter fields, such as hydraulic conductivity maps, on which we can run an accurate forward flow and transport simulation. We address the issue of rapidly identifying a subset of candidates whose response best match a reference response curve. In order to keep the number of calls to the accurate flow simulator computationally tractable, a recent distance-based approach relying on fast proxy simulations is revisited, and turned into a non-stationary kriging method where the covariance kernel is obtained by combining a classical kernel with the proxy. Once the accurate simulator has been run for an initial subset of parameter fields and a kriging metamodel has been inferred, the predictive distributions of misfits for the remaining parameter fields can be used as a guide to select candidate parameter fields in a sequential way. The proposed algorithm,*Proxy-based Kriging for Sequential Inversion*(ProKSI), relies on a variant of the*Expected Improvement*, a popular criterion for kriging-based global optimization. A statistical benchmark of ProKSI’s performances illustrates the efficiency and the robustness of the approach when using different kinds of proxies.Montrer plus - PublicationMétadonnées seulementConditioning of Multiple-Point Statistics Facies Simulations to Tomographic Images(2014-7)
;Lochbühler, Tobias; ; Linde, NiklasMontrer plus Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.Montrer plus - PublicationAccès libreA practical guide to performing multiple-point statistical simulations with the Direct Sampling algorithm(2013-3)
;Meerschman, Eef; ; ; ;Van Meirvenne, MarcMontrer plus The Direct Sampling (DS) algorithm is a recently developed multiple-point statistical simulation technique. It directly scans the training image (TI) for a given data event instead of storing the training probability values in a catalogue prior to simulation. By using distances between the given data events and the TI patterns, DS allows to simulate categorical, continuous and multivariate problems. Benefiting from the wide spectrum of potential applications of DS, requires understanding of the user-defined input parameters. Therefore, we list the most important parameters and assess their impact on the generated simulations. Real case TIs are used, including an image of ice-wedge polygons, a marble slice and snow crystals, all three as continuous and categorical images. We also use a 3D categorical TI representing a block of concrete to demonstrate the capacity of DS to generate 3D simulations. First, a quantitative sensitivity analysis is conducted on the three parameters balancing simulation quality and CPU time: the acceptance threshold t, the fraction of TI to scan f and the number of neighbors n. Next to a visual inspection of the generated simulations, the performance is analyzed in terms of speed of calculation and quality of pattern reproduction. Whereas decreasing the CPU time by influencing t and n is at the expense of simulation quality, reducing the scanned fraction of the TI allows substantial computational gains without degrading the quality as long as the TI contains enough reproducible patterns. We also illustrate the quality improvement resulting from post-processing and the potential of DS to simulate bivariate problems and to honor conditioning data. We report a comprehensive guide to performing multiple-point statistical simulations with the DS algorithm and provide recommendations on how to set the input parameters appropriately.Montrer plus