Voici les éléments 1 - 8 sur 8
  • Publication
    Accès libre
    Contaminant source localization via Bayesian global optimization
    (2019) ;
    Tipaluck Krityakierne
    ;
    David Ginsbourger
    ;
    Abstract. Contaminant source localization problems require efficient and robust methods that can account for geological heterogeneities and accommodate relatively small data sets of noisy observations. As realism commands hi-fidelity simulations, computation costs call for global optimization algorithms under parsimonious evaluation budgets. Bayesian optimization approaches are well adapted to such settings as they allow the exploration of parameter spaces in a principled way so as to iteratively locate the point(s) of global optimum while maintaining an approximation of the objective function with an instrumental quantification of prediction uncertainty. Here, we adapt a Bayesian optimization approach to localize a contaminant source in a discretized spatial domain. We thus demonstrate the potential of such a method for hydrogeological applications and also provide test cases for the optimization community. The localization problem is illustrated for cases where the geology is assumed to be perfectly known. Two 2-D synthetic cases that display sharp hydraulic conductivity contrasts and specific connectivity patterns are investigated. These cases generate highly nonlinear objective functions that present multiple local minima. A derivative-free global optimization algorithm relying on a Gaussian process model and on the expected improvement criterion is used to efficiently localize the point of minimum of the objective functions, which corresponds to the contaminant source location. Even though concentration measurements contain a significant level of proportional noise, the algorithm efficiently localizes the contaminant source location. The variations of the objective function are essentially driven by the geology, followed by the design of the monitoring well network. The data and scripts used to generate objective functions are shared to favor reproducible research. This contribution is important because the functions present multiple local minima and are inspired from a practical field application. Sharing these complex objective functions provides a source of test cases for global optimization benchmarks and should help with designing new and efficient methods to solve this type of problem.
  • Publication
    Accès libre
    A pseudo genetic model of coarse braided-river deposits
    A new method is proposed to produce three-dimensional facies models of braided-river aquifers based on analog data. The algorithm consists of two steps. The first step involves building the main geological units. The production of the principal inner structures of the aquifer is achieved by stacking Multiple-Point-Statistics simulations of successive topographies, thus mimicking the major successive flooding events responsible for the erosion and deposition of sediments. The second step of the algorithm consists of generating fine scale heterogeneity within the main geological units. These smaller-scale structures are generated by mimicking the trough-filling process occurring in braided rivers; the imitation of the physical processes relies on the local topography and on a local approximation of the flow. This produces realistic cross-stratified sediments, comparable to what can be observed in outcrops. The three main input parameters of the algorithm offer control over the proportions, the continuity and the dimensions of the deposits. Calibration of these parameters does not require invasive field measurements and can rely partly on analog data.
  • Publication
    Accès libre
    Influence of conceptual Model uncertainty on contaminant transport forecasting in braided river aquifers
    (2015-12) ; ;
    Huber, Emanuel
    ;
    ;
    Huggenberger, Peter
    Hydrogeologist are commonly confronted to field data scarcity. An interesting way to compensate this data paucity, is to use analog data. Then the questions of prediction accuracy and uncertainty assessment when using analog data shall be raised. These questions are investigated in the current paper in the case of contaminant transport forecasting in braided river aquifers. In using analog data from the literature, multiple unconditional geological realizations are produced following different geological conceptual models (Multi-Gaussian, Object-based, Pseudo-Genetic). These petrophysical realizations are tested in a contaminant transport problem based on the MADE-II tracer experiment dataset. The simulations show that reasonable contaminant transport predictions can be achieved using analog data. The initial concentration conditions and location regarding the conductivity heterogeneity field have a stronger influence on the plume behavior than the resulting equivalent permeability. The results also underline the necessity to include a wide variety of geological conceptual models and not to restrain parameter space exploration within each concept as long as no field data allows for conceptual model or parameter value falsification.
  • Publication
    Accès libre
    Stochastic heterogeneity modeling of braided river aquifers: a methodology based on multiple point statistics and analog data
    In this thesis a new pseudo-genetic method to model the heterogeneity of sandy gravel braided-river aquifers is proposed. It is tested and compared with other modeling approaches on a case study of contaminant transport. Indeed, in Switzerland or in mountainous regions, braided-river aquifers represent an important water resource that need to be preserved. In order to manage this resource, a good understanding of groundwater flow and transport in braided-river aquifers is necessary. As the complex heterogeneity of such sedimentary deposits strongly influences the groundwater flow and transport, groundwater behavior predictions need to rely on a wide spectrum of geological model realizations.
    To achieve realistic sedimentary deposits modeling of braided river aquifers, the proposed pseudo-genetic algorithm combines the use of analogue data with Multiple-Point Statistics and process-imitating methods. The integration of analogue data is a key feature to provide additional, complementary and necessary information in the modeling process. Assuredly, hydrogeologist are often subject to field data scarcity because of budget, time and field constraints. Multiple-Points Statistics recent algorithms, on one hand, allow to produce realistic stochastic realizations from training set with complex structures and at the same time allow to honor easily conditioning data. On the other hand, process-imitating methods allow to generate realistic patterns by mimicking physical processes.
    The proposed pseudo-genetic algorithm consists of two main steps. The first step is to build main geological units by stacking successive topography realizations one above the other. So, it mimics the successive large flood events contributing to the formation of the sedimentary deposits. The successive topographies are Multiple-Point Statistics realizations from a training set composed of Digital Elevation Models of an analogue braided-river at different time steps. Each topography is generated conditionally to the previous one. The second step is to generate fine scale heterogeneity within the main geological units. This is performed for each geological unit by iterative deformations of the unit bottom surface, imitating so the process of scour filling. With three main parameters, the aggradation rate, the number of successive iterations and the intensity of the deformations, the algorithm allows to produce a wide range of realistic cross-stratified sedimentary deposits.
    The method is tested in a contaminant transport example, using as reference Tritium tracer experiment concentration data from MADE site, Columbus, Mississippi, USA. In this test case, an assumption of data scarcity is made. Analogue data are integrated in the geological modeling process to determine the input parameters required -- characteristic dimensions and conductivity statistical properties -- for two variants of the proposed pseudo-genetic algorithm as well as for multi-gaussian simulation and object based methods. For each conceptual model, flow and transport simulations are run over 200 geological model realizations to cover a part of the uncertainty due to the input parameters. A comparison of the plume behavior prediction is performed between the different conceptual models.
    The results show that geological structures strongly influence the plume behavior, therefore the choice or the restriction to specific conceptual models will impact the prediction uncertainty. Though little information are available for the modeler, it is possible to achieve reasonable predictions by using analogue data. Of course, with limited information, it is impossible to make an accurate prediction to match the reference, and none of each conceptual model produces better predictions but all are useful to cover the uncertainty range. The results also underline the need to consider a wide exploration of the input parameters for the various conceptual models in order to recover the uncertainty.
  • Publication
    Métadonnées seulement
    Conditioning of Multiple-Point Statistics Facies Simulations to Tomographic Images
    (2014-7)
    LochbĂĽhler, Tobias
    ;
    ; ;
    Linde, Niklas
    Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.
  • Publication
    Accès libre
    Simulation of braided river elevation model time series with multiple-point statistics
    A new method is proposed to generate successive topographies in a braided river system. Indeed, braided river morphology models are a key factor influencing river-aquifer interactions and have repercussions in ecosystems, flood risk or water management. It is essentially based on multivariate multiple-point statistics simulations and digital elevation models as training data sets. On the one hand, airborne photography and LIDAR acquired at successive time steps have contributed to a better understanding of the geomorphological processes although the available data are sparse over time and river scales. On the other hand, geostatistics provide simulation tools for multiple and continuous variables, which allow the exploration of the uncertainty of many assumption scenarios. Illustration of the approach demonstrates the ability of multiple-point statistics to produce realistic topographies from the information provided by digital elevation models at two time steps.
  • Publication
    Accès libre
    A practical guide to performing multiple-point statistical simulations with the Direct Sampling algorithm
    (2013-3)
    Meerschman, Eef
    ;
    ; ; ;
    Van Meirvenne, Marc
    ;
    The Direct Sampling (DS) algorithm is a recently developed multiple-point statistical simulation technique. It directly scans the training image (TI) for a given data event instead of storing the training probability values in a catalogue prior to simulation. By using distances between the given data events and the TI patterns, DS allows to simulate categorical, continuous and multivariate problems. Benefiting from the wide spectrum of potential applications of DS, requires understanding of the user-defined input parameters. Therefore, we list the most important parameters and assess their impact on the generated simulations. Real case TIs are used, including an image of ice-wedge polygons, a marble slice and snow crystals, all three as continuous and categorical images. We also use a 3D categorical TI representing a block of concrete to demonstrate the capacity of DS to generate 3D simulations. First, a quantitative sensitivity analysis is conducted on the three parameters balancing simulation quality and CPU time: the acceptance threshold t, the fraction of TI to scan f and the number of neighbors n. Next to a visual inspection of the generated simulations, the performance is analyzed in terms of speed of calculation and quality of pattern reproduction. Whereas decreasing the CPU time by influencing t and n is at the expense of simulation quality, reducing the scanned fraction of the TI allows substantial computational gains without degrading the quality as long as the TI contains enough reproducible patterns. We also illustrate the quality improvement resulting from post-processing and the potential of DS to simulate bivariate problems and to honor conditioning data. We report a comprehensive guide to performing multiple-point statistical simulations with the DS algorithm and provide recommendations on how to set the input parameters appropriately.
  • Publication
    Accès libre
    Distance-based kriging relying on proxy simulations for inverse conditioning
    (2013-1-10)
    Ginsbourger, David
    ;
    Rosspopoff, Bastien
    ;
    ;
    Durrande, Nicolas
    ;