Voici les éléments 1 - 9 sur 9
  • Publication
    Accès libre
    Influence of conceptual Model uncertainty on contaminant transport forecasting in braided river aquifers
    (2015-12) ; ;
    Huber, Emanuel
    ;
    ;
    Huggenberger, Peter
    Hydrogeologist are commonly confronted to field data scarcity. An interesting way to compensate this data paucity, is to use analog data. Then the questions of prediction accuracy and uncertainty assessment when using analog data shall be raised. These questions are investigated in the current paper in the case of contaminant transport forecasting in braided river aquifers. In using analog data from the literature, multiple unconditional geological realizations are produced following different geological conceptual models (Multi-Gaussian, Object-based, Pseudo-Genetic). These petrophysical realizations are tested in a contaminant transport problem based on the MADE-II tracer experiment dataset. The simulations show that reasonable contaminant transport predictions can be achieved using analog data. The initial concentration conditions and location regarding the conductivity heterogeneity field have a stronger influence on the plume behavior than the resulting equivalent permeability. The results also underline the necessity to include a wide variety of geological conceptual models and not to restrain parameter space exploration within each concept as long as no field data allows for conceptual model or parameter value falsification.
  • Publication
    Accès libre
    A pseudo genetic model of coarse braided-river deposits
    A new method is proposed to produce three-dimensional facies models of braided-river aquifers based on analog data. The algorithm consists of two steps. The first step involves building the main geological units. The production of the principal inner structures of the aquifer is achieved by stacking Multiple-Point-Statistics simulations of successive topographies, thus mimicking the major successive flooding events responsible for the erosion and deposition of sediments. The second step of the algorithm consists of generating fine scale heterogeneity within the main geological units. These smaller-scale structures are generated by mimicking the trough-filling process occurring in braided rivers; the imitation of the physical processes relies on the local topography and on a local approximation of the flow. This produces realistic cross-stratified sediments, comparable to what can be observed in outcrops. The three main input parameters of the algorithm offer control over the proportions, the continuity and the dimensions of the deposits. Calibration of these parameters does not require invasive field measurements and can rely partly on analog data.
  • Publication
    Accès libre
    Influence of conceptual model uncertainty on contaminant transport forecasting in braided river aquifers
    (2015) ; ;
    Huber, E
    ;
    Straubhaar, J
    ;
    Huggenberger, P
    Hydrogeologist are commonly confronted to field data scarcity. An interesting way to compensate this data paucity, is to use analog data. Then the questions of prediction accuracy and uncertainty assessment when using analog data shall be raised. These questions are investigated in the current paper in the case of contaminant transport forecasting in braided river aquifers. In using analog data from the literature, multiple unconditional geological realizations are produced following different geological conceptual models (Multi-Gaussian, Object-based, Pseudo-Genetic). These petrophysical realizations are tested in a contaminant transport problem based on the MADE-II tracer experiment dataset. The simulations show that reasonable contaminant transport predictions can be achieved using analog data. The initial concentration conditions and location regarding the conductivity heterogeneity field have a stronger influence on the plume behavior than the resulting equivalent permeability. The results also underline the necessity to include a wide variety of geological conceptual models and not to restrain parameter space exploration within each concept as long as no field data allows for conceptual model or parameter value falsification.
  • Publication
    Accès libre
    Simulation of braided river elevation model time series with multiple-point statistics
    A new method is proposed to generate successive topographies in a braided river system. Indeed, braided river morphology models are a key factor influencing river-aquifer interactions and have repercussions in ecosystems, flood risk or water management. It is essentially based on multivariate multiple-point statistics simulations and digital elevation models as training data sets. On the one hand, airborne photography and LIDAR acquired at successive time steps have contributed to a better understanding of the geomorphological processes although the available data are sparse over time and river scales. On the other hand, geostatistics provide simulation tools for multiple and continuous variables, which allow the exploration of the uncertainty of many assumption scenarios. Illustration of the approach demonstrates the ability of multiple-point statistics to produce realistic topographies from the information provided by digital elevation models at two time steps.
  • Publication
    Accès libre
    A practical guide to performing multiple-point statistical simulations with the Direct Sampling algorithm
    (2013-3)
    Meerschman, Eef
    ;
    ; ; ;
    Van Meirvenne, Marc
    ;
    The Direct Sampling (DS) algorithm is a recently developed multiple-point statistical simulation technique. It directly scans the training image (TI) for a given data event instead of storing the training probability values in a catalogue prior to simulation. By using distances between the given data events and the TI patterns, DS allows to simulate categorical, continuous and multivariate problems. Benefiting from the wide spectrum of potential applications of DS, requires understanding of the user-defined input parameters. Therefore, we list the most important parameters and assess their impact on the generated simulations. Real case TIs are used, including an image of ice-wedge polygons, a marble slice and snow crystals, all three as continuous and categorical images. We also use a 3D categorical TI representing a block of concrete to demonstrate the capacity of DS to generate 3D simulations. First, a quantitative sensitivity analysis is conducted on the three parameters balancing simulation quality and CPU time: the acceptance threshold t, the fraction of TI to scan f and the number of neighbors n. Next to a visual inspection of the generated simulations, the performance is analyzed in terms of speed of calculation and quality of pattern reproduction. Whereas decreasing the CPU time by influencing t and n is at the expense of simulation quality, reducing the scanned fraction of the TI allows substantial computational gains without degrading the quality as long as the TI contains enough reproducible patterns. We also illustrate the quality improvement resulting from post-processing and the potential of DS to simulate bivariate problems and to honor conditioning data. We report a comprehensive guide to performing multiple-point statistical simulations with the DS algorithm and provide recommendations on how to set the input parameters appropriately.
  • Publication
    Accès libre
    Distance-based kriging relying on proxy simulations for inverse conditioning
    (2013-1-10)
    Ginsbourger, David
    ;
    Rosspopoff, Bastien
    ;
    ;
    Durrande, Nicolas
    ;
  • Publication
    Accès libre
    Distance-based kriging relying on proxy simulations for inverse conditioning
    Ginsbourger, David
    ;
    Rosspopoff, Bastien
    ;
    ;
    Durrande, Nicolas
    ;
    Let us consider a large set of candidate parameter fields, such as hydraulic conductivity maps, on which we can run an accurate forward flow and transport simulation. We address the issue of rapidly identifying a subset of candidates whose response best match a reference response curve. In order to keep the number of calls to the accurate flow simulator computationally tractable, a recent distance-based approach relying on fast proxy simulations is revisited, and turned into a non-stationary kriging method where the covariance kernel is obtained by combining a classical kernel with the proxy. Once the accurate simulator has been run for an initial subset of parameter fields and a kriging metamodel has been inferred, the predictive distributions of misfits for the remaining parameter fields can be used as a guide to select candidate parameter fields in a sequential way. The proposed algorithm, Proxy-based Kriging for Sequential Inversion (ProKSI), relies on a variant of the Expected Improvement, a popular criterion for kriging-based global optimization. A statistical benchmark of ProKSI’s performances illustrates the efficiency and the robustness of the approach when using different kinds of proxies.
  • Publication
    Accès libre
    Simulation of braided river elevation model time series with multiple-point statistics
    A new method is proposed to generate successive topographies in a braided river system. Indeed, braided river morphologymodels are a key factor influencing river–aquifer interactions and have repercussions in ecosystems, flood risk or water management. It is essentially based on multivariate multiple-point statistics simulations and digital elevation models as training data sets. On the one hand, airborne photography and LIDAR acquired at successive time steps have contributed to a better understanding of the geomorphological processes although the available data are sparse over time and river scales. On the other hand, geostatistics provide simulation tools for multiple and continuous variables, which allow the exploration of the uncertainty of many assumption scenarios. Illustration of the approach demonstrates the ability of multiple-point statistics to produce realistic topographies from the information provided by digital elevation models at two time steps.
  • Publication
    Accès libre
    A practical guide to performing multiple-point statistical simulations with the Direct Sampling algorithm
    The Direct Sampling (DS) algorithm is a recently developed multiple-point statistical simulation technique. It directly scans the training image (TI) for a given data event instead of storing the training probability values in a catalogue prior to simulation. By using distances between the given data events and the TI patterns, DS allows to simulate categorical, continuous and multivariate problems. Benefiting from the wide spectrum of potential applications of DS, requires understanding of the user-defined input parameters. Therefore, we list the most important parameters and assess their impact on the generated simulations. Real case TIs are used, including an image of ice-wedge polygons, a marble slice and snow crystals, all three as continuous and categorical images. We also use a 3D categorical TI representing a block of concrete to demonstrate the capacity of DS to generate 3D simulations. First, a quantitative sensitivity analysis is conducted on the three parameters balancing simulation quality and CPU time: the acceptance threshold t, the fraction of TI to scan f and the number of neighbors n. Next to a visual inspection of the generated simulations, the performance is analyzed in terms of speed of calculation and quality of pattern reproduction. Whereas decreasing the CPU time by influencing t and n is at the expense of simulation quality, reducing the scanned fraction of the TI allows substantial computational gains without degrading the quality as long as the TI contains enough reproducible patterns. We also illustrate the quality improvement resulting from post-processing and the potential of DS to simulate bivariate problems and to honor conditioning data. We report a comprehensive guide to performing multiple-point statistical simulations with the DS algorithm and provide recommendations on how to set the input parameters appropriately.