Voici les éléments 1 - 10 sur 13
  • Publication
    Accès libre
    Efficiency of template matching methods for Multiple-Point Statistics simulations
    (2021-8)
    Sharifzadeh Lari, Mansoureh
    ;
    ;
    Almost all Multiple-Point Statistic (MPS) methods use internally a template matching method to select patterns that best match conditioning data. The purpose of this paper is to analyze the performances of ten of the most frequently used template matching techniques in the framework of MPS algorithms. Performance is measured in terms of computing efficiency, accuracy, and memory usage. The methods were tested with both categorical and continuous training images (TI). The analysis considers the ability of those methods to locate rapidly and with minimum error a data event with a specific proportion of known pixels and a certain amount of noise. Experiments indicate that the Coarse to Fine using Entropy (CFE) method is the fastest in all configurations. Skipping methods are efficient as well. In terms of accuracy, and without noise all methods except CFE and cross correlation (CC) perform well. CC is the least accurate in all configurations if the TI is not normalized. This method performs better when normalized training images are used. The Binary Sum of Absolute Difference is the most robust against noise. Finally, in terms of memory usage, CFE is the worst among the ten methods that were tested; the other methods are not significantly different.
  • Publication
    Accès libre
    The Traveling Pilot Point method. A novel approach to parameterize the inverse problem for categorical fields
    Categorical parameter distributions are common-place in hydrogeological systems consisting of geologic fa cies/categories with distinct properties, e.g., high-permeability channels embedded in a low-permeability ma trix. Parameter estimation is difficult in such systems because the discontinuities in the parameter space hinder the inverse problem. Previous research in this area has been focused on the use of stochastic methods. In this paper, we present a novel approach based on Traveling Pilot points (TRIPS) combined with subspace parameter estimation methods to generate realistic categorical parameter distributions that honor calibration constraints (e.g., - measured water levels). In traditional implementations, aquifer properties (e.g., hydraulic conductivity) are estimated at fixed pilot point locations. In the TRIPS implementation, both the properties associated with the pilot points and their locations are estimated. Tikhonov regularization constraints are incorporated in the param eter estimation process to produce realistic parameter depictions. For a synthetic aquifer system, we solved the categorical inverse problem by combining the TRIPS methodology with two subspace methods: Null Space Monte Carlo (NSMC) and Posterior Covariance (PC). A posterior ensemble developed with the rejection sampling (RS) method is compared against the TRIPS ensembles. The comparisons indicated similarities between the various ensembles and to the reference parameter distribution. Between the two subspace methods, the NSMC method produced an ensemble with more variability than the PC method. These preliminary results suggest that the TRIPS methodology has promise and could be tested on more complicated problems.
  • Publication
    Accès libre
    Missing data simulation inside flow rate time-series using multiple-point statistics
    The direct sampling (DS) multiple-point statistical technique is proposed as a non-parametric missing data simulator for hydrological flow rate time-series. The algorithm makes use of the patterns contained inside a training data set to reproduce the complexity of the missing data. The proposed setup is tested in the reconstruction of a flow rate time-series while considering several missing data scenarios, as well as a comparative test against a time-series model of type ARMAX. The results show that DS generates more realistic simulations than ARMAX, better recovering the statistical content of the missing data. The predictive power of both techniques is much increased when a correlated flow rate time-series is used, but DS can also use incomplete auxiliary time-series, with a comparable prediction power. This makes the technique a handy simulation tool for practitioners dealing with incomplete data sets.
  • Publication
    Accès libre
    Conditioning multiple-point statistics simulations to block data
    Multiple-points statistics (MPS) allows to generate random fields reproducing spatial statistics derived from a training image. MPS methods consist in borrowing patterns from the training set. Therefore, the simulation domain is assumed to be at the same resolution as the conceptual model, although geometrical deformations can be handled by such techniques. Whereas punctual conditioning data corresponding to the scale of the grid node can be easily integrated, accounting for data available at larger scales is challenging. In this paper, we propose an extension of MPS able to deal with block data, i.e. target mean values over subsets of the simulation domain. Our extension is based on the direct sampling algorithm and consists to add a criterion for the acceptance of the candidate node scanned in the training image to constrain the simulation to block data. Likelihood ratios are used to compare the averages of the simulated variable taken on the informed nodes in the blocks and the target mean values. Moreover, the block data may overlap and their support can be of any shape and size. Illustrative examples show the potential of the presented algorithm for practical applications.
  • Publication
    Accès libre
    Integrating aerial geophysical data in multiple-point statistics simulations to assist groundwater flow models
    (2015-10)
    Dickson, Neil
    ;
    Comte, Jean-Christophe
    ;
    ; ;
    McKinley, Jennifer
    ;
    Ofterdinger, Ulrich
    The process of accounting for heterogeneity has made significant advances in statistical research, primarily in the framework of stochastic analysis and the development of multiple-point statistics (MPS). Among MPS techniques, the direct sampling (DS) method is tested to determine its ability to delineate heterogeneity from aerial magnetics data in a regional sandstone aquifer intruded by low-permeability volcanic dykes in Northern Ireland, UK. The use of two two-dimensional bivariate training images aids in creating spatial probability distributions of heterogeneities of hydrogeological interest, despite relatively `noisy' magnetics data (i.e. including hydrogeologically irrelevant urban noise and regional geologic effects). These distributions are incorporated into a hierarchy system where previously published density function and upscaling methods are applied to derive regional distributions of equivalent hydraulic conductivity tensor K. Several K models, as determined by several stochastic realisations of MPS dyke locations, are computed within groundwater flow models and evaluated by comparing modelled heads with field observations. Results show a significant improvement in model calibration when compared to a simplistic homogeneous and isotropic aquifer model that does not account for the dyke occurrence evidenced by airborne magnetic data. The best model is obtained when normal and reverse polarity dykes are computed separately within MPS simulations and when a probability threshold of 0.7 is applied. The presented stochastic approach also provides improvement when compared to a previously published deterministic anisotropic model based on the unprocessed (i.e. noisy) airborne magnetics. This demonstrates the potential of coupling MPS to airborne geophysical data for regional groundwater modelling.
  • Publication
    Accès libre
    Simulation of braided river elevation model time series with multiple-point statistics
    A new method is proposed to generate successive topographies in a braided river system. Indeed, braided river morphology models are a key factor influencing river-aquifer interactions and have repercussions in ecosystems, flood risk or water management. It is essentially based on multivariate multiple-point statistics simulations and digital elevation models as training data sets. On the one hand, airborne photography and LIDAR acquired at successive time steps have contributed to a better understanding of the geomorphological processes although the available data are sparse over time and river scales. On the other hand, geostatistics provide simulation tools for multiple and continuous variables, which allow the exploration of the uncertainty of many assumption scenarios. Illustration of the approach demonstrates the ability of multiple-point statistics to produce realistic topographies from the information provided by digital elevation models at two time steps.
  • Publication
    Accès libre
    A practical guide to performing multiple-point statistical simulations with the Direct Sampling algorithm
    (2013-3)
    Meerschman, Eef
    ;
    ; ; ;
    Van Meirvenne, Marc
    ;
    The Direct Sampling (DS) algorithm is a recently developed multiple-point statistical simulation technique. It directly scans the training image (TI) for a given data event instead of storing the training probability values in a catalogue prior to simulation. By using distances between the given data events and the TI patterns, DS allows to simulate categorical, continuous and multivariate problems. Benefiting from the wide spectrum of potential applications of DS, requires understanding of the user-defined input parameters. Therefore, we list the most important parameters and assess their impact on the generated simulations. Real case TIs are used, including an image of ice-wedge polygons, a marble slice and snow crystals, all three as continuous and categorical images. We also use a 3D categorical TI representing a block of concrete to demonstrate the capacity of DS to generate 3D simulations. First, a quantitative sensitivity analysis is conducted on the three parameters balancing simulation quality and CPU time: the acceptance threshold t, the fraction of TI to scan f and the number of neighbors n. Next to a visual inspection of the generated simulations, the performance is analyzed in terms of speed of calculation and quality of pattern reproduction. Whereas decreasing the CPU time by influencing t and n is at the expense of simulation quality, reducing the scanned fraction of the TI allows substantial computational gains without degrading the quality as long as the TI contains enough reproducible patterns. We also illustrate the quality improvement resulting from post-processing and the potential of DS to simulate bivariate problems and to honor conditioning data. We report a comprehensive guide to performing multiple-point statistical simulations with the DS algorithm and provide recommendations on how to set the input parameters appropriately.
  • Publication
    Accès libre
    Parallel Multiple-point Statistics Algorithm Based on List and Tree Structures
    (2013-2) ;
    Walgenwitz, Alexandre
    ;
    Multiple-point statistics are widely used for the simulation of categorical variables because the method allows for integrating a conceptual model via a training image and then simulating complex heterogeneous fields. The multiple-point statistics inferred from the training image can be stored in several ways. The tree structure used in classical implementations has the advantage of being efficient in terms of CPU time, but is very RAM demanding and then implies limitations on the size of the template, which serves to make a proper reproduction of complex structures difficult. Another technique consists in storing the multiple-point statistics in lists. This alternative requires much less memory and allows for a straightforward parallel algorithm. Nevertheless, the list structure does not benefit from the shortcuts given by the branches of the tree for retrieving the multiple-point statistics. Hence, a serial algorithm based on list structure is generally slower than a tree-based algorithm. In this paper, a new approach using both list and tree structures is proposed. The idea is to index the lists by trees of reduced size: the leaves of the tree correspond to distinct sublists that constitute a partition of the entire list. The size of the indexing tree can be controlled, and then the resulting algorithm keeps memory requirements low while efficiency in terms of CPU time is significantly improved. Moreover, this new method benefits from the parallelization of the list approach.
  • Publication
    Accès libre
    Conditioning Facies Simulations with Connectivity Data
    When characterizing and simulating underground reservoirs for flow simulations, one of the key characteristics that needs to be reproduced accurately is its connectivity. More precisely, field observations frequently allow the identification of specific points in space that are connected. For example, in hydrogeology, tracer tests are frequently conducted that show which springs are connected to which sink-hole. Similarly well tests often allow connectivity information in a petroleum reservoir to be provided. To account for this type of information, we propose a new algorithm to condition stochastic simulations of lithofacies to connectivity information. The algorithm is based on the multiple-point philosophy but does not imply necessarily the use of multiple-point simulation. However, the challenge lies in generating realizations, for example of a binary medium, such that the connectivity information is honored as well as any prior structural information (e.g. as modeled through a training image). The algorithm consists of using a training image to build a set of replicates of connected paths that are consistent with the prior model. This is done by scanning the training image to find point locations that satisfy the constraints. Any path (a string of connected cells) between these points is therefore consistent with the prior model. For each simulation, one sample from this set of connected paths is sampled to generate hard conditioning data prior to running the simulation algorithm. The paper presents in detail the algorithm and some examples of two-dimensional and three-dimensional applications with multiple-point simulations.
  • Publication
    Accès libre
    Three-dimensional high resolution fluvio-glacial aquifer analog - Part 2: Geostatistical modeling
    (2011-7-21)
    Comunian, Alessandro
    ;
    ; ;
    Bayer, Peter
    The heterogeneity of sedimentary structures at the decimeter scale is crucial to the understanding of groundwater flow and transport. In a series of two papers, we provide a detailed analysis of a fluvio-glacial aquifer analog: the Herten site. The geological data along a series of 2D sections in a quarry, the corresponding GPR measurements, and their sedimentological interpretation are described in the companion paper. In this paper, we focus on the three-dimensional reconstruction of the heterogeneity. The resulting numerical model is provided as an electronic supplementary material for further studies. Furthermore, the geostatistical parameters derived from this analysis and the methodology described in the paper could be used in the future for the simulation of similar deposits where less data would be available. To build the 3D model, we propose a hierarchical simulation method which integrates various geostatistical techniques. First, we model the subdivision of the domain into regions corresponding to main sedimentological structures (e.g. a sedimentation event). Within these volumes, we use multiple-point statistics to describe the internal heterogeneity. What is unusual here is that we do not try to use a complex training image for the multiple-point algorithm accounting for all the non-stationarity and complexity, but instead use a simple conceptual model of heterogeneity (ellipsoidal shapes as a training image) and constrain the multiple point simulations within the regions by a detailed interpolation of orientation data derived from the 2D sections. This method produces realistic geological structures. The analysis of the flow and transport properties (hydraulic conductivity and tracer breakthrough curves) of the resulting model shows that it is closer to the properties estimated directly from the 2D geological observations rather than those estimated from a model of heterogeneity based on probability of transitions and not including the modeling of the large-scale structures.