Voici les éléments 1 - 10 sur 32
  • Publication
    Accès libre
    Preconditioners for the conjugate gradient algorithm using Gram–Schmidt and least squares methods
    This paper is devoted to the study of some preconditioners for the conjugate gradient algorithm used to solve large sparse linear and symmetric positive definite systems. The construction of a preconditioner based on the Gram–Schmidt orthogonalization process and the least squares method is presented. Some results on the condition number of the preconditioned system are provided. Finally, numerical comparisons are given for different preconditioners.
  • Publication
    Accès libre
    Three-dimensional high resolution fluvio-glacial aquifer analog - Part 2: Geostatistical modeling
    (2011-7-21)
    Comunian, Alessandro
    ;
    ; ;
    Bayer, Peter
    The heterogeneity of sedimentary structures at the decimeter scale is crucial to the understanding of groundwater flow and transport. In a series of two papers, we provide a detailed analysis of a fluvio-glacial aquifer analog: the Herten site. The geological data along a series of 2D sections in a quarry, the corresponding GPR measurements, and their sedimentological interpretation are described in the companion paper. In this paper, we focus on the three-dimensional reconstruction of the heterogeneity. The resulting numerical model is provided as an electronic supplementary material for further studies. Furthermore, the geostatistical parameters derived from this analysis and the methodology described in the paper could be used in the future for the simulation of similar deposits where less data would be available. To build the 3D model, we propose a hierarchical simulation method which integrates various geostatistical techniques. First, we model the subdivision of the domain into regions corresponding to main sedimentological structures (e.g. a sedimentation event). Within these volumes, we use multiple-point statistics to describe the internal heterogeneity. What is unusual here is that we do not try to use a complex training image for the multiple-point algorithm accounting for all the non-stationarity and complexity, but instead use a simple conceptual model of heterogeneity (ellipsoidal shapes as a training image) and constrain the multiple point simulations within the regions by a detailed interpolation of orientation data derived from the 2D sections. This method produces realistic geological structures. The analysis of the flow and transport properties (hydraulic conductivity and tracer breakthrough curves) of the resulting model shows that it is closer to the properties estimated directly from the 2D geological observations rather than those estimated from a model of heterogeneity based on probability of transitions and not including the modeling of the large-scale structures.
  • Publication
    Accès libre
    A practical guide to performing multiple-point statistical simulations with the Direct Sampling algorithm
    (2013-3)
    Meerschman, Eef
    ;
    ; ; ;
    Van Meirvenne, Marc
    ;
    The Direct Sampling (DS) algorithm is a recently developed multiple-point statistical simulation technique. It directly scans the training image (TI) for a given data event instead of storing the training probability values in a catalogue prior to simulation. By using distances between the given data events and the TI patterns, DS allows to simulate categorical, continuous and multivariate problems. Benefiting from the wide spectrum of potential applications of DS, requires understanding of the user-defined input parameters. Therefore, we list the most important parameters and assess their impact on the generated simulations. Real case TIs are used, including an image of ice-wedge polygons, a marble slice and snow crystals, all three as continuous and categorical images. We also use a 3D categorical TI representing a block of concrete to demonstrate the capacity of DS to generate 3D simulations. First, a quantitative sensitivity analysis is conducted on the three parameters balancing simulation quality and CPU time: the acceptance threshold t, the fraction of TI to scan f and the number of neighbors n. Next to a visual inspection of the generated simulations, the performance is analyzed in terms of speed of calculation and quality of pattern reproduction. Whereas decreasing the CPU time by influencing t and n is at the expense of simulation quality, reducing the scanned fraction of the TI allows substantial computational gains without degrading the quality as long as the TI contains enough reproducible patterns. We also illustrate the quality improvement resulting from post-processing and the potential of DS to simulate bivariate problems and to honor conditioning data. We report a comprehensive guide to performing multiple-point statistical simulations with the DS algorithm and provide recommendations on how to set the input parameters appropriately.
  • Publication
    Accès libre
    Parallel Multiple-point Statistics Algorithm Based on List and Tree Structures
    (2013-2) ;
    Walgenwitz, Alexandre
    ;
    Multiple-point statistics are widely used for the simulation of categorical variables because the method allows for integrating a conceptual model via a training image and then simulating complex heterogeneous fields. The multiple-point statistics inferred from the training image can be stored in several ways. The tree structure used in classical implementations has the advantage of being efficient in terms of CPU time, but is very RAM demanding and then implies limitations on the size of the template, which serves to make a proper reproduction of complex structures difficult. Another technique consists in storing the multiple-point statistics in lists. This alternative requires much less memory and allows for a straightforward parallel algorithm. Nevertheless, the list structure does not benefit from the shortcuts given by the branches of the tree for retrieving the multiple-point statistics. Hence, a serial algorithm based on list structure is generally slower than a tree-based algorithm. In this paper, a new approach using both list and tree structures is proposed. The idea is to index the lists by trees of reduced size: the leaves of the tree correspond to distinct sublists that constitute a partition of the entire list. The size of the indexing tree can be controlled, and then the resulting algorithm keeps memory requirements low while efficiency in terms of CPU time is significantly improved. Moreover, this new method benefits from the parallelization of the list approach.
  • Publication
    Métadonnées seulement
    Conditioning of Multiple-Point Statistics Facies Simulations to Tomographic Images
    (2014-7)
    Lochbühler, Tobias
    ;
    ; ;
    Linde, Niklas
    Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.
  • Publication
    Accès libre
    Conditioning multiple-point statistics simulations to block data
    Multiple-points statistics (MPS) allows to generate random fields reproducing spatial statistics derived from a training image. MPS methods consist in borrowing patterns from the training set. Therefore, the simulation domain is assumed to be at the same resolution as the conceptual model, although geometrical deformations can be handled by such techniques. Whereas punctual conditioning data corresponding to the scale of the grid node can be easily integrated, accounting for data available at larger scales is challenging. In this paper, we propose an extension of MPS able to deal with block data, i.e. target mean values over subsets of the simulation domain. Our extension is based on the direct sampling algorithm and consists to add a criterion for the acceptance of the candidate node scanned in the training image to constrain the simulation to block data. Likelihood ratios are used to compare the averages of the simulated variable taken on the informed nodes in the blocks and the target mean values. Moreover, the block data may overlap and their support can be of any shape and size. Illustrative examples show the potential of the presented algorithm for practical applications.
  • Publication
    Accès libre
    Conditioning Multiple-Point Statistics Simulation to Inequality Data
    Stochastic modeling is often employed in environmental sciences for the analysis and understanding of complex systems. For example, random fields are key components in uncertainty analysis or Bayesian inverse modeling. Multiple-point statistics (MPS) provides efficient simulation tools for simulating fields reproducing the spatial statistics depicted in a training image (TI), while accounting for local or block conditioning data. Among MPS methods, the direct sampling algorithm is a flexible pixel-based technique that consists in first assigning the conditioning data values (so-called hard data) in the simulation grid, and then in populating the rest of the simulation domain in a random order by successively pasting a value from a TI cell sharing a similar pattern. In this study, an extension of the direct sampling method is proposed to account for inequality data, that is, constraints in given cells consisting of lower and/or upper bounds for the simulated values. Indeed, inequality data are often available in practice. The new approach involves the adaptation of the distance used to compare and evaluate the match between two patterns to account for such constraints. The proposed method, implemented in the DeeSse code, allows generating random fields both reflecting the spatial statistics of the TI and honoring the inequality constraints. Finally examples of topography simulations illustrate and show the capabilities of the proposed method.
  • Publication
    Métadonnées seulement
    Preconditioners for the conjugate gradient algorithm using Gram-Schmidt and least squares methods
    This paper is devoted to the study of some preconditioners for the conjugate gradient algorithm used to solve large sparse linear and symmetric positive definite systems. The construction of a preconditioner based on the Gram-Schmidt orthogonalization process and the least squares method is presented. Some results on the condition number of the preconditioned system are provided. Finally, numerical comparisons are given for different preconditioners.
  • Publication
    Accès libre
    3D multiple-point statistics simulation using 2D training images
    (2012-3)
    Comunian, Alessandro
    ;
    ;
    One of the main issues in the application of multiple-point statistics (MPS) to the simulation of three-dimensional (3D) blocks is the lack of a suitable 3D training image. In this work, we compare three methods of overcoming this issue using information coming from bidimensional (20) training images. One approach is based on the aggregation of probabilities. The other approaches are novel. One relies on merging the lists obtained using the impala algorithm from diverse 2D training images, creating a list of compatible data events that is then used for the MPS simulation. The other (s2Dcd) is based on sequential simulations of 2D slices constrained by the conditioning data computed at the previous simulation steps. These three methods are tested on the reproduction of two 3D images that are used as references, and on a real case study where two training images of sedimentary structures are considered. The tests show that it is possible to obtain 3D MPS simulations with at least two 2D training images. The simulations obtained, in particular those obtained with the s2Dcd method, are close to the references, according to a number of comparison criteria. The CPU time required to simulate with the method s2Dcd is from two to four orders of magnitude smaller than the one required by a MPS simulation performed using a 3D training image, while the results obtained are comparable. This computational efficiency and the possibility of using MPS for 3D simulation without the need for a 3D training image facilitates the inclusion of MPS in Monte Carlo, uncertainty evaluation, and stochastic inverse problems frameworks.
  • Publication
    Accès libre
    Optimisation issues in 3D multiple-point statistics simulation
    (: Julián M. Ortiz and Xavier Emery, Mining Engineering Department, University of Chile., 2008-12) ;
    Walgenwitz, Alexandre
    ;
    Froidevaux, Roland
    ;
    ;
    Multiple-point statistics simulation has gained wide acceptance in recent years and is routinely used for simulating geological heterogeneity in hydrocarbon reservoirs and aquifers. In classical implementations, the multiple-point statistics inferred from the reference training image are stored in a dynamic data structure called search tree. The size of this search tree depends on the search template used to scan the training image and the number of facies to be simulated. In 3D applications this size can become prohibitive. One promissing avenue for drastically reducing the RAM requirements consists of using dynamically allocated lists instead of search trees to store and retrieve the multiple–point statistics. Each element of this list contains the identification of the data event together with occurence counters for each facies. First results show that implementing this list based approach results in reductions of RAM requirement by a factor 10 and more. The paper discusses in detail this novel list based approach, presents RAM and CPU performance comparisons with the (classical) tree based approach.