Logo du site
  • English
  • Français
  • Se connecter
Logo du site
  • English
  • Français
  • Se connecter
  1. Accueil
  2. Université de Neuchâtel
  3. Publications
  4. 3D Geological Image Synthesis from 2D Examples Using Generative Adversarial Networks
 
  • Details
Options
Vignette d'image

3D Geological Image Synthesis from 2D Examples Using Generative Adversarial Networks

Auteur(s)
Coiffier, Guillaume
Renard, Philippe 
Centre d'hydrogéologie et de géothermie 
Lefebvre, Sylvain
Date de parution
2020-10
In
Frontiers in Water
No
2
De la page
598
A la page
612
Revu par les pairs
1
Mots-clés
  • geology
  • heterogeneity
  • stochastic model
  • groundwater
  • generative adversarial network
  • deep learning
  • geology

  • heterogeneity

  • stochastic model

  • groundwater

  • generative adversaria...

  • deep learning

Résumé
Generative Adversarial Networks (GAN) are becoming an alternative to Multiple-point Statistics (MPS) techniques to generate stochastic fields from training images. But a difficulty for all the training image based techniques (including GAN and MPS) is to generate 3D fields when only 2D training data sets are available. In this paper, we introduce a novel approach called Dimension Augmenter GAN (DiAGAN) enabling GANs to generate 3D fields from 2D examples. The method is simple to implement and is based on the introduction of a random cut sampling step between the generator and the discriminator of a standard GAN. Numerical experiments show that the proposed approach provides an efficient solution to this long lasting problem.
Identifiants
https://libra.unine.ch/handle/123456789/30430
_
10.3389/frwa.2020.560598
Type de publication
journal article
Dossier(s) à télécharger
 main article: 2023-01-17_110_9534.pdf (3.73 MB)
google-scholar
Présentation du portailGuide d'utilisationStratégie Open AccessDirective Open Access La recherche à l'UniNE Open Access ORCIDNouveautés

Service information scientifique & bibliothèques
Rue Emile-Argand 11
2000 Neuchâtel
contact.libra@unine.ch

Propulsé par DSpace, DSpace-CRIS & 4Science | v2022.02.00