Logo du site
  • English
  • Français
  • Se connecter
Logo du site
  • English
  • Français
  • Se connecter
  1. Accueil
  2. Université de Neuchâtel
  3. Publications
  4. Investigating Hyperparameter Optimization and Transferability for ES-HyperNEAT: A TPE Approach
 
  • Details
Options
Vignette d'image

Investigating Hyperparameter Optimization and Transferability for ES-HyperNEAT: A TPE Approach

Auteur(s)
Claret, Romain 
Institut du management de l'information 
Michael O'Neill
Cotofrei, Paul 
Institut du management de l'information 
Stoffel, Kilian 
Institut du management de l'information 
Date de parution
2024-08
In
Proceedings of the Genetic and Evolutionary Computation Conference Companion
Vol.
24
De la page
1879
A la page
1887
Revu par les pairs
true
Résumé
Neuroevolution of Augmenting Topologies (NEAT) and its advanced version, Evolvable-Substrate HyperNEAT (ES-HyperNEAT), have shown great potential in developing neural networks. However, their effectiveness heavily depends on the selection of hyperparameters. This study investigates the optimization of ES-HyperNEAT hyperparameters using the Tree-structured Parzen Estimator (TPE) on the MNIST classification task, exploring a search space of over 3 billion potential combinations. TPE effectively navigates this vast space, significantly outperforming random search in terms of mean, median, and best accuracy. During the validation process, the best hyperparameter configuration found by TPE achieves an accuracy of 29.00% on MNIST, surpassing previous studies while using a smaller population size and fewer generations. The transferability of the optimized hyperparameters is explored in logic operations and Fashion-MNIST tasks, revealing successful transfer to the more complex Fashion-MNIST problem but limited to simpler logic operations. This study emphasizes a method to unlock the full potential of neuroevolutionary algorithms and provides insights into the hyperparameters’ transferability across tasks of varying complexity.
Nom de l'événement
Genetic and Evolutionary Computation Conference (GECCO ’24)
Lieu
Melbourne, VIC, Australia
Identifiants
https://libra.unine.ch/handle/123456789/32945
_
https://doi.org/10.1145/3638530.3664144
Type de publication
conference paper
Dossier(s) à télécharger
 main article: Investigating Hyperparameter Optimization and Transferability for ES-HyperNEAT: A TPE Approach (885.51 KB)
Neuroevolution of Augmenting Topologies (NEAT) and its advanced version, Evolvable-Substrate HyperNEAT (ES-HyperNEAT), have shown great potential in developing neural networks. However, their effectiveness heavily depends on the selection of hyperparameters. This study investigates the optimization of ES-HyperNEAT hyperparameters using the Tree-structured Parzen Estimator (TPE) on the MNIST classification task, exploring a search space of over 3 billion potential combinations. TPE effectively navigates this vast space, significantly outperforming random search in terms of mean, median, and best accuracy. During the validation process, the best hyperparameter configuration found by TPE achieves an accuracy of 29.00% on MNIST, surpassing previous studies while using a smaller population size and fewer generations. The transferability of the optimized hyperparameters is explored in logic operations and Fashion-MNIST tasks, revealing successful transfer to the more complex Fashion-MNIST problem but limited to simpler logic operations. This study emphasizes a method to unlock the full potential of neuroevolutionary algorithms and provides insights into the hyperparameters’ transferability across tasks of varying complexity.
google-scholar
Présentation du portailGuide d'utilisationStratégie Open AccessDirective Open Access La recherche à l'UniNE Open Access ORCIDNouveautés

Service information scientifique & bibliothèques
Rue Emile-Argand 11
2000 Neuchâtel
contact.libra@unine.ch

Propulsé par DSpace, DSpace-CRIS & 4Science | v2022.02.00