Investigating Hyperparameter Optimization and Transferability for ES-HyperNEAT: A TPE Approach
Date issued
August 2024
In
Proceedings of the Genetic and Evolutionary Computation Conference Companion
Vol
24
From page
1879
To page
1887
Reviewed by peer
true
Abstract
Neuroevolution of Augmenting Topologies (NEAT) and its advanced version, Evolvable-Substrate HyperNEAT (ES-HyperNEAT), have shown great potential in developing neural networks. However, their effectiveness heavily depends on the selection of hyperparameters. This study investigates the optimization of ES-HyperNEAT hyperparameters using the Tree-structured Parzen Estimator (TPE) on the MNIST classification task, exploring a search space of over 3 billion potential combinations. TPE effectively navigates this vast space, significantly outperforming random search in terms of mean, median, and best accuracy. During the validation process, the best hyperparameter configuration found by TPE achieves an accuracy of 29.00% on MNIST, surpassing previous studies while using a smaller population size and fewer generations. The transferability of the optimized hyperparameters is explored in logic operations and Fashion-MNIST tasks, revealing successful transfer to the more complex Fashion-MNIST problem but limited to simpler logic operations. This study emphasizes a method to unlock the full potential of neuroevolutionary algorithms and provides insights into the hyperparameters’ transferability across tasks of varying complexity.
Event name
Genetic and Evolutionary Computation Conference (GECCO ’24)
Location
Melbourne, VIC, Australia
Publication type
conference paper
File(s)![Thumbnail Image]()
Loading...
Name
Investigating Hyperparameter Optimization and Transferability for ES-HyperNEAT: A TPE Approach
Type
Main Article
Size
885.51 KB
Format
Unknown
