Abstract
The pursuit of optimal neural network architectures has surged in popularityalongside the growth of modern deep learning research. This Ph.D. thesis
presents a comprehensive analysis and exploration of Neural Architecture
Search (NAS) using Evolutionary Algorithms (EAs). It introduces two novel
methods: the evolution of custom activation functions with Grammatical
Evolution, and an Ecological Neural Architecture Search (ENAS) for evolving
the evolutionary hyperparameters in tandem with the neural network’s
architecture. The analysis of these methods show that they outperform
standard neural network architectures as well as some of the most prominent
machine learning algorithms to date.
To complement the experimental evaluations of these novel methods, this
thesis also describes a new Python framework for implementing evolutionary
algorithm inspired NAS. This framework aims to strike a balance between
performance, versatility and simplicity, allowing developers to optimise their
neural network’s architecture with ease.
In summary, this thesis makes a significant contribution to the research on
Neural Architecture Search (NAS) and Evolutionary algorithm inspired NAS
(EA-NAS) by demonstrating the effectiveness of a task-specific approach to
neural network architecture design. It underscores the importance of this
endeavour by offering developers a Python framework to independently test
and optimise their own neural networks to achieve superior performance on
their own datasets.
Date of Award | 2025 |
---|---|
Original language | English |
Sponsors | Bangor University & Artificial Intelligence, Machine Learning and Advanced Computing (AIMLAC) CDT |
Supervisor | Franck Vidal (Supervisor) & William Teahan (Supervisor) |
Keywords
- Grammatical Evolution
- neural networks
- NEAT
- Doctor of Philosophy (PhD)
- ENAS
- Neuvo
- GEAF