alphaNEAT is a flexible, experimental NEAT implementation written in Java. It is inspired by the original Kenneth O. Stanley's C++ NEAT implementation, Colin Green's SharpNEAT, and the evo-NEAT Java implementation by vishnugh.
NEAT stands for NeuroEvolution of Augmenting Topologies, it describes a Neuroevolution algorithm that tries to find a suitable neural network for solving a given problem. NEAT employs a genetic algorithm to search for an optimal neural network. Most importantly, NEAT does not require users to provide the topology of the network to optimize, instead, it evolves network topologies through a complexification process and adapts the classic genetic operators to work on different topologies. For an in-depth description consult the original paper.
- Externally defined algorithm parameters as a Java Parameters file. (Parameter descriptions in the
NEATConfig
class) - Centralized RNG (Random Number Generator) using the apache-commons rng package.
- Support for resuming interrupted evolution through Java's object serialization/deserialization mechanism.
- Support for concurrent evaluation of networks across multiple threads.
- Continuous computation of evolution statistics and support for saving stats in CSV format.
- Possibility to start evolution with disconnected input neurons to force feature selection.
- Multi-interrupting add-node mutation: adding a node involves disabling a link and adding a new node and two links in its place. If the disabled link becomes enabled again it can get re-interrupted again with a new different node and new different links.
- Activation mutation operator (Experimental): mutate nodes' activation functions to one of the allowed functions.
- Link reorientation mutation (Experimental): mutate a network by reorienting one of the links.
- Link filtering: impose restrictions on the proportions of each link type (loops, recurrent).
- Phased Search: allows transition to a simplification phase if
mean complexity surpasses a given threshold.
- Species phased search: only species with high mean complexity transition to simplification.
- (On simplification, node deletion and link deletion operators take effect)
- Dead-end (dangling) nodes repair mechanism.
- Simple API.
Precompiled JARs with all dependencies are available in the releases page.
The following is a simple usage example for the classic XOR problem domain. The evalXOR
method is located in
XORExample.java
.
This snippet will run the NEAT evolution for 1000 generations using the configurations provided. Sample configurations
for the XOR example are given in xor\xorConfigs.cfg
.
import encoding.Genome;
import engine.ANEAT;
import examples.XORExample;
public class Example {
public static void main(String[] args) {
String configPath = "path/to/neat/config/file";
int generations = 1000;
ANEAT aneat = new ANEAT(configPath);
aneat.run(XORExample::evalXOR, generations, null);
// Retrieve best genome
Genome bestGenome = aneat.getBestGenome();
}
}
For more details concerning saving/resuming evolution please consult XORExample.java
.
- Alleviate the somewhat high memory usage.
- Transition from Java's serialization/deserialization format to more lightweight formats (json, yaml).
- Include example evaluation functions for more problem domains (eg. pole balancing).
- General code improvements and optimizations.
- Although alphaNEAT was thoroughly tested, it would be great to add formal JUnit tests
If you find any bug or problem with the code please open an issue.
All contributions are welcome!