Recurrent Neuroevolution of Augmenting Topologies (RNEAT)¶
Neuroevolution of Augmenting Topologies (NEAT) uses evolutionary genetic algorithms to evolve neural architectures, where the best optimized neural network is selected according to certain criteria. For NEORL, NEAT tries to build a neural network that minimizes or maximizes an objective function by following {action, state, reward} terminology of reinforcement learning. In RNEAT, genetic algorithms evolve Recurrent neural networks for optimization purposes in a reinforcement learning context.
Original paper: Stanley, K. O., & Miikkulainen, R. (2002). Evolving neural networks through augmenting topologies. Evolutionary computation, 10(2), 99-127.
What can you use?¶
Multi processing: ✔️
Discrete spaces: ❌
Continuous spaces: ✔️
Mixed Discrete/Continuous spaces: ❌
Parameters¶
-
class
neorl.hybrid.rneat.
RNEAT
(mode, fit, bounds, config, ncores=1, seed=None)[source]¶ Recurrent NeuroEvolution of Augmenting Topologies (RNEAT)
- Parameters
mode – (str) problem type, either
min
for minimization problem ormax
for maximization (RL is default tomax
)fit – (function) the fitness function
bounds – (dict) input parameter type and lower/upper bounds in dictionary form. Example:
bounds={'x1': ['int', 1, 4], 'x2': ['float', 0.1, 0.8], 'x3': ['float', 2.2, 6.2]}
config – (dict) dictionary of RNEAT hyperparameters, see Notes below for available hyperparameters to change
ncores – (int) number of parallel processors
seed – (int) random seed for sampling
-
evolute
(ngen, x0=None, save_best_net=False, checkpoint_itv=None, startpoint=None, verbose=False)[source]¶ This function evolutes the RNEAT algorithm for number of generations.
- Parameters
ngen – (int) number of generations to evolute
x0 – (list) initial position of the NEAT (must have same size as the
x
variable)save_best_net – (bool) save the winner neural network to a pickle file
checkpoint_itv – (int) generation frequency to save checkpoints for restarting purposes (e.g. 1: save every generation, 10: save every 10 generations)
startpoint – (str) name/path to the checkpoint file to use to start the search (the checkpoint file can be saved by invoking the argument
checkpoint_itv
)verbose – (bool) print statistics to screen
- Returns
(tuple) (best individual, best fitness, and dictionary containing major search results)
Example¶
Train a RNEAT agent to optimize the 5-D sphere function
from neorl import RNEAT
import numpy as np
def Sphere(individual):
"""Sphere test objective function.
F(x) = sum_{i=1}^d xi^2
d=1,2,3,...
Range: [-100,100]
Minima: 0
"""
return sum(x**2 for x in individual)
nx=5
lb=-100
ub=100
bounds={}
for i in range(1,nx+1):
bounds['x'+str(i)]=['float', -100, 100]
# modify your own NEAT config
config = {
'pop_size': 50,
'num_hidden': 1,
'activation_mutate_rate': 0.1,
'survival_threshold': 0.3,
}
# model config
rneat=RNEAT(fit=Sphere, bounds=bounds, mode='min', config= config, ncores=1, seed=1)
#A random initial guess (provide one individual)
x0 = np.random.uniform(lb,ub,nx)
x_best, y_best, rneat_hist=rneat.evolute(ngen=200, x0=x0,
verbose=True, checkpoint_itv=None,
startpoint=None)
Notes¶
- The following major hyperparameters can be changed when you define the
config
dictionary: Hyperparameter
Description
pop_size
num_hidden
elitism
survival_threshold
min_species_size
activation_mutate_rate
aggregation_mutate_rate
weight_mutate_rate
bias_mutate_rate
The number of individuals in each generation (30)
The number of hidden nodes to add to each genome in the initial population (1)
The number of individuals to survive from one generation to the next (1)
The fraction for each species allowed to reproduce each generation(0.3)
The minimum number of genomes per species after reproduction (2)
The probability that mutation will replace the node’s activation function (0.05)
The probability that mutation will replace the node’s aggregation function (0.05)
The probability that mutation will change the connection weight by adding a random value (0.5)
The probability that mutation will change the bias of a node by adding a random value (0.7)
- The following major hyperparameters can be changed when you define the
Acknowledgment¶
Thanks to our fellows in NEAT-Python, as we have used their NEAT implementation to leverage our optimization classes.