Neural Harris Hawks Optimization (NHHO)¶
A module for the surrogate-based Harris Hawks Optimization trained by offline data-driven tri-training approach. The surrogate model used is feedforward neural networks constructed from tensorflow.
Original paper: Huang, P., Wang, H., & Jin, Y. (2021). Offline data-driven evolutionary optimization based on tri-training. Swarm and Evolutionary Computation, 60, 100800.
What can you use?¶
Multi processing: ✔️
Discrete spaces: ✔️
Continuous spaces: ✔️
Mixed Discrete/Continuous spaces: ✔️
Parameters¶
-
class
neorl.hybrid.nhho.
NHHO
(mode, bounds, fit, nhawks, num_warmups=None, int_transform='nearest_int', nn_params={}, ncores=1, seed=None)[source]¶ Neural Harris Hawks Optimizer
- Parameters
mode – (str) problem type, either “min” for minimization problem or “max” for maximization
bounds – (dict) input parameter type and lower/upper bounds in dictionary form. Example:
bounds={'x1': ['int', 1, 4], 'x2': ['float', 0.1, 0.8], 'x3': ['float', 2.2, 6.2]}
fit – (function) the fitness function
nhawks – (int): number of the hawks in the group
num_warmups – (int) number of warmup samples to train the surrogate which will be evaluated by the real fitness
fit
(ifNone
,num_warmups=20*len(bounds)
)int_transform – (str): method of handling int/discrete variables, choose from:
nearest_int
,sigmoid
,minmax
.nn_params – (dict) parameters for building the surrogate models in dictionary form. Keys are:
test_split
,learning_rate
,activation
,num_nodes
,batch_size
,epochs
,save_models
,verbose
,plot
. See Notes below for descriptions.ncores – (int) number of parallel processors to train the three surrogate models (only
ncores=1
orncores=3
are allowed)seed – (int) random seed for sampling
-
evolute
(ngen, x0=None, verbose=False)[source]¶ This function evolutes the NHHO algorithm for number of generations.
- Parameters
ngen – (int) number of generations to evolute
x0 – (list of lists) initial position of the hawks (must be of same size as
nhawks
)verbose – (bool) print statistics to screen
- Returns
(tuple) (list of best individuals, list of best fitnesses)
Example¶
from neorl import NHHO
import time
import sys
#Define the fitness function
def FIT(individual):
"""Sphere test objective function.
F(x) = sum_{i=1}^d xi^2
d=1,2,3,...
Range: [-100,100]
Minima: 0
"""
y=sum(x**2 for x in individual)
return y
#Setup the parameter space (d=5)
nx=5
BOUNDS={}
for i in range(1,nx+1):
BOUNDS['x'+str(i)]=['float', -100, 100]
nn_params = {}
nn_params['num_nodes'] = [60, 30, 15]
nn_params['learning_rate'] = 8e-4
nn_params['epochs'] = 100
nn_params['plot'] = False #will accelerate training
nn_params['verbose'] = False #will accelerate training
nn_params['save_models'] = False #will accelerate training
try:
ngen=int(sys.argv[1]) #get ngen as external argument for testing
except:
ngen=50 #or use default ngen
t0=time.time()
nhho = NHHO(mode='min', bounds=BOUNDS, fit=FIT, nhawks=20,
nn_params=nn_params, ncores=3, seed=1)
individuals, fitnesses = nhho.evolute(ngen=ngen, verbose=True)
print('Comp Time:', time.time()-t0)
#make evaluation of the best individuals using the real fitness function
real_fit=[FIT(item) for item in individuals]
#print the best individuals/fitness found
min_index=real_fit.index(min(real_fit))
print('------------------------ Final Summary --------------------------')
print('Best real individual:', individuals[min_index])
print('Best real fitness:', real_fit[min_index])
print('-----------------------------------------------------------------')
Notes¶
Tri-training concept uses semi-supervised learning to leverage surrogate models that approximate the real fitness function to accelerate the optimization process for expensive fitness functions. Three feedforward neural network models are trained, which are used to determine the best individual from one generation to the next, which is added to retrain the three surrogate models. The real fitness function
fit
is ONLY used to evaluatenum_warmups
. Afterwards, the three neural network models are used to guide the Harris hawks optimizer.For
num_warmups
, choose a reasonable value to accommodate the number of design variablesx
in your problem. IfNone
, the default value of warmup samples is 20 times the size ofx
.Total number of cost evaluations via the real fitness function
fit
for NHHO isnum_warmups
.Total number of cost evaluations via the surrogate model for NHHO is
2 * nhawks
*ngen
.The following variables can be used in
nn_params
dictionary to construct the surrogate modelHyperparameter
Description
num_nodes
learning_rate
batch_size
activation
test_split
epochs
verbose
save_models
plot
List of number of nodes, e.g. [64, 32] creates two layer-network with 64 and 32 nodes (default: [100, 50, 25])
The learning rate of Adam optimizer (default: 6e-4)
The minibatch size (default: 32)
Activation function type (default:
relu
)Fraction of test data or test split (default: 0.2)
Number of training epochs (default: 20)
Flag to print different surrogate error to screen (default: True)
Flag to save the neural network models (default: True)
Flag to generate plots for surrogate training loss and surrogate prediction accuracy (default: True)