Harris Hawks Optimization (HHO)

A module for the Harris Hawks Optimization with parallel computing support and mixed discrete/continuous optimization ability.

Original paper: Heidari, A. A., Mirjalili, S., Faris, H., Aljarah, I., Mafarja, M., & Chen, H. (2019). Harris hawks optimization: Algorithm and applications. Future generation computer systems, 97, 849-872.

alternate text

What can you use?

  • Multi processing: ✔️

  • Discrete spaces: ✔️

  • Continuous spaces: ✔️

  • Mixed Discrete/Continuous spaces: ✔️

Parameters

class neorl.evolu.hho.HHO(mode, bounds, fit, nhawks, int_transform='nearest_int', ncores=1, seed=None)[source]

Harris Hawks Optimizer

Parameters
  • mode – (str) problem type, either “min” for minimization problem or “max” for maximization

  • bounds – (dict) input parameter type and lower/upper bounds in dictionary form. Example: bounds={'x1': ['int', 1, 4], 'x2': ['float', 0.1, 0.8], 'x3': ['float', 2.2, 6.2]}

  • fit – (function) the fitness function

  • nhawks – (int): number of the hawks in the group

  • int_transform – (str): method of handling int/discrete variables, choose from: nearest_int, sigmoid, minmax.

  • ncores – (int) number of parallel processors (must be <= nhawks)

  • seed – (int) random seed for sampling

evolute(ngen, x0=None, verbose=False, **kwargs)[source]

This function evolutes the HHO algorithm for number of generations.

Parameters
  • ngen – (int) number of generations to evolute

  • x0 – (list of lists) initial position of the hawks (must be of same size as nhawks)

  • verbose – (bool) print statistics to screen

Returns

(tuple) (best individual, best fitness, and dictionary containing major search results)

Example

from neorl import HHO

#Define the fitness function
def FIT(individual):
    """Sphere test objective function.
                    F(x) = sum_{i=1}^d xi^2
                    d=1,2,3,...
                    Range: [-100,100]
                    Minima: 0
    """
    y=sum(x**2 for x in individual)
    return y

#Setup the parameter space (d=5)
nx=5
BOUNDS={}
for i in range(1,nx+1):
    BOUNDS['x'+str(i)]=['float', -100, 100]

#setup and evolute HHO
hho=HHO(mode='min', bounds=BOUNDS, fit=FIT, nhawks=20, ncores=1, seed=1)
x_best, y_best, hho_hist=hho.evolute(ngen=200, verbose=1)

Notes

  • HHO is inspired by the cooperative behavior and chasing style of Harris’ hawks in nature, which is called surprise pounce. Several hawks cooperatively pounce a prey from different directions in an attempt to surprise it. The prey here can be a rabbit, which is a representative of the global optima.

  • HHO employs different exploration and exploitation strategies in form of soft and hard besieges as well as rapid dives before attacking the prey. These strategies are parameter-free, as only nhawks needs to be specified by the user.

  • We provide a flexible HHO implemetation that can handle continuous (float), discrete (int), and categorical (grid) and their mix. The user can control the type of discrete transformation via the argument int_transform.

  • ncores argument evaluates the fitness of all hawks in the swarm in parallel after the position update. Therefore, set ncores <= nhawks for most optimal resource allocation.

  • Look for an optimal balance between nhawks and ngen, it is recommended to minimize the number of nhawks to allow for more updates and more generations.

  • Total number of cost evaluations for HHO is 2*nhawks * ngen (this is an upper bound estimate as there is randomness in whether some of the hawks are evaluated or not).