www.gp-field-guide.org.uk
Contents Top Previous Next

9.4 MO GP via Operator Bias

While it is very common to use only modifications of the selection phase to perform multi-objective search, it is also possible to combine MOO selection with genetic operators exhibiting an inbuilt search bias which can steer the algorithm towards optimising certain objectives.

In some sense the classical repair operators, which are used in constrained optimisation to deal with hard constraints, are an extreme form of the idea of using operators to help MOO search.1 More generally, it is possible to imagine search operators with softer biases which favour the achievement of one or more objectives. These can be the same or different from the objectives that bias the selection of parents.

The pygmies and civil servants approach proposed in (Ryan19941996) combines the separation typical of Pareto-based approaches with biased search operators. In this system two lists are built, one where individuals are ranked based on fitness and the other where individuals are ranked based on a linear combination of fitness and size (i.e., a parsimonious fitness function). During crossover, the algorithm draws one parent from the first list and the other from the second list. This can be seen as a form of disassortative mating aimed at maintain diversity in the population. Another example of this kind is (?) where crossover was modified so that an offspring is retained only if it dominates either of its parents.

Furthermore, as discussed in Sections  5.2 and  11.3.2 , there are several mutation operators with a direct or indirect bias towards smaller programs. This provides a pressure towards the evolution of more parsimonious solutions throughout a run.

As with the staged fitness functions discussed in the previous section, it is also possible to activate operators with a known bias towards smaller programs only when the main objective -- say a 100% correct solution -- has been achieved. This was tested in (Pujol1999Pujol and Poli1997), where GP was used to evolve neural networks. After a 100% correct solution was found, one hidden node of each network in the population was replaced by a terminal, and the evolution process was resumed. This pruning procedure was repeated until the specified number of generations had been reached.


www.gp-field-guide.org.uk
Contents Top Previous Next