Contents Top Previous Next

1.1 Genetic Programming in a Nutshell

In genetic programming we evolve a population of computer programs. That is, generation by generation, GP stochastically transforms populations of programs into new, hopefully better, populations of programs, cf. Figure  1.1 . GP, like nature, is a random process, and it can never guarantee results. GP's essential randomness, however, can lead it to escape traps which deterministic methods may be captured by. Like nature, GP has been very successful at evolving novel and unexpected ways of solving problems. (See Chapter  12 for numerous examples.)

The basic steps in a GP system are shown in Algorithm  1.1 . GP finds out how well a program works by running it, and then comparing its behaviour to some ideal (line 3). We might be interested, for example, in how well a program predicts a time series or controls an industrial process. This comparison is quantified to give a numeric value called fitness. Those programs that do well are chosen to breed (line 4) and produce new programs for the next generation (line 5). The primary genetic operations that are used to create new programs from existing ones are:

1: Randomly create an initial population of programs from the available primitives (more on this in Section  2.2 ).

2: repeat

3: Execute each program and ascertain its fitness.

4: Select one or two program(s) from the population with a probability based on fitness to participate in genetic operations (Section  2.3 ).

5: Create new individual program(s) by applying genetic operations with specified probabilities (Section  2.4 ).

6: until an acceptable solution is found or some other stopping condition is met (e.g., a maximum number of generations is reached).

7: return  the best-so-far individual.

Algorithm 1.1: Genetic Programming

Contents Top Previous Next