www.gp-field-guide.org.uk
Contents Top Previous Next

8.3 Mixing Grammars and Probabilities

A variety of other systems have been proposed which combine the use of grammars and probabilities. We mention only a few here; a more extended review of these is available in (Shan, McKay, Essam, and Abbass2006).

Ratle and Sebag (2001) used a stochastic context-free grammar to generate program trees. The probability of applying each rewrite rule was adapted using a standard EDA approach so as to increase the likelihood of using successful rules. The system could also be run in a mode where rule probabilities depended upon the depth of the non-terminal symbol to which a rewrite rule was applied, thereby providing a higher degree of flexibility.

The approach taken in program evolution with explicit learning (PEEL) (Shan, McKay, Abbass, and Essam2003) was slightly more general. PEEL used a probabilistic L-system where rewrite rules were both depth- and location-dependent. The probabilities with which rules were applied were adapted by an ant colony optimisation (ACO) algorithm (Dorigo and Stützle2004). Another feature of PEEL was that the L-system's rules could be automatically refined via splitting and specialisation.

Other programming systems based on probabilistic grammars which are optimised via ant systems include ant-TAG (Abbass, Hoai, and McKay2002Shan, Abbass, McKay, and Essam2002), which uses a tree-adjunct grammar as its main representation, and generalised ant programming (GAP) (Keber and Schuster2002), which is based on a context-free grammar. Other systems which learn and use probabilistic grammars include grammar model based program evolution (GMPE) (Shan, McKay, Baxter, Abbass, Essam, and Hoai2004), the system described in (Bosman and de Jong2004a, b) and Baysian automatic programming (BAP) (Regolin and Pozo2005).


www.gp-field-guide.org.uk
Contents Top Previous Next