Noisy Fitness Evaluation in Genetic Algorithms and the Dynamics of Learning
Magnus Rattray
Jonathon L. Shapiro
Computer Science Department
University of Manchester
Oxford Road
Manchester M13 9PL, U.K.
rattraym@cs.man.ac.uk
ABSTRACT
An exact theory is presented which describes selection in a Genetic
Algorithm (GA) under a stochastic fitness measure and correctly
accounts for finite population effects. Although the theory can be
applied to a number of selection schemes, we only consider Boltzmann
selection in detail here as the results for this form of selection are
particularly transparent when fitness is corrupted by additive
Gaussian noise. Finite population effects are shown to be of
fundamental importance in this case, as the noise has no effect in the
infinite population limit. In the limit of weak selection we show how
the effects of any Gaussian noise can be removed by increasing the
population size appropriately. The theory is tested on two closely
related problems: the one-max problem corrupted by Gaussian noise and
generalization in a perceptron with binary weights. The averaged
dynamics can be accurately modelled for both problems using a
formalism which describes the dynamics of the GA using methods from
statistical mechanics. The second problem is a simple example of a
learning problem and by considering this problem we show how the
accurate characterization of noise in the fitness evaluation (in this
case, the negative training error) may be relevant in machine
learning. The training error is the number of misclassified training
examples in a batch and can be considered as a noisy version of the
generalization error if an independent batch is used for each
evaluation. The noise is due to the finite batch size and in the limit
of large problem size and weak selection we show how the effect of
this noise can be removed by increasing the population size. This
allows the optimal batch size to be determined, which minimizes
computation time as well as the total number of training examples
required.