01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

181<br />

Often the crossover operator and selection method are too effective and they end up driving the<br />

genetic algorithm to create a population of individuals that are almost exactly the same. When the<br />

population consists of similar individuals, the likelihood of finding new solutions typically<br />

decreases.<br />

On one hand, you want the genetic algorithm to find good individuals, but on the other you want it<br />

to maintain diversity. A common problem in optimization (not just genetic algorithms) is how to<br />

define an objective function that accurately (and consistently) captures the effects of multiple<br />

objectives.<br />

In general, genetic algorithms are better than gradient search methods if your search space has<br />

many local optima. Since the genetic algorithm traverses the search space using the genotype<br />

rather than the phenotype, it is less likely to get stuck on a local high or low.<br />

8.4 The Algorithm<br />

In summary, the Genetic Algorithm goes as follows:<br />

1. Define a basic population of chromosomes.<br />

2. Evaluate the fitness of each chromosome.<br />

3. Select parents with a probability (rank-based, roulette-wheel)<br />

4. Breed pairs of parents (cross-over).<br />

5. Apply mutation on children.<br />

6. Evaluate fitness of children.<br />

7. Population size is kept constant through a rejection process. Throw away the worst solutions of<br />

the old population or of the old population + the new children.<br />

8. Start again at point 3.<br />

9. The GA stops when either satisfactory level of fitness is attained, or when the population is<br />

uniform (local minima).<br />

8.5 Convergence<br />

There is, to this day, no theoretical proof that a GA will eventually converge, nor that it will<br />

converge to the global optimum of the fitness function. There are, however, a number of “rules of<br />

thumb” that one might follow to ensure good performance of the algorithm.<br />

Convergence of a GA depends on choosing correctly the parameters. One should keep in mind<br />

the following:<br />

• If the mutation is too small (not frequent enough and making small steps), there will not be<br />

enough exploration. Hence, the algorithm might get stuck in some local optimum.<br />

• If, conversely, the mutation is too strong (too frequent and too important steps), the algorithm<br />

will be very slow to converge.<br />

• Cross-over can slow down convergence, by destroying good solutions.<br />

• Convergence depends importantly on the encoding and on the location of the genes on the<br />

chromosomes.<br />

• The number of “children” produced at each generation determines the speed at which one<br />

explores the environment.<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!