PDF (double-sided) - Physics Department, UCSB - University of ...

PDF (double-sided) - Physics Department, UCSB - University of ... PDF (double-sided) - Physics Department, UCSB - University of ...

web.physics.ucsb.edu
from web.physics.ucsb.edu More from this publisher
27.09.2014 Views

ages are based on this method, including Matlab’s “fminsearch” function. The algorithm also handles noise reasonably well. The major drawbacks of the simplex method are that its performance is extremely dependent on the choice of the initial simplex, which is not always easy to do right. It also gets stuck fairly easily in local minima. Furthermore, the algorithm interleaves function evaluations with decision steps that are based on the results of these function evaluations. This makes it impossible to pipeline the evaluations except for the rare cases when all vertices need to be reevaluated. This leads to a low and somewhat random experimental duty cycle and thus potential thermal drifts. 11.3.4 Particle Swarm Optimization Relatively recently, an attempt to model social decision making behavior led to the invention of a new class of Direct Search algorithms called “Particle Swarm Optimization” [Eberhart and Kennedy, 1995]. These algorithms are based on the random placement of “particles” throughout the parameter space. Each particle probes the fitness of the function at its position and keeps track of the position of the best fitness it has ever observed. The particles’ positions are updated sequentially by simulating their motion through parameter space under forces that accelerate them randomly towards the current globally known best position 256

and back towards the position of their personal best fitness. The particles are assigned a uniform inertial mass to favor exploration, and the space is given a viscosity to eventually damp the motion and cause convergence. Particle Swarm Optimization is considered a class of algorithms, as there are many possible ways in which the algorithm can be implemented. Choices include whether the particles’ knowledge of the global best position is limited to information about only a few neighboring particles, whether particles get added to or removed from the swarm dynamically, whether other points of attraction or repulsion are kept, and the exact values of the particles’ mass and the space’s viscosity. This flexibility makes it possible to customize the algorithm to yield optimal behavior for the circumstances of the given problem. Particle Swarm Optimization has several major advantages. It is very robust against noisy data and getting stuck in local minima (if the number of particles, their mass and the vicosity are chosen right). It can be modified to allow for pipelining with 100% function evaluation duty cycle. The modification requires a slight relaxation of the definition of global best fitness in that it must exclude the new values for points that are currently being evaluated. This restriction does not noticeably reduce the performance of the algorithm, though. Last, but not least, it is extremely easy to implement and has very little overhead. One disadvantage is the number of function evaluations that the algorithm 257

and back towards the position <strong>of</strong> their personal best fitness. The particles are<br />

assigned a uniform inertial mass to favor exploration, and the space is given a<br />

viscosity to eventually damp the motion and cause convergence.<br />

Particle Swarm Optimization is considered a class <strong>of</strong> algorithms, as there are<br />

many possible ways in which the algorithm can be implemented. Choices include<br />

whether the particles’ knowledge <strong>of</strong> the global best position is limited to information<br />

about only a few neighboring particles, whether particles get added to<br />

or removed from the swarm dynamically, whether other points <strong>of</strong> attraction or<br />

repulsion are kept, and the exact values <strong>of</strong> the particles’ mass and the space’s<br />

viscosity. This flexibility makes it possible to customize the algorithm to yield<br />

optimal behavior for the circumstances <strong>of</strong> the given problem.<br />

Particle Swarm Optimization has several major advantages. It is very robust<br />

against noisy data and getting stuck in local minima (if the number <strong>of</strong> particles,<br />

their mass and the vicosity are chosen right).<br />

It can be modified to allow for<br />

pipelining with 100% function evaluation duty cycle. The modification requires a<br />

slight relaxation <strong>of</strong> the definition <strong>of</strong> global best fitness in that it must exclude the<br />

new values for points that are currently being evaluated. This restriction does not<br />

noticeably reduce the performance <strong>of</strong> the algorithm, though. Last, but not least,<br />

it is extremely easy to implement and has very little overhead.<br />

One disadvantage is the number <strong>of</strong> function evaluations that the algorithm<br />

257

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!