The.Algorithm.Design.Manual.Springer-Verlag.1998
The.Algorithm.Design.Manual.Springer-Verlag.1998 The.Algorithm.Design.Manual.Springer-Verlag.1998
Simulated Annealing Next: Traveling Salesman Problem Up: Heuristic Methods Previous: Heuristic Methods Simulated Annealing The inspiration for simulated annealing comes from the physical process of cooling molten materials down to the solid state. When molten steel is cooled too quickly, cracks and bubbles form, marring its surface and structural integrity. To end up with the best final product, the steel must be cooled slowly and evenly. Annealing is a metallurgical technique that uses a disciplined cooling schedule to efficiently bring the steel to a low-energy, optimal state. In thermodynamic theory, the energy state of a system is described by the energy state of each of the particles constituting it. The energy state of each particle jumps about randomly, with such transitions governed by the temperature of the system. In particular, the probability of transition from energy to at temperature T is given by where is a constant, called Boltzmann's constant. What does this formula mean? Consider the value of the exponent under different conditions. The probability of moving from a high-energy state to a lower-energy state is very high. However, there is also a non-zero probability of accepting a transition into a high-energy state, with small energy jumps much more likely than big ones. The higher the temperature, the more likely such energy jumps will occur. What relevance does this have for combinatorial optimization? A physical system, as it cools, seeks to go to a minimumenergy state. For any discrete set of particles, minimizing the total energy is a combinatorial optimization problem. Through random transitions generated according to the above probability distribution, we can simulate the physics to solve arbitrary combinatorial optimization problems. Simulated-Annealing() from S to Create initial solution S Initialize temperature t repeat file:///E|/BOOK/BOOK2/NODE92.HTM (1 of 3) [19/1/2003 1:29:34] for i=1 to iteration-length do Generate a random transition
Simulated Annealing Return S then until (no change in C(S)) If then else if Reduce temperature t There are three components to any simulated annealing algorithm for combinatorial search: ● Concise problem representation - The problem representation includes both a representation of the solution space and an appropriate and easily computable cost function C() measuring the quality of a given solution. ● Transition mechanism between solutions - To move from one state to the next, we need a collection of simple transition mechanisms that slightly modify the current solution. Typical transition mechanisms include swapping the position of a pair of items or inserting/deleting a single item. Ideally, the effect that these incremental changes have on measuring the quality of the solution can be computed incrementally, so cost function evaluation takes time proportional to the size of the change (typically constant) instead of linear in the size of the solution. ● Cooling schedule - These parameters govern how likely we are to accept a bad transition as a function of time. At the beginning of the search, we are eager to use randomness to explore the search space widely, so the probability of accepting a negative transition is high. As the search progresses, we seek to limit transitions to local improvements and optimizations. The cooling schedule can be regulated by the following parameters: ❍ Initial system temperature - Typically . ❍ Temperature decrement function - Typically , where . This implies an exponential decay in the temperature, as opposed to a linear decay. ❍ Number of iterations between temperature change - Typically, 100 to 1,000 iterations might be permitted before lowering the temperature. ❍ Acceptance criteria - A typical criterion is to accept any transition from to when and to accept a negative transition whenever where r is a random number . The constant c normalizes this cost function, so that almost all transitions are accepted at the starting temperature. ❍ Stop criteria - Typically, when the value of the current solution has not changed or improved within the last iteration or so, the search is terminated and the current solution reported. Creating the proper cooling schedule is somewhat of a trial and error process. It might pay to start from an existing implementation of simulated annealing, pointers to which are provided in Section . file:///E|/BOOK/BOOK2/NODE92.HTM (2 of 3) [19/1/2003 1:29:34]
- Page 233 and 234: Depth-First Search Next: Applicatio
- Page 235 and 236: Connected Components Next: Tree and
- Page 237 and 238: Two-Coloring Graphs Next: Topologic
- Page 239 and 240: Topological Sorting Next: Articulat
- Page 241 and 242: Modeling Graph Problems Next: Minim
- Page 243 and 244: Modeling Graph Problems good line s
- Page 245 and 246: Minimum Spanning Trees ● Prim's A
- Page 247 and 248: Prim's Algorithm inserted edge (x,y
- Page 249 and 250: Kruskal's Algorithm a tree of weigh
- Page 251 and 252: Dijkstra's Algorithm Next: All-Pair
- Page 253 and 254: All-Pairs Shortest Path Next: War S
- Page 255 and 256: War Story: Nothing but Nets Next: W
- Page 257 and 258: War Story: Nothing but Nets ``You a
- Page 259 and 260: War Story: Dialing for Documents Ne
- Page 261 and 262: War Story: Dialing for Documents If
- Page 263 and 264: War Story: Dialing for Documents CO
- Page 265 and 266: Exercises Next: Implementation Chal
- Page 267 and 268: Exercises Prove the statement or gi
- Page 269 and 270: Backtracking report it. kth element
- Page 271 and 272: Constructing All Subsets Next: Cons
- Page 273 and 274: Constructing All Paths in a Graph N
- Page 275 and 276: Search Pruning Next: Bandwidth Mini
- Page 277 and 278: Bandwidth Minimization immediately
- Page 279 and 280: War Story: Covering Chessboards Nex
- Page 281 and 282: War Story: Covering Chessboards att
- Page 283: Heuristic Methods Mon Jun 2 23:33:5
- Page 287 and 288: Traveling Salesman Problem Next: Ma
- Page 289 and 290: Independent Set Next: Circuit Board
- Page 291 and 292: Neural Networks Next: Genetic Algor
- Page 293 and 294: Genetic Algorithms Next: War Story:
- Page 295 and 296: War Story: Annealing Arrays Next: P
- Page 297 and 298: War Story: Annealing Arrays optimal
- Page 299 and 300: Parallel Algorithms Next: War Story
- Page 301 and 302: War Story: Going Nowhere Fast Next:
- Page 303 and 304: Exercises Next: Implementation Chal
- Page 305 and 306: Problems and Reductions Next: Simpl
- Page 307 and 308: Simple Reductions Next: Hamiltonian
- Page 309 and 310: Hamiltonian Cycles Next: Independen
- Page 311 and 312: Independent Set and Vertex Cover pr
- Page 313 and 314: Clique and Independent Set These la
- Page 315 and 316: Satisfiability Mon Jun 2 23:33:50 E
- Page 317 and 318: The Theory of NP-Completeness Next:
- Page 319 and 320: 3-Satisfiability where for , , , an
- Page 321 and 322: Integer Programming Next: Vertex Co
- Page 323 and 324: Integer Programming possible IP ins
- Page 325 and 326: Vertex Cover reduction for the 3-SA
- Page 327 and 328: Other NP-Complete Problems hard. Th
- Page 329 and 330: The Art of Proving Hardness easiest
- Page 331 and 332: War Story: Hard Against the Clock N
- Page 333 and 334: War Story: Hard Against the Clock I
Simulated Annealing<br />
Next: Traveling Salesman Problem Up: Heuristic Methods Previous: Heuristic Methods<br />
Simulated Annealing<br />
<strong>The</strong> inspiration for simulated annealing comes from the physical process of cooling molten materials down to the solid state.<br />
When molten steel is cooled too quickly, cracks and bubbles form, marring its surface and structural integrity. To end up with<br />
the best final product, the steel must be cooled slowly and evenly. Annealing is a metallurgical technique that uses a<br />
disciplined cooling schedule to efficiently bring the steel to a low-energy, optimal state.<br />
In thermodynamic theory, the energy state of a system is described by the energy state of each of the particles constituting it.<br />
<strong>The</strong> energy state of each particle jumps about randomly, with such transitions governed by the temperature of the system. In<br />
particular, the probability of transition from energy to at temperature T is given by<br />
where is a constant, called Boltzmann's constant.<br />
What does this formula mean? Consider the value of the exponent under different conditions. <strong>The</strong> probability of moving from<br />
a high-energy state to a lower-energy state is very high. However, there is also a non-zero probability of accepting a<br />
transition into a high-energy state, with small energy jumps much more likely than big ones. <strong>The</strong> higher the temperature, the<br />
more likely such energy jumps will occur.<br />
What relevance does this have for combinatorial optimization? A physical system, as it cools, seeks to go to a minimumenergy<br />
state. For any discrete set of particles, minimizing the total energy is a combinatorial optimization problem. Through<br />
random transitions generated according to the above probability distribution, we can simulate the physics to solve arbitrary<br />
combinatorial optimization problems.<br />
Simulated-Annealing()<br />
from S to<br />
Create initial solution S<br />
Initialize temperature t<br />
repeat<br />
file:///E|/BOOK/BOOK2/NODE92.HTM (1 of 3) [19/1/2003 1:29:34]<br />
for i=1 to iteration-length do<br />
Generate a random transition