The.Algorithm.Design.Manual.Springer-Verlag.1998
The.Algorithm.Design.Manual.Springer-Verlag.1998 The.Algorithm.Design.Manual.Springer-Verlag.1998
The Partition Problem M[i,j] = D[i,j] = x for i=1 to n do M[i,1] = p[i] for i=1 to k do (* evaluate main recurrence *) for i=2 to n do for j = 2 to k do Figure: Dynamic programming matrices M and D in partitioning file:///E|/BOOK/BOOK2/NODE45.HTM (3 of 5) [19/1/2003 1:28:44] for x = 1 to i-1 do if (M[i,j] > ) then
The Partition Problem Figure: Dynamic programming matrices M and D in partitioning The implementation above in fact runs faster than advertised. Our original analysis assumed that it took time to update each cell of the matrix. This is because we selected the best of up to n possible points to place the divider, each of which requires the sum of up to n possible terms. In fact, it is easy to avoid the need to compute these sums by storing the set of n prefix sums , since . This enables us to evaluate the recurrence in linear time per cell, yielding an algorithm. By studying the recurrence relation and the dynamic programming matrices of Figures and , you should be able to convince yourself that the final value of M(n,k) will be the cost of the largest range in the optimal partition. However, what good is that? For most applications, what we need is the actual partition that does the job. Without it, all we are left with is a coupon for a great price on an out-of-stock item. The second matrix, D, is used to reconstruct the optimal partition. Whenever we update the value of M[i,j], we record which divider position was required to achieve that value. To reconstruct the path used to get to the optimal solution, we work backward from D[n,k] and add a divider at each specified position. This backwards walking is best achieved by a recursive subroutine: ReconstructPartition(S,D,n,k) If (k = 1) then print the first partition else ReconstructPartition(S,D,D[n,k],k-1) Print the kth partition { } It is important to catch the distinction between storing the value of a cell and what decision/move it took to get there. The latter is not used in the computation but is presumably the real thing that you are interested in. For most of the examples in this chapter, we will not worry about reconstructing the answer. However, study this example closely to ensure that you know how to obtain the winning configuration when you need it. file:///E|/BOOK/BOOK2/NODE45.HTM (4 of 5) [19/1/2003 1:28:44]
- Page 133 and 134: Exercises (b) If I prove that an al
- Page 135 and 136: Fundamental Data Types Next: Contai
- Page 137 and 138: Containers Next: Dictionaries Up: F
- Page 139 and 140: Dictionaries Next: Binary Search Tr
- Page 141 and 142: Binary Search Trees BinaryTreeQuery
- Page 143 and 144: Priority Queues Next: Specialized D
- Page 145 and 146: Specialized Data Structures Next: S
- Page 147 and 148: Sorting Next: Applications of Sorti
- Page 149 and 150: Applications of Sorting Figure: Con
- Page 151 and 152: Data Structures Next: Incremental I
- Page 153 and 154: Incremental Insertion Next: Divide
- Page 155 and 156: Randomization Next: Bucketing Techn
- Page 157 and 158: Randomization Next: Bucketing Techn
- Page 159 and 160: Bucketing Techniques Algorithms Mon
- Page 161 and 162: War Story: Stripping Triangulations
- Page 163 and 164: War Story: Stripping Triangulations
- Page 165 and 166: War Story: Mystery of the Pyramids
- Page 167 and 168: War Story: Mystery of the Pyramids
- Page 169 and 170: War Story: String 'em Up We were co
- Page 171 and 172: War Story: String 'em Up Figure: Su
- Page 173 and 174: Exercises Next: Implementation Chal
- Page 175 and 176: Exercises used to select the pivot.
- Page 177 and 178: Dynamic Programming Next: Fibonacci
- Page 179 and 180: Fibonacci numbers Next: The Partiti
- Page 181 and 182: Fibonacci numbers Next: The Partiti
- Page 183: The Partition Problem . What is the
- Page 187 and 188: Approximate String Matching Next: L
- Page 189 and 190: Approximate String Matching The val
- Page 191 and 192: Longest Increasing Sequence Will th
- Page 193 and 194: Minimum Weight Triangulation Next:
- Page 195 and 196: Limitations of Dynamic Programming
- Page 197 and 198: War Story: Evolution of the Lobster
- Page 199 and 200: War Story: Evolution of the Lobster
- Page 201 and 202: War Story: What's Past is Prolog Ne
- Page 203 and 204: War Story: What's Past is Prolog th
- Page 205 and 206: War Story: Text Compression for Bar
- Page 207 and 208: War Story: Text Compression for Bar
- Page 209 and 210: Divide and Conquer Next: Fast Expon
- Page 211 and 212: Fast Exponentiation Next: Binary Se
- Page 213 and 214: Binary Search Next: Square and Othe
- Page 215 and 216: Exercises Next: Implementation Chal
- Page 217 and 218: Exercises whose denominations are ,
- Page 219 and 220: The Friendship Graph Next: Data Str
- Page 221 and 222: The Friendship Graph friendship gra
- Page 223 and 224: Data Structures for Graphs linked t
- Page 225 and 226: War Story: Getting the Graph ``Well
- Page 227 and 228: Traversing a Graph Next: Breadth-Fi
- Page 229 and 230: Breadth-First Search Next: Depth-Fi
- Page 231 and 232: Depth-First Search Next: Applicatio
- Page 233 and 234: Depth-First Search Next: Applicatio
<strong>The</strong> Partition Problem<br />
M[i,j] =<br />
D[i,j] = x<br />
for i=1 to n do M[i,1] = p[i]<br />
for i=1 to k do<br />
(* evaluate main recurrence *)<br />
for i=2 to n do<br />
for j = 2 to k do<br />
Figure: Dynamic programming matrices M and D in partitioning<br />
file:///E|/BOOK/BOOK2/NODE45.HTM (3 of 5) [19/1/2003 1:28:44]<br />
for x = 1 to i-1 do<br />
if (M[i,j] > ) then