The.Algorithm.Design.Manual.Springer-Verlag.1998
The.Algorithm.Design.Manual.Springer-Verlag.1998 The.Algorithm.Design.Manual.Springer-Verlag.1998
Best, Worst, and Average-Case Complexity Next: The Big Oh Notation Up: Keeping Score Previous: The RAM Model of Best, Worst, and Average-Case Complexity Using the RAM model of computation, we can count how many steps our algorithm will take on any given input instance by simply executing it on the given input. However, to really understand how good or bad an algorithm is, we must know how it works over all instances. To understand the notions of the best, worst, and average-case complexity, one must think about running an algorithm on all possible instances of data that can be fed to it. For the problem of sorting, the set of possible input instances consists of all the possible arrangements of all the possible numbers of keys. We can represent every input instance as a point on a graph, where the x-axis is the size of the problem (for sorting, the number of items to sort) and the y-axis is the number of steps taken by the algorithm on this instance. Here we assume, quite reasonably, that it doesn't matter what the values of the keys are, just how many of them there are and how they are ordered. It should not take longer to sort 1,000 English names than it does to sort 1,000 French names, for example. Figure: Best, worst, and average-case complexity As shown in Figure , these points naturally align themselves into columns, because only integers represent possible input sizes. After all, it makes no sense to ask how long it takes to sort 10.57 items. Once we have these points, we can define three different functions over them: ● The worst-case complexity of the algorithm is the function defined by the maximum number of steps taken on any instance of size n. It represents the curve passing through the highest point of file:///E|/BOOK/BOOK/NODE13.HTM (1 of 2) [19/1/2003 1:28:09]
Best, Worst, and Average-Case Complexity each column. ● The best-case complexity of the algorithm is the function defined by the minimum number of steps taken on any instance of size n. It represents the curve passing through the lowest point of each column. ● Finally, the average-case complexity of the algorithm is the function defined by the average number of steps taken on any instance of size n. In practice, the most useful of these three measures proves to be the worst-case complexity, which many people find counterintuitive. To illustrate why worst-case analysis is important, consider trying to project what will happen to you if you bring n dollars to gamble in a casino. The best case, that you walk out owning the place, is possible but so unlikely that you should place no credence in it. The worst case, that you lose all n dollars, is easy to calculate and distressingly likely to happen. The average case, that the typical person loses 87.32% of the money that they bring to the casino, is difficult to compute and its meaning is subject to debate. What exactly does average mean? Stupid people lose more than smart people, so are you smarter or dumber than the average person, and by how much? People who play craps lose more money than those playing the nickel slots. Card counters at blackjack do better on average than customers who accept three or more free drinks. We avoid all these complexities and obtain a very useful result by just considering the worst case. The important thing to realize is that each of these time complexities defines a numerical function, representing time versus problem size. These functions are as well-defined as any other numerical function, be it or the price of General Motors stock as a function of time. Time complexities are complicated functions, however. In order to simplify our work with such messy functions, we will need the big Oh notation. Next: The Big Oh Notation Up: Keeping Score Previous: The RAM Model of Algorithms Mon Jun 2 23:33:50 EDT 1997 file:///E|/BOOK/BOOK/NODE13.HTM (2 of 2) [19/1/2003 1:28:09]
- Page 63 and 64: References CT80 CT92 1971. G. Carpa
- Page 65 and 66: References K. Daniels and V. Milenk
- Page 67 and 68: References (FOCS), pages 60-69, 199
- Page 69 and 70: References FM71 M. Fischer and A. M
- Page 71 and 72: References History of Computing, 7:
- Page 73 and 74: References N. Gibbs, W. Poole, and
- Page 75 and 76: References Hof82 Hof89 Hol75 C. M.
- Page 77 and 78: References IK75 Ita78 O. Ibarra and
- Page 79 and 80: References Kir83 D. Kirkpatrick. Ef
- Page 81 and 82: References object in 2-dimensional
- Page 83 and 84: References LT79 LT80 8:99-118, 1995
- Page 85 and 86: References Mil97 V. Milenkovic. Dou
- Page 87 and 88: References Theoretical Computer Sci
- Page 89 and 90: References Pav82 T. Pavlidis. Algor
- Page 91 and 92: References Rei72 Rei91 Rei94 E. Rei
- Page 93 and 94: References Prentice Hall, Englewood
- Page 95 and 96: References SR95 SS71 SS90 SSS74 ST8
- Page 97 and 98: References Tin90 Mikkel Thorup. On
- Page 99 and 100: References Wel84 T. Welch. A techni
- Page 101 and 102: References Algorithms Mon Jun 2 23:
- Page 103 and 104: Correctness and Efficiency Next: Co
- Page 105 and 106: Correctness NearestNeighborTSP(P) P
- Page 107 and 108: Correctness Figure: A bad example f
- Page 109 and 110: Efficiency Next: Expressing Algorit
- Page 111 and 112: Keeping Score Next: The RAM Model o
- Page 113: The RAM Model of Computation substa
- Page 117 and 118: The Big Oh Notation Figure: Illustr
- Page 119 and 120: Logarithms Next: Modeling the Probl
- Page 121 and 122: Logarithms justified in ignoring th
- Page 123 and 124: Modeling the Problem Figure: Modeli
- Page 125 and 126: About the War Stories Next: War Sto
- Page 127 and 128: War Story: Psychic Modeling Next: E
- Page 129 and 130: War Story: Psychic Modeling have pu
- Page 131 and 132: War Story: Psychic Modeling Next: E
- Page 133 and 134: Exercises (b) If I prove that an al
- Page 135 and 136: Fundamental Data Types Next: Contai
- Page 137 and 138: Containers Next: Dictionaries Up: F
- Page 139 and 140: Dictionaries Next: Binary Search Tr
- Page 141 and 142: Binary Search Trees BinaryTreeQuery
- Page 143 and 144: Priority Queues Next: Specialized D
- Page 145 and 146: Specialized Data Structures Next: S
- Page 147 and 148: Sorting Next: Applications of Sorti
- Page 149 and 150: Applications of Sorting Figure: Con
- Page 151 and 152: Data Structures Next: Incremental I
- Page 153 and 154: Incremental Insertion Next: Divide
- Page 155 and 156: Randomization Next: Bucketing Techn
- Page 157 and 158: Randomization Next: Bucketing Techn
- Page 159 and 160: Bucketing Techniques Algorithms Mon
- Page 161 and 162: War Story: Stripping Triangulations
- Page 163 and 164: War Story: Stripping Triangulations
Best, Worst, and Average-Case Complexity<br />
each column.<br />
● <strong>The</strong> best-case complexity of the algorithm is the function defined by the minimum number of<br />
steps taken on any instance of size n. It represents the curve passing through the lowest point of<br />
each column.<br />
● Finally, the average-case complexity of the algorithm is the function defined by the average<br />
number of steps taken on any instance of size n.<br />
In practice, the most useful of these three measures proves to be the worst-case complexity, which many<br />
people find counterintuitive. To illustrate why worst-case analysis is important, consider trying to project<br />
what will happen to you if you bring n dollars to gamble in a casino. <strong>The</strong> best case, that you walk out<br />
owning the place, is possible but so unlikely that you should place no credence in it. <strong>The</strong> worst case, that<br />
you lose all n dollars, is easy to calculate and distressingly likely to happen. <strong>The</strong> average case, that the<br />
typical person loses 87.32% of the money that they bring to the casino, is difficult to compute and its<br />
meaning is subject to debate. What exactly does average mean? Stupid people lose more than smart<br />
people, so are you smarter or dumber than the average person, and by how much? People who play craps<br />
lose more money than those playing the nickel slots. Card counters at blackjack do better on average than<br />
customers who accept three or more free drinks. We avoid all these complexities and obtain a very useful<br />
result by just considering the worst case.<br />
<strong>The</strong> important thing to realize is that each of these time complexities defines a numerical function,<br />
representing time versus problem size. <strong>The</strong>se functions are as well-defined as any other numerical<br />
function, be it or the price of General Motors stock as a function of time. Time<br />
complexities are complicated functions, however. In order to simplify our work with such messy<br />
functions, we will need the big Oh notation.<br />
Next: <strong>The</strong> Big Oh Notation Up: Keeping Score Previous: <strong>The</strong> RAM Model of<br />
<strong>Algorithm</strong>s<br />
Mon Jun 2 23:33:50 EDT 1997<br />
file:///E|/BOOK/BOOK/NODE13.HTM (2 of 2) [19/1/2003 1:28:09]