18.04.2013 Views

The.Algorithm.Design.Manual.Springer-Verlag.1998

The.Algorithm.Design.Manual.Springer-Verlag.1998

The.Algorithm.Design.Manual.Springer-Verlag.1998

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Growth Rates<br />

Next: Logarithms Up: Introduction to <strong>Algorithm</strong>s Previous: <strong>The</strong> Big Oh Notation<br />

Growth Rates<br />

In working with the big Oh notation, we cavalierly discard the multiplicative constants. <strong>The</strong> functions<br />

and are treated identically, even though g(n) is a million times larger than<br />

f(n) for all values of n. <strong>The</strong> method behind this madness is illustrated by Figure , which tabulates the<br />

growth rate of several functions arising in algorithm analysis, on problem instances of reasonable size.<br />

Specifically, Figure shows how long it takes for algorithms that use f(n) operations to complete on a<br />

fast computer where each operation takes one nanosecond ( seconds). Study the table for a few<br />

minutes and the following conclusions become apparent:<br />

Figure: Growth rates of common functions measured in nanoseconds<br />

● All of these algorithms take about the same amount of time for n=10.<br />

● <strong>The</strong> algorithm whose running time is n! becomes useless well before n=20.<br />

● <strong>The</strong> algorithm whose running time is has a greater operating range, but it becomes impractical<br />

for n > 40.<br />

● <strong>The</strong> algorithm whose running time is is perfectly reasonable up to about n=100, but it quickly<br />

deteriorates with larger inputs. For n > 1,000,000 it likely to be hopeless.<br />

● Both the n and algorithms remain practical on inputs of up to one billion items.<br />

● You can't hope to find a real problem where an algorithm is going to be too slow in<br />

practice.<br />

<strong>The</strong> bottom line is that even by ignoring constant factors, we can get an excellent idea of whether a given<br />

algorithm will be able to run in a reasonable amount of time on a problem of a given size. An algorithm<br />

whose running time is seconds will beat one whose running time is seconds<br />

only so long as . Such enormous constant factor differences between algorithms occur in<br />

practice far less frequently than such large problems do.<br />

file:///E|/BOOK/BOOK/NODE15.HTM (1 of 2) [19/1/2003 1:28:12]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!