The.Algorithm.Design.Manual.Springer-Verlag.1998
The.Algorithm.Design.Manual.Springer-Verlag.1998 The.Algorithm.Design.Manual.Springer-Verlag.1998
Lecture 5 - quicksort Worst Case for Quicksort Suppose instead our pivot element splits the array as unequally as possible. Thus instead of n/2 elements in the smaller half, we get zero, meaning that the pivot element is the biggest or smallest element in the array. Now we have n-1 levels, instead of , for a worst case time of , since the first n/2 levels each have elements to partition. Thus the worst case time for Quicksort is worse than Heapsort or Mergesort. To justify its name, Quicksort had better be good in the average case. Showing this requires some fairly intricate analysis. The divide and conquer principle applies to real life. If you will break a job into pieces, it is best to make the pieces of equal size! Listen To Part 5-9 Intuition: The Average Case for Quicksort Suppose we pick the pivot element at random in an array of n keys. Half the time, the pivot element will be from the center half of the sorted array. Whenever the pivot element is from positions n/4 to 3n/4, the larger remaining subarray contains at most 3n/4 elements. If we assume that the pivot element is always in this range, what is the maximum number of partitions we need to get from n elements down to 1 element? file:///E|/LEC/LECTUR17/NODE5.HTM (5 of 10) [19/1/2003 1:34:35]
Lecture 5 - quicksort Listen To Part 5-10 What have we shown? At most levels of decent partitions suffices to sort an array of n elements. But how often when we pick an arbitrary element as pivot will it generate a decent partition? Since any number ranked between n/4 and 3n/4 would make a decent pivot, we get one half the time on average. If we need levels of decent partitions to finish the job, and half of random partitions are decent, then on average the recursion tree to quicksort the array has levels. Since O(n) work is done partitioning on each level, the average time is . More careful analysis shows that the expected number of comparisons is . Listen To Part 5-11 Average-Case Analysis of Quicksort To do a precise average-case analysis of quicksort, we formulate a recurrence given the exact expected time T(n): Each possible pivot p is selected with equal probability. The number of comparisons needed to do the partition is n- 1. file:///E|/LEC/LECTUR17/NODE5.HTM (6 of 10) [19/1/2003 1:34:35]
- Page 839 and 840: Binary Search in Action file:///E|/
- Page 841 and 842: Postscript version of the lecture n
- Page 843 and 844: Lecture 1 - analyzing algorithms Ne
- Page 845 and 846: Lecture 1 - analyzing algorithms Yo
- Page 847 and 848: Lecture 1 - analyzing algorithms Th
- Page 849 and 850: Lecture 1 - analyzing algorithms Th
- Page 851 and 852: Lecture 1 - analyzing algorithms is
- Page 853 and 854: Lecture 2 - asymptotic notation How
- Page 855 and 856: Lecture 2 - asymptotic notation Not
- Page 857 and 858: Lecture 2 - asymptotic notation alg
- Page 859 and 860: Lecture 2 - asymptotic notation Sup
- Page 861 and 862: Lecture 2 - asymptotic notation Nex
- Page 863 and 864: Lecture 3 - recurrence relations Is
- Page 865 and 866: Lecture 3 - recurrence relations
- Page 867 and 868: Lecture 3 - recurrence relations Li
- Page 869 and 870: Lecture 3 - recurrence relations Su
- Page 871 and 872: Lecture 3 - recurrence relations A
- Page 873 and 874: Lecture 4 - heapsort Note iteration
- Page 875 and 876: Lecture 4 - heapsort The convex hul
- Page 877 and 878: Lecture 4 - heapsort 1. All leaves
- Page 879 and 880: Lecture 4 - heapsort left = 2i righ
- Page 881 and 882: Lecture 4 - heapsort Since this sum
- Page 883 and 884: Lecture 4 - heapsort Selection sort
- Page 885 and 886: Lecture 4 - heapsort Greedy Algorit
- Page 887 and 888: Lecture 5 - quicksort Partitioning
- Page 889: Lecture 5 - quicksort Listen To Par
- Page 893 and 894: Lecture 5 - quicksort The worst cas
- Page 895 and 896: Lecture 5 - quicksort Mon Jun 2 09:
- Page 897 and 898: Lecture 6 - linear sorting Since 2i
- Page 899 and 900: Lecture 6 - linear sorting With uni
- Page 901 and 902: Lecture 6 - linear sorting Listen T
- Page 903 and 904: Lecture 7 - elementary data structu
- Page 905 and 906: Lecture 7 - elementary data structu
- Page 907 and 908: Lecture 7 - elementary data structu
- Page 909 and 910: Lecture 7 - elementary data structu
- Page 911 and 912: Lecture 7 - elementary data structu
- Page 913 and 914: Lecture 7 - elementary data structu
- Page 915 and 916: Lecture 7 - elementary data structu
- Page 917 and 918: Lecture 8 - binary trees - Joyce Ki
- Page 919 and 920: Lecture 8 - binary trees TREE-MINIM
- Page 921 and 922: Lecture 8 - binary trees Inorder-Tr
- Page 923 and 924: Lecture 8 - binary trees Listen To
- Page 925 and 926: Lecture 8 - binary trees insert(b)
- Page 927 and 928: Lecture 8 - binary trees Not (1) -
- Page 929 and 930: Lecture 9 - catch up Next: Lecture
- Page 931 and 932: Lecture 10 - tree restructuring 1.
- Page 933 and 934: Lecture 10 - tree restructuring Lis
- Page 935 and 936: Lecture 10 - tree restructuring Not
- Page 937 and 938: Lecture 10 - tree restructuring Cas
- Page 939 and 940: Lecture 11 - backtracking Next: Lec
Lecture 5 - quicksort<br />
Worst Case for Quicksort<br />
Suppose instead our pivot element splits the array as unequally as possible. Thus instead of n/2 elements in the<br />
smaller half, we get zero, meaning that the pivot element is the biggest or smallest element in the array.<br />
Now we have n-1 levels, instead of , for a worst case time of , since the first n/2 levels each have<br />
elements to partition.<br />
Thus the worst case time for Quicksort is worse than Heapsort or Mergesort.<br />
To justify its name, Quicksort had better be good in the average case. Showing this requires some fairly intricate<br />
analysis.<br />
<strong>The</strong> divide and conquer principle applies to real life. If you will break a job into pieces, it is best to make the pieces<br />
of equal size!<br />
Listen To Part 5-9<br />
Intuition: <strong>The</strong> Average Case for Quicksort<br />
Suppose we pick the pivot element at random in an array of n keys.<br />
Half the time, the pivot element will be from the center half of the sorted array.<br />
Whenever the pivot element is from positions n/4 to 3n/4, the larger remaining subarray contains at most 3n/4<br />
elements.<br />
If we assume that the pivot element is always in this range, what is the maximum number of partitions we need to<br />
get from n elements down to 1 element?<br />
file:///E|/LEC/LECTUR17/NODE5.HTM (5 of 10) [19/1/2003 1:34:35]