Skip to main content

Average case analysis of Quick Sort

Position of Pivot Left of pivot Right of pivot T(nleft) T(nright)
1 0 n-1 T(0) T(n-1)
2 1 n-2 T(1) T(n-2)
3 2 n-2 T(2) T(n-3)
--- --- --- --- --
--- --- --- --- ---
--- --- --- --- ---
n-2 n-3 2 T(n-3) T(2)
n-1 n-2 1 T(n-1) T(1)
n n-1 0 T(n-1) T(0)

The number of comparisons for first call on partion:

Assume left_to_right moves over k smaller element and thus k comparisons.

So when right_to_left crosses left_to_right it has made n-k+1 comparisons.

So first call on partition makes n+1 comparisons.

The average case complexity of quicksort is:

{T(n) = comparisons for first call on quicksort + Σ1<=nleft,nright<=n[T(nleft)+T(nright)]}

=(n+1) + 2[T(0) + T(1) + T(2) + ----T(n-1)/n.....(1)

So,

nT(n) = n(n+1) + 2[T(0) + T(1) + T(2) + --- +T(n-2) + T(n-1)] .......(2)

(n-1)T(n-1) = (n-1)n + 2[T(0) + T(1) + T(2) + --- +T(n-2)] ........(3)

Subtracting equn(3) from equn(2):

nT(n) - (n-1)T(n-1) = [n(n+1) - (n-1)n] + 2T(n-1)

        = 2n + 2T(n-1)

nT(n) = 2n + (n-1)T(n-1) + 2T(n-1)
      = 2n +(n+1)T(n-1)

T(n) = 2 + (n-1)T(n-1)/n

The recurrence relation obtained is:

T(n)/(n+1) = 2/(n+1) + T(n-1)/n

Using the method of substitution:

T(n)/(n+1) = 2/(n+1) + T(n-1)/n

T(n-1)/n = 2/n + T(n-2)(n-1)

T(n-2)/(n-1) = 2/(n-1) + T(n-3)/(n-2)

T(n-3)/(n-2) = 2/(n-2) + T(n-4)/(n-3)

-------------------------------------

-------------------------------------

T(3)/4 = 2/4 + T(2)/3

T(2)/3 = 2/3 + T(1)/2

T(1)/2 = 2/2 + T(0)


Adding both sides

T(n)/(n+1) + [T(n-1)/n + T(n-2)/(n-1)+------+ T(2)/3 + T(1)/2]

= [T(n-1)/n + T(n-2)/(n-1)+------+ T(2)/3 + T(1)/2] + T(0) + [2/(n+1) + 2/n + 2/(n-1) + ----- + 2/4 + 2/3]

Canceling the common terms:

T(n)/(n+1) = 2[1/2 + 1/3 + 1/4 +-------- +1/n + 1/(n+1)]

= 2Σ2≤k≤(n+1)(1/k)

[=2Hk, Hk is called the kth harmonic number]

≤2∫ dk/k for k varyfing from 2 to (n+1)

2[loge(n+1) - loge2]



Hence, T(n) = O([(n+1) 2[loge(n+1) - loge2]

   =O(n log n)


AVERAGE CASE : T(n) = O(n log n)

Comments

Popular posts from this blog

Optimal Binary Search Trees

Instead of hoping the incremental construction yields a shallow tree, we can construct the tree that minimizes the search time. We consider the common problem in which items have different probabilities to be the target of a search. For example,some words in the English dictionary are more commonly searched than others and are therefore assigned a higher probability. Let a 1 < a 2 < . . . < a n be the items and p i is the corresponding probabilities. To simplify the discussion, we only consider successful searches and thus assume n &#931 i=1 &nbsp p i = 1 . The expected number of comparisons for a successful search in a binary search tree T storing the n item is 1+ C(T) = n &#931 i=1 &nbsp p i .(&#948 i + 1) = 1 + n &#931 i=1 &nbsp p i .&#948 i , where &#948 i is the depth of the node that stores a i . C(T ) is the weighted path length or the cost of T . We study the problem of constructing a tree that minimizes th

Knapsack Problem

Today we will try to apply the greedy method to solve one of the simply complex problem Knapsack Problem . Given n objects and a Knapsack where object i has a weight w i and the Knapsack has a capasity m If a fraction x i of object i placed into knapsack, a profit p i x i is earned The objective is to obtain a filling of Knapsack maximizing the total profit Problem Formulation: First Know the Greedy method control abstraction for the subset paradigm Algorithm Greedy(a,n) { solution:=null; For i: 1 to n do { x = Select(a); if feasible(solution,x) then solution = Union(solution,x) } return solution; } Let's consider a knapsack problem of finding the optimal solution here, M = 15, (p1,p2,p3,...p7) = (10,5,15,7,6,18,3) and (w1, w2,..w7) = (2,3,5,7,1,4,1). Now we will find the solution by following "maximum profit per unit of capacity" Strategy: Let (a,b,c,d,e,f,g) represent the items with profit (10,5,15,7,6,18,3). This means that the objects a