MHB Theta(n^2) running time of Quicksort Algorithm

Click For Summary
The discussion revolves around proving that a specific scenario of the Quicksort algorithm has a running time of Theta(n^2). The user describes a case where the array is partitioned into sub-arrays of sizes i and i(j-1), with i being a constant. They derive a recurrence relation and express concerns about how to demonstrate that the resulting summation leads to n^2. A participant suggests that if the user can show T(ij) equals Theta(j^2), they can conclude that the running time is indeed Theta(n^2), since i remains constant as n increases. The conversation emphasizes the relationship between the parameters and the asymptotic behavior of the algorithm.
ATroelstein
Messages
15
Reaction score
0
I am trying to prove the following scenario of Quicksort has a running time of Theta(n^2). Initially, we only have arrays of size n, where n = ij. The idea is that at every partition step of Quicksort, you end up with two sub-arrays where one is of size i and the other is of size i(j-1). i in this case is an integer constant greater than 0. I have drawn out the recursive tree of some examples and understand why this is a worst-case scenario and that the running time will be Theta(n^2). To prove this, I've used repeated substitution to solve the recurrence equation:

T(n) = T(ij) = m if j = 1, where m just denotes a constant
T(n) = T(ij) = T(i) + T(i(j-1)) + cn if j > 1

T(i) = m
T(2i) = m + m + c*2i = 2m + 2ci
T(3i) = m + 2m + 2ci + 3ci = 3m + 5ci

So it looks like the recurrence is:

$$T(n) = jm + (ci\sum_{k=1}^{j}k) - 1$$

At this point, I'm a bit lost as to what to do. It looks the summation at the end will result in j^2 if expanded out, but I need to show that it somehow equals n^2. Any explanation on how to continue with this would be appreciated.
 
Technology news on Phys.org
ATroelstein said:
I am trying to prove the following scenario of Quicksort has a running time of Theta(n^2). Initially, we only have arrays of size n, where n = ij. The idea is that at every partition step of Quicksort, you end up with two sub-arrays where one is of size i and the other is of size i(j-1). i in this case is an integer constant greater than 0. I have drawn out the recursive tree of some examples and understand why this is a worst-case scenario and that the running time will be Theta(n^2). To prove this, I've used repeated substitution to solve the recurrence equation:

T(n) = T(ij) = m if j = 1, where m just denotes a constant
T(n) = T(ij) = T(i) + T(i(j-1)) + cn if j > 1

T(i) = m
T(2i) = m + m + c*2i = 2m + 2ci
T(3i) = m + 2m + 2ci + 3ci = 3m + 5ci

So it looks like the recurrence is:

$$T(n) = jm + (ci\sum_{k=1}^{j}k) - 1$$

At this point, I'm a bit lost as to what to do. It looks the summation at the end will result in j^2 if expanded out, but I need to show that it somehow equals n^2. Any explanation on how to continue with this would be appreciated.


Why are you factorising \(n\)?

In the worst case the pivot splits the array of size \(n\) into arrays of size \(0\), and size \(n-1\). This spliting requires \(n-1\) compartisons, so:

\[T_{wc}(n)= k.(n-1)+T_{wc}(n-1)=k\sum_{k=1}^{n-1} n=k \frac{n.(n-1)}{2}=K(n^2-n)\]

which implies that \(T_{wx}\in \Theta(n^2)\)

CB
 
Last edited:
CaptainBlack said:
Why are you factorising \(n\)?

In the worst case the pivot splits the array of size \(n\) into arrays of size \(0\), and size \(n-1\). This spliting requires \(n-1\) compartisons, so:

\[T_{wc}(n)= k.(n-1)+T_{wc}(n-1)=k\sum_{k=1}^{n-1} n=k \frac{n.(n-1)}{2}=K(n^2-n)\]

which implies that \(T_{wx}\in \Theta(n^2)\)

CB
Thank you for your reply CaptainBlack. I am factorizing n because the specific problem I am working on is trying to show that while the case you described is the absolute worst-case scenario, where you have a sub-arrays of size 0 and n-1, that the scenario I gave also has a running time of theta(n^2). I am able to find many examples of the scenario you gave in textbooks and online, but am unfortunately am unable to find any examples of the case that I have given above and that's why I've become so stuck at this point. Thanks.
 
ATroelstein said:
Thank you for your reply CaptainBlack. I am factorizing n because the specific problem I am working on is trying to show that while the case you described is the absolute worst-case scenario, where you have a sub-arrays of size 0 and n-1, that the scenario I gave also has a running time of theta(n^2). I am able to find many examples of the scenario you gave in textbooks and online, but am unfortunately am unable to find any examples of the case that I have given above and that's why I've become so stuck at this point. Thanks.

Can you tell us exactly what the case you are considering is?

In particular how are the i's and j's behaving asymtotically as n becomes large?

CB
 
Since i is an integer constant, its asymptotic behavior won't depend on n. The asymptotic behavior of j however will be increasing as n increases, since n = ij. To make my scenario a little more clear, we could look at the case where i = 1. This falls into my scenario and is just one step away from the worst case you described, as now we have sub-arrays of size 1 and n-2. The next case is my scenario would be where i = 2 and we have sub-arrays of 2 and n-3, and so on. So now I'm trying to show that no matter what the integer constant i is, the running time will still be theta(n^2) as j is becoming large as n becomes large, but i does not. Thank you.
 
ATroelstein said:
Since i is an integer constant, its asymptotic behavior won't depend on n. The asymptotic behavior of j however will be increasing as n increases, since n = ij. To make my scenario a little more clear, we could look at the case where i = 1. This falls into my scenario and is just one step away from the worst case you described, as now we have sub-arrays of size 1 and n-2. The next case is my scenario would be where i = 2 and we have sub-arrays of 2 and n-3, and so on. So now I'm trying to show that no matter what the integer constant i is, the running time will still be theta(n^2) as j is becoming large as n becomes large, but i does not. Thank you.

If you can show that:

\[T(ij)=\Theta(j^2)\]

you are done, since in this context \(i\) is a constant and so \(\Theta(j^2)=\Theta(i^2j^2)=\Theta(n^2) \)

CB
 
Anthropic announced that an inflection point has been reached where the LLM tools are good enough to help or hinder cybersecurity folks. In the most recent case in September 2025, state hackers used Claude in Agentic mode to break into 30+ high-profile companies, of which 17 or so were actually breached before Anthropic shut it down. They mentioned that Clause hallucinated and told the hackers it was more successful than it was...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
9
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 11 ·
Replies
11
Views
4K
Replies
2
Views
2K
Replies
12
Views
2K
  • · Replies 20 ·
Replies
20
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K