Complexity of this heap algorithm

  • Thread starter fiksx
  • Start date
61
0
I'm trying to count running time of build heap in heap sort algorithm

Code:
    BUILD-HEAP(A)
    heapsize := size(A);
      for i := floor(heapsize/2) downto 1
        do HEAPIFY(A, i);
      end for
    END
The basic idea behind why the time is linear is due to the fact that the time complexity of heapify depends on where it is within the heap. It takes O(1)O(1)O(1) time when the node is a leaf node (which makes up at least half of the nodes) and O(logn)O(\log n)O(logn) time when it’s at the root.


The O(n)O(n)O(n) time can be proven by solving the following:
##\sum_{h=0}^ {\lfloor lg n \rfloor} \frac{n}{2^{h+1}}O(h)=O(n\sum_{h=0}^ {lg n}\frac{h}{2^{h}})##

suppose this is the tree

4 .. height2
/ \
2 6 .. height 1
/\ /\
1 3 5 7 .. height 0

what I understand here $O(h)$ means worst case for heapify for each node, so height=ln n if the node is in the root for example to heapify node 2,1,3 it takes $ln_2 3 =1.5$ the height of root node 2 is 1, so the call to HEAPIFY is $ln_2 n=height = O(h)$

im not sure about this $$\frac{n}{2^{h+1}}$$ , is the number of nodes for any given height , suppose height is 1 and sum of nodes is 3 such as 2,1,3, so ##\frac{n}{2^{h+1}}= \frac{3}{2^{0+1}}=1.5=2## , when height is 0 there is at most two nodes. am i correct?

suppose given height is 0 so it is the last layer, then when sum of nodes is 7 , ##\frac{n}{2^{h+1}}## =##\frac{n}{2^{0+1}}=\frac{7}{2}=3.5=4? ## -> {1,3,5,7} if the root is 4


the summation is lg n because it sum the total height when it do heapify?

and last is to count the big-oh, BUILD HEAPIFY will call HEAPIFY ##\frac{n}{2}## times, and each will be ##ln_2 n## = height of the root, so ##\frac{n}{2} * ln_2 n##?


please correct me if i am wrong thanks!

this is the reference i used, and also i read about this in CLRS book


[1]: https://i.stack.imgur.com/KYOjp.png
[2]: https://www.HostMath.com/Show.aspx?Code=\sum_{h=0}^ {\lfloor lg n \rfloor} \frac{n}{2^{h+1}}O(h)=O(n\sum_{h=0}^ {lg n}\frac{h}{2^{h}})
 
Last edited by a moderator:
1,091
193
Are you still looking for help? I'm not sure why noone else has replied although it is a bit hard to read, if you put more effort in to composing a post you are more likely to get a helpful answer. Anyway here goes...

The basic idea behind why the time is linear is due to the fact that the time complexity of heapify depends on where it is within the heap. It takes O(1)O(1)O(1) time when the node is a leaf node (which makes up at least half of the nodes) and O(logn)O(\log n)O(logn) time when it’s at the root.
This has just been copied and pasted from the referenced article: do you understand what it is saying? If so, why did you not correct O(1)O(1)O(1) and O(logn)O(\log n)O(logn) when these should be ## O(1) ## and ## O(\log n) ##?

what I understand here $O(h)$ means worst case for heapify for each node, so height=ln n if the node is in the root for example to heapify node 2,1,3 it takes $ln_2 3 =1.5$ the height of root node 2 is 1, so the call to HEAPIFY is $ln_2 n=height = O(h)$
I don't understand what this means. Do you understand why in Big-O notation we only use ## \log ## without specifying the base, not ## \ln = \log_e ## or ## \log_2 ## etc?

im not sure about this $$\frac{n}{2^{h+1}}$$ , is the number of nodes for any given height , suppose height is 1 and sum of nodes is 3 such as 2,1,3, so ##\frac{n}{2^{h+1}}= \frac{3}{2^{0+1}}=1.5=2## , when height is 0 there is at most two nodes. am i correct?
Firstly you refer to the 'sum of nodes' when I think you mean 'number of nodes' or alternatively 'count of nodes'.

Secondly, the number of nodes ## n ## in the expression you quote is fixed - it is 7 for your example tree. So for ## h = 0 ## there are exactly $$ \lceil \frac{n}{2^{h+1}} \rceil = \lceil \frac{7}{2^{0+1}} \rceil = \lceil 3.5 \rceil = 4 $$
nodes at layer 0 (please do not write 1.5 = 2).

the summation is lg n because it sum the total height when it do heapify?

and last is to count the big-oh, BUILD HEAPIFY will call HEAPIFY ##\frac{n}{2}## times, and each will be ##ln_2 n## = height of the root, so ##\frac{n}{2} * ln_2 n##?
Again I am not sure what you are saying, or of your mathematical knowledge - do you understand the concept of limits? What we are interested in is not the actual number of times we heapify for any particular heap, rather what is the limit as the size of the heap tends to ## \infty ##.

PS the CLRS book you mention is really good, but unless you have a firm grasp of calculus or at least limit theory I would recommend the Steven Skiena book mentioned in the article instead (or as well).
 

Want to reply to this thread?

"Complexity of this heap algorithm" You must log in or register to reply here.

Related Threads for: Complexity of this heap algorithm

Replies
4
Views
7K
Replies
9
Views
711
Replies
1
Views
2K
Replies
1
Views
3K
Replies
2
Views
4K
  • Posted
Replies
2
Views
1K

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top