Complexity of this heap algorithm

In summary: BUILD-HEAP will call HEAPIFY ##\frac{n}{2}## times, and each will be ##ln_2 n## = height of the root?
  • #1
fiksx
77
1
I'm trying to count running time of build heap in heap sort algorithm

Code:
    BUILD-HEAP(A)
    heapsize := size(A);
      for i := floor(heapsize/2) downto 1
        do HEAPIFY(A, i);
      end for
    END

The basic idea behind why the time is linear is due to the fact that the time complexity of heapify depends on where it is within the heap. It takes O(1)O(1)O(1) time when the node is a leaf node (which makes up at least half of the nodes) and O(logn)O(\log n)O(logn) time when it’s at the root.The O(n)O(n)O(n) time can be proven by solving the following:
##\sum_{h=0}^ {\lfloor lg n \rfloor} \frac{n}{2^{h+1}}O(h)=O(n\sum_{h=0}^ {lg n}\frac{h}{2^{h}})##

suppose this is the tree

4 .. height2
/ \
2 6 .. height 1
/\ /\
1 3 5 7 .. height 0

what I understand here $O(h)$ means worst case for heapify for each node, so height=ln n if the node is in the root for example to heapify node 2,1,3 it takes $ln_2 3 =1.5$ the height of root node 2 is 1, so the call to HEAPIFY is $ln_2 n=height = O(h)$

im not sure about this $$\frac{n}{2^{h+1}}$$ , is the number of nodes for any given height , suppose height is 1 and sum of nodes is 3 such as 2,1,3, so ##\frac{n}{2^{h+1}}= \frac{3}{2^{0+1}}=1.5=2## , when height is 0 there is at most two nodes. am i correct?

suppose given height is 0 so it is the last layer, then when sum of nodes is 7 , ##\frac{n}{2^{h+1}}## =##\frac{n}{2^{0+1}}=\frac{7}{2}=3.5=4? ## -> {1,3,5,7} if the root is 4the summation is lg n because it sum the total height when it do heapify?

and last is to count the big-oh, BUILD HEAPIFY will call HEAPIFY ##\frac{n}{2}## times, and each will be ##ln_2 n## = height of the root, so ##\frac{n}{2} * ln_2 n##?please correct me if i am wrong thanks!

https://www.growingwiththeweb.com/data-structures/binary-heap/build-heap-proof/this is the reference i used, and also i read about this in CLRS book [1]: https://i.stack.imgur.com/KYOjp.png
[2]: https://www.HostMath.com/Show.aspx?Code=\sum_{h=0}^ {\lfloor lg n \rfloor} \frac{n}{2^{h+1}}O(h)=O(n\sum_{h=0}^ {lg n}\frac{h}{2^{h}})
 
Last edited by a moderator:
Technology news on Phys.org
  • #2
Are you still looking for help? I'm not sure why no one else has replied although it is a bit hard to read, if you put more effort into composing a post you are more likely to get a helpful answer. Anyway here goes...

fiksx said:
The basic idea behind why the time is linear is due to the fact that the time complexity of heapify depends on where it is within the heap. It takes O(1)O(1)O(1) time when the node is a leaf node (which makes up at least half of the nodes) and O(logn)O(\log n)O(logn) time when it’s at the root.
This has just been copied and pasted from the referenced article: do you understand what it is saying? If so, why did you not correct O(1)O(1)O(1) and O(logn)O(\log n)O(logn) when these should be ## O(1) ## and ## O(\log n) ##?

fiksx said:
what I understand here $O(h)$ means worst case for heapify for each node, so height=ln n if the node is in the root for example to heapify node 2,1,3 it takes $ln_2 3 =1.5$ the height of root node 2 is 1, so the call to HEAPIFY is $ln_2 n=height = O(h)$
I don't understand what this means. Do you understand why in Big-O notation we only use ## \log ## without specifying the base, not ## \ln = \log_e ## or ## \log_2 ## etc?

fiksx said:
im not sure about this $$\frac{n}{2^{h+1}}$$ , is the number of nodes for any given height , suppose height is 1 and sum of nodes is 3 such as 2,1,3, so ##\frac{n}{2^{h+1}}= \frac{3}{2^{0+1}}=1.5=2## , when height is 0 there is at most two nodes. am i correct?
Firstly you refer to the 'sum of nodes' when I think you mean 'number of nodes' or alternatively 'count of nodes'.

Secondly, the number of nodes ## n ## in the expression you quote is fixed - it is 7 for your example tree. So for ## h = 0 ## there are exactly $$ \lceil \frac{n}{2^{h+1}} \rceil = \lceil \frac{7}{2^{0+1}} \rceil = \lceil 3.5 \rceil = 4 $$
nodes at layer 0 (please do not write 1.5 = 2).

fiksx said:
the summation is lg n because it sum the total height when it do heapify?

and last is to count the big-oh, BUILD HEAPIFY will call HEAPIFY ##\frac{n}{2}## times, and each will be ##ln_2 n## = height of the root, so ##\frac{n}{2} * ln_2 n##?
Again I am not sure what you are saying, or of your mathematical knowledge - do you understand the concept of limits? What we are interested in is not the actual number of times we heapify for any particular heap, rather what is the limit as the size of the heap tends to ## \infty ##.

PS the CLRS book you mention is really good, but unless you have a firm grasp of calculus or at least limit theory I would recommend the Steven Skiena book mentioned in the article instead (or as well).
 
  • Informative
Likes anorlunda

1. What is a heap algorithm?

A heap algorithm is a data structure that organizes data in a specific way to allow for efficient retrieval and insertion of elements. It is commonly used to implement priority queues and sorting algorithms.

2. How does a heap algorithm work?

A heap algorithm works by creating a binary tree structure, where each node has at most two child nodes. The elements in the tree are ordered in a specific way, with the root node containing the highest or lowest value, depending on the type of heap. This allows for efficient retrieval of the top element, as well as insertion and deletion of elements.

3. What makes the complexity of a heap algorithm important?

The complexity of a heap algorithm is important because it determines the efficiency of the algorithm. This includes the time and space complexity, which can impact the performance of the algorithm in terms of speed and memory usage. Understanding the complexity of a heap algorithm can help in choosing the best algorithm for a specific problem.

4. What is the time complexity of a heap algorithm?

The time complexity of a heap algorithm depends on the specific operation being performed. For retrieval, insertion, and deletion, the time complexity is O(log n), where n is the number of elements in the heap. This means that the algorithm's performance will not significantly decrease as the size of the data increases.

5. What are the advantages of using a heap algorithm?

There are several advantages of using a heap algorithm, including efficient retrieval of the top element, efficient insertion and deletion of elements, and the ability to maintain a sorted data structure. It is also useful for implementing priority queues, which are commonly used in scheduling and resource allocation problems.

Similar threads

  • Programming and Computer Science
Replies
1
Views
1K
Replies
1
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
11
Views
1K
  • Programming and Computer Science
Replies
3
Views
1K
  • Programming and Computer Science
Replies
1
Views
1K
  • Programming and Computer Science
Replies
5
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
1K
Replies
9
Views
1K
  • Programming and Computer Science
Replies
1
Views
1K
  • Programming and Computer Science
Replies
1
Views
1K
Back
Top