- #1
MinusTheBear
- 22
- 0
I am learning about Big-O notation, and am kind of confused. So, Big-O notation is the upper-bound order that measures the amount of required resources as n approaches a sufficient size. It essentially measures the scalability of an algorithm. My question, however, is with the notation.If an algorithm is O(n), this means it is linear and that algorithm never exceeds this upper-bound order. I realize this means that the algorithm is technically O(nc) where c ≥ 1.
Now, my question is two-fold, and maybe I've answered them myself:
1) Can the ACTUAL growth rate of an O(n) algorithm EXCEED n?
2) Is the actual growth rate of an O(n) algorithm in the worst case, then, n?
Based on my understanding it would seem that O(n) can exceed n, since the actual growth rate may be kn+m, where k,m are positive integers.Which leads me to believe that the actual growth rate in the worst case is not n.
Sorry if this all seems obvious, it all just seems very weird to me and very broad in what it actually means.
Now, my question is two-fold, and maybe I've answered them myself:
1) Can the ACTUAL growth rate of an O(n) algorithm EXCEED n?
2) Is the actual growth rate of an O(n) algorithm in the worst case, then, n?
Based on my understanding it would seem that O(n) can exceed n, since the actual growth rate may be kn+m, where k,m are positive integers.Which leads me to believe that the actual growth rate in the worst case is not n.
Sorry if this all seems obvious, it all just seems very weird to me and very broad in what it actually means.