Was the Professor's Big O Notation Example Correct?

  • Thread starter Thread starter Allday
  • Start date Start date
  • Tags Tags
    Analysis
Click For Summary
SUMMARY

The discussion centers on the interpretation of Big O notation, specifically the example given in an MIT OpenCourseWare lecture where the professor states that 2n² = O(n³). Participants agree that while this statement is technically correct as O(n³) is an upper bound for 2n², it is not the most precise representation, which would be 2n² = O(n²). The conversation highlights the importance of distinguishing between formal definitions and practical usage in computer science, emphasizing that while both statements are valid, clarity and precision are crucial in algorithm analysis.

PREREQUISITES
  • Understanding of Big O notation and its formal definition
  • Familiarity with algorithm complexity analysis
  • Basic knowledge of asymptotic notation
  • Experience with algorithm examples, such as merge sort and polynomial functions
NEXT STEPS
  • Study the formal definition of Big O notation in detail
  • Learn about the differences between upper bounds and tight bounds in algorithm analysis
  • Explore examples of common algorithms and their complexities, such as O(n log n) for merge sort
  • Review resources like "The Algorithm Design Manual" by Steven Skiena for deeper insights into algorithm analysis
USEFUL FOR

Computer science students, software engineers, and anyone involved in algorithm design and analysis will benefit from this discussion, particularly those looking to clarify their understanding of Big O notation and its applications.

Allday
Messages
164
Reaction score
1
I decided to watch an MIT OpenCourseWare class on algorithms and the second lecture has me checking my understanding. A link to the lecture is here (http://ocw.mit.edu/courses/electric...ation-recurrences-substitution-master-method/)

My question is pretty basic (I think). Right in the beginning of the lecture (2:50 on the video) the prof. writes on the blackboard,

2 n^2 = O(n^3)

as an example. I was thinking that should be O(n^2) but then I thought the example statement is true (i.e. 2 n^2 grows slower than n^3 ... and n^4 and n^5 ... ) but the statement with the most information is 2 n^2 = O(n^2). Do you think it was a mistake on the blackboard or he did it to show that the O(n^3) is also true?
 
Technology news on Phys.org
I was under the impression that you could ignore constant factors, and lesser terms when deducing the Big O of a function. So 2n^2 => O(n^2).

Or 5n^3 + 8000000000n^2 + 4612876198274194109438109284n + 8 => O(n^3). When you reach large values of n, the n^3 term eclipses the other terms.
 
I agree:
2 n^2 => O(n^2)
 
If I recall correctly, big O notation refers to an upper bound. So strictly speaking, an algorithm that is O(n^2) is also O(n^3), because the latter will be larger than the former. However, it is best to hold a fair amount of precision in your upper bound: simply stating that an algorithm is O(infinity) is technically always true, and also never useful (unless your algorithm will literally never finish).
 
Last edited:
DimReg said:
If I recall correctly, big O notation refers to an upper bound. So strictly speaking, an algorithm that is O(n^2) is also O(n^3), because the latter will be larger than the former.
There are several uses - but in all cases it is related to an asymptote approached towards infinity. So 2N^2 => O(n^3) would be inaccurate.
 
.Scott said:
There are several uses - but in all cases it is related to an asymptote approached towards infinity. So 2N^2 => O(n^3) would be inaccurate.

Inaccurate, but not incorrect, at least according to: http://en.wikipedia.org/wiki/Big_O_notation#Formal_definition

(If you use n^2 as f(x) and n^3 as g(x), the formal definition is easily satisfied)

This is just semantics, but my guess is that the prof was trying to illustrate this point.
 
DimReg said:
Inaccurate, but not incorrect, at least according to: http://en.wikipedia.org/wiki/Big_O_notation#Formal_definition

(If you use n^2 as f(x) and n^3 as g(x), the formal definition is easily satisfied)

This is just semantics, but my guess is that the prof was trying to illustrate this point.
At least in computer science, if you called a merge sort O(n^2), you would be both inaccurate and incorrect - because it is O(n log n).
 
Well, I know of at least one computer science textbook that agrees with me, and it's where I learned this definition. The book is http://sist.sysu.edu.cn/~isslxm/DSA/textbook/Skiena.-.TheAlgorithmDesignManual.pdf , and on it's 36th page Skiena writes:

"The Big Oh notation provides for a rough notion of equality when comparing
functions. It is somewhat jarring to see an expression like n^2 = O(n^3), but its
meaning can always be resolved by going back to the definitions in terms of upper
and lower bounds. It is perhaps most instructive to read the “=” here as meaning
one of the functions that are. Clearly, n^2 is one of functions that are O(n^3)."

Maybe the greater computer science community disagrees with him on the definition of Big Oh notation.
 
Last edited by a moderator:
Allday said:
I decided to watch an MIT OpenCourseWare class on algorithms and the second lecture has me checking my understanding. A link to the lecture is here (http://ocw.mit.edu/courses/electric...ation-recurrences-substitution-master-method/)

My question is pretty basic (I think). Right in the beginning of the lecture (2:50 on the video) the prof. writes on the blackboard,

2 n^2 = O(n^3)

as an example. I was thinking that should be O(n^2) but then I thought the example statement is true (i.e. 2 n^2 grows slower than n^3 ... and n^4 and n^5 ... ) but the statement with the most information is 2 n^2 = O(n^2). Do you think it was a mistake on the blackboard or he did it to show that the O(n^3) is also true?

As already pointed out, there is a formal definition of big O and then there is a typical usage of it in computer science. From the formal standpoint, 2n^2=o(n^3) is just fine since it satisfies the formal definition. In other words, O(n^3) is an upper limit for 2n^2; however, it is not the best upper limit we can obtain here.

So I think he was just saying be careful. There is a difference between having an upper limit and the best upper limit.
 
  • #10
thanks everyone. thinking of the "=" sign not as a true equality but as indicating membership in a set makes it clearer for me.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 27 ·
Replies
27
Views
4K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
6
Views
5K
  • · Replies 65 ·
3
Replies
65
Views
8K
Replies
2
Views
3K
Replies
15
Views
16K
Replies
6
Views
2K
  • · Replies 86 ·
3
Replies
86
Views
24K