I have a question with respect to the Big Oh notation, in which the Landau symbol [tex]O(n)[/tex] is involved. The Landau symbol [itex]O(n)[/itex] is defined as: [tex]f(n)=O(g(n))[/tex] if there exists an integer [tex]n_0[/tex] and a constant [tex]C>0[/tex] such that for all integers [tex]n \geq n_0[/tex], [tex]f(n) \leq C \cdot g(n)[/tex]. For example [tex]f(n)= 5 \cdot n = O(n)[/tex]. Now the equality sign is not to be understand as an equality sign in the common sense, but rather as [tex]f(n) \in O(n)[/tex], see Big Oh Fallacies and Pitfalls and the Wikipedia article. ( [itex]5 \cdot n = O(n)[/itex] and [itex]2 \cdot n = O(n)[/itex], but [itex] 5 \cdot n \neq 2 \cdot n[/itex]) So it would be better to think of [tex]O(n)[/tex] as a set of functions. My question: Does anyone know why it became common to use the notation f(n)=O(g(n))? I remember my professor mentioning that physicists introduced this sloppy notation.