Operator O in Physics: Definition & Uses

Click For Summary

Discussion Overview

The discussion centers around the operator O in the context of asymptotic notation used in mathematics and computer science, particularly in relation to growth rates of functions. Participants explore its definition, applications, and the distinctions between big O, little o, and other related notations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants seek a clear definition of the operator O and its applications in physics and algorithms.
  • Others explain that O(f(x)) indicates an upper bound on the growth of f(x), while o(f(x)) suggests a function that grows slower than f(x).
  • A participant notes that O is often used to describe worst-case scenarios in algorithms, while little o may not have as intuitive an origin.
  • There are discussions about the relationship between O and o, with some arguing that o can always be replaced by O, but not vice versa.
  • Some participants express preferences for using set notation (f(n) ∈ O(g(n))) to clarify the meaning of O as a set of functions rather than a single function.
  • Concerns are raised about the proper use of the equality sign in the context of asymptotic notation, with some preferring to avoid its misuse.
  • Participants discuss the implications of using Θ notation, which indicates that two functions grow at the same rate.
  • There is mention of the Vinogradov notation as an alternative to big O notation.

Areas of Agreement / Disagreement

Participants express differing views on the definitions and implications of big O and little o notations, with no consensus reached on the best practices for their use. The discussion remains unresolved regarding the intuitive understanding of little o compared to big O.

Contextual Notes

Some participants highlight the limitations of their definitions and the potential for confusion regarding the use of asymptotic notation, particularly in relation to specific functions and their growth rates.

pivoxa15
Messages
2,250
Reaction score
1
This operator, captial O is used in physics such as electrodynamics and can be placed in front of functions (i.e. Of(x)). Could someone explain it clearly with a definition. In what situations is it used?
 
Mathematics news on Phys.org
It seems the order (O) of f(x), which is g(x) (or f(x)=O(g(x))) provides the largest order of magnitude of growth for f(x).

Whereas the little order (o) provides f(x) with a function which is one order of magnitude larger than the largest order of growth for f(x). Although I haven't regiorously explained order of magnitude, I had the limit definitions of these functions in mind. Why would they use o to describe a larger function than O?
 
Last edited:
I don't know the history behind the notation, but I'm sure you could find it if you're really interested.

A function f is O(g(x)) if g(x) is no larger asymptotically. It is o(g(x)) if g(x) is smaller asymptotically.

Small o is useful when you have a function (or algorithm) that grows (runs) faster than one function but slower than another. The best factoring algorithms, for example, are o(2^n) but Ω(n^c). (Ω is just the opposite of o; f is Ω(g(x)) if g(x) is greater asymptotically.)
 
I also realized that
if f(x)=o(g(x)) then f(x)=O(g(x))
 
Yeah. O is like <=, o like <, Ω like >=. There is also little-omega (>), but that's rare.
 
Asymptotic notation has to do with "within a constant multiple" rather than just strictly greater than or less than. You write
f(n) = O(g(n)) iff there is some N and some k > 0, where for any n >= N, you have 0 <= k * f(n) <= g(n)

One book I have uses f(n) [tex]\in[/tex] O(g(n)) instead of f(n) = O(g(n)) to emphasize that O(g(n)) actually is a set of functions that are asymptotically within a constant multiple of each other.
 
0rthodontist said:
Asymptotic notation has to do with "within a constant multiple" rather than just strictly greater than or less than. You write
f(n) = O(g(n)) iff there is some N and some k > 0, where for any n >= N, you have 0 <= k * f(n) <= g(n)

Yes, of course. That's what I mean when I say "asymptotically less than" and such.

The operator comparisons are good for people just getting used to the notation, as long as they understand the underlying definition. It helps keep people from flipping them by accident (hopefully!).

0rthodontist said:
One book I have uses f(n) [tex]\in[/tex] O(g(n)) instead of f(n) = O(g(n)) to emphasize that O(g(n)) actually is a set of functions that are asymptotically within a constant multiple of each other.

I rather prefer that notation. I can't stand to see the = sign abused. In my posts above I only used "f is O(g(x))" rather than "f = O(g(x))" for that reason. Set notation isn't that common, so I didn't want to cause confusion, but I also don't like extreme notation abuse. (I can live with little stuff, but = is fundamental and should be used properly.)
 
pivoxa15 said:
It seems the order (O) of f(x), which is g(x) (or f(x)=O(g(x))) provides the largest order of magnitude of growth for f(x).

It's not the largest order of magnitude of growth, rather an upper bound for the order of the growth. It doesn't need to be optimal in any way, x^2=O(x^34), this is as x->infinity. You will also see this big-O notation with other limit points, like 0 with taylor polynomials, e^x=1+x+O(x^2) just means there is a constant A>0 that in some small enough neighbourhood of x=0 we have |e^x-1-x|<=A|x^2|


About the "=", I don't tend to think of O(f(x)) as meaning a set of functions, rather as a specific function that's bounded by a constant times f(x), and all we really care about is this functions growth is bounded in this way. For example, writing x+cos(x)=x+O(1), the O(1) to me is really standing for the function cos(x) but all that's going to matter later on is that this term is bounded in absolute value by a constant. The convenience is not having to keep exact track of what function is being hidden by the big-O at each step when all you care about are bounds.

there's also the Vinogradov notations of the double arrows [tex]f\ll g[/tex] which means the same thing as the landau big-O, f=O(g).
 
  • #10
shmoe said:
It's not the largest order of magnitude of growth, rather an upper bound for the order of the growth. It doesn't need to be optimal in any way, x^2=O(x^34), this is as x->infinity.

With your example, there is no doubt it is correct but o can be used as well and is a better alternative. Because all o can be replaced by O (but not vice versa) so it means it is always best to use o if possible (otherwise what is the point of having o).

There is no doubt O is used more often than o and that could be why Wiki only has one entry called big O, not small o.
 
  • #11
pivoxa15 said:
With your example, there is no doubt it is correct but o can be used as well and is a better alternative. Because all o can be replaced by O (but not vice versa) so it means it is always best to use o if possible (otherwise what is the point of having o).

There is no doubt O is used more often than o and that could be why Wiki only has one entry called big O, not small o.

Θ is best, if you can use it. f(x) is Θ(g(x)) iff f(x) is O(g(x)) and g(x) is O(f(x)). To extend the above analogy, it's like =.
 
  • #12
Big-O comes from examining the worst case behavior of an algorithm, but little o doesn't seem to have such an intuitive origin. I guess you could use it to say that one algorithm runs asymptotically better than another, but I haven't seen that used. When you say an algorithm is big-O of a function there's usually the understanding that there are some inputs for which the algorithm is exactly theta of the function--you want to find the smallest possible function that the algorithm is big-O of. If you say an algorithm is little-o of a function then you just want to ask what's the closer bound.
 
Last edited:

Similar threads

  • · Replies 61 ·
3
Replies
61
Views
6K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 32 ·
2
Replies
32
Views
4K
  • · Replies 25 ·
Replies
25
Views
2K
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K