# Big-O and recurrence equations.

1. Feb 10, 2007

### cscott

Can anyone point me to resources showing me how to prove recurance equations like:

$I(n) \le c$ if n = 0
$d+I(n-1)$ if n >= 1

i.e. $$I(n) \le c + dn$$

Also for proving things like: $30000n^2 + n \log n$ is $O(n^3)$ using the basic definition of big-O notation ($f(n) \le k \cdot g(n)$ for some k and $n \le n_0$)

I never took algebra so I never learned this stuff while CS majors did :( It'd be helpful if I knew what to search for.

Last edited: Feb 10, 2007
2. Feb 12, 2007

### parabrahm

Hi,

I'm not sure I entirely understand your example but generally recurrence relations can be proven using mathematical induction. Studying proof by mathematical induction will help. The same of method of proof that allows us to prove things like:

$$\sum_{i=1}^{n} i = \frac {n(n + 1)} {2}$$

Also allows us to prove recurrence relations (that is one). There are some variations but at the risk of oversimplifying, the general idea is to establish the truth of a proposition by showing that it follows from smaller instances of the same proposition, as long as the truth of the smallest instance can be explicitly established. Proofs by induction normally involve establishing the truth of a base case and then the application of an inductive hypothesis (inductive step) which we assume to be true to show the truth of the proposition, no matter what size, follows from the truth of all the previous propositions.

Regarding big-O, the whole purpose for it's existence is to provide a "fuzzy ruler" with which to measure the efficiency of algorithms. With that in mind, we normally want leave out tricky things like constants and such. For easy examples like the one you listed it is sufficient to note that:

$30000n^2 + n \log n \epsilon O(n^3)$

By noting that:

$30000n^2 + n \log n < n^2 + n^2 = 2n^2 < 2n^3$

Thus, $30000n^2 + n \log n \epsilon O(n^3)$ for very large n.

For harder functions it helps to study the asymptotic behavior of the functions in question. That is to say, to see if $f(n) \epsilon O(g)$ we look at the limit:

$\lim_{n\rightarrow\infty}\frac {f(n)} {g(n)}$

From there we have a couple possibilities. If the limit exists and is finite we know that $f(n) \epsilon O(g)$. If the limit is infinity then $f(n) \neg\epsilon O(g)$. Additionally if the limit is finite and non zero we can say that $O(f) = O(g)$

Last edited: Feb 12, 2007
3. Feb 13, 2007

### uart

Are you guys sure about that? I could have sworn it was $$O(n^2)$$

4. Feb 13, 2007

### cscott

I've used induction to prove convergence in sequences or formulas like you posted but I've never used it in the context of CS.

So parabrham you are saying basically $n log n \le n^2$, adding the two $n^2$'s (drop 30000) and saying theyre less than $x^3$?? I thought we always went with the best possible order...

I would have said O(n^2) as well but they also say prove $30000n^2$ is $O(n^3 \log n)$?!? I don't understand how $30000n^2$ isn't simply $O(n^2)$.

Last edited: Feb 13, 2007
5. Feb 13, 2007

### Alkatran

6. Feb 14, 2007

### uart

Yes that's discussing recurrence relations. But the $30000n^2 + n \log n$ was not given as part of a recurrence relation but just a simple function of n.

$$30000n^2 + n \log n < 30001 n^2$$ and is therefore $$O(n^2)$$

Last edited: Feb 14, 2007
7. Feb 19, 2007

### parabrahm

You guys are correct. Sorry I was away for a bit.