# Big O question: f(n) = n^(1.01) + n(log(n))^5, g(n) = n^(1.01)

## Homework Statement

Given that:

f(n) = n^(1.01) + n(log(n))^5
g(n) = n^(1.01)

Show that:
a) f(n) ∈ O(g(n))
b) f(n) ∈ Ω(g(n))
c) f(n) ∈ Θ(g(n))

## Homework Equations

Any method as long as it's correct. It doesn't have to be strictly by using the definition of the big O notation but it could be using limits or any other correct method.

## The Attempt at a Solution

I can see how b is true. Also, given that I know the answer to a is true because the "solution" my teacher gave states it, I can automatically conclude that c is true because if both a and b are true, then c must also be true.

I tried plugging in n = 1,000,000 and n(log(n))^5 grows larger than n^(1.01) which makes intuitive sense but seems to contradict a.

Any help in showing that a is in fact true would be greatly appreciated!

Related Engineering and Comp Sci Homework Help News on Phys.org
zcd
Try taking the limit of the ratio $$\frac{f(n)}{g(n)}$$. If the limit converges, you use the result to get a relationship between the two.

Actually, I already knew that. I just missed something logically in the procedure involving limits which I now get. Thankfully the problem just asks for the answer because I can foresee the outcome of very many applications of l'Hopital's rule and don't actually have to differentiate it so many times and hurt my poor brain and hand! :D

Now, I get it algebraically thanks to the limit but, conceptually, I still can't see this! The n[log_2(n)]^5 term wins out against n^(1.01) not to mention that in this case, it has an additional n^(1.01) to "help it win the fight" on the numerator. Could you please help me get it fundamentally without directly relying on a mathematical theorem?

zcd
If f(n) ∈ Ω(g(n)), then you already know that the log term does not in fact win against the power term in growth. It may feel counterintuitive because g(n) has the power term as well, but if the log term insignificant in comparison then you're really only comparing the powers.

1) f(n) ∈ Ω(g(n)) is not given in the question but rather in the solution that the teacher gave. In other words, you're not supposed to know that when you attempt the question.

2) I get what you're saying about if it were insignificant, I'd only have to compare the powers but I can't accept that it is insignificant. Let n = 10^9:

http://www.wolframalpha.com/input/?i=(10^9*(log_2(10^9))^5)/(10^9)^1.01

the result from Wolfram Alpha is clearly greater than 1 (implying that the log wins).

What am I doing wrong?

zcd
My mistake, I meant to say f(n) ∈ O(g(n)) in my previous post.

Your n isn't large enough, try comparing 10^100 vs 10^1000.

Last edited:
I think I typed the question in a confusing way initially but the point is that none of the three things asking to be shown are known in the question (I have the answers given by my teacher - that's why I know them all.)

Could you comment on the Wolfram Alpha part of my previous post specifically please?

zcd
I think I typed the question in a confusing way initially but the point is that none of the three things asking to be shown are known in the question (I have the answers given by my teacher - that's why I know them all.)

Could you comment on the Wolfram Alpha part of my previous post specifically please?
You didn't choose n large enough. The power n^(.01) will win against log(n), but will take much longer. If you solve for their intersections, the larger intersection when the power term wins against log_2(n) is around 7*10^(299)

http://www.wolframalpha.com/input/?i=n^(.01)=log_2+n

Be careful when choosing values to check for growth; even though 10^9 seems like a large number for daily use it's still a small finite number when compared to infinity.

My god!

I had to use 10^9999! 10^999 didn't work either to "prove" the point!:
http://www.wolframalpha.com/input/?i=(10^9999*(log_2(10^9999))^5)/(10^9999)^1.01

Okay so, basically, any finite power (regardless of how small the finite power is) always eventually wins against a logarithm to any finite power (no matter how large the finite power is)?

(Assume that the powers are positive and that the only variables are the base of the power for the polynomial as well as the argument for the logarithmic function for the question I just asked.)

Is there an exponent, w, small enough such that the limit as x-->inf of x^w <= limit as x-->inf of log_2(x)? If so, what is it? (If the answer to my question above is yes then this second question - technically two questions close together - could automatically be ignored.)

zcd
You can try to prove the first claim with the same method as before:
$$\lim_{n\to\infty}\frac{\log(n)}{n^w}$$ for some w>0. The base of the log doesn't matter, so working with the natural log will make it easier.

Is what I just did (in the attachment) nonsense? If not, what do I do next? If it is, then what's the first mistake I made and how can I correct it?

#### Attachments

• 60.2 KB Views: 486
zcd
You did the limit incorrectly; here's a (counter) proof of a similar claim to help you get started:

If we start by assuming (for contradiction) that $$n^w\in O(\ln(n))$$ for some w>0, then there exist c,N>0 such that $$c\ln(n)\geq n^w$$ for all n>N, or equivalently $$\lim_{n\to\infty}\frac{\ln(n)}{n^w}\geq \frac{1}{c}>0.$$ However,
$$\lim_{n\to\infty}\frac{\ln(n)}{n^w}=\lim_{n\to\infty}\frac{\frac{1}{n}}{wn^{w-1}}=\lim_{n\to\infty}\frac{1}{wn^w}=0$$

Ok so what you just said means that w can be a super small (positive and non-zero) finite number and that the monomial would win out against the logarithm no matter which base and power it's raised to like I said earlier, right?

Edit: I think it seems like I'm repeating what you implied but I just need an explicit confirmation just to be sure because I don't want to incorrectly assume the above.

Last edited:
zcd
There's a subtle difference here. What I proved was that any positive power is not in O(log n), or that log n doesn't provide an upper bound to power functions. It does not necessarily immediately imply that any power function gives an upper bound for log(n).

Okay, and stating that n^w ∈ O(ln(n)) is false means n^w <= c * ln(n) is false which in turn means n^w > c * ln(n) is true which indirectly means (using a proof by contradiction) that w can be a super small (positive and non-zero) finite number and that the monomial would win out against the logarithm no matter which base and power it's raised to, right?

zcd
That's not necessarily true either. The statement n^w ∈ O(ln(n)) is equivalent to "∃c,N>0 such that ∀n>N, n^w< c*ln(n)". Since this is false, we can only assume its negation, which is "∀c,N>0 ∃n>N, n^w>= c*ln(n)". This is not the same as saying n^w always outgrows against ln(n); it only says that ln(n) does not outgrow n^w. You'll have to do a separate proof (in the same pattern) of your specific claim.

Edit: I forgot to mention w is a real so imagine that I stated it in the proof.

#### Attachments

• 101 KB Views: 464
zcd
That's the gist of it. If you really wanted to be rigorous, l'hopital's rule can't be applied to functions of integers (it's a calculus result) so define functions $$f,g:\mathbb{R}\to\mathbb{R}\quad\text{as}\quad f(x)=x^w,g(x)=\ln(x)$$ and then apply l'hopital's rule. Since the integers are a subset of the reals, the result will hold true if true in the continuous analog. You also don't want to assume the theorem you're proving in the beginning since it will end up as circular logic.

Thanks for adding since I care about rigour. I got the integer/real statement you made but, as for the circular logic part, I don't see how what I did is circular reasoning because I started with an inequality that did not isolate the constant and I then: 1) took the limit of both sides 2) isolated for the constant 3) applied l'Hopital's rule. So, what, specifically, is circular?

zcd
The circular reasoning lies in the fact that you assumed your proposition is true before proceeding to prove it is true.

Sorry, double posted.

Last edited:
So which part of my work should I change mechanically though?

Is it the "Let's assume [. . .]" part? If so, should I change that to "If [. . .]" instead?

Last edited:
I know it seems like a redundant question but I really need to know the exact details so I can fully comprehend the solution to the problem so please tell me.

Oh, wait! I think it's just that the wording was confusing. When I said "Let's assume "[. . .]" I was saying to assume that the big omega statement is equivalent to the inequality and not saying to assume that the big omega statement is true.

Having said that, is my statement fully correct?

Oh, wait! I think it's just that the wording was confusing. When I said "Let's assume "[. . .]" I was saying to assume that the big omega statement is equivalent to the inequality and not saying to assume that the big omega statement is true.

Having said that, is my statement fully correct?