Proving the divergent integral of 1/f(x) as x-> infinity

000
Messages
7
Reaction score
0

Homework Statement


There exists a function f(x) such that the indefinite integral of 1/f(x) as x-> infinity diverges, and f(x) >= x for all values of x. Prove this function must be a linear polynomial.

Homework Equations


None that I know of.

The Attempt at a Solution


No idea where to start.
 
Last edited:
Physics news on Phys.org
You can't, as stated, it is not true. There exist many non-polynomial functions that are "asymptotic" to x such that the integral of 1/f(x) diverges. However, if you require that f(x) be a polynomial, then it is true that f must be linear.
 
HallsofIvy said:
You can't, as stated, it is not true. There exist many non-polynomial functions that are "asymptotic" to x such that the integral of 1/f(x) diverges. However, if you require that f(x) be a polynomial, then it is true that f must be linear.

f(x) must always be larger than x, are there any asymptotic functions that fit that criteria?
 
000 said:
f(x) must always be larger than x, are there any asymptotic functions that fit that criteria?

How about f(x)=x+1/x?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top