# Homework Help: Derivative of an integral

1. Dec 6, 2007

### Mothrog

I have a function of the form

$$\int^{r}_{0} tf(t)dt$$

I'm supposed to take the derivative with respect to r of this integral. By the fundamental theorem of calculus, is the derivative not

$$rf(r)$$

The problem being I need, for the problem to work out correctly, to have a df/dr term. So, am I mistaken in my solution?

2. Dec 6, 2007

### sutupidmath

well, it depends on what properties does the function under the integral sign has, tf(t).
THere are some conditions which this function has to fulfill in order for the answer to be what u said. THe first one is that f(t)>0, which implies directly that $$\int^{r}_{0} f(t)dt$$ is monotono increasing.

3. Dec 6, 2007

### HallsofIvy

No, there is no need that f(t)> 0. By the fundamental theorem of calculus, as long as g(x) is continuous if $$G(x)= \int_a^x g(t)dt$$, then $$\frac{dG}{dx}= g(x)$$. In this case, g(x)= xf(x) so, as long as f is continuous, if $$F(r)= \int_0^r tf(t)dt$$, then $$\frac{dF}{dr}= rf(r)$$. I do not understand why you would "need to have a df/dr term. Perhaps if you were to post the actual problem, it would make more sense.

4. Dec 6, 2007

### Mothrog

It is desired to show that if, for constant k,

$$\frac{d}{dr} g(r) = \frac{d}{dr}k\frac{r^2}{\int^{r}_{0} tf(t)dt} < 0$$​

Then

$$\frac{d}{dr}f(r) > 0$$​

When I take the derivative of g, I get

$$\frac{dg}{dr} = k\frac{2r\int^{r}_{0}tf(t)dt - r^2(rf(r))}{(\int^{r}_{0} tf(t)dt)^2}$$​

If that is correct, I don't see how it is possible to show the desired relation. The definition of g given is the only information known about g. Am I missing something?

Last edited: Dec 6, 2007
5. Dec 7, 2007

Bump.

6. Dec 7, 2007

### Dick

Well, here's something. Let's throw k away. If it's positive it doesn't add anything to the problem and if it can be either sign, then it just plain isn't true. Now let

$$F(r)=\int^{r}_{0} tf(t)dt$$ and $$G(r)=\frac{r^2}{2} f(r)$$.

Now look at the numerator of dg/dr. If it's negative then G(r)>F(r). But G(0)=F(0)=0. So if G'(r)>F'(r) then I could conclude G(r)>F(r). G'(r)=rf(r)+(r^2/2)*f'(r). F'(r)=rf(r). So if f'(r)>0 then dg/dr<0 and vice versa. Trouble is, I can't see how the converse would hold, which seems to be what you are actually trying to prove.

7. Dec 7, 2007

### Mothrog

I'm not sure you can assume $$g(0) = F(0) = 0$$. Can you walk me through your logic on that step?

Last edited: Dec 7, 2007
8. Dec 7, 2007

### Dick

Not the little f and g. The big F and G. It's not an assumption, look at their definitions.

9. Dec 7, 2007

### Mothrog

But G(r) depends only on little f(r), so how can you conclude that if F(0) = 0, that G(r) = 0?

10. Dec 7, 2007

### Dick

G(0)=f(0)*0^2/2=0. F(0) is the integral from 0 to 0 of t*f(t) which=0.

11. Dec 7, 2007

### Mothrog

Ah, OK. That makes sense.

12. Dec 7, 2007

### Mothrog

Actually, looking back at the problem again, I needed to show that f'(r) > 0 implies g'(r) < 0, so I don't need to prove the converse. So, problem solved. Thanks for your help. I was tearing my hair out over that one.

13. Dec 7, 2007

### Dick

No problem. Good thing you don't have to prove the converse, It's not true.