2 True/False Questions -- Integral and convergence

In summary: Please focus on "this" problem.I'm sorry I wasn't clear, I was explaining that I didn't know how to prove it, and that I thought it was true because of that theorem, however it's not the same problem you are right (I've deleted that part). In summary, the statement for ##b## is true because if the integral converges then its integrand must be a decreasing function, and if the integrand is a decreasing function then its limit must be zero, according to Cauchy's integral test.
  • #1
cacofolius
30
0

Homework Statement


a) If ##f: [0,1] \rightarrow \mathbb{R}## is continuous and ##\int^{b}_{a} f(x)dx = 0## for every interval ##[a,b] \subset [0,1]##, then ##f(x)=0 \forall x \in [0,1]##

b) Let ##f: [0,\infty) \rightarrow [0,\infty)## be continous. If ##\int^{\infty}_{0} f(x)dx## converges, then ##f(x) \rightarrow 0## when ##x \rightarrow \infty ##

Homework Equations



The Attempt at a Solution


For the a) part, my guess is that it is true ( I know that guesses don't mean much). I've tried to come up with a counter example such as

## f(x)=
\left\{
\begin{array}{ll}
1 & \mbox{if } x \in Q \\
0 & \mbox{if } x \in I
\end{array}
\right.
##, which will satisfy the latter conditions, but the ones that say that the function is continuous and that ##f: [0,1] \rightarrow \mathbb{R}## (which would prevent me to use symmetric intervals around zero, and hence use sin(x) for example) make me think that it's true. However I'm not sure how to prove it.

b) Like before, the ##f: [0,\infty) \rightarrow [0,\infty)## condition makes me think it is true, since I cannot use bounded functions like sin(x) which would give me a counter example (here's a similar problem, but without the interval and continuous conditions: http://math.stackexchange.com/questions/538750/if-the-improper-integral-int-infty-a-fx-dx-converges-then-lim-x→∞fx ). Also I have in my notes, before Cauchy's integral criterion (which states, for using it, that the function must be a decreasing one, something which is not specified here), that if

##S_{n}= \sum_{k=0}^{N} A_{k}## converges, then ##A_{k} \rightarrow 0##, but I cannot find a proof of this.

Any hint would be much appreciated.
 
Physics news on Phys.org
  • #2
Suppose f were not identically 0 on the interval. That is, suppose that, for some a in the interval f(a)= b> 0. Then, since f is continuous, there would be an interval around a where f(x)> 0 and the integral over that interval could not be 0.
 
  • Like
Likes cacofolius
  • #3
Thanks for your reply Hallsoflvy, but I don't quite understand what you are saying. The statement has the condition that the integral is zero on every interval ##[a,b] \subset [0,1]##, therefore how could I suppose that for some ##a## the conclusion is that its integral is not zero?
I thought of a counterexample (let me know if you think this is correct): Let ##F(x)'=f(x)##, and I define this F(x) to be constant on every interval in ## [a,b] \subset [0,1]##, therefore, since f(x) is continous

##\int^{b}_{a} f(x) dx = F(b) - F(a)=0## and it doesn't assume for this to happen that it must true that that ##f(x)=0##, therefore the statement is false. Is this right?
 
  • #4
cacofolius said:
Thanks for your reply Hallsoflvy, but I don't quite understand what you are saying. The statement has the condition that the integral is zero on every interval ##[a,b] \subset [0,1]##, therefore how could I suppose that for some ##a## the conclusion is that its integral is not zero?
You can't and I did not say that. I said assume the conclusion, that the function is not identically 0, is false and then show a contradiction.

I thought of a counterexample (let me know if you think this is correct): Let ##F(x)'=f(x)##, and I define this F(x) to be constant on every interval in ## [a,b] \subset [0,1]##, therefore, since f(x) is continous

##\int^{b}_{a} f(x) dx = F(b) - F(a)=0## and it doesn't assume for this to happen that it must true that that ##f(x)=0##, therefore the statement is false. Is this right?
No, that is not correct since, if F(x) is constant on every interval, its derivative, f, is identically 0.
 
  • Like
Likes cacofolius
  • #5
Thank you for your patience, I misread you, I see what you mean now, I'll work on that.

Meanwhile I thought something for the b) part. If could treat the problem (since f is continous) from a series point of view (maybe justifying this with the archimedean property of real numbers that states that for every ##x \in \mathbb{R}## there's an ##n \in \mathbb{N}## such, that ##n>x## ?):
The series ##\sum_{n=0}^{\infty} a_{n}## converges if the sequence of partial sums ##S_{k}=\sum_{n=0}^{k} a_{n}## also converges, and the sum of the series is ##\lim_{k \to \infty} S_{k}=L##

Now ##a_{n}= S_{n} - S_{n-1}= L - L = 0## as ##n \rightarrow \infty##. Therefore the statement is true.
 
  • #6
What statement is true? What you have proved is "If a series converges then the individual terms must go to 0." I don't see how that has anything to do with the original statement "If the integral of continuous function, f, over any subinterval of an interval is 0, then f is 0 on the interval". For one thing, "continuous" is crucially important in this statement but has no part in your "series" variation.
 
  • #7
The b) is a different problem, that says if the integral from zero to infinity converges then its integrand must go to zero when x extends to infinity. I was referring to that problem when I brought up the series.

For the a) problem, following your suggestion: suppose that for some ##a \subset [0,1]##, ##f(a)>0##, then, since f(x) is continuous there is an interval ##(a-\epsilon, a+\epsilon)## around a, where ##f(a)>0##, therefore

##\int^{a+\epsilon}_{a-\epsilon} f(x) = F(a+\epsilon) - F(a-\epsilon) \neq 0## and so the original statement is true.
 
  • #8
cacofolius said:

Homework Statement


a) If ##f: [0,1] \rightarrow \mathbb{R}## is continuous and ##\int^{b}_{a} f(x)dx = 0## for every interval ##[a,b] \subset [0,1]##, then ##f(x)=0 \forall x \in [0,1]##

b) Let ##f: [0,\infty) \rightarrow [0,\infty)## be continous. If ##\int^{\infty}_{0} f(x)dx## converges, then ##f(x) \rightarrow 0## when ##x \rightarrow \infty ##b) Like before, the ##f: [0,\infty) \rightarrow [0,\infty)## condition makes me think it is true, since I cannot use bounded functions like sin(x) which would give me a counter example (here's a similar problem, but without the interval and continuous conditions: http://math.stackexchange.com/questions/538750/if-the-improper-integral-int-infty-a-fx-dx-converges-then-lim-x→∞fx ). Also I have in my notes, before Cauchy's integral criterion (which states, for using it, that the function must be a decreasing one, something which is not specified here),

Cauchy's integral test will not assist you: it says that a series converges if the integral of a related decreasing function converges. You are asked to prove or disprove that if an integral of an arbitrary continuous non-negative function converges then that function tends to zero.

Intuitively it seems like statement (b) should be true. But consider a continuous, non-negative function which is zero except for triangular peaks centered on the positive integers, such that the height of the peaks increase without limit but the widths decrease sufficiently fast that the integral over the peak at [itex]n[/itex] is a strictly decreasing function of [itex]n[/itex]. Does that not suggest how one might construct a counterexample?
 
  • #9
Thanks pasmith, indeed that is a counterexample, but I have no idea how to define that function (except for the x=0, when x<0, part, of course). Nevertheless, describing it as you did would not count as correct answer (whatever the explicit expression for its formula might be) in replying that the statement is false?
 

1. What is the difference between an integral and a convergence?

An integral is a mathematical operation that represents the area under a curve. It is used to calculate the total value of a function over a given interval. Convergence, on the other hand, refers to the behavior of a series or sequence. A series is said to converge if its terms eventually become closer and closer to a fixed value, whereas a series is said to diverge if its terms do not approach a fixed value.

2. How can I determine if an integral is convergent or divergent?

There are several tests that can be used to determine the convergence or divergence of an integral, such as the comparison test, the integral test, and the ratio test. These tests involve comparing the given integral to a known convergent or divergent series or sequence.

3. Can an integral and a series both be convergent?

Yes, it is possible for both an integral and a series to be convergent. However, this is not always the case. It is important to evaluate each separately and not assume that if one is convergent, the other will be as well.

4. What is the relationship between the convergence of a series and the convergence of its integral?

If a series is convergent, then its corresponding integral must also be convergent. However, the converse is not necessarily true. A convergent integral does not guarantee the convergence of the series.

5. Can an integral and a series be both divergent?

Yes, it is possible for both an integral and a series to be divergent. In this case, the integral and the series will often have the same type of divergence, such as both being divergent to infinity or both being divergent to negative infinity.

Similar threads

  • Calculus and Beyond Homework Help
Replies
15
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
843
  • Calculus and Beyond Homework Help
Replies
4
Views
311
  • Calculus and Beyond Homework Help
Replies
3
Views
418
  • Calculus and Beyond Homework Help
Replies
2
Views
160
  • Calculus and Beyond Homework Help
Replies
26
Views
898
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
1K
Back
Top