- #1
- 26
- 3
Hello everyone!
I'm a student of electrical engineering, preparing for the theoretical exam in math which will cover stuff like differential geometry, multiple integrals, vector analysis, complex analysis and so on. So the other day I was browsing through the required knowledge sheet our professor sent us, and one question, in particular, is still perplexing to me.
The reason why I'm seeking help on this site is that I wasn't able to find the concrete answer to my question either in the textbook or online. This thread is more or less "theoretical" in nature. Any help would be very appreciated
1. The problem statement, all variables, and given/known data
The question goes like this:
In order to answer this question, one must be familiar with the definition of the parameter-dependent integral. Directly from my notes:
I've found a small "note to self" when analyzing my lecture scripts that says
In terms of the parameter-dependent integral, if we have a difference$$F(y+h)-F(h)=\int_a^\infty f(x,y + h)dx - \int_a^\infty f(x,y)dx$$ and segregate the integration scope into two integrals, one with finite and other with infinite integration boundaries, then we get$$\int_a^m [f(x,y + h) - f(x,y)] dx + \int_m^\infty [f(x,y + h) - f(x,y)]dx$$
And here's the catch. The first integral will be arbitrarily small for a small, arbitrarily chosen constant ##h##. However, the second integral should somehow prove that uniform convergence is necessary. I know that the function ##f(x,y)## must be continuous.
From what I've read here, the parameter dependent integral is uniformly convergent for some ##y\in[c,d]## if for each ##\epsilon > 0## exists such ##\delta(\epsilon)## so that$$|F(y) = \int_a^\infty f(x,y)dx| < \epsilon$$ for all ##m \geq \delta(\epsilon)##
My question, therefore, is: from what I've provided, it is possible that uniform convergence is a necessary condition when evaluating the parameter-dependent improper integral simply because, without it, the continuity of function ##F(y)## cannot be ensured?
I really hope this post makes any sense to you, but regardless I wish you a happy Sunday!
I'm a student of electrical engineering, preparing for the theoretical exam in math which will cover stuff like differential geometry, multiple integrals, vector analysis, complex analysis and so on. So the other day I was browsing through the required knowledge sheet our professor sent us, and one question, in particular, is still perplexing to me.
The reason why I'm seeking help on this site is that I wasn't able to find the concrete answer to my question either in the textbook or online. This thread is more or less "theoretical" in nature. Any help would be very appreciated
1. The problem statement, all variables, and given/known data
The question goes like this:
When does the parameter-dependent integral $$F(y)=\int_a^b f(x,y)dx$$uniformly converge?
Homework Equations
In order to answer this question, one must be familiar with the definition of the parameter-dependent integral. Directly from my notes:
Suppose f( x,##y )## is a continuous function ##f:[ a,b ] \times [ c,d ] → R## , then $$F( y ) = \int_a^b f( x,y ) dx$$ is known as parameter dependent integral.
The Attempt at a Solution
I've found a small "note to self" when analyzing my lecture scripts that says
Unfortunately I never explained that claim, but from what I understand, the uniform convergence is a stronger type of convergence than pointwise convergence.In order for $$F(y) = \int_a^\infty f(x,y)dx $$ to exist, uniform convergence is a mandatory condition
In terms of the parameter-dependent integral, if we have a difference$$F(y+h)-F(h)=\int_a^\infty f(x,y + h)dx - \int_a^\infty f(x,y)dx$$ and segregate the integration scope into two integrals, one with finite and other with infinite integration boundaries, then we get$$\int_a^m [f(x,y + h) - f(x,y)] dx + \int_m^\infty [f(x,y + h) - f(x,y)]dx$$
And here's the catch. The first integral will be arbitrarily small for a small, arbitrarily chosen constant ##h##. However, the second integral should somehow prove that uniform convergence is necessary. I know that the function ##f(x,y)## must be continuous.
From what I've read here, the parameter dependent integral is uniformly convergent for some ##y\in[c,d]## if for each ##\epsilon > 0## exists such ##\delta(\epsilon)## so that$$|F(y) = \int_a^\infty f(x,y)dx| < \epsilon$$ for all ##m \geq \delta(\epsilon)##
My question, therefore, is: from what I've provided, it is possible that uniform convergence is a necessary condition when evaluating the parameter-dependent improper integral simply because, without it, the continuity of function ##F(y)## cannot be ensured?
I really hope this post makes any sense to you, but regardless I wish you a happy Sunday!