Uniform convergence of a parameter-dependent integral

  • #1
Hello everyone!

I'm a student of electrical engineering, preparing for the theoretical exam in math which will cover stuff like differential geometry, multiple integrals, vector analysis, complex analysis and so on. So the other day I was browsing through the required knowledge sheet our professor sent us, and one question, in particular, is still perplexing to me.
The reason why I'm seeking help on this site is that I wasn't able to find the concrete answer to my question either in the textbook or online. This thread is more or less "theoretical" in nature. Any help would be very appreciated

1. The problem statement, all variables, and given/known data
The question goes like this:
When does the parameter-dependent integral $$F(y)=\int_a^b f(x,y)dx$$uniformly converge?

Homework Equations


In order to answer this question, one must be familiar with the definition of the parameter-dependent integral. Directly from my notes:
Suppose f( x,##y )## is a continuous function ##f:[ a,b ] \times [ c,d ] → R## , then $$F( y ) = \int_a^b f( x,y ) dx$$ is known as parameter dependent integral.

The Attempt at a Solution


I've found a small "note to self" when analyzing my lecture scripts that says
In order for $$F(y) = \int_a^\infty f(x,y)dx $$ to exist, uniform convergence is a mandatory condition
Unfortunately I never explained that claim, but from what I understand, the uniform convergence is a stronger type of convergence than pointwise convergence.

In terms of the parameter-dependent integral, if we have a difference$$F(y+h)-F(h)=\int_a^\infty f(x,y + h)dx - \int_a^\infty f(x,y)dx$$ and segregate the integration scope into two integrals, one with finite and other with infinite integration boundaries, then we get$$\int_a^m [f(x,y + h) - f(x,y)] dx + \int_m^\infty [f(x,y + h) - f(x,y)]dx$$

And here's the catch. The first integral will be arbitrarily small for a small, arbitrarily chosen constant ##h##. However, the second integral should somehow prove that uniform convergence is necessary. I know that the function ##f(x,y)## must be continuous.

From what I've read here, the parameter dependent integral is uniformly convergent for some ##y\in[c,d]## if for each ##\epsilon > 0## exists such ##\delta(\epsilon)## so that$$|F(y) = \int_a^\infty f(x,y)dx| < \epsilon$$ for all ##m \geq \delta(\epsilon)##

My question, therefore, is: from what I've provided, it is possible that uniform convergence is a necessary condition when evaluating the parameter-dependent improper integral simply because, without it, the continuity of function ##F(y)## cannot be ensured?

I really hope this post makes any sense to you, but regardless I wish you a happy Sunday!
 

Answers and Replies

  • #3
Yeah, I was wondering why this question would even exist or how does it pertain to the parameter-dependent integrals. I was contemplating this task for awhile and I don't think I'm even going in the right direction.

I'm sorry for posting nonsense, but this is the line of reasoning I took trying to solve the task.
 
  • #4
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2020 Award
16,124
8,144
Yeah, I was wondering why this question would even exist or how does it pertain to the parameter-dependent integrals. I was contemplating this task for awhile and I don't think I'm even going in the right direction.

I'm sorry for posting nonsense, but this is the line of reasoning I took trying to solve the task.
You learn something every day:

https://www.encyclopediaofmath.org/index.php/Parameter-dependent_integral

Although I'm still not sure what the question is asking.
 
  • #5
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728
Hello everyone!

I'm a student of electrical engineering, preparing for the theoretical exam in math which will cover stuff like differential geometry, multiple integrals, vector analysis, complex analysis and so on. So the other day I was browsing through the required knowledge sheet our professor sent us, and one question, in particular, is still perplexing to me.
The reason why I'm seeking help on this site is that I wasn't able to find the concrete answer to my question either in the textbook or online. This thread is more or less "theoretical" in nature. Any help would be very appreciated

1. The problem statement, all variables, and given/known data
The question goes like this:


Homework Equations


In order to answer this question, one must be familiar with the definition of the parameter-dependent integral. Directly from my notes:


The Attempt at a Solution


I've found a small "note to self" when analyzing my lecture scripts that says Unfortunately I never explained that claim, but from what I understand, the uniform convergence is a stronger type of convergence than pointwise convergence.

In terms of the parameter-dependent integral, if we have a difference$$F(y+h)-F(h)=\int_a^\infty f(x,y + h)dx - \int_a^\infty f(x,y)dx$$ and segregate the integration scope into two integrals, one with finite and other with infinite integration boundaries, then we get$$\int_a^m [f(x,y + h) - f(x,y)] dx + \int_m^\infty [f(x,y + h) - f(x,y)]dx$$

And here's the catch. The first integral will be arbitrarily small for a small, arbitrarily chosen constant ##h##. However, the second integral should somehow prove that uniform convergence is necessary. I know that the function ##f(x,y)## must be continuous.

From what I've read here, the parameter dependent integral is uniformly convergent for some ##y\in[c,d]## if for each ##\epsilon > 0## exists such ##\delta(\epsilon)## so that$$|F(y) = \int_a^\infty f(x,y)dx| < \epsilon$$ for all ##m \geq \delta(\epsilon)##

My question, therefore, is: from what I've provided, it is possible that uniform convergence is a necessary condition when evaluating the parameter-dependent improper integral simply because, without it, the continuity of function ##F(y)## cannot be ensured?

I really hope this post makes any sense to you, but regardless I wish you a happy Sunday!
Your note to self is wrong: in order for ##F(y) = \int_a^b f(x,y) \, dx## to exist, all you need is for ##f(x,y)## to be ##x##-integrable for each ##y##. If you want ##F(y)## to be continuous or have continuous derivatives, THEN you may need to assume more properties of ##f(x,y)##. Things may be a bit different when ##a## and ##b## are both finite than when one or both are infinite.
 
  • Like
Likes Peter Alexander
  • #6
That helped a lot, thank you for your time and patience!
 
  • #7
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728
That helped a lot, thank you for your time and patience!
Who are you responding to? In future, please try to respond to an individual reply, so that you can keep the attributions apparent.
 
  • Like
Likes Peter Alexander

Related Threads on Uniform convergence of a parameter-dependent integral

  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
13
Views
3K
  • Last Post
Replies
2
Views
1K
Replies
3
Views
3K
  • Last Post
Replies
5
Views
1K
  • Last Post
Replies
2
Views
1K
Replies
1
Views
4K
Replies
19
Views
4K
Replies
15
Views
5K
Top