Integral over gaussian pdf where parameters depend on integrand

Click For Summary
SUMMARY

This discussion focuses on solving the integral of a Gaussian probability density function (PDF) with parameters that depend on the integrand, specifically $$\int_a^b \mathcal{N}(f(x_1,...,x_n,t),g(x_1,...,x_n,t)) dt$$. The participants clarify that the mean and variance of the normal distribution change with respect to the variable t, and they derive that $$\mathcal{N}(\mu_1,\sigma_1) + \mathcal{N}(\mu_2,\sigma_2) = \mathcal{N}(\mu_1 + \mu_2,\sqrt{\sigma_1^2+\sigma_2^2})$$. They also explore the use of Riemann sums to approximate the integral and conclude that as the number of partitions increases, the mean approaches a specific value while the variance diminishes towards zero.

PREREQUISITES
  • Understanding of Gaussian distributions and their properties
  • Familiarity with Riemann sums and integration techniques
  • Knowledge of functions of multiple variables, specifically $$f(x_1,...,x_n,t)$$ and $$g(x_1,...,x_n,t)$$
  • Basic calculus concepts, including limits and anti-derivatives
NEXT STEPS
  • Study the properties of Gaussian distributions, particularly under linear transformations
  • Learn about Riemann sums and their applications in numerical integration
  • Explore the concept of convergence in integrals and how it relates to variance
  • Investigate the implications of varying the function g in the context of Gaussian integrals
USEFUL FOR

Mathematicians, statisticians, and students studying probability theory or numerical analysis, particularly those interested in Gaussian integrals and their applications.

ariberth
Messages
8
Reaction score
0
Hallo math helpers ,
i am trying to understand how one could solve the following integrall:
$$\int_a^b \mathcal{N}(f(x_1,...,x_n,t),g(x_1,...,x_n,t)) dt$$, where $$\mathcal{N}$$ is the normal distribution, and $$f(x_1,...,x_n,t): \mathbb{R}^{n+1} \rightarrow \mathbb{R}$$, $$g(x_1,...,x_n,t): \mathbb{R}^{n+1} \rightarrow
\mathbb{R}$$. So the mean and variance changes with t . I read that $$\mathcal{N}(\mu_1,\sigma_1) + \mathcal{N}(\mu_2,\sigma_2) = \mathcal{N}(\mu_1 + \mu_2,\sigma_1+\sigma_2)$$ Does that mean that i just need to integrate f and g respectively?:confused:
 
Physics news on Phys.org
ariberth said:
Hallo math helpers ,
i am trying to understand how one could solve the following integrall:
$$\int_a^b \mathcal{N}(f(x_1,...,x_n,t),g(x_1,...,x_n,t)) dt$$, where $$\mathcal{N}$$ is the normal distribution, and $$f(x_1,...,x_n,t): \mathbb{R}^{n+1} \rightarrow \mathbb{R}$$, $$g(x_1,...,x_n,t): \mathbb{R}^{n+1} \rightarrow
\mathbb{R}$$. So the mean and variance changes with t .

Hi ariberth! Welcome to MHB! (Smile)

Since $x_1,...,x_n$ are not referenced anywhere, we can assume them to be constant and reduce the problem to:
$$\int_a^b \mathcal{N}(f(t),g(t)) dt$$

I read that $$\mathcal{N}(\mu_1,\sigma_1) + \mathcal{N}(\mu_2,\sigma_2) = \mathcal{N}(\mu_1 + \mu_2,\sigma_1+\sigma_2)$$ Does that mean that i just need to integrate f and g respectively?:confused:

Not quite. It's a little more complex.
It should be:
$$\mathcal{N}(\mu_1,\sigma_1) + \mathcal{N}(\mu_2,\sigma_2) = \mathcal{N}(\mu_1 + \mu_2,\sqrt{\sigma_1^2+\sigma_2^2})$$
or with an alternative and easier notation:
$$\mathcal{N}(\mu_1,\sigma_1^2) + \mathcal{N}(\mu_2,\sigma_2^2) = \mathcal{N}(\mu_1 + \mu_2,\sigma_1^2+\sigma_2^2)$$We can write the integral as the limit of, say, a Left Riemann Sum (see the definition of a Riemann integral):
$$\int_a^b \mathcal{N}(f(t),g(t)) dt = \lim_{n \to \infty} \sum_{i=0}^{n-1} \mathcal{N}(f(t_i),g(t_i)) \Delta t
$$
where $\Delta t = \frac{b-a}n$ and $t_i = a + i\Delta t$.Let's pick an example.

Suppose we pick $a=0, b=1, f(t)=t, g(t)=1$, and $n=2$.
What will be the Riemann sum:
$$\sum_{i=0}^{n-1} \mathcal{N}(f(t_i),g(t_i)) \Delta t$$
? (Wondering)

And what if we pick $n=4$?
 
I like Serena said:
Let's pick an example.

Suppose we pick $a=0, b=1, f(t)=t, g(t)=1$, and $n=2$.
What will be the Riemann sum:
$$\sum_{i=0}^{n-1} \mathcal{N}(f(t_i),g(t_i)) \Delta t$$
? (Wondering)

And what if we pick $n=4$?

Thanks a lott for the tip with the rieman sums. Following your hint i discovered that i can use the fact that the normal is closed under linear transformation. So for the first example where $a=0, b=1, f(t)=t, g(t)=1$, and $n=2$ this leads to:
$$\sum_{i=0}^{n-1} \mathcal{N}(f(t_i),g(t_i)) \Delta t = \sum_{i=0}^{1} \mathcal{N}(i\frac{1}{2},1) \frac{1}{2} = (\mathcal{N}(0,1) + \mathcal{N}(\frac{1}{2},1)) \frac{1}{2} = \mathcal{N}(\frac{1}{2},2) \frac{1}{2} = \mathcal{N}(\frac{1}{4},\frac{1}{2})$$
Using that, in the limit this would lead to: $$\lim_{x \to \infty} \sum_{i=0}^{n-1} \mathcal{N}(f(t_i),g(t_i)) \Delta t = \lim_{x \to \infty} \mathcal{N}( \sum_{i=0}^{n-1}\Delta tf(t_i), \sum_{i=0}^{n-1}(\Delta t) ^2g(t_i)) $$ Is that correct?
 
ariberth said:
Thanks a lott for the tip with the rieman sums. Following your hint i discovered that i can use the fact that the normal is closed under linear transformation. So for the first example where $a=0, b=1, f(t)=t, g(t)=1$, and $n=2$ this leads to:
$$\sum_{i=0}^{n-1} \mathcal{N}(f(t_i),g(t_i)) \Delta t = \sum_{i=0}^{1} \mathcal{N}(i\frac{1}{2},1) \frac{1}{2} = (\mathcal{N}(0,1) + \mathcal{N}(\frac{1}{2},1)) \frac{1}{2} = \mathcal{N}(\frac{1}{2},2) \frac{1}{2} = \mathcal{N}(\frac{1}{4},\frac{1}{2})$$
Using that, in the limit this would lead to: $$\lim_{x \to \infty} \sum_{i=0}^{n-1} \mathcal{N}(f(t_i),g(t_i)) \Delta t = \lim_{x \to \infty} \mathcal{N}( \sum_{i=0}^{n-1}\Delta tf(t_i), \sum_{i=0}^{n-1}(\Delta t) ^2g(t_i)) $$ Is that correct?

Yep - assuming that g(t) is the variance instead of the standard deviation. (Nod)
 
So that means i have to solve the following two integralls:

$$\lim\limits_{i \to \infty}\sum_{i=0}^{n-1}\Delta tf(t_i)$$ and $$ \lim\limits_{i \to \infty}\sum_{i=0}^{n-1}(\Delta t) ^2g(t_i)$$
The first one is easy since: $$\lim\limits_{i \to \infty}\sum_{i=0}^{n-1}f(t_i) \Delta t= \int_a^b f(t) dt$$
and i just have to find the anti-derivative of f.

Is there a way to do the same thing with the second:
$$ \lim\limits_{i \to \infty}\sum_{i=0}^{n-1}g(t_i) (\Delta t) ^2 = ??$$
 
Good!

Let's do another example.
Or rather, the same example with n=4.
What's the pattern?
 
$$\sum_{i=0}^{n-1} \mathcal{N}(f(t_i),g(t_i)) \Delta t = \sum_{i=0}^{3} \mathcal{N}(i\frac{1}{4},1) \frac{1}{2} =\frac{1}{4}(\mathcal{N}(0,1) + \mathcal{N}(\frac{1}{4},1) + \mathcal{N}(\frac{2}{4},1) +\mathcal{N}(\frac{3}{4},1)) = \frac{1}{4} \mathcal{N}(\frac{6}{4},4) = \mathcal{N}(\frac{6}{16},\frac{4}{16})$$ The only pattern i see is that the scalar from $$\Delta t$$ is the inverse of the variance but that depends on how g is chosen. So i don't really know what the pattern is...:confused:
 
ariberth said:
$$\sum_{i=0}^{n-1} \mathcal{N}(f(t_i),g(t_i)) \Delta t = \sum_{i=0}^{3} \mathcal{N}(i\frac{1}{4},1) \frac{1}{2} =\frac{1}{4}(\mathcal{N}(0,1) + \mathcal{N}(\frac{1}{4},1) + \mathcal{N}(\frac{2}{4},1) +\mathcal{N}(\frac{3}{4},1)) = \frac{1}{4} \mathcal{N}(\frac{6}{4},4) = \mathcal{N}(\frac{6}{16},\frac{4}{16})$$ The only pattern i see is that the scalar from $$\Delta t$$ is the inverse of the variance but that depends on how g is chosen. So i don't really know what the pattern is...:confused:

The pattern is that $\mu$ approaches $\frac 1 2$ as expected, since $\int f(t)dt = \int_0^1 t\,dt = \frac 12$.
And we see that $\sigma^2$ becomes smaller and smaller, approaching 0.

Indeed, $\sum g(t_i) \Delta t$ approaches $\int g(t)dt$.
So multiplying it with another $\Delta t$ makes it approach 0.

This is equivalent to the fact that when you take averages long enough (ad infinitum), finally you will be left with the expected mean and negligible variance.
 
I like Serena said:
The pattern is that $\mu$ approaches $\frac 1 2$ as expected, since $\int f(t)dt = \int_0^1 t\,dt = \frac 12$.
And we see that $\sigma^2$ becomes smaller and smaller, approaching 0.

Indeed, $\sum g(t_i) \Delta t$ approaches $\int g(t)dt$.
So multiplying it with another $\Delta t$ makes it approach 0.

This is equivalent to the fact that when you take averages long enough (ad infinitum), finally you will be left with the expected mean and negligible variance.

I don't believe it. So the variance allways aproaches 0, no matter what choice of g i take?
 
  • #10
ariberth said:
I don't believe it. So the variance allways aproaches 0, no matter what choice of g i take?

Yup.

Well... maybe if you have a $g$ that approaches infinity...
... or take an interval that is infinitely large...
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 61 ·
3
Replies
61
Views
13K
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 175 ·
6
Replies
175
Views
27K
  • · Replies 28 ·
Replies
28
Views
7K