# Convolution of densities and distributions

1. Oct 18, 2010

### Kreizhn

Hello everyone,

I have a quick theoretical question regarding probability. If you answer, I would appreciate it if you would be as precise as possible about terminology.

Here is the problem: I'm working on some physics problems that do probability in abstract spaces and the author freely moves between calling some poorly defined function f a density, a measure, and a distribution. From my knowledge of measure theory, these are all very different things, though are all inter-related.

In particular, the author talks about having two objects, say $x_1$ sampled from $f_1$ and $x_2$ sampled from $f_2$. He wants to calculate the joint sampling distribution (?) of $x = x_1 \cdot x_2$ which is defined via convolution

$$(f_1 \star f_2)(x) = \int f_1(x \cdot x_2^{-1}) f_2(x_2) d x_2$$

Where I hope it's clear that we're using multiplicative notation rather than the classical additive notation.

What it comes down to is that I'm trying to figure out what the author really means when talking about f. Is f a density or a distribution?

http://en.wikipedia.org/wiki/Convolution#Applications" says that densities obey convolution.

It would seem to me that this must be a density since if it were a distribution we would need to integrate over "sections" or at least non-zero measure sets.

Anyway, I really just want an answer as to whether densities, distributions, or both obey convolution. Thanks.

Last edited by a moderator: Apr 25, 2017
2. Oct 18, 2010

Convolution makes sense for both densities and distribution functions. If $$X_1, X_2$$ have densities $$f_1, f_2$$ the density of $$X_1 + X_2$$
is the convolution of $$f_1, f_2$$. The same is true for the distribution functions.

3. Oct 18, 2010

### Kreizhn

Perhaps you could answer something else then that's been bugging me.

Theoretically, a distribution is the measure of the preimage of a random variable right? That is, let $(\Omega, \Sigma, \mu)$ is a probability space with real valued measure $\mu$ and let $(\mathbb R, \nu)$ be real space under the Lebesgue measure. X is a random variable if $X: \Omega \to \mathbb R$ is measurable. Then the distribution on X is $f:\mathcal L_\nu(\mathbb R) \to \mathbb R$ given by $f(S) = \mu(X^{-1}(S))$ where $\mathcal L_\nu(\mathbb R)$ is the set of Lebesgue measurable real sets.

In this case, how can we define a convolution on distributions? Namely, if $f_1, f_2$ are distributions, then

$$(f_1\star f_2)(x) = \int_{\Omega} f_1(y) f_2(x-y) d\mu(y)$$
(where I hope that you'll forgive the fact that I've reverted back to additive notation: it just seems easier for this discussion).

However, we're integrating over singletons in the $\sigma-$algebra and hence the convolution will be zero. Does this make sense? Am I missing something? Should this instead be

$$(f_1 \star f_2)(x) = \int_{\Sigma} f_1(Y) f_2(X\setminus_Y) d \mu(Y)$$

4. Oct 18, 2010

you seem to be using $$f_1, f_2$$ to represent measureable functions rather than the measures generated by the distributions. If so, there isn't anything out of sorts with your notation - it is simply the convolution of two functions.

It is possible to define the convolution of the associated measures. Look at the discussion that begins near the bottom of page 1 of the pdf at this link:

http://www.galaxy.gmu.edu/stats/syllabi/it971/Lecture11.pdf

Last edited by a moderator: Apr 25, 2017
5. Oct 19, 2010

### Kreizhn

Sounds good. But I wonder if this makes sense: it is taken almost verbatim from a paper I'm reading

What is f here? It doesn't seem to me like it is actually a measure; at best a distribution? The author then also later notes that

$$\int d\mu(g) f^{\star m}(g) = 1, \qquad \forall m \in \mathbb N$$
where

$$f \in \mathcal F, f^{\star m} = \underbrace{f\star \cdots \star f }_{m \text{ times }}$$.

Doesn't that imply it's a density? The author refers to f in this context as a measure, a distribution and a density, so I'm very confused.

Last edited by a moderator: May 5, 2017
6. Oct 19, 2010

### Office_Shredder

Staff Emeritus
Distribution density and measure can all mean the same thing. A density is something that you integrate to find the probability that you're inside of a given region. But a measure, as long as the measure of the whole space is 1, is the same thing. You can integrate it over a region to find the probability you are in that region as long as you interpret the measure as a density. A probability distribution sometimes refers to the cumulative distribution, but if it's being used interchangeably with density then it's probably not.

7. Oct 19, 2010

### Kreizhn

Why would you integrate a measure to find the probability of being in a region? Let A be our region: if $\mu$ is the probability measure, the probability of being in A is just $\mu(A)$. By definition, the domain of a measure is the associated sigma-algebra, while the domain of a density is the underlying set; how can these be interpreted as the same thing?

Also, a distribution must have a random variable associated to it, while a probability measure need not. As a matter of fact, distributions need not have an associated density. A distribution is another measure on the space, and if one is lucky enough that it is absolutely continuous and the domain/codomain of the random variable are both sigma-finite, we can define that density using the Radon-Nikodym theorem.

Am I missing something here? These words all have very precise mathematical meanings and while they can all be related to each other in special circumstances, they are certainly different things.

8. Oct 19, 2010

"But a measure, as long as the measure of the whole space is 1, is the same thing. You can integrate it over a region to find the probability you are in that region as long as you interpret the measure as a density"

No, densities are not equivalent to the measure.

density -> distribution function <--> measure.

Consider a VERY simple case, the density

$$f(x) = 1, 0 \le x \le 1$$

You integrate this to find probability:

$$P(X <.5) = \int_0^{.5} f(x) \, dx = \int_0^{.5} 1 \,dx = .5$$

The associated distribution function is defined in terms of the anti-derivative of f.

$$F(x) = \int_0^x f(t) \, dt = x \quad 0 \le x \le 1$$

and $F(x) = 0$ for $x < 0$, $F(x) = 1$ for $x > 1$.

Both the density and distribution function are point functions. The associated measure is a set function:

$$\mu(\{t \mid t \le x\}) = F(x)$$

In general, if $B$ is a Borel subset of the real line, and $F$ is the distribution function of a random variable $X$, the associated probability measure is

$$\mu(B) = \int_B \, dF$$

(Lebesque-Stieltjes integral). If $F$ happens to be absolutely continuous with respect to Lebesque measure there is a density.

"Also, a distribution must have a random variable associated to it, while a probability measure need not."
Semantics. It is true that a probability measure is simply a special type of measure (one that assigns mass 1 to the underlying space) but, if you are going through the both of referring to "probability measure", you typically have a random variable in mind.

9. Oct 19, 2010

### Kreizhn

I believe in theory it is possible to consider a two probability spaces $S_i=(\Omega_i, \Sigma_i, \mu_i), i\in\left\{1,2\right\}$. In this case, I think we really do require an explicit random variable, represented by a measurable function $X:S_1 \to S_2$. In this case the distribution is not a point-function because there isn't necessarily a well-defined order on $S_2$ to make it a cumulative function. So the probability distribution on X, denoted $P_X$ is given by $P_X(B) = P(X^{-1}(B))$ which is also a measure.