Meaning of probability measure

In summary, it seems that the derivative of the cumulative probability distribution is the probability density function.
  • #1
David1234
5
0
What does it mean by
[tex]\int f(w) P (dw)[/tex]

I don't really understand [tex] P (dw) [/tex] here. Does it mean [tex] P (x: x \in B(x, \delta))[/tex] for infinitely small [tex] \delta [/tex]?

For example, with [tex] P(x)=1/10[/tex] for [tex] x=1, 2, ..., 10 [/tex]. How can we interpret this in term of the above integral

Thanks...
 
Physics news on Phys.org
  • #2
I never took measure theory. Therefor my totally naive answer would be P(dw) is just a distribution function. Of course I'm probably completely wrong given I don't even know what a measure is.
 
  • #3
P(dw) is like a distribution fuction... may be. I am confused about P(dw), is it probability of dw? Then what is dw? Following the above example, say, we have [tex] f(w)=1 [/tex] for w=1 and 0 otherwise. What is the meaning of dw here and hence value of P(dw) at w=1? I guess the above integral would give value = 1/10.
 
  • #4
I have never seen "P(dw)". I think you mean what I would call dP(w)= P'(w)dw- the derivative of the cumulative probability distribution and so the probability density function. In that case, [itex]\int F(w)dP= \int F(w)P'(w)dw[/itex] is the expected value of F.
 
  • #5
I guess if P(w) has a derivative we can write it that way. I got the expression from a textbook by Patrick Billingsley. Generally, when P(w) is not differentiable (as shown in the example), we can not write the expression in that form.
 
  • #6
The notation
[tex]
\int_\Omega X(\omega}) \, \mathcal{P}(dw)
[/tex]

is used in probability to indicate the expectation of the random variable [tex] X [/tex]
with respect tot the probability measure (distribution) [tex] \mathcal{P} [/tex] over the probability space [tex] \Omega [/tex].

If [tex] \Lambda [/tex] is any measurable set, then

[tex]
\int_\Lambda X(\omega) \, \mathcal{P}(dw) = E[X \cdot 1_{\Lambda}]
[/tex]

If the probability space is the real line with measure [tex] \mu [/tex], then

[tex]
\int_\Lambda X(\omega) \, \mathcal{P}(dw) = \int_\Lambda f(x) \, \mu(dx)
[/tex]

is the Lebesgue-Stieltjes integral of [tex] f [/tex] with respect to the
probability measure [tex] \mu [/tex].

In more traditional form, if [tex] F [/tex] is the distribution function of [tex] \mu [/tex], and [tex] \Lambda [/tex] is an interval [tex] (a,b) [/tex], then

[tex]
\int_\Lambda X(\omega) \, \mathcal{P}(dw) = \int_\Lambda f(x) \, \mu(dx) = \int_{(a,b)} f(x) \, dF(x)
[/tex]

If the probability measure doesn't have any atoms, the final integral is just a Lebesgue integral. If there are atoms, you need to take care to specify the interval according to whether the endpoints are or are not included - e.g.

[tex]
\int_{a+0}^{b+0} f(x) \,dF(x), \quad \int_{a-0}^{b-0} f(x) \, dF(x)
[/tex]

and so on.

Billingsley is one of the "classic" probability texts. Chang's "A Course in Probability Theory" is another - I studied from it many years ago, and have the second edition. His writing is a little terse, but there is a lot packed into his book.
 
  • #7
Thanks a lot for the detail answer. I guess by Chang you mean Kai Lai Chung... :)
 
  • #8
Yes, I did mean Kai Lai Chung - I would give a general description of my typing ability, but the description wouldn't be "safe for work".
Sorry for the confusion - glad the answer helped.
 
  • #9
HallsofIvy said:
I have never seen "P(dw)". I think you mean what I would call dP(w)= P'(w)dw- the derivative of the cumulative probability distribution and so the probability density function. In that case, [itex]\int F(w)dP= \int F(w)P'(w)dw[/itex] is the expected value of F.

This isn't really correct. P may not be differentiable. When you say [itex]\int_B f(x) P(dx)[/itex], you are referring to the Lebesgue integral of f with respect to P. It is the same as saying [itex]\int_B f dP[/itex]. You are just telling where the arguments lie so there is no confusion. I have to disagree with statdad in that it is not the Stieltjes integral, it is just the plain old Lebesgue integral. For Stieltjes you want to take a distribution function F of P and then you work it out as [itex]\int_B f(x) dF(x)=\int_B f(x) P(dx)[/itex].

Billingsley is a nice textbook and also I would recommend Ash, Real Analysis and Probability.
 
  • #10
Thanks...

I will have a look at "Real Analysis and Probability" by Ash.
 

What is the meaning of a probability measure?

A probability measure is a mathematical concept that assigns a numerical value to a set of outcomes in a random experiment. It represents the likelihood of an event occurring and helps us understand the uncertainty associated with an event or outcome.

How is a probability measure different from a probability distribution?

While a probability measure assigns a numerical value to a set of outcomes, a probability distribution is a function that assigns probabilities to all possible outcomes in a sample space. In other words, a probability measure is a type of probability distribution, but a probability distribution is not necessarily a probability measure.

What are the different types of probability measures?

There are two main types of probability measures: discrete and continuous. Discrete probability measures are used for countable outcomes, while continuous probability measures are used for uncountable outcomes. Additionally, there are also mixed probability measures that combine elements of both discrete and continuous probability measures.

How is a probability measure used in statistics and data analysis?

In statistics and data analysis, probability measures are used to calculate the likelihood of different outcomes and to make predictions. They are also used to model real-world events and phenomena, and to make decisions based on uncertain information.

Can a probability measure be greater than 1?

No, a probability measure cannot be greater than 1. This is because a probability measure represents the likelihood of an event occurring, and the maximum likelihood is 1 or 100%. If a probability measure is greater than 1, it means that the event is certain to occur, which goes against the concept of probability.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
91
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
920
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
406
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
3K
Back
Top