Proof involving means in continuous distributions

Click For Summary

Discussion Overview

The discussion revolves around the properties of means in continuous probability distributions, specifically exploring the relationship between the mean and the integral of absolute deviations from the mean. Participants are examining whether a specific equality involving the mean and probability density functions can be proven, and they are considering the implications of symmetry in distributions.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant recalls that the mean acts as a fulcrum, suggesting that the integral of absolute deviations from the mean should balance on either side of the mean.
  • Another participant proposes that the equality \(\int^{∞}_{μ} |x-μ| ρ(x) dx = \int^{μ}_{-∞} |x-μ| ρ(x) dx\) may need to be proven for each specific probability density function (PDF).
  • There is a suggestion to standardize the random variable to have a mean of zero to simplify the proof, using a transformation \(Y = X - μ\).
  • One participant argues that if the PDF is symmetric, the relationship holds, but questions arise about non-symmetric distributions and whether the equality still applies.
  • Another participant challenges the assertion that both sides of the fulcrum must be equal, suggesting that in some cases, one side may need to be negated to maintain balance.
  • Concerns are raised about proving the property for non-symmetric distributions, with suggestions that transformations may be necessary to relate back to the original variables.

Areas of Agreement / Disagreement

Participants express differing views on the validity of the proposed equality and the implications of symmetry in distributions. There is no consensus on whether the equality holds for non-symmetric distributions, and the discussion remains unresolved regarding the general case.

Contextual Notes

Participants note that proving the equality may depend on the specific properties of the probability density function being used, and there are unresolved questions about the implications of symmetry and transformations on the mean and the integrals involved.

Bipolarity
Messages
773
Reaction score
2
I recall reading somewhere that the mean value of a continuous variable is situated at a point that acts as a fulcrum about which all other values are considered "weights".

In other words, if we define the mean as

[tex]μ = \int^{∞}_{-∞} x ρ(x) dx[/tex] (where rho is the probability density)

then can we prove that

[tex]\int^{∞}_{μ} |x-μ| ρ(x) dx = \int^{μ}_{-∞} |x-μ| ρ(x) dx[/tex]

I am not sure my question is very clear considering I don't understand this too well, but perhaps someone understands what I mean?
I'm also not sure the equation is even correct, but my memory tells me I did this a while ago and my gut tells me my memory is not wrong. :D

EDIT: Made a big error with the LaTeX. Just fixed it.

BiP
 
Last edited:
Physics news on Phys.org
Bipolarity said:
I recall reading somewhere that the mean value of a continuous variable is situated at a point that acts as a fulcrum about which all other values are considered "weights".

In other words, if we define the mean as

[tex]μ = \int^{∞}_{-∞} x ρ(x) dx[/tex] (where rho is the probability density)

then can we prove that

[tex]\int^{∞}_{μ} |x-μ| ρ(x) dx = \int^{μ}_{-∞} |x-μ| ρ(x) dx[/tex]

I am not sure my question is very clear considering I don't understand this too well, but perhaps someone understands what I mean?
I'm also not sure the equation is even correct, but my memory tells me I did this a while ago and my gut tells me my memory is not wrong. :D

EDIT: Made a big error with the LaTeX. Just fixed it.

BiP

Hey BiPolarity.

Basically the fulcrum argument says that if you look at the mean equation, then the contribution of [tex]\int^{∞}_{μ} xp(x)dx = \int^{μ}_{-∞}xp(x)dx[/tex]

Now in terms of proving this, we can use the property that:

[tex]\int^{∞}_{-∞}xp(x) = μ[/tex]. Now we know that [tex]\int^{∞}_{μ} xp(x)dx + \int^{μ}_{-∞}xp(x)dx = \int^{∞}_{-∞}xp(x) = μ[/tex] So if both are equal, then both must equal [tex]\frac{μ}{2}[/tex] which is the intuitive explanation of the fulcrum interpretation.

Now in terms of showing this as a general property, I'm afraid that you will need to do it as far as I know for each PDF individually, since p(x) is a general probability density function.

What you can do is for certain types of PDE's like symmetric and even functions, you can show that this property holds by using properties of these kinds of general classes, but other than that, to prove it for a general PDF, you will need to plug in the function and prove it.

One way though that I think would be optimal for the general case is if you standardized your random variable to have a mean 0 and a symmetric distribution about that mean, which would prove it for the general symmetric distribution, but the thing is many are not symmetric under a normal standardization.
 
Last edited:
After reading your post, it would make sense that you should standardized your random variable to have mean = 0 using a simple transformation [tex]Y = X - μ[/tex] so that you have a zero mean.

You can then move on to prove the identity [tex]\int^{0}_{-∞}yp(y)dy = \int^{∞}_{0}yp(y)dy[/tex] which removes μ completely from the proof.

So with regard to your original expression, you need to lose the absolute value signs and you need to use a random variable transformation to get the above result as opposed to a normal integral substitution.
 
Last edited:
chiro said:
After reading your post, it would make sense that you should standardized your random variable to have mean = 0 using a simple transformation [tex]Y = X - μ[/tex] so that you have a zero mean.

You can then move on to prove the identity [tex]\int^{0}_{-∞}yp(y)dy = \int^{∞}_{0}yp(y)dy[/tex] which removes μ completely from the proof.

So with regard to your original expression, you need to lose the absolute value signs and you need to use a random variable transformation to get the above result as opposed to a normal integral substitution.

I see. Thanks chiro!

By the way, I think you mean [tex]\int^{0}_{-∞}yp(y)dy = -\int^{∞}_{0}yp(y)dy[/tex].
There needs to be a negative sign because if the PDF is standardized, then one side of the fulcrum must have negative values. Or you could put absolute values in front of the y?

Oh and another thing, suppose that the probability distribution is not symmetric. Then does the property still hold?BiP
 
Bipolarity said:
I see. Thanks chiro!

By the way, I think you mean [tex]\int^{0}_{-∞}yp(y)dy = -\int^{∞}_{0}yp(y)dy[/tex].
There needs to be a negative sign because if the PDF is standardized, then one side of the fulcrum must have negative values. Or you could put absolute values in front of the y?

Oh and another thing, suppose that the probability distribution is not symmetric. Then does the property still hold?

BiP

No that's not correct.

Remember both are the same value as you get the same quantity on the left as on the right which means they are equal and not negatives of each other.

As an example if you have a fulcrum and you have two equal weights: they are both say 2, not 2 and -2: Remember they are balanced just like the fulcrum and this is expressed with an equality.
 
chiro said:
No that's not correct.

Remember both are the same value as you get the same quantity on the left as on the right which means they are equal and not negatives of each other.

As an example if you have a fulcrum and you have two equal weights: they are both say 2, not 2 and -2: Remember they are balanced just like the fulcrum and this is expressed with an equality.

Actually chiro, I would have to disagree. If the PDF is [itex]ρ(x) = e^{-x^{2}}[/itex], then the left wing of the fulcrum will be the negative of the right wing, so you need to negate one of them to get the other.

Also, I think even if the probability distribution is not symmetric, the equality in my original post should hold, as long as the mean is defined. I should like to prove this for non-symmetric probability distributions, but I don't know how to.

BiP
 
Bipolarity said:
Actually chiro, I would have to disagree. If the PDF is [itex]ρ(x) = e^{-x^{2}}[/itex], then the left wing of the fulcrum will be the negative of the right wing, so you need to negate one of them to get the other.

Also, I think even if the probability distribution is not symmetric, the equality in my original post should hold, as long as the mean is defined. I should like to prove this for non-symmetric probability distributions, but I don't know how to.

BiP

Yeah you are right, because if it is symmetric then you will end up having a non-symmetric function when multiplying by x, when x can be both positive and negative. Also this means I might be wrong about above for the relationship with [tex]\frac{\mu}{2}[/tex] and that it will only hold if x is always positive. If x is negative, then you will need to deal with a more general case.

In terms of non-symmetric distributions, I'm not sure how you prove the general case either because the function is so general that unless it has a property like being symmetric or whatever, then it's really unconstrained and too general to work with.

The only way I can think of is if you transform the PDF to something symmetric by introducing a transformation of the random variable and then showing some kind of argument relating the transformed variable and transformed mean back to the original variable and original mean.

I don't know if the above would work though, but it's the only thing I can think of currently.
 
ρ
chiro said:
Yeah you are right, because if it is symmetric then you will end up having a non-symmetric function when multiplying by x, when x can be both positive and negative. Also this means I might be wrong about above for the relationship with [tex]\frac{\mu}{2}[/tex] and that it will only hold if x is always positive. If x is negative, then you will need to deal with a more general case.

In terms of non-symmetric distributions, I'm not sure how you prove the general case either because the function is so general that unless it has a property like being symmetric or whatever, then it's really unconstrained and too general to work with.

The only way I can think of is if you transform the PDF to something symmetric by introducing a transformation of the random variable and then showing some kind of argument relating the transformed variable and transformed mean back to the original variable and original mean.

I don't know if the above would work though, but it's the only thing I can think of currently.

Actually I think it is true even if x is negative. I have been doing some calculations on maple with various probability distributions, and the equation I posted in the OP appears to hold no matter the probability distribution, as long as the mean for that distribution is defined, i.e.
As long as
[tex]\int^{∞}_{-∞}xρ(x)dx[/tex] is finite, then the "idea" of the mean being the fulcrum must be true. I can only say this out of intuition however.

However I myself do not know that much about probability distributions. Could you recommend me some PDFs that I can perform calculations on just to see whether or not it's correct?

EDIT: IF you look at the equation in my OP, the absolute value is designed to account for negative x values left of the mean.

BiP
 
Last edited:
Bipolarity said:
ρ

Actually I think it is true even if x is negative. I have been doing some calculations on maple with various probability distributions, and the equation I posted in the OP appears to hold no matter the probability distribution, as long as the mean for that distribution is defined, i.e.
As long as
[tex]\int^{∞}_{-∞}xρ(x)dx[/tex] is finite, then the "idea" of the mean being the fulcrum must be true. I can only say this out of intuition however.

However I myself do not know that much about probability distributions. Could you recommend me some PDFs that I can perform calculations on just to see whether or not it's correct?

EDIT: IF you look at the equation in my OP, the absolute value is designed to account for negative x values left of the mean.

BiP

No, the negative property doesn't hold when x is always positive because p(x) is always positive which means the integral is always positive for all x which means that LHS integral is positive and the RHS is positive since xp(x) is always positive. Again think about having 2 and dividing into 1 on LHS and 1 on RHS because the integral for both parts is positive. It's when you have negative x that you need to consider the negative case that you have mentioned (like the normal distribution).

In terms of distributions, I'd try it on the exponential, normal, uniform for starters: basically anything you can integrate out for the mean expression to get an analytic answer so that you can check it symbolically: setup the integral and see what you get.
 
  • #10
chiro said:
No, the negative property doesn't hold when x is always positive because p(x) is always positive which means the integral is always positive for all x which means that LHS integral is positive and the RHS is positive since xp(x) is always positive. Again think about having 2 and dividing into 1 on LHS and 1 on RHS because the integral for both parts is positive. It's when you have negative x that you need to consider the negative case that you have mentioned (like the normal distribution).

In terms of distributions, I'd try it on the exponential, normal, uniform for starters: basically anything you can integrate out for the mean expression to get an analytic answer so that you can check it symbolically: setup the integral and see what you get.

I actually do not know the PDF for those distributions myself, could you list the actual functions?

BiP
 
  • #11
Bipolarity said:
I actually do not know the PDF for those distributions myself, could you list the actual functions?

BiP

Google is your friend ;), but for these distributions you have:

[itex]U(x) = \frac{1}{b-a}[/itex] for b > a and a < x < b which is the uniform distribution.
[itex]E(x) = ae^{-ax}[/itex] for x > 0 and a > 0 which is the exponential distribution
[itex]N(x) = \frac{1}{(2\pi)^{1/2}}*e^{-\frac{x^2}{2}}[/itex] which is the standard normal N(0,1)
 
Last edited:

Similar threads

  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 25 ·
Replies
25
Views
6K
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K