Relating Moments from one Distribution to the Moments of Another

  • I
  • Thread starter Steve Zissou
  • Start date
  • Tags
    Stats
In summary: I use the Normal approximation to the Binomial), etc.In summary, In order to "fit" data with a different distribution, you might need to define a function that measures the "fit" between the two distributions. This is a difficult task that may not have a unique solution.
  • #1
Steve Zissou
49
0
TL;DR Summary
I hope to clarify my question soon, but I'd like to know how we can relate the moments of one sort of distribution to the moments of another.
Ok, I'm sure I can find a smarter way to pose this question, and I will try to define the question more carefully in coming days. That having been said, consider this:
Let's say we have a random variable X (or whatever). I can calculate the moments of this variable with no problem. In fact let's say X is normal. But I would like to characterize the distribution by a different distribution. In other words it is useful to model X by describing it with a different distribution. How can I relate the moments of X (that I know) to the moments of a new, different distribution?

Does this question make sense?

Thanks, friends
 
Physics news on Phys.org
  • #2
Steve Zissou said:
TL;DR Summary: I hope to clarify my question soon, but I'd like to know how we can relate the moments of one sort of distribution to the moments of another.

I can calculate the moments of this variable with no problem.
To calculate the moments you need to know the probability distribution. So how do you calculate the second moment of x ?
 
  • #3
Steve Zissou said:
In fact let's say X is normal. But I would like to characterize the distribution by a different distribution.

Saying that X is normally distributed doesn't define a particular distribution unless you also state the specific values of the parameters of that normal distribution. Let's say you begin by saying that X is normally distributed with mean 30 and standard deviation 15.8. Suppose you decide to characterize X by a different distribution. Are you asking how to characterize X with a distribution that may have different numerical values for its mean and standard deviation? Are you asking if there is only one way to pick different numerical values for the mean and standard deviation of the new distribution? If so, you haven't defined a precise mathematical problem. What criteria are we to use to judge which particular different numerical values "characterize" X?

If you are thinking of a situation where we have data from a sample that we use to estimate the unknown parameters of a distribution then there may be different customary estimators (i.e. formulas for making the estimates) for different distributions. The estimators are random variables. The distribution of an estimator depends on the distribution we assume for X. There are various criteria for which estimators are "best" - depending on one's interests. In such cases, if you change the assumed distribution for X, you might change your estimating formula and get a different numerical estimates for the parameters of the new distribution. There is no general procedure that applies to all such situations. It depends on the particular distributions and goals of estimation.
 
  • Like
Likes hutchphd
  • #4
hutchphd and Stephen Tashi, thank you for your quick replies. Stephen I think you've already made a great point. But regardless here is my attempt to "tighten up" my question.

Ok, so let's say we have some data that ostensibly "comes from" the normal:

$$f(x)=\frac{1}{\sigma\sqrt{2\pi}}\exp\left[ -\frac{1}{2}\left( \frac{x-\mu}{\sigma} \right)^2 \right]$$

...and has mean ##\mu##, variance ##\sigma^2## and skew 0 (whatever).

Now, for whatever reason, we'd like to "fit" this data with the Gamma distribution:

$$f(x)=\frac{\beta^\alpha}{\Gamma(\alpha)}x^{\alpha-1}\exp\left[ -\beta x \right]$$

with mean ##\frac{\alpha}{\beta}##, variance ##\frac{\alpha}{\beta^2}## and skew ##\frac{2}{\sqrt{\alpha}}.##

Is there a way to think about relating ##\mu## and ##\sigma^2## to ##\alpha## and ##\beta##?

Thanks!
 
  • #5
Steve Zissou said:
Now, for whatever reason, we'd like to "fit" this data with the Gamma distribution:

Steve Zissou said:
Is there a way to think about relating ##\mu## and ##\sigma^2## to ##\alpha## and ##\beta##?

Yes - provided you define "fit" in some precise way. For example, you might seek to find the gamma distribution that is the "least squares" fit to the normal distribution.

One complication to that approach is that there may not be a unique solution since there are several parameters involved. And a practical consideration is that one may not care about the tails of the distributions involved.
 
  • #6
@Stephen Tashi
I see. So 1) define some best fit, and 2) make some specification for behavior of the tails
 
  • #7
Steve Zissou said:
@Stephen Tashi
I see. So 1) define some best fit, and 2) make some specification for behavior of the tails

In general, define a function that measures the "fit" between two distributions in some precise way so that the function is a function of the parameters of interest. Treat the parameters of one distribution as given and try to find the parameters of the other distribution that maximize the fit.

Of course defining the task precisely does not guarantee it has a solution or a unique solution.

A function measuring fit could have rules that eliminate the tails from consideration or it could weight what happens on the tails less than what happens on the more probable values.

We often see articles along the lines of: The such-and-such Approximation to the so-and-so Distribution (e.g. the Normal approximation to the Binomial), I tend to get lost in the details of such articles without having a clear idea of how the authors have defined the goodness of approximation. If you look up articles of this type, you might find examples relating the moments of two distributions.
 
  • #8
I have always found perusing Abramowitz and Stegun chapter 26 to be edifying. It is pretty condensed.
 

What is the concept of "Relating Moments from one Distribution to the Moments of Another"?

The concept refers to the process of comparing the statistical properties or moments of two different probability distributions. This allows for a better understanding of the relationship between the two distributions and can provide insights into their similarities and differences.

Why is it important to relate moments from one distribution to the moments of another?

Relating moments can help in identifying patterns and trends in data, which can be useful in various fields such as finance, economics, and engineering. It also allows for the comparison of different datasets and can aid in making predictions and decisions based on the statistical properties of the distributions.

What are the common moments used for comparing distributions?

The most commonly used moments are mean, variance, skewness, and kurtosis. These moments provide information about the central tendency, spread, and shape of a distribution, which are important factors in understanding and comparing two distributions.

How do you compare the moments of two distributions?

The moments of two distributions can be compared by calculating the difference between their values and interpreting the results. If the moments are similar or close in value, it indicates that the distributions are similar. On the other hand, if there is a significant difference between the moments, it suggests that the distributions are different.

What are some limitations of comparing moments from different distributions?

One limitation is that moments do not provide a complete picture of the distributions and may not capture all the characteristics of the data. Additionally, comparing moments may not be appropriate for distributions with extreme values or outliers, as they can heavily influence the results. It is also important to consider the sample size and the underlying assumptions of the distributions when comparing moments.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
935
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
344
Back
Top