Relating Moments from one Distribution to the Moments of Another

  • Context: Undergrad 
  • Thread starter Thread starter Steve Zissou
  • Start date Start date
  • Tags Tags
    Stats
Click For Summary
SUMMARY

This discussion focuses on the relationship between the moments of a normal distribution and those of a Gamma distribution. The participants emphasize the importance of defining a precise method for "fitting" one distribution to another, particularly in terms of parameters such as mean (μ), variance (σ²), and the parameters of the Gamma distribution (α and β). They highlight that there is no universal procedure for this task, as it depends on the specific distributions and the criteria for goodness of fit. The conversation also references the need to consider the behavior of distribution tails when establishing a fitting function.

PREREQUISITES
  • Understanding of normal distribution parameters (mean, variance, skewness)
  • Familiarity with Gamma distribution and its parameters (α, β)
  • Knowledge of statistical fitting techniques and goodness-of-fit criteria
  • Basic concepts of moments in probability distributions
NEXT STEPS
  • Research methods for fitting distributions, such as least squares fitting
  • Explore the concept of moments in probability theory and their applications
  • Study the implications of tail behavior in statistical modeling
  • Investigate articles on distribution approximations, particularly those relating moments of different distributions
USEFUL FOR

Statisticians, data analysts, and researchers involved in statistical modeling and distribution fitting, particularly those working with normal and Gamma distributions.

Steve Zissou
Messages
75
Reaction score
4
TL;DR
I hope to clarify my question soon, but I'd like to know how we can relate the moments of one sort of distribution to the moments of another.
Ok, I'm sure I can find a smarter way to pose this question, and I will try to define the question more carefully in coming days. That having been said, consider this:
Let's say we have a random variable X (or whatever). I can calculate the moments of this variable with no problem. In fact let's say X is normal. But I would like to characterize the distribution by a different distribution. In other words it is useful to model X by describing it with a different distribution. How can I relate the moments of X (that I know) to the moments of a new, different distribution?

Does this question make sense?

Thanks, friends
 
Physics news on Phys.org
Steve Zissou said:
TL;DR Summary: I hope to clarify my question soon, but I'd like to know how we can relate the moments of one sort of distribution to the moments of another.

I can calculate the moments of this variable with no problem.
To calculate the moments you need to know the probability distribution. So how do you calculate the second moment of x ?
 
Steve Zissou said:
In fact let's say X is normal. But I would like to characterize the distribution by a different distribution.

Saying that X is normally distributed doesn't define a particular distribution unless you also state the specific values of the parameters of that normal distribution. Let's say you begin by saying that X is normally distributed with mean 30 and standard deviation 15.8. Suppose you decide to characterize X by a different distribution. Are you asking how to characterize X with a distribution that may have different numerical values for its mean and standard deviation? Are you asking if there is only one way to pick different numerical values for the mean and standard deviation of the new distribution? If so, you haven't defined a precise mathematical problem. What criteria are we to use to judge which particular different numerical values "characterize" X?

If you are thinking of a situation where we have data from a sample that we use to estimate the unknown parameters of a distribution then there may be different customary estimators (i.e. formulas for making the estimates) for different distributions. The estimators are random variables. The distribution of an estimator depends on the distribution we assume for X. There are various criteria for which estimators are "best" - depending on one's interests. In such cases, if you change the assumed distribution for X, you might change your estimating formula and get a different numerical estimates for the parameters of the new distribution. There is no general procedure that applies to all such situations. It depends on the particular distributions and goals of estimation.
 
  • Like
Likes   Reactions: hutchphd
hutchphd and Stephen Tashi, thank you for your quick replies. Stephen I think you've already made a great point. But regardless here is my attempt to "tighten up" my question.

Ok, so let's say we have some data that ostensibly "comes from" the normal:

$$f(x)=\frac{1}{\sigma\sqrt{2\pi}}\exp\left[ -\frac{1}{2}\left( \frac{x-\mu}{\sigma} \right)^2 \right]$$

...and has mean ##\mu##, variance ##\sigma^2## and skew 0 (whatever).

Now, for whatever reason, we'd like to "fit" this data with the Gamma distribution:

$$f(x)=\frac{\beta^\alpha}{\Gamma(\alpha)}x^{\alpha-1}\exp\left[ -\beta x \right]$$

with mean ##\frac{\alpha}{\beta}##, variance ##\frac{\alpha}{\beta^2}## and skew ##\frac{2}{\sqrt{\alpha}}.##

Is there a way to think about relating ##\mu## and ##\sigma^2## to ##\alpha## and ##\beta##?

Thanks!
 
Steve Zissou said:
Now, for whatever reason, we'd like to "fit" this data with the Gamma distribution:

Steve Zissou said:
Is there a way to think about relating ##\mu## and ##\sigma^2## to ##\alpha## and ##\beta##?

Yes - provided you define "fit" in some precise way. For example, you might seek to find the gamma distribution that is the "least squares" fit to the normal distribution.

One complication to that approach is that there may not be a unique solution since there are several parameters involved. And a practical consideration is that one may not care about the tails of the distributions involved.
 
@Stephen Tashi
I see. So 1) define some best fit, and 2) make some specification for behavior of the tails
 
Steve Zissou said:
@Stephen Tashi
I see. So 1) define some best fit, and 2) make some specification for behavior of the tails

In general, define a function that measures the "fit" between two distributions in some precise way so that the function is a function of the parameters of interest. Treat the parameters of one distribution as given and try to find the parameters of the other distribution that maximize the fit.

Of course defining the task precisely does not guarantee it has a solution or a unique solution.

A function measuring fit could have rules that eliminate the tails from consideration or it could weight what happens on the tails less than what happens on the more probable values.

We often see articles along the lines of: The such-and-such Approximation to the so-and-so Distribution (e.g. the Normal approximation to the Binomial), I tend to get lost in the details of such articles without having a clear idea of how the authors have defined the goodness of approximation. If you look up articles of this type, you might find examples relating the moments of two distributions.
 
I have always found perusing Abramowitz and Stegun chapter 26 to be edifying. It is pretty condensed.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
5
Views
5K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K