Mean and s.d. of product of two variables

Click For Summary
SUMMARY

The discussion focuses on calculating the mean and standard deviation of the product of two variables using the delta method and approximation techniques. Key formulas mentioned include E[XY] = E[X]E[Y] + Cov[X,Y] and the Taylor series expansion for approximating functions. The user seeks guidance on applying these concepts in SPSS, referencing a Stanford lecture for additional context. The conversation highlights the importance of understanding covariance and the Taylor series in statistical analysis.

PREREQUISITES
  • Understanding of covariance in statistics
  • Familiarity with Taylor series and approximation methods
  • Basic knowledge of SPSS for statistical analysis
  • Concept of expected value in probability theory
NEXT STEPS
  • Learn how to implement the delta method in SPSS
  • Study the Taylor series and its applications in statistics
  • Research covariance calculation techniques
  • Explore advanced statistical methods for product distributions
USEFUL FOR

Statisticians, data analysts, and researchers who need to calculate the mean and standard deviation of products of variables, particularly those using SPSS for statistical analysis.

Monique
Staff Emeritus
Science Advisor
Gold Member
Messages
4,229
Reaction score
61
I'm not a math wizard and now I need to calculate the mean and s.d. of the product of two variables. Apparently there is a delta method that should be able to do the trick, but I don't know how to apply it. I have SPSS, should that be able to help me out?
 
Physics news on Phys.org
I found this: http://www-stat.stanford.edu/~susan/courses/s200/lectures/lect5.pdf

Unfortunately the formulas are a bit too much for me. Does the second formula on page 2 make sense to anyone?
 
In general you need to use an approximation method (e.g. Taylor series) or simulation.

A useful formula is: E[XY] = E[X]E[Y] + Cov[X,Y]. This assumes you know the covariance between X and Y.

As for the formulas, Y = g(X) is approximated around the mean of X, m, as g(m) + (X - m)g'(m) + (X - m)^2 g"(m)/2. In a first order approximation the second-order term (X - m)^2 g"(m)/2 is assumed zero, so I can write Y = g(m) + (X - m)g'(m) and therefore E[Y] = E[g(m) + (X - m)g'(m)] = E[g(m)] + E[(X - m)g'(m)]. Since m is a constant, for an arbitrary function h and an arbitrary random variable Z, E[h(m)] = h(m) and E[Z h(m)] = h(m)E[Z]. Hence E[Y] = g(m) + g'(m)E[(X - m)] = g(m) + g'(m)(E[X] - m) = g(m), because E[X] = m.

In a second-order approximation, you have the additional term E[(X - m)^2 g"(m)/2] = g"(m)E[(X - m)^2]/2 = g"(m)Var[X]/2.

EnumaElish
___________________________________________
I would definitely have logged in as EnumaElish had PF administration awarded that account the privilege of posting replies, after I reset my e-mail address Tuesday, October 28, 2008.
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K