Taylor Series and Random Variables

In summary, when finding the approximate mean and variance of a function using Taylor Expansion, a bias correction factor (Equation A) is an overcorrection if the residual series has a negative skewness. This is because the assumption of the final term, which involves the expectation of the cubic, is affected by the skewness of the random variable y.
  • #1
GottaLoveMath
3
0

Homework Statement



A standard procedure for finding an approximate mean and variance of a function of a variable is to use a Taylor Expansion for the function about the mean of the variable. Suppose the variable is y, and that its mean and standard deviation are "u" and "o".

f(y) = f(u) + f'(u)(y-u) + f''(u)(((y-u)^2)/2!)) + f'''(u)((y-u)^3)/3!)) + ...

Consider the case of f(.) as e^(.). By taking the expectation of both sides of this equation, explain why the bias correction factor given in Equation A is an overcorrection if the residual series has a negative skewness, where skewness p of a random variable y is defined by

p = E((y-u)^3)/(o^3)

Equation A = x^hat_t = e^(m_t + s_t)*e^((1/2)(o^2))

where x_t is observed series, m_t is the trend, s_t is seasonal effect

Homework Equations

The Attempt at a Solution



Im not even really sure where to start. If someone could point me in the right direction, it would be greatly appreciated
 
Physics news on Phys.org
  • #2
Try what is suggested. If you begin with expected values like this:
[tex]
E(f(y)) = f(\mu) + f'(\mu)E(y-\mu) + \frac{f'(\mu)}{2!} E((y-\mu)^2) + \frac{f'''(\mu)}{3!} E((y-\mu)^3)
[/tex]

What do the assumptions tell you about the final term (term with the expectation of the cubic)?
 

1. What is a Taylor series?

A Taylor series is a mathematical representation of a function in terms of an infinite sum of its derivatives. It is used to approximate a function at a specific point by using a polynomial of increasing degree.

2. How is a Taylor series used in probability and statistics?

In probability and statistics, a Taylor series is used to approximate the probability distribution of a random variable. By finding the derivatives of the probability distribution function, we can create a polynomial that closely approximates the distribution at a specific point.

3. What are the assumptions for using a Taylor series for a random variable?

The main assumption for using a Taylor series for a random variable is that the function is smooth and continuous. This means that the function and all of its derivatives are well-defined and do not have any sharp changes or discontinuities.

4. What is the relationship between a Taylor series and a moment-generating function?

A moment-generating function is a type of Taylor series that is used specifically for random variables. It is a mathematical function that generates the moments (expectations) of a random variable. The coefficients in a moment-generating function can be used to find the moments of a random variable.

5. Can a Taylor series be used for any type of random variable?

Yes, a Taylor series can be used for any type of random variable as long as the assumptions for using a Taylor series are met. However, in some cases, other methods may be more suitable for approximating the probability distribution of a random variable, such as using a Fourier series or a Laplace transform.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
611
  • Calculus and Beyond Homework Help
Replies
4
Views
679
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
27
Views
708
  • Calculus and Beyond Homework Help
Replies
6
Views
363
Replies
3
Views
678
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
453
  • Calculus and Beyond Homework Help
Replies
1
Views
692
  • Calculus and Beyond Homework Help
Replies
6
Views
905
Back
Top