Maximum likelihood to fit a parameter of this model

Click For Summary
SUMMARY

This discussion focuses on using maximum likelihood estimation (MLE) to fit the parameter λ in an Exponential Weighted Moving Average (EWMA) model for time series data. The model is defined by the equation σ_t^2 = λσ_{t-1}^2 + (1-λ)y_{t-1}^2 for t > 250, with σ_{M=250}^2 initialized as the average of the first 250 observations. The likelihood function is constructed as L(β, λ) = ∏_{i=1}^N f_{β}(x_i(λ)), where x_i(λ) represents the dependence on λ. The discussion emphasizes the need to differentiate the likelihood function with respect to the unknowns to derive the maximum likelihood estimators.

PREREQUISITES
  • Understanding of Exponential Weighted Moving Average (EWMA) models
  • Familiarity with maximum likelihood estimation (MLE) techniques
  • Knowledge of probability density functions and their parameters
  • Basic calculus for differentiation of functions
NEXT STEPS
  • Study the derivation of maximum likelihood estimators in statistical models
  • Learn about the properties and applications of Exponential Weighted Moving Average (EWMA) models
  • Explore the concept of independent and identically distributed (i.i.d.) random variables
  • Investigate different probability distributions and their density functions
USEFUL FOR

Statisticians, data scientists, and analysts working with time series data who need to apply maximum likelihood estimation techniques to model parameters effectively.

member 428835
Hi PF!

Given random time series data ##y_i##, we assume the data follows a EWMA (exponential weighted moving average) model: ##\sigma_t^2 = \lambda\sigma_{t-1}^2 + (1-\lambda)y_{t-1}^2## for ##t > 250##, where ##\sigma_t## is the standard deviation, and ##\sigma_{M=250}^2 = \sum_{i=1}^{250}y_i^2/250## to initialize. How would we use maximum likelihood to estimate ##\lambda##?

In general, it seems to use the principal we first choose a distribution ##P(y_i)## the data likely came from ( like a Bernoulli variable maybe for flipping a coin and estimating probability of heads ##p##, or normal distribution if we've been given heights of people as a sample and want to estimate the mean, standard deviation). Next, since the data are i.i.d. (we assume this is true) we optimize ##\Pi_i P(y_i)## with respect to the variable we seek (##p## or ##\mu## in the previous examples, in this question should be ##\lambda##). I'm just confused how the assumed model with ##\sigma## plays a role. Any help is greatly appreciated.
 
Physics news on Phys.org
The squaring just adds unnecessary superscripts for this exercise so let's write ##S_i## for ##\sigma^2_i## and ##X_i## for ##y_i^2##.

Typically we assume the ##X_i## are iid. Say the distribution of ##X_i## has parameters ##\mathbf \beta=\langle \beta_1,...,\beta_K\rangle##, and the probability density function of ##X_i## is ##f_{\mathbf \beta}##. We need to estimate ##\mathbf\beta## and ##\lambda## given observations ##s_1, ...,, s_n## for the random variables ##S_1, ..., S_N##.

Given the equations
$$S_t = \lambda S_{t-1} + (1-\lambda)Y_{t-1}$$
for ##t=2,...N##
and the missing equation ##S_1=X_1##
we can write the realized values ##x_1,..., x_N## of the random variables ##X_1,..., X_N## in terms of just ##\lambda## by inserting the observed values of ##S_1, ..., S_N##. Write these as ##x_1(\lambda),..., X_N(\lambda)## to emphasise this dependence.

The likelihood of the observed data given ##\mathbf \beta,\lambda## is
$$\mathscr L (\mathbf \beta,\lambda)=
\prod_{i=1}^N
f_{\mathbf\beta}(x_i(\lambda))$$

This expression has ##K+1## unknowns: ##\beta_1, ..., \beta_K, \lambda##. We partially differentiate it wrt each of those unknowns in turn and set it equal to zero, to get ##K+1## equations, the same number as we have unknowns. Solving those equations leads to the ML estimators of those unknowns.

Note how we needed the density function of ##X_i## to form the expression for ##\mathscr L##.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
3K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 0 ·
Replies
0
Views
1K