Linear Gaussian parameter estimation

Click For Summary
SUMMARY

The discussion centers on the challenges of estimating scalar parameters in a multivariate linear-Gaussian model, specifically when some elements of the matrix A are known a priori. The user, Carlos, successfully derives the Maximum Likelihood Estimation (MLE) for matrices A and Q but encounters difficulties when trying to estimate scalar elements of A that depend on Q. The conversation highlights the complexity of matrix calculus in this context and suggests using an iterative approach to optimize parameters while holding others constant, as well as exploring the special case of bivariate Gaussian for simplification.

PREREQUISITES
  • Understanding of multivariate linear-Gaussian models
  • Familiarity with Maximum Likelihood Estimation (MLE)
  • Proficiency in MATLAB, particularly the Symbolic toolbox
  • Knowledge of matrix calculus
NEXT STEPS
  • Explore iterative optimization techniques for parameter estimation
  • Learn about bivariate Gaussian models to simplify calculations
  • Research joint parameter estimation methods in Gaussian models
  • Investigate advanced matrix calculus techniques for symbolic solutions
USEFUL FOR

Data scientists, statisticians, and researchers working with multivariate linear-Gaussian models, particularly those involved in parameter estimation and optimization techniques.

aydos
Messages
19
Reaction score
2
Hi,

I have a multivariate linear-gaussian model and I am trying to estaimte a particular scalar set of parameters of the model.
I know how to derive the MLE in order to find the matrices A and Q (linear transfer function and covariance respectively).
I take the log of the joint distribution, find the above parameter derivatives, equal the expression to zero and solve for each of the 2 above parameters. The expression for A only depends on the data. The expression of Q depends on A and on the data. So I solve A first and then I can solve Q.

However, I have a specific application where I do not want to estimate the entire matrix A. Some scalar elements of the matrix I know a priori and some other scalar elements are the parameters to be estimated.

This is where my problems start:
1- The matrix calculus gets very hairy and I do not know how to solve this symbolically.
2- I tried to skip the step above and expanded the linear equation into a scalar expression with Matlab Symbolic toolbox since the parameters to be estimated are now scalars. I then went through Matlab differentiation and solving tools as well. It seems to work in principle.
3- On the original problem (the one I know how to solve), the expressions for A do not depend on Q. But now the Matlab solution shows that my estimated scalar Aij parameters do depend on Q and I believe it is correct. So now I have a set of parameters that all depend on each other and I am not sure what to do to solve them.

Any light on what I might need here would be appreciated.

Regards,
Carlos

BTW. this is my first post here, how do I insert latex expressions in these posts?
 
Physics news on Phys.org
Welcome to PF, Carlos.

You can insert latex into your post by typing

[ tex] your code here [ /tex]
or
[ itex] your code here [ /itex] (for an inline formula)

When you click on the \Sigma symbol you are provided a short latex reference.

As for your question. Have you tried to solve your equations in the special case of a bivariate Gaussian, just to get a feeling of how the calculations go and if they can be done at all:smile:
 
Last edited:
Hi aidos. As mentioned before, it's a bit difficult to tell what you're doing if you aren't more explicit with the formulas. That said, I'd point out that you cannot generally find the MLE for each part of the parameters separately and have that work out to be the MLE over all of the parameters. The usual Gaussian problem is a special case, where the mean estimate doesn't depend on the covariance, and so you can do it in an iterated fastion. But, most of the time, you need to jointly estimate all of the parameters at once to make it happen. Nevertheless, you can still use an iterated approach, wherein you hold all but one parameter constant, and then optimize the others, then move on to the next parameter, and so on. You may need to take only small steps in each parameter at each iteration, and do lots of iterations, to ensure convergence. Alternatively, it may be possible to solve the entire set of equations at once to get the answer, but this will usually not be the case and, even when it is, the solution may be very nasty.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K