Proof of the MMSE Estimator of $\mathbf{\theta}\left[n\right]$

  • Thread starter Thread starter prhzn
  • Start date Start date
  • Tags Tags
    Proof
prhzn
Messages
8
Reaction score
0

Homework Statement



Consider a parameter \mathbf{\theta} which changes with time according to the deterministic relation

\mathbf{\theta}\left[n\right] = \mathbf{A}\mathbf{\theta}\left[n-1\right]\; n\geq 1,

where \mathbf{A} is a known p\times p matrix and \mathbf{\theta}\left[0\right] is an unknown parameter which is modeled as a random (p\times 1) vector. Note that once \mathbf{\theta}\left[0\right] is specified, so is \mathbf{\theta}\left[n\right] for n\geq 1.

Prove that the MMSE estimator of \mathbf{\theta}\left[n\right] is

\mathbf{\hat{\theta}}\left[n\right] =\mathbf{A}^n\mathbf{\hat{\theta}}\left[0\right],

where \mathbf{\hat{\theta}}\left[0\right] is the MMSE estimator of \mathbf{\theta}\left[0\right], or equivalently,

\mathbf{\hat{\theta}}\left[n\right] = \mathbf{A}\mathbf{\hat{\theta}}\left[n-1\right]

Homework Equations



MMSE: \hat{\theta} = \mathbb{E}\left[\theta|\mathbf{x}\right] = \int p\left(\theta\right)\ln\left(p\left(\theta|\mathbf{x}\right)\right)\mathrm{d}\theta

The Attempt at a Solution



So far I haven't got any good attempt as my main problem is how to start. Until now, all exercises about MMSE that I've done have specified information about the PDF's to some of the variables or some other information that has made it more obvious how to start; however, with this I feel like I'm a bit lost. So mainly I'm just looking for a hint on how to start, such that I can do an fair attempt on my own.
 
Last edited:
Physics news on Phys.org
Not sure if this is correct, maybe someone can tell or not?

We know that \mathbf{\theta}[n] = \mathbf{A}\mathbf{\theta}[n-1] for n\geq 1.

Then \mathbf{\theta}[n] = \mathbf{A}\left(\mathbf{A}\mathbf{\theta}[n-2]\right) and so on, resulting in \mathbf{\theta}[n] = \mathbf{A}^n\mathbf{\theta}[0].

The MMSE is then

\mathbf{\hat{\theta}}[n] = \mathbb{E}\left[\mathbf{\theta}[n]|\mathbf{x}\right]

Doing the same "trick" as above we get

\mathbf{\hat{\theta}}[n] = \mathbf{A}^n\mathbb{E}\left[\mathbf{\theta}[0]|\mathbf{x}\right].

We already know that \mathbf{\hat{\theta}}[0] is the MMSE estimator of \mathbf{\theta}[0]; hence, the proof is complete.
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top