Recent content by kasraa

  1. K

    Compute Mean Square Error (MSE) for a Problem

    Hi all, I want to compute mean square error (MSE) for a problem but I'm not sure if I'm doing it right. Suppose that I want to estimate a variable (e.g. the position of an object) like x. The estimation process depends on the realizations of some specific random variables (i.e. Gaussian...
  2. K

    Question on Importance Sampling (Monte Carlo method)

    Hi, Suppose I have N iid samples from a distribution q, and I want to estimate another distributin, p, using those samples (Importance Sampling). By "standard importance sampling", I mean the case where samples (prior samples. i.e. samples from q) have equal weights ( w_i = 1/N ). In...
  3. K

    Conditional & uncoditional MSE (in MMSE estimation)

    In my notation, X is the RV which we're trying to estimate, so the prior (unconditional pdf, which in case of the Kalman filter, is our estimate at the previous step) is p(x) . Actually nothing is wrong with it (using Bayes in order to reach to the posterior). I believe I explained my...
  4. K

    Conditional & uncoditional MSE (in MMSE estimation)

    So you're confused about conditional/unconditional MSE too (just like me), right? :D
  5. K

    Conditional & uncoditional MSE (in MMSE estimation)

    Sorry, but I can't understand your last post (I don't get your "English". not minimizing the trace of covariance matrix to find the Kalman gain ...). What I understand is that Kalman and MMSE are related (in fact, I think Kalman is the MMSE estimator for the case of Gaussian variables (or...
  6. K

    Conditional & uncoditional MSE (in MMSE estimation)

    Sorry, but I can't understand your last post (I don't get your "English". not minimizing the trace of covariance matrix to find the Kalman gain ...). What I understand is that Kalman and MMSE are related (in fact, I think Kalman is the MMSE estimator for the case of Gaussian variables (or...
  7. K

    Conditional & uncoditional MSE (in MMSE estimation)

    I believe the covariance matrix of p \left( x | Z \right) when they're jointly Gaussian is: R_{XX}-R_{XZ}R_{ZZ}^{-1}R_{ZX} which its trace is the *minimum* MSE. I believe the minimization took place when you selected E \left[ x|Z \right] as your estimator. About double...
  8. K

    Conditional & uncoditional MSE (in MMSE estimation)

    Part one: The posterior p \left( x|Z \right) , has a mean and a (co)variance. Its mean is the MMSE estimator, E \left[ x|Z \right] , and its variance (or the trace of its covariance matrix, if it's a random vector) is the minimum mean squared error. Am I right? So the trace of...
  9. K

    Conditional & uncoditional MSE (in MMSE estimation)

    Thanks for your reply. Actually I've read it. My question is about MMSE estimation in general (and Kalman filter, only as one of its implementations for some particular case). Let me explain more. As I've asked in (1) and (2), I'm not sure what conditional/unconditional MSE exactly are (and...
  10. K

    Conditional & uncoditional MSE (in MMSE estimation)

    Hi, 1- Please explain conditional & unconditional mean square error, and their difference. 2- Which one is the solution for minimum MSE estimation? (that is conditional expectation: E \left[ X|Y \right] . I meant which one is minimized by selecting the conditional expectation.) 3- What is...
  11. K

    Calculating Conditional Expectation for IID Normal Variables

    and please tell me if you know a good reference for understanding these subjects. I am trying to fix my probability/statistics background knowledge in order to understand stochastic processes and estimation theory deeply.
  12. K

    Calculating Conditional Expectation for IID Normal Variables

    Thanks. I'm really confused! Can you please suggest a solution through the definition of E{x1*x2|x1+x2=x}? I want to correct my beliefs about this kind of problems (working with sum of RVs and ...). I think E{x1*x2|x1+x2=x}=double integral on x1 and x2 { x1 * x2 * p(x1,x2|x1+x2=x) *dx1 *dx2}...
  13. K

    Calculating Conditional Expectation for IID Normal Variables

    I'm confused. Can you please explain more why the first solution (-1) is wrong? X_1 and X_2 are 2 RVs, and we're told that X_1+X_2=y. 1) I understand that sum of two RVs will be a RV, but here, when we are told the sum is y, it is something like a constant? (a realization of Y that is Y=y I...
Back
Top