Understanding Conditional Expectation, Variance, and Precision Matrices

Click For Summary
SUMMARY

This discussion focuses on the derivation of conditional expectation, variance, and precision matrices in the context of Gaussian Markov Random Fields (GMRFs), specifically referencing Lindgren, Rue, and Lindström (2011). The conditional expectation is defined as $$E(u_{i,j}\vert u_{-i,j}) = \frac{1}{a}(u_{i-1,j}+u_{i+1,j}+u_{i,j-1}+u_{i,j+1})$$ with variance $$Var(u_{i,j}\vert u_{-i,j}) = \frac{1}{a}$$, where 'a' is a scaling factor. The precision matrix for first-order neighbors is presented, and an extension to second-order neighbors is discussed, leading to a more complex precision matrix structure. The inquiry seeks clarity on the derivation of these expressions and the significance of the scaling factor 'a'.

PREREQUISITES
  • Understanding of Gaussian Markov Random Fields (GMRFs)
  • Familiarity with conditional autoregressive (CAR) models
  • Knowledge of precision matrices in statistical modeling
  • Basic concepts of conditional expectation and variance
NEXT STEPS
  • Study the derivation of conditional expectation in GMRFs
  • Explore the implications of precision matrices in statistical inference
  • Examine the relationship between GMRFs and CAR models in detail
  • Learn about the scaling factors in statistical models and their significance
USEFUL FOR

Researchers, statisticians, and data scientists interested in advanced statistical modeling, particularly those working with Gaussian Markov Random Fields and conditional autoregressive models.

MAXIM LI
Messages
6
Reaction score
2
Homework Statement
$$E(u_{i,j}\vert u_{-i,j}) = \frac{1}{a}(u_{i-1,j}+u_{i+1,j}+u_{i,j-1}+u_{i,j+1})$$
Relevant Equations
$$E(u_{i,j}\vert u_{-i,j}) = \frac{1}{a}(u_{i-1,j}+u_{i+1,j}+u_{i,j-1}+u_{i,j+1})$$
My question relates to subsection 2.2.1 of [this article][1]. This subsection recalls the work of Lindgren, Rue, and Lindström (2011) on Gaussian Markov Random Fields (GMRFs). The subsection starts with a two-dimensional regular lattice where the 4 first-order neighbours of $u_{i,j}$ are identified. The article defines the full conditional distribution through the expectation $$E(u_{i,j}\vert u_{-i,j}) = \frac{1}{a}(u_{i-1,j}+u_{i+1,j}+u_{i,j-1}+u_{i,j+1})$$ and variance $$Var(u_{i,j}\vert u_{-i,j}) = \frac{1}{a}.$$
This is then redefined in terms of the precision matrix, where the upper right quadrant is
\begin{array}
-1 \\
a & -1
\end{array}

Extending to second-order neighbours (i.e. the neighbours of first-order neighbours), the precision matrix becomes (again, just the upper right quadrant)
\begin{array}
-1 \\
-2a & 2 \\
4+a^2 & -2a & 1.
\end{array}

I am new to this topic and am trying to understand where the expressions for the conditional expectation and variance came from, and how the precision matrices were derived. I'd appreciate a fulsome explanation and derivation for both the first-order and second-order case. I tried looking in the book 'Gaussian Markov Random Fields
Theory and Applications' and this looks very similar to a conditional autoregression (CAR) model, defined in Chapter 1. However, here the full conditionals are written down as

$$
x_i \vert \mathbf{x}_{-i} \sim N\left(\sum_{j\neq i}\beta_{ij}x_{j},\kappa_i^{-1} \right)
$$

and the elements of the corresponding precision matrix are stated to be ##Q_{ii} = \kappa_i## and ##Q_{ij} = -\kappa_{i}\beta_{ij}## for ##i\neq j##This seems to be more general, which leaves me wondering how the conditional mean and variance at the start of this post were derived (along with the precision matrices). Where did a come from and why did we scale by this amount? Any help addressing this is much appreciated.

Note that ##\mathbf{x}_{-i}## means the vector of random variables excluding ##x_i##.

[1]: https://becarioprecario.bitbucket.io/spde-gitbook/ch-intro.html#sec:spde
 

Similar threads

Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
Replies
3
Views
4K
Replies
2
Views
3K