Matrix notation - Two jointly Gaussian vectors pdf

AI Thread Summary
The discussion revolves around deriving the conditional probability density function (pdf) of two jointly Gaussian vectors using block matrix notation. Participants address issues with matrix inversions and multiplication orders, emphasizing the importance of symmetry in the calculations. One user identifies an error in the inverse matrix expression and notes that the matrix may not be invertible if the vectors are equal. After some back-and-forth, the original poster successfully derives the desired result by reorganizing the matrix multiplication. Clarifications are sought regarding the commutativity of covariance matrices and the logic behind certain substitutions in the derivation.
EmmaSaunders1
Messages
45
Reaction score
0
Hello

I am having trouble deriving using block matrix notation the conditional pdf of two joint Gaussian vectors:

I assume that it just involves some re-arranging of eq 1 (attatched) but am unsure if taking the inverse of the resultant matrix in eq 1 is valid and if the order of multiplication holds.

Thoughts appreciated
 

Attachments

  • gauss1.JPG
    gauss1.JPG
    51.4 KB · Views: 560
Physics news on Phys.org
There's an error, inv([I,0;A,I])=[I,0;-A,I], otherwise looks on the right track. If you expand the product it should get an answer that is symmetric in x and y as the factorization shouldn't matter. The Woodbury matrix identity may or may not be useful here.

And yes the matrix isn't guaranteed to be invertible (e.g. if X=Y)
 
Hello

Thanks very much for your help. I have multiplied out the problem and looked for symmetry as you suggested. I do however have an extra term in comparison to the final solution;

Would you possibly be-able to take a look at the attatched - perhaps I am missing something - is there any kind of concept or theorem I am missing which suggests the extra term is zero or is fundamentally the calculation wrong. I notice in the original attachment there is a "X" sign I assumed this to be matrix multiplication rather than cross product - is this correct??

Thanks again for your help
 

Attachments

Sorry the original version looks correct, ignore my previous comments - in effect you're showing that [I,-Sxx*inv(Syy);0,I]*[x;y] given y is gaussian. Notice that the exponent reduces to -(1/2)*[x'-xbar',y'-ybar']*(inv(S)-[0,0;0,inv(Syy)])*[x-xbar;y-ybar] and use [0,0;0,inv(Syy)] = [I,0;-inv(Syy)*Syx,I]*[0,0;0,inv(Syy)]*[I,-Sxx*inv(Syy);0,I].
 
Hi Thanks for your help:

I have managed to obtain the desired result - it was simply grouping the matrix multiplication into two parts separated by the X sign in the first attachment to make the multiplication easier. Would you however please be able to clarify - during the expansion I assumed that the product of two different covariance matrices are commutive - is this assumption okay.

I would also like to understand the simpler way you have tried to explain but am unable to follow the logic of the substitution as shown on the attached?

Your helps appreciated

Thanks
 

Attachments

I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...

Similar threads

Replies
7
Views
2K
Replies
2
Views
1K
Replies
2
Views
2K
Replies
7
Views
2K
Replies
4
Views
2K
Replies
1
Views
2K
Replies
1
Views
3K
Back
Top