Multivariable change of variable.

Kuma
Messages
129
Reaction score
0
Hi there.

In the multivariate case it is proven that if

X~Nm(U, S) and Y = a + CX where C is an invertible n x n matrix, and S is the covariance matrix then:

Y ~ Nm(a + CU, CSC')

I am trying to apply this proof to a similar problem.

If we have X1...Xk with Xi ~ Nm(Ui, Si) where i = 1...k, with Xi independent.

what is the distribution of Y = a + sigma (i=1 to k) CiXi

I'm trying to apply the change of variable formula here to derive this, but I don't know how to use it for a function of several variables. In the proof with only X I can understand it.

Here Y = a + (C1X1 + C2X2 + C3X3 + ... + CkXk)
 
Physics news on Phys.org
That does not look like a generalization, it looks like a special case of your first statement. The "C" in your first theorem is just the matrix with a single row, \begin{bmatrix} C_1 & C_2 & \cdot\cdot\cdot & C_k\end{bmatrix}
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top