I Citation needed: Only multivariate rotationally invariant distribution with iid components is a multivariate normal distribution

AI Thread Summary
The discussion centers on the proposition that a random vector with independent and identically distributed (iid) components and mean zero is only invariant under orthogonal transformations if its distribution is multivariate normal. The argument involves the probability density function (PDF) of the random vector, which must be even to satisfy invariance under orthogonal transformations. This leads to the conclusion that the corresponding symmetric matrix must be a multiple of the identity, resulting in a specific form for the PDF. The conversation references the Maxwell characterization of the multivariate normal distribution as a key source for this proposition. Ultimately, the thread concludes with a successful citation of the relevant article that supports the claim.
DrDu
Science Advisor
Messages
6,418
Reaction score
1,003
TL;DR Summary
I need a citation for the proposition that the only multivariate rotationally invariant distribution with iid components is a multivariate normal distribution.
I need a citation for the following proposition: Assume a random vector ##X=(X_1, ..., X_n)^T## with iid components ##X_i## and mean 0, then the distribution of ##X## is only invariant with respect to orthogonal transformations, if the distribution of the ##X_i## is a normal distribution.
Thank you for your help!
 
Physics news on Phys.org
The PDF of X is <br /> f(x_1)\dots f(x_n) where f is the PDF of each X_i. Invariance under orthogonal transformations would require f to be even, since the transformation which multiplies the ith component by -1 and fixes the others is orthogonal. We can then write f(z) = g(z^2) whilst g(x_1^2) \cdots g(x_n^2) = F(x^TAx) for some symmetric matrix A which satisfies R^TAR = A for every orthogonal R. This is equivalent to the requiement that A should commute with every orthogonal R. I believe this in fact results in A being a multiple of the identity. If so, we have <br /> g(x_1^2) \cdots g(x_n^2) = F(x_1^2 + \dots + x_n^2) where the multiplier of the identity has been absorbed into F. Setting all but one of the x_i to be zero then shows that <br /> g(x_j^2)g(0)^{n-1} = F(x_j^2). Setting g = Ch where h(0) = 1 we find <br /> F = C^n h where <br /> h(z_1) \cdots h(z_n) = h(z_1 + \dots + z_n) for all (z_1, \dots, z_n) \in [0, \infty)^n. I think now we can proceed by induction on n, noting that for n = 2 and the assumption of continuous h we have h(z) = h(1)^z = \exp(z\log h(1)).
 
Look up the Maxwell characterization of the multivariate normal distribution.
 
Back
Top