How Do You Derive the PDF of a Linearly Transformed Vector?

Click For Summary
SUMMARY

The discussion focuses on deriving the probability density function (pdf) of a linearly transformed vector Y, defined as Y=AX+b, where A is an invertible nxn matrix and X is an n-vector of jointly continuous random variables. The transformation of the pdf is established through the relationship f_Y(y) = f_X(x) |det(J)|, where J is the Jacobian matrix of the transformation. The key takeaway is that the pdf of Y can be expressed in terms of the original pdf f(x) by incorporating the determinant of the Jacobian, which accounts for the change of variables in the transformation.

PREREQUISITES
  • Understanding of jointly continuous random variables and their pdfs.
  • Knowledge of matrix operations, specifically invertible matrices.
  • Familiarity with the concept of the Jacobian matrix in multivariable calculus.
  • Basic principles of transformation of random variables.
NEXT STEPS
  • Study the derivation of the Jacobian matrix for vector transformations.
  • Learn about the properties of determinants and their role in transformations.
  • Explore examples of linear transformations of random variables in probability theory.
  • Investigate the implications of linear transformations on the behavior of probability distributions.
USEFUL FOR

Students and professionals in statistics, data science, and applied mathematics who are working with multivariate probability distributions and transformations of random variables.

alpines4
Messages
4
Reaction score
0

Homework Statement


Define X to be an n-vector of jointly continuous random variables X1, ..., Xn with joint pdf f(x) mapping R^n to R. Let A be an invertible nxn matrix and set Y=AX+b. I want to derive the pdf of f(y) in terms of f(x), the original pdf.



Homework Equations






The Attempt at a Solution



Given a random variable and its PDF f(x), the transformation of Y=g(X) is (given that g is one to one and thus has an inverse) f(g^{-1}(y)) * g'(y). I don't know how to generalize this to a matrix, however. I assume it will be kind of similar... Any help is appreciated. I just need some tips to get started. Thank you!
 
Physics news on Phys.org
alpines4 said:

Homework Statement


Define X to be an n-vector of jointly continuous random variables X1, ..., Xn with joint pdf f(x) mapping R^n to R. Let A be an invertible nxn matrix and set Y=AX+b. I want to derive the pdf of f(y) in terms of f(x), the original pdf.



Homework Equations






The Attempt at a Solution



Given a random variable and its PDF f(x), the transformation of Y=g(X) is (given that g is one to one and thus has an inverse) f(g^{-1}(y)) * g'(y). I don't know how to generalize this to a matrix, however. I assume it will be kind of similar... Any help is appreciated. I just need some tips to get started. Thank you!


Intuitively the pdf of Y=g(X) is given by
f_X(x)|dx|=f_Y(y)|dy|
In the case of vector x and y, dx and dy is related through the determinant of the Jacobian matrix between x and y
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
9
Views
2K
Replies
5
Views
1K