Are their theorems that deal with the general question? - Given a set of random variables ##\{X_1,X_2,..,X_M\}##, when does their exist some subspace ##S## of the vector space of random variables such that ##S## contains each ##X_i## and ##S## has a basis of mutually independent random variables?
"Linearly independent" presumably implies we are considering a set of random variables to be a vector space under the operations of multiplication by scalars and addition of the random variables.
Suppose we take "transformed" to mean transformed by a linear transformation in a vector space. If the vector space containing the random variables ##{X_1,X_2,...X_M}## has a finite basis ##B_1,B_2,...B_n## consisting of mutually independent random variables then (trivially) for each ##X_k## there exists a possibly non-invertible linear transformation ##T## that transforms some linear combination of the ##B_i## into ##X_k##.
If the smallest vector space containing the ##X_i## is infinite dimensional (e.g the vector space of all measurable functions on the real number line) , I don't know what happens.
I don't recall any texts that focus on vector spaces of random variables. Since the product of random variables is also a random variable, the topic for textbooks seems to be the algebra of random variables. But that approach downplays the concept of probability distributions.