A doubt on stastical indeependence , orthogonality and uncorrelatedness ?

  • Context: Graduate 
  • Thread starter Thread starter dexterdev
  • Start date Start date
  • Tags Tags
    Doubt Orthogonality
Click For Summary

Discussion Overview

The discussion revolves around the concepts of statistical independence, uncorrelatedness, and orthogonality in the context of random variables. Participants explore the relationships between these concepts and seek clarification on their definitions and implications, particularly in statistical and vector space contexts.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant seeks clarity on the concepts of statistical independence, uncorrelatedness, and orthogonality, questioning if they are the same and how they relate to random variables.
  • Another participant distinguishes between statistical independence and independence in vector spaces, suggesting that they are not identical due to differing definitions.
  • It is proposed that random variables can be viewed as vectors, and the relationship between statistical independence and vector independence is questioned, with no definitive answer provided.
  • A participant notes that orthogonality typically means the same as uncorrelatedness in statistics, particularly when considering two random variables affecting a third.
  • Mathematical expressions are presented to illustrate the relationship between joint probability density functions and expected values, though the implications are not universally agreed upon.

Areas of Agreement / Disagreement

Participants express differing views on the relationships between the concepts discussed, with no consensus reached on whether statistical independence and vector independence are equivalent or how they should be interpreted in various contexts.

Contextual Notes

There are unresolved questions regarding the definitions and implications of statistical independence, uncorrelatedness, and orthogonality, particularly in relation to vector spaces and the mathematical relationships between them.

dexterdev
Messages
194
Reaction score
1
A doubt on statistical independence , orthogonality and uncorrelatedness ?

Hi friends,
I wanted to make my concepts on statistical independence, uncorrelatedness and orthogonality clear. Suppose I have 2 random variables x and y. I have 2 pictures on the above concepts, is it correct ? which is more general picture? If any mistake is there please point it out.


What is statistical independence and linear independent independence means? Are they same?

Do pdf(x,y)=pdf(x)*pdf(y) always imply E(XY)=E(X)E(Y). Can anyone please explain that?

-Devanand T
 

Attachments

  • ind_uncorr_orth1.jpg
    ind_uncorr_orth1.jpg
    24.6 KB · Views: 487
  • ind_uncorr_orth2.jpg
    ind_uncorr_orth2.jpg
    24.9 KB · Views: 524
Last edited:
Physics news on Phys.org
You'd get better answers if you only asked one important question per thread!

What do you mean by "independent independence"? Are you referring to indpendence of vectors in a vector space? That type of indpendence is not identical to statistical indpendence merely because independence in vector spaces is defined without any reference to probability or expected values.

It is possible for a person to view a set of random variables as vectors in various ways.

So the relevant question is "Can I view random variables as vectors in such a way that
a set of random variables is statistically independent if and only if the set is independent as a set of vectors?". I can't answer that. I'll have to think about it.

It is more common to view random variables as vectors in such as way that two random variables are uncorrelated if and only if they are orthogonal as vectors. I think the general idea is that you view a set of random variables as forming, not only a vector space, but a vector space with an inner product. I think the inner product is the covariance. If you do things this way, the "natural" addition of random variables becomes the way you add them as vectors. The "natural" way you multiply a random variable by a constant becomes the way you multiply it as a vector by a scalar.

There could be other ways ot viewing random variables as vectors where addition and scalar multiplication in the vector space have a different definition that the "natural" one. That's why it's hard to answer the question "Can I view random variables as vectors in such a way that
a set of random variables is statistically independent if and only if the set is independent as a set of vectors?". You can't automatically say "no".
 
In stats, orthogonal generally means the same as uncorrelated, but only used in the context of two r.v.s which affect a third. The point is that if they're uncorrelated you can run the two regressions independently. See http://en.wikipedia.org/wiki/Orthogonality#Statistics.2C_econometrics.2C_and_economics

If pdf(x,y)=pdf(x)*pdf(y), E(XY)=∫∫xy.pdf(x,y).dx.dy=∫∫xy.pdf(x)*pdf(y).dx.dy=∫∫x.pdf(x).dx*y.pdf(y).dy=∫x.pdf(x).dx*∫y.pdf(y).dy=E(X)E(Y).
 
Thanks guys... will ask one question per thread
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 43 ·
2
Replies
43
Views
6K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K