Uniform distribution - Independent Randmo variables?

AI Thread Summary
Uniform distribution does not imply independence among random variables. For instance, if X is uniformly distributed between 0 and 1, and Y is defined as Y = 1 - X, then X and Y are not independent despite both being uniformly distributed. The uniformity of a random variable indicates that its outcomes are equally probable, but does not provide information about dependence. To determine if two variables are dependent, one can use the expectation test E[XY] ≠ E[X]E[Y], although this does not guarantee independence in all cases. Understanding the relationship between uniform distributions and independence requires more context about the variables involved.
karlihnos
Messages
2
Reaction score
0
Hi guys,

I have this doubt but i am not sure, if i have an uniform distibution can i conclude that the events or random variables are independent?

Thank you
 
Physics news on Phys.org
Could you please explain a little bit more about your data? How many random variables? Which of those variables are uniform...
 
karlihnos said:
Hi guys,

I have this doubt but i am not sure, if i have an uniform distibution can i conclude that the events or random variables are independent?

Thank you

No. For example choose a random variable X from a uniform distribution (0,1) and then let
Y = 1 - X. X and Y are certainly not independent, but both have unform distributions.
 
karlihnos said:
Hi guys,

I have this doubt but i am not sure, if i have an uniform distibution can i conclude that the events or random variables are independent?

Thank you

Generally, distribution without any additional information (e.g. parameters in some dists.) doesn't say anything about dependence. Uniformity of random variable only means, that its realizations are equiprobable, that is \mathbb{P}[X=x_1] = \mathbb{P}[X=x_2] = ... where X \sim U(a,b) and x_i \in [a,b]. The notion of (in)dependence is much more tricky. Are you talking about multiple uniformly distributed random variables? A nice example of correlated uniformly distributed rvs gave mathman.
 
karlihnos said:
Hi guys,

I have this doubt but i am not sure, if i have an uniform distibution can i conclude that the events or random variables are independent?

Thank you

The test that will definitely tell if two variables are dependent is if E[XY] <> E[X]E[Y] for two variables X and Y.

The converse is not true though funnily enough: you can show that E[XY] = E[X]E[Y] but still have instances where you have dependent variables, although the case that this happens provides a kind of 'evidence' that they are independent (doesn't mean its conclusive though).
 
Thanks to all. So i think the best way to look for it is the test that mentioned chiro.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top