Joint probability for an infinite number of random variables,

rmas
Messages
6
Reaction score
0
Hi,

I have the following question :
How do we estimate the joint probability Pr(X_1, ... X_n) when n \rightarrow \infty ?

Thanks a lot.
 
Physics news on Phys.org
Takr a step back. How would you do the problem for finite n?
 
Thank you for your reply.

Using the product rule ?

For k=2, P(X_1, X_2)=P(X_2|X_1) \times P(X1)
For k=3, P(X_1, X_2, X_3)=P(X_3|X_1, X_2) \times P(X_2|X_1) \times P(X_1)

\vdots

For k=n, P(X_1,... X_n)=P(X_n|X_1,... X_{n-1}) \times ... \times P(X_2|X_1)P(X_1)
 
rmas said:
Hi,

I have the following question :
How do we estimate the joint probability Pr(X_1, ... X_n) when n \rightarrow \infty ?

Thanks a lot.

Hey rmas and welcome to the forums.

For your question, are you looking for a general formula or do you have a particular distribution (or distributions) in mind and want to calculate an actual (or estimated) value for some realization of your variables?

I'm guessing you have a distribution in mind though but correct me if I am wrong.

Also what are the properties of the distribution? Are they completely independent? First order conditionally independent (think markovian)? Higher order conditionally independent?

In other words what other constraints do you have that will help you simplify the problem down to as much as it can be simplified?
 
Hi :smile:

Thanks a lot !

Let me make some assumptions about the existing dependency among the variables. Let's say that there are k dependent and n-k independent variables.
I expressed the relation of dependence through the function p (if we think about it graphically).

P(X_1,... X_n)=<br /> P(X_1) \times P(X_2) \times ... \times P(X_i) \times \underbrace{\prod_{j=i+1}^{j=i+k}<br /> P(X_j|p(X_j))}_{\textrm{The } k \textrm{ dependent variables}} \times P(X_{i+k+1}) \times ...\times P(X_{n-1}) \times P(X_{n \rightarrow \infty})

I don't really have a particular distribution but I am wondering whether it is possible to find a general relation between the P(X_1,... X_n), the k dependent variables and the n-k independent variables ?

(and please, correct me if I am wrong)

Thanks.
 
rmas said:
Hi :smile:

Thanks a lot !

Let me make some assumptions about the existing dependency among the variables. Let's say that there are k dependent and n-k independent variables.
I expressed the relation of dependence through the function p (if we think about it graphically).

P(X_1,... X_n)=<br /> P(X_1) \times P(X_2) \times ... \times P(X_i) \times \underbrace{\prod_{j=i+1}^{j=i+k}<br /> P(X_j|p(X_j))}_{\textrm{The } k \textrm{ dependent variables}} \times P(X_{i+k+1}) \times ...\times P(X_{n-1}) \times P(X_{n \rightarrow \infty})

I don't really have a particular distribution but I am wondering whether it is possible to find a general relation between the P(X_1,... X_n), the k dependent variables and the n-k independent variables ?

(and please, correct me if I am wrong)

Thanks.

What kind of relation are you looking for?

Is it like some kind of bound (inequality) of some sort?

The constraints you've given a pretty broad. Based on what you have said, I can't think of any kind of relation that would be useful. The system is too broad to make any kind of useful properties.

But I guess if you wanted to analyze the two systems, and you had more constraints, you could use them to derive bounds of some sort.

My suggestion (and this is just a suggestion) that I would offer for you is to use your model and start off with a toy version that has a lot of constraints. Use that as your first model to investigate.

Slowly start peeling off constraints or at least modify them to make your model more broad, but still manageable. As you move to more broad representations, use the results or findings of your investigations with more constrained models to figure out something about the broader ones.

Apart from this, I can't really help you, but good luck!
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top