Joint probability for an infinite number of random variables,

In summary, the conversation discusses estimating joint probabilities for an infinite number of variables and considering dependencies among them. The speaker makes assumptions about the dependence among variables and expresses it through a function. They also discuss finding a general relation between the probabilities, dependent variables, and independent variables. The other speaker suggests starting with a toy model and gradually adding constraints to investigate and derive bounds for a broader representation.
  • #1
rmas
6
0
Hi,

I have the following question :
How do we estimate the joint probability [itex]Pr(X_1, ... X_n)[/itex] when [itex]n \rightarrow \infty[/itex] ?

Thanks a lot.
 
Physics news on Phys.org
  • #2
Takr a step back. How would you do the problem for finite n?
 
  • #3
Thank you for your reply.

Using the product rule ?

For k=2, [itex]P(X_1, X_2)=P(X_2|X_1) \times P(X1)[/itex]
For k=3, [itex]P(X_1, X_2, X_3)=P(X_3|X_1, X_2) \times P(X_2|X_1) \times P(X_1)[/itex]

[itex]\vdots[/itex]

For k=n, [itex]P(X_1,... X_n)=P(X_n|X_1,... X_{n-1}) \times ... \times P(X_2|X_1)P(X_1)[/itex]
 
  • #4
rmas said:
Hi,

I have the following question :
How do we estimate the joint probability [itex]Pr(X_1, ... X_n)[/itex] when [itex]n \rightarrow \infty[/itex] ?

Thanks a lot.

Hey rmas and welcome to the forums.

For your question, are you looking for a general formula or do you have a particular distribution (or distributions) in mind and want to calculate an actual (or estimated) value for some realization of your variables?

I'm guessing you have a distribution in mind though but correct me if I am wrong.

Also what are the properties of the distribution? Are they completely independent? First order conditionally independent (think markovian)? Higher order conditionally independent?

In other words what other constraints do you have that will help you simplify the problem down to as much as it can be simplified?
 
  • #5
Hi :smile:

Thanks a lot !

Let me make some assumptions about the existing dependency among the variables. Let's say that there are [itex]k[/itex] dependent and [itex]n-k[/itex] independent variables.
I expressed the relation of dependence through the function p (if we think about it graphically).

[itex]P(X_1,... X_n)=
P(X_1) \times P(X_2) \times ... \times P(X_i) \times \underbrace{\prod_{j=i+1}^{j=i+k}
P(X_j|p(X_j))}_{\textrm{The } k \textrm{ dependent variables}} \times P(X_{i+k+1}) \times ...\times P(X_{n-1}) \times P(X_{n \rightarrow \infty}) [/itex]

I don't really have a particular distribution but I am wondering whether it is possible to find a general relation between the [itex]P(X_1,... X_n)[/itex], the [itex]k[/itex] dependent variables and the [itex]n-k[/itex] independent variables ?

(and please, correct me if I am wrong)

Thanks.
 
  • #6
rmas said:
Hi :smile:

Thanks a lot !

Let me make some assumptions about the existing dependency among the variables. Let's say that there are [itex]k[/itex] dependent and [itex]n-k[/itex] independent variables.
I expressed the relation of dependence through the function p (if we think about it graphically).

[itex]P(X_1,... X_n)=
P(X_1) \times P(X_2) \times ... \times P(X_i) \times \underbrace{\prod_{j=i+1}^{j=i+k}
P(X_j|p(X_j))}_{\textrm{The } k \textrm{ dependent variables}} \times P(X_{i+k+1}) \times ...\times P(X_{n-1}) \times P(X_{n \rightarrow \infty}) [/itex]

I don't really have a particular distribution but I am wondering whether it is possible to find a general relation between the [itex]P(X_1,... X_n)[/itex], the [itex]k[/itex] dependent variables and the [itex]n-k[/itex] independent variables ?

(and please, correct me if I am wrong)

Thanks.

What kind of relation are you looking for?

Is it like some kind of bound (inequality) of some sort?

The constraints you've given a pretty broad. Based on what you have said, I can't think of any kind of relation that would be useful. The system is too broad to make any kind of useful properties.

But I guess if you wanted to analyze the two systems, and you had more constraints, you could use them to derive bounds of some sort.

My suggestion (and this is just a suggestion) that I would offer for you is to use your model and start off with a toy version that has a lot of constraints. Use that as your first model to investigate.

Slowly start peeling off constraints or at least modify them to make your model more broad, but still manageable. As you move to more broad representations, use the results or findings of your investigations with more constrained models to figure out something about the broader ones.

Apart from this, I can't really help you, but good luck!
 

Related to Joint probability for an infinite number of random variables,

1. What is joint probability for an infinite number of random variables?

Joint probability for an infinite number of random variables is the probability of multiple events occurring simultaneously, where the events are described by an infinite number of random variables. It is a measure of how likely it is for all the variables to take on specific values at the same time.

2. How is joint probability calculated for an infinite number of random variables?

Joint probability for an infinite number of random variables is calculated by multiplying the probabilities of each individual variable occurring. This is done using the joint probability distribution, which assigns probabilities to combinations of values for the random variables.

3. What is the difference between joint probability and conditional probability?

Joint probability considers the probability of multiple events occurring simultaneously, while conditional probability considers the probability of an event occurring given that another event has already occurred. In other words, joint probability looks at the probability of all events happening together, while conditional probability looks at the probability of one event happening given that another has already occurred.

4. Can joint probability be used for dependent variables?

Yes, joint probability can be used for dependent variables. However, the calculation of joint probability for dependent variables is more complex and may require additional information such as the correlation between the variables.

5. How is joint probability used in real-world applications?

Joint probability is used in many real-world applications, including in the fields of statistics, finance, and engineering. It is used to calculate the likelihood of multiple events occurring simultaneously, which can help in making decisions and predictions. For example, in finance, joint probability can be used to assess the risk of different investments by considering the probability of multiple economic events happening at the same time.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
545
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
601
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
512
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
177
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
570
Back
Top