Joint probability mass function

In summary, the conversation is about a person asking for help in understanding the notation used in a problem involving coins and outcomes. The notation includes Ω, 0, 1, and PX,Y(x,y), with different probabilities assigned for different outcomes. The person clarifies that X is equal to 0 if the outcome is TT and 1 if the outcome is HH, HT, or TT. Similarly, Y is equal to 0 if the outcome is HH and 1 if the outcome is HT, TT, or TH. The conversation ends with the person suggesting to use LaTex for writing the notation correctly.
  • #1
zyQuzA0e5esy2y
3
0
I'm having difficulties reading this, could someone please explain to me?

2 coins
Ω = {HH, HT, TH, TT}
0 = Tails
1 = Heads
PX,Y(x,y) = P({TT} n {HH}) for (x,y) = (0,0)
P({HH, HT,TH} n {HH}) for (x,y) = (1,0)
P({TT} n {TT, TH, HT}) for (x,y) = (0,1)
P({HH,HT,TH} n {TH,HT,TT}) for (x,y) = (1,1)
0 otherwise

PX,Y(x,y) = P(0) for (x,y) = (0,0)
P({HH}) for (x,y) = (1,0)
P({TT}) for (x,y) = (0,1)
0 otherwise
PX,Y(x,y) = 0.25 for (x,y) = (1,0)(0,1)
0.50 for(x,y) = (1,1)
0 otherwiseThat's basically the lay out i was given.
im not sure how to read for(x,y) = (0,0) and the others similar to it.

Solved it thank you, if anyone is curious i forgot to mention that X = 0 if the outcome equals to TT and if X = 1 for the outcome if it equals to HH HT TT..
Y = 0 if the outcome is equal to HH
Y = 1 if the outcome is equal to HT TT TH
 
Last edited:
Physics news on Phys.org
  • #2
If your question is about how to read notation, the way you have presented the notation isn't coherent. For example, I think you are using "n" to mean the intersection symbol [itex] \cap [/itex].

If this notation comes from a problem, I suggest you post the problem in the homework section and quote the entire problem exactly.

Perhaps you can use LaTex: https://www.physicsforums.com/help/latexhelp/
 

What is a joint probability mass function?

A joint probability mass function is a statistical concept that describes the likelihood of two or more discrete random variables occurring simultaneously. It maps the possible outcomes of each variable and assigns a probability to each combination of outcomes.

Why is the joint probability mass function important?

The joint probability mass function is important because it allows us to understand the relationship between multiple random variables and their outcomes. It can be used to calculate the probabilities of various events and make predictions about future outcomes.

How is the joint probability mass function different from a marginal probability mass function?

The joint probability mass function considers the probabilities of multiple variables occurring together, while a marginal probability mass function only looks at the probability of one variable occurring. The joint probability mass function takes into account the relationship between variables, while the marginal probability mass function does not.

What is the difference between a joint probability mass function and a joint probability distribution?

While both concepts describe the relationship between multiple random variables, a joint probability mass function is used for discrete variables while a joint probability distribution is used for continuous variables. Additionally, a joint probability mass function assigns probabilities to specific combinations of outcomes, while a joint probability distribution assigns probabilities to ranges of outcomes.

How is the joint probability mass function used in real-world applications?

The joint probability mass function can be used in a variety of real-world applications, such as risk assessment, market analysis, and quality control. It can also be used in fields such as genetics, where multiple variables may affect the occurrence of a particular trait or disease.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
908
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Precalculus Mathematics Homework Help
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
827
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
Back
Top