How Do You Calculate the Probability of Wanting to Watch a Football Game?

  • Thread starter Thread starter Tajeshwar
  • Start date Start date
  • Tags Tags
    Joint Probability
AI Thread Summary
The discussion revolves around calculating the probability of wanting to watch a football game based on given joint probabilities. Participants share methods for organizing the data, such as using probability tables and trees to clarify relationships between events. The main challenge is determining the missing probabilities needed to compute the marginal probability of interest. There is a focus on understanding concepts like joint probability and conditional probability, with references to Bayes' rule for further clarity. Overall, the thread emphasizes the importance of structured approaches in probability calculations.
Tajeshwar
Messages
11
Reaction score
0

Homework Statement



I am given the following 3 joint probabilities:
p(I am leaving work early, there is a football game that I want to watch this afternoon) = .1
p(I am leaving work early, there is not a football game that I want to watch this afternoon) = .05
p(I am not leaving work early, there is not a football game that I want to watch this afternoon) = .65
What is the probability that there is a football game that I want to watch this afternoon?

Homework Equations

The Attempt at a Solution


I started out by writing out these cases like P(E,F) P(E,~F) P(~E,~F) and I also made the marginal probability table. I was thinking I would be looking for the marginal probability P(F), which would be P(E,F)+P(~E,F) but I could not find the value for P(~E,F)...?
 
Physics news on Phys.org
Tajeshwar said:
I started out by writing out these cases like P(E,F) P(E,~F) P(~E,~F)
What do mean by "writing out the cases". Did you write Bayes rule?

I think the general idea of this problem is to get two simultaneous equations in the variables P(E) and P(F).

A useful pattern is

P(A|B) = v = P(A,B)/ P(B)
implies P(A,B) = v P(B)

P(A|-B) = w = P(A,-B)/ P(-B)
implies P(A,-B) = w P(-B)

P(A,B) + P(A,-B) = P(A) = v P(B) + w(P(-B)) = v P(B) + w( 1 - P(B))

So we get a linear equation in P(A), P(B) given by P(A) = v P(B) + w( 1 - P(B))
 
Tajeshwar said:

Homework Statement



I am given the following 3 joint probabilities:
p(I am leaving work early, there is a football game that I want to watch this afternoon) = .1
p(I am leaving work early, there is not a football game that I want to watch this afternoon) = .05
p(I am not leaving work early, there is not a football game that I want to watch this afternoon) = .65
What is the probability that there is a football game that I want to watch this afternoon?

Homework Equations

The Attempt at a Solution


I started out by writing out these cases like P(E,F) P(E,~F) P(~E,~F) and I also made the marginal probability table. I was thinking I would be looking for the marginal probability P(F), which would be P(E,F)+P(~E,F) but I could not find the value for P(~E,F)...?

I would (almost always) do a probability tree for this sort of problem. It clarifies what is going on and cuts through all these words!
 
  • Like
Likes Tajeshwar
Tajeshwar said:

Homework Statement



I am given the following 3 joint probabilities:
p(I am leaving work early, there is a football game that I want to watch this afternoon) = .1
p(I am leaving work early, there is not a football game that I want to watch this afternoon) = .05
p(I am not leaving work early, there is not a football game that I want to watch this afternoon) = .65
What is the probability that there is a football game that I want to watch this afternoon?

Homework Equations

The Attempt at a Solution


I started out by writing out these cases like P(E,F) P(E,~F) P(~E,~F) and I also made the marginal probability table. I was thinking I would be looking for the marginal probability P(F), which would be P(E,F)+P(~E,F) but I could not find the value for P(~E,F)...?

You could make a little table:
$$\begin{array}{r|cc|c}
& \text{Game} & \text{No game} & \text{Total} \\ \hline
\text{Early} & 0.1 & 0.05 & 0.15 \\
\text{Not early} & --- & 0.65 & --- \\ \hline
\text{Total} & --- & 0.70 & ---
\end{array}
$$
Can you fill in the missing entries? After that, what entry or combinations of entries will you need in order to solve your problem?
 
Last edited:
  • Like
Likes Tajeshwar and PeroK
Ray Vickson said:
You could make a little table:
$$\begin{array}{r|cc|c}
& \text{Game} & \text{No game} & \text{Total} \\ \hline
\text{Early} & 0.1 & 0.05 & 0.15 \\
\text{Not early} & --- & 0.65 & --- \\ \hline
\text{Total} & --- & 0.70 & ---
\end{array}
$$
Can you fill in the missing entries? After that, what entry or combinations of entries will you need in order to solve your problem?

Thank you so much. This was really helpful. Could you please also let me know how to built that table in this thread? Is there a tool here that helps with that?

Since I don't know how to make that table yet I'll just write it out.

(E,F) = 0.1
(~E,F) = 0.2
(E,~F) = 0.05
(~E,~F) = 0.65

The total of the F row (there is a football game) will be 0.3.

However, i am not sure why the total of all the four probabilities in this table should be equal to 1? That is how I calculated the missing one.

Any guidance on that would be appreciated.

Thank you!
 
PeroK said:
I would (almost always) do a probability tree for this sort of problem. It clarifies what is going on and cuts through all these words!

Thank you. Yes just did that and it is much easier. Not sure how I can post a tree here on this forum. Would love to know if you use any tools to do that sort of thing regularly.
 
Stephen Tashi said:
What do mean by "writing out the cases". Did you write Bayes rule?

I think the general idea of this problem is to get two simultaneous equations in the variables P(E) and P(F).

A useful pattern is

P(A|B) = v = P(A,B)/ P(B)
implies P(A,B) = v P(B)

P(A|-B) = w = P(A,-B)/ P(-B)
implies P(A,-B) = w P(-B)

P(A,B) + P(A,-B) = P(A) = v P(B) + w(P(-B)) = v P(B) + w( 1 - P(B))

So we get a linear equation in P(A), P(B) given by P(A) = v P(B) + w( 1 - P(B))

Thank you for your response, but I am not very comfortable with these concepts yet.

P(A|B) = probability of event a occurring given that B has taken place right?

Then what is P(A,B)?

Sorry I have not yet studied Bayes rule. Is there a good tutorial you could recommend? Thank you.
 
Tajeshwar said:
Thank you so much. This was really helpful. Could you please also let me know how to built that table in this thread? Is there a tool here that helps with that?

Since I don't know how to make that table yet I'll just write it out.

(E,F) = 0.1
(~E,F) = 0.2
(E,~F) = 0.05
(~E,~F) = 0.65

The total of the F row (there is a football game) will be 0.3.

However, i am not sure why the total of all the four probabilities in this table should be equal to 1? That is how I calculated the missing one.

Any guidance on that would be appreciated.

Thank you!

When you asked how I built the table, you could be asking one of twp possible things.
(1) How did you figure out what numbers to put in the various parts of the table?
or
(2) How did you type out the table so it looks nice?

Well, for (1): I just used the data directly given in the problem

For (2): I used the LaTeX facility that is built-in to this Forum, typesetting the table as an "array" inside a displayed equation. To see the commands used, just right-click on the image and ask for "display as TeX commands". Another respondent has already given you a link to a LaTeX tutorial for this Forum.

Why should all the entries sum to 1? Well, there are only four possible "conditions" in this problem, and together they give all the possibilities (that is, everything). Since ##\Pr \{ \text{everything} \} = 1## the four probabilities must add up to 1. That is just basic Probability 101.
 
Tajeshwar said:
Thank you for your response, but I am not very comfortable with these concepts yet.

P(A|B) = probability of event a occurring given that B has taken place right?

Then what is P(A,B)?

Sorry I have not yet studied Bayes rule. Is there a good tutorial you could recommend? Thank you.

##P(A|B)## is generally the probability of event A given event B. There's no sense that B must happen before A.

For example. Suppose you throw a die twice. Let A be the event that the first throw is a 6 and B be the event that the total is 10. Then:

##P(A|B) = 1/3##

In this case, event A always comes before event B, but you can still look at ##P(A|B)##.

You can calculate this at least three ways. First, use the probability tree. Second, draw a Venn diagram. Third, use the formula:

##P(A|B) = \frac{P(A \cap B)}{P(B)}##

What is happening is that first you are restricting your sample space to only that where you have event B. Then, you are taking the probability of event A within that restricted sample space.

If you use the tree or the Venn diagram, you should be able to justify the above formula. And, it's a good way to recreate the formula if you forget it.
 
Last edited:
  • #10
Tajeshwar said:
P(A|B) = probability of event a occurring given that B has taken place right?

Then what is P(A,B)?

In the notation you used, P(A,B) is the "Probability of A and B" or "The Probability that both A and B occur" or, if you think in terms of sets A and B (instead of statements A and B), it denotes the probability of : A intersection B.

A more common notation for that concept would be ##P(A\land B)## or ##P(A \cap B)##.

To illustrate the distinction between "probability of A given B" versus "Probability of A and B", consider the example:
A = I will call my car insurance company today
B = I will have a traffic accident today

If you drive carefully ##P(A \land B)## is small. However ##P(A|B)## is large.

##P(A|B)## is defined to be ##P(A\land B)/ P(B)##. This definition is often called "Bayes Rule" for historical reasons.

A confusing thing about distingushing ##P(A \land B)## from ##P(A|B)## is that both notations refer to the same events - namely that A and B both occur. In terms of sets, both notations refer to the set ##A \cap B##. The two notations differ with respect to the "sample space" or total collection of events that is being considered. ##P(A \land B)## and ##P(A \cap B)## refer to assigning probability to ##A \cap B## in some context where where ##B## may or may not occur. ##P(A|B)## refers to assigning probability to ##A \cap B## in a narrower context where we are certain that ##B## occurs.
 
Back
Top