Simple Probability Question(HELP)

  • Thread starter Beowulf2007
  • Start date
  • Tags
    Probability
In summary: In (1) Is it something to do with the events being mutually exclusive??If P(X) = 1, then X is the entire sample space, right?In (1), since P(T) = P(U) = 0, what can you say about T and U?In (2), since P(T) = P(U) = 1, what can you say about T and U?You need to use the axioms of probability, not intuition.
  • #1
Beowulf2007
17
0

Homework Statement



Given that [tex]P(T) = P(U) = 0[/tex] then show that [tex]P(T \cup U) = 0[/tex]

Given that [tex]P(T) = P(U) = 1[/tex] then show that [tex]P(T \cap U) = 1[/tex]


Homework Equations



I am told that I need to use the following equations.

(1) [tex]P(T \cup U) = P(T) + P(U)[/tex] if [tex]P(T \cap U) = 0[/tex]

(2) [tex]P(T \cup U) = P(T) + P(U) - P(T \cup U)[/tex]

The Attempt at a Solution



My Solution for question one.

Since We know that [tex]P(T) = P(U) = 0[/tex] then by using equation

We get that [tex]P(T \cup U) = P(T) + P(U) = 0 + 0 = 0[/tex]

My Solution for question two.

Here I have a feeling that I need to use equation two, but how do I deduce
[tex]P(T \cup U)[/tex] ??

Best Regards
Beowulf..
 
Physics news on Phys.org
  • #2
Beowulf2007 said:
(2) [tex]P(T \cup U) = P(T) + P(U) - P(T \cup U)[/tex]

The last term should be [itex]P(T \cap U)[/itex].

Since We know that [tex]P(T) = P(U) = 0[/tex] then by using eqution

We get that [tex]P(T \cup U) = P(T) + P(U) = 0 + 0 = 0[/tex]

How do you know to use equation (2)? What if [itex]P(T \cap U) \ne 0[/itex]?

Here I have a feeling that I need to use equation two, but how do I deduce
[tex]P(T \cup U)[/tex] ?? .

Why do you have a feeling? Describe your reasoning.
 
  • #3
e(ho0n3 said:
The last term should be [itex]P(T \cap U)[/itex].

Thanks ;)

How do you know to use equation (2)? What if [itex]P(T \cap U) \ne 0[/itex]?
Why do you have a feeling? Describe your reasoning.

From what I see it [itex]P(T \cap U) [/itex] cannot be larger than one for obvious reasons. Because that would give a negative probability.

I am not sure about that, another way of proving question (2)

Could it say If P(C) is the entire probability space where then since P(T) = P(U), then I could say [tex] P(C) = P(U) + P(T) = 2 [/tex] and then [itex]P(T \cap U) =P(U) \cdot P(T) = 1[/itex]
Then [itex]P(T \cup U) = P(c) - P(U) \cdot P(T) = 1[/itex]

Is my argument my reasonable? If yes can it be used in (1) too?

Best Regards

Beowulf
 
Last edited:
  • #4
Beowulf2007 said:
From what I see it [itex]P(T \cap U) [/itex] cannot be larger than one for obvious reasons. Because that would give a negative probability.

I will write [itex]T \cap U[/itex] as TU for short. If I understand you correctly, you're saying that P(TU) cannot be greater than 1 because if it is, it's negative?!

Could it say If P(C) is the entire probability space where then since P(T) = P(U), then I could say [tex] P(C) = P(U) + P(T) = 2 [/tex] and then [itex]P(T \cap U) =P(U) \cdot P(T) = 1[/itex]
Then [itex]P(T \cup U) = P(c) - P(U) \cdot P(T) = 1[/itex]

Is my argument my reasonable? If yes can it be used in (1) too?

By the axioms of probability, P(C) = 1, not 2. Note that P(U) and P(T) have the same probability. What does that say about C, U and T? Also P(TU) = P(T)P(U) if T and U are independent events. The problem does not mention that T and U are independent.
 
  • #5
Hello Again,

The whole problem is as follows: Let (S, E, P) be a probability space and let T and U be events(Nothing is said about the events).

Show, that if

(1) [tex]P(T) = P(U) = 0[/tex] then [tex]P(T \cup U) = 0[/tex]

(2) [tex]P(T) = P(U) = 1[/tex] then [tex]P(T \cap U) = 1[/tex]

My Solultion(1):[tex]P(T \cup U) = P(T) + P(U) = 0 + 0 = 0[/tex]

My Solultion(2):[tex]P(T \cap U) = P(T) \cdot P(U) = 1 \cdot 1 = 1[/tex]

Could this be it? What else is there to add?

Sincerely Yours
Beowulf
 
  • #6
Beowulf2007 said:
My Solultion(1):[tex]P(T \cup U) = P(T) + P(U) = 0 + 0 = 0[/tex]

This will only work if P(TU) = 0. Can you prove that?

My Solultion(2):[tex]P(T \cap U) = P(T) \cdot P(U) = 1 \cdot 1 = 1[/tex]

This will only work if T and U are independent. Can you prove that?
 
  • #7
e(ho0n3 said:
This will only work if P(TU) = 0. Can you prove that?

This will only work if T and U are independent. Can you prove that?

Question(a)

This will only work if P(TU) = 0? By this you mean proving that P(TU) = emptyset.

The only that I know here is the that is the consequence of the formula P(T U U) = P(T) + P(U).

Is that what you mean??

Question(b)

Since nothing is said about the events being indepedent do I assume they are??

Finally if these two can be proved, does that then complete the solution for (1) and (2)??

Many thanks in advance.

Sincerely Beowulf.
 
Last edited:
  • #8
Beowulf2007 said:
This will only work if P(TU) = 0? By this you mean proving that P(TU) = emptyset.

The only that I know here is the that is the consequence of the formula P(T U U) = P(T) + P(U).

Is that what you mean??

TU is not necessarily empty. Consider the elements in TU individually and their probabilities when considered as events.

Since nothing is said about the events being indepedent do I assume they are??

No. Forget about independence for the moment. If P(X) = 1 where X is some arbitrary subset of the sample space, what can you say about X?

Finally if these two can be proved, does that then complete the solution for (1) and (2)??

All you have to do is show that your results follow from the axioms of probability.
 
  • #9
Regarding (1) do I considered the events to the mutually exclusive?? And therefor if P(T) occurs then P(U) also occurs and thus P(T u U ) = P(T) + P(U)?

Or is the explanation more simple??

e(ho0n3 said:
TU is not necessarily empty. Consider the elements in TU individually and their probabilities when considered as events.



No. Forget about independence for the moment. If P(X) = 1 where X is some arbitrary subset of the sample space, what can you say about X?

Do you mean P's complement here?

All you have to do is show that your results follow from the axioms of probability.

In (2) Is it something to do with the events being mutually inclusive??

SR

Beowolf
 
  • #10
You're going about it the wrong way. For problem (1) you're told that P(T) = P(U) = 0. What does this say about T and U? Ditto for problem (2).
 
  • #11
e(ho0n3 said:
You're going about it the wrong way. For problem (1) you're told that P(T) = P(U) = 0. What does this say about T and U? Ditto for problem (2).

Just so we understand each other. Do You mean?

[tex]T \subseteq U[/tex] and [tex]U \subseteq T[/tex]

Therefore their union is defined as

[tex]T \cup U = T + U[/tex]

Therefore their intersection is defined as

[tex]T \cap U = T \cdot U[/tex]

And then use this fact to claim that my result in (1) and (2) are true!

How does that sound?

BeoWulf
 
Last edited:

FAQ: Simple Probability Question(HELP)

What is simple probability and how is it calculated?

Simple probability is the likelihood of an event occurring. It is calculated by dividing the number of favorable outcomes by the total number of possible outcomes.

What is the difference between theoretical and experimental probability?

Theoretical probability is based on the assumption that all outcomes are equally likely, while experimental probability is based on actual results from an experiment or observation.

How do you determine if events are independent or dependent?

Events are independent if the outcome of one event does not affect the outcome of the other event. Events are dependent if the outcome of one event does affect the outcome of the other event.

Can the probability of an event be greater than 1?

No, the probability of an event cannot be greater than 1. A probability of 1 means the event is certain to occur, while a probability of 0 means the event is impossible.

What is the difference between odds and probability?

Odds are a way of expressing the probability of an event occurring, while probability is a measure of the likelihood of an event occurring.

Similar threads

Back
Top