Probability P(A\B) = P(A) - P(B)

  • Thread starter Thread starter magicarpet512
  • Start date Start date
  • Tags Tags
    Probability
magicarpet512
Messages
36
Reaction score
0

Homework Statement


Let A \subseteq B \subseteq S where S is a sample space.
Show that P(A \setminus B) = P(A) - P(B)


Homework Equations



A \setminus B denotes set difference; these are probability functions.

The Attempt at a Solution


I have,
P(A \setminus B) = P(A \cap B^{C}) <br /> = P(A) - P(A \cap B) <br /> = P(A) - [P(B) - P(A^{c} \cap B)] <br /> = P(A) - P(B) + P(A^{c} \cap B)

It seems like I'm close, but I've spent a while trying to figure out how to get rid of the P(A^{c} \cap B).

Any insight anyone?
Thanks!
 
Physics news on Phys.org
I think your inclusion is backwards: you mean B \subseteq A \subseteq S, right?

Try writing A as the union of two disjoint sets--this will give you P(A) in terms of something that can be rearranged into what you're trying to prove.
 
spamiam said:
I think your inclusion is backwards: you mean B \subseteq A \subseteq S, right?

Yes, thank you.

spamiam said:
Try writing A as the union of two disjoint sets--this will give you P(A) in terms of something that can be rearranged into what you're trying to prove.

I've tried that... unless I'm missing something?
A as the union of disjoint sets is A = (A \cap B) \cup (A \cap B^{c}).
So, P(A) = P((A \cap B) + (A \cap B^{c}).
When i plug this in and do some rearranging, i just get right back to where i ended up in the original post?
 
magicarpet512 said:
I've tried that... unless I'm missing something?
A as the union of disjoint sets is A = (A \cap B) \cup (A \cap B^{c}).
So, P(A) = P((A \cap B) + (A \cap B^{c}).
When i plug this in and do some rearranging, i just get right back to where i ended up in the original post?

Ah I see, your calculations just went off in an unexpected direction after the third equality. Take a look at your third equality: since B \subseteq A, then what is A \cap B?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Replies
3
Views
1K
Replies
1
Views
2K
Replies
4
Views
1K
Replies
5
Views
2K
Replies
5
Views
1K
Back
Top