Understanding Probability: The Meaning of Δ and Proving A Δ B^c = A^c Δ B

  • Thread starter Thread starter matrix_204
  • Start date Start date
  • Tags Tags
    Probability
matrix_204
Messages
99
Reaction score
0
Hi, I have a quick question. What does the triangle (Δ) mean? I was asked to prove this, but since it's not told in the book and I just wana get an idea of what the Δ means.

Show that A Δ B^c = A^c Δ B

Also after trying to prove the two sides, I got stuck here...

For A Δ B^c =...=...=(A ∩ B^c) U (B^c ∩ A)
and for A^c Δ B =...=...=(A^c ∩ B) U (B ∩ A^c)

how do they equal?
 
Physics news on Phys.org
A\Delta B = (A-B)\cup (B-A) is the symmetric difference of the sets A and B and it contains only those points of A which are not in B and those points of B which are not in A.

Example: If A={1,2,7,9,11} and B={3,5,7,11,13}, then the set difference, A-B, of A and B is the part of A not in B, namely A-B={1,2,9}; likewise, the set difference, B-A, of B and A is the part of B not in A, namely
B-A={3,5,13}, and hence

A\Delta B = (A-B)\cup (B-A)={1,2,9}\cup {3,5,13}

and this is equivalent to what you have since A-B=A\cap B^c

where B^c is the complement of B in X (if X is the universal set containing A and B).
 
Last edited:
Yes that is also given in the question but I can't make them equal. Like using your example, it doesn't equal. So does that mean it doesn't equal?
I'm stuck at the same place as I posted in my first post.
 
Well, the two expressions you wrote aren't equal. (A ∩ B^c) U (B^c ∩ A) should really be (A ∩ B^c) U (B ∩ A^c), and you have a similar trouble in the second one.
 
Could you please show me how you got that? I understand that it's correct but I just want to know how you got that.
 
Just use what Benorin said to expand (A - B) U (B - A): A - B = A n B^c
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top