Discrete Math: Proving something is logically equivalent

sjung915
Messages
4
Reaction score
0

Homework Statement


Show that (p ∧ q) → r and (p → r) ∧ (q → r) are not
logically equivalent.


Homework Equations


a → b = \nega v b



The Attempt at a Solution


I'm sorry. I'm completely stumped on how to go about this problem. I'm not asking for the solution since I want to know how to do this instead of just getting the answer. Any help would be appreciated. Thank you.
Here is what I had just so no one thinks I didn't try.

(p ∧ q) → r
=> \neg ( p \wedge q ) \vee r
=> (\negp \wedge \negq ) \vee r
=> (switched it around) r \vee (\negp \wedge \negq )
=> (distributed) (r \vee \negp ) \wedge ( r v \neg q)
=> (\negp v r ) \wedge (\negq v r )
=> (p -> r ) \wedge (q -> r)

It said disprove but somehow I'm getting that they are L.E.
 
Last edited:
Physics news on Phys.org
If you want to prove they are not equivalent then just figure out how you can assign true and false values to p, q and r so that the two sides give you different values.
 
Dick said:
If you want to prove they are not equivalent then just figure out how you can assign true and false values to p, q and r so that the two sides give you different values.

That makes sense. Here is what I got, please correct me if I'm wrong.

Let
p = true
q = false
r = false
then (p ∧ q) → r is true.
and (p → r) ∧ (q → r) is false.
Hence it's not L.E.

Any mistakes?
 
sjung915 said:
That makes sense. Here is what I got, please correct me if I'm wrong.

Let
p = true
q = false
r = false
then (p ∧ q) → r is true.
and (p → r) ∧ (q → r) is false.
Hence it's not L.E.

Any mistakes?

Looks ok to me. true → false is false. false → false is true.
 
Last edited:
As far as the original logic goes...

sjung915 said:
=> \neg ( p \wedge q ) \vee r
=> (\negp \wedge \negq ) \vee r

Your problem is here.
not (p and q) = (not p OR not q)
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top