If X is a man, and if X is unmarried, then X is a bachelor

  • Thread starter Thread starter nietzsche
  • Start date Start date
  • Tags Tags
    Bachelor
nietzsche
Messages
185
Reaction score
0
I am wondering about something. Let's take the example of the bachelor:

(1) If X is a man, and
(2) if X is unmarried, then
(3) X is a bachelor.

So in this example, (1) is a necessary condition for (3), and (2) is also a necessary condition for (3). But considered together, if (1) and (2) are both satisfied, can that be considered a sufficient condition? Like in the following example,

(4) If X is an unmarried man, then
(5) X is a bachelor.

(4) is now a sufficient condition for (5). Am I right?My actual question is, is there any other way of stating that (1) and (2) are together sufficient, other than writing them together as one condition, as in (4)? Or can I outright state that "together, (1) and (2) are sufficient for (3)". I haven't taken a logic class, so I don't really know what the "rules" are...

(This is for a word problem in my analysis class.)

Thanks.
 
Last edited:
Physics news on Phys.org
What about widowers and divorced?
 
nietzsche said:
(1) If X is a man, and
(2) if X is unmarried, then
(3) X is a bachelor.
The way that you are dividing up the formulas is a bit confusing. Do you want to know how the following formulas are related, where P, Q, R are formulas?
a) P --> (Q --> R)
b) (P --> R) & (Q --> R)
c) (R --> P) & (R --> Q)
d) (P & Q) --> R​

You do not have to combine P and Q into a new atomic formula. You can just conjunct them.

(I think, for this definition, male widowers and divorcees are bachelors.)
 
Last edited:
Sorry, I didn't realize until later that perhaps Borek was trying to point to the problem of knowing when you have met all necessary conditions of a formula. Perhaps this is not the question that you were wondering about, but it sounds interesting, so I am going to think about it. But first, if you just want to know how to say that a formula P is sufficient for R, one of the simplest ways is with P --> R, though any equivalent formula will work. Sorry. But P can of course be a complex formula, i.e., it's possible that P <=> (S & T), in which case, having already P --> R, you get (S & T) --> R. You can make P as complex as you'd like. But (c) alone does not imply (d).

One of the formulas listed above is equivalent to (d). You can probably understand why even not knowing any formal logic. (d) says that it's impossible for R to be false when P and Q are both true. The first implication in (a) says that it's impossible for Q --> R to be false when P is true, and Q --> R says that it's impossible for R to be false when Q is true. So you can see that (a) and (d) end up saying the same thing: R cannot be false when P and Q are both true.

In your original terminology, can say that P & Q is sufficient for R or that P is sufficient for Q to be sufficient for R. And you can continue adding formulas into the antecedent, ((P & Q & S) --> R) <=> (P --> (Q --> (S --> R))), forming a chain of dependence. Does that make sense?[Edit: what follows is partly wrong but corrected below.] To the other question, I can't think of anything wrong with the idea that, if a formula P includes as subformulas all necessary conditions for R, then P is sufficient for R. It doesn't make sense that everything necessary for R to be true or provable could be true or provable without R also being true or provable. (Or does it? It can be another way, but do you want it to be?) It seems consistent with the usual systems anyway.

In order to say that P includes all necessary conditions for R, you can use quantification over the set of formulas: if \phi ranges over formulas and \subseteq is the subformula relation,

\forall \phi [(R \rightarrow P) \wedge ((R \rightarrow \phi) \rightarrow (\phi \subseteq P))]​

So it occurs to me as being a possible second-order theorem:

\forall \phi [((R \rightarrow P) \wedge ((R \rightarrow \phi) \rightarrow (\phi \subseteq P))) \rightarrow (P \rightarrow R)]​

But it might have a propositional or first-order equivalent. I don't know. But a quick thought says no. So there might be no way to say this in propositional or first-order logic.Edit: So, woops. I was thinking one thing but saying another. I was thinking of P as being a conjunction of all consequences of R. However, that is not the same as P including those consequences as subformulas. I want for P to imply all of R's consequences, but their being subformulas of P does not guarantee this. And as soon as I correct this to

\forall \phi [((R \rightarrow P) \wedge ((R \rightarrow \phi) \rightarrow (P \rightarrow \phi))) \rightarrow (P \rightarrow R)]​

I see a proof of it: let R = \phi. Since (R --> R), (P --> R) is part of the premise. And it's clearer now that two formulas having the same consequences makes them equivalent (since they must then have each other as consequences). It reminds me of set-equality by extension.
 
Last edited:
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...

Similar threads

Replies
7
Views
1K
Replies
2
Views
2K
Replies
12
Views
2K
Replies
12
Views
4K
Replies
16
Views
3K
Replies
17
Views
1K
Back
Top