If two numbers are not equal, there is infinitely many numbers between them. Therefore, numbers that are equal have a finite amount of numbers between them, 0. Therefore, numbers with a finite amount of numbers between them are equal.
Assuming you mean real numbers with the usual order relation (without entering into a debate about whether this needs to be said, please)... this sounds correct. It's the contrapositive of a true statement, after all.
... Also, I guess if you're concerned about the flow of your logic, then I could say that you don't need the middle statement here, for the reason I pointed out above.
Thanks. I know it sounds like a "duh!" question, but it has implications. I was considering 1 = or != .999..., and the thought about there being infinitely many [real] numbers between two numbers came to me. One can name no numbers between .999... and 1. My understanding of numbers is that they do not exist until named, and cannot exist if they cannot be named. The argument would rely on this being true, which it is, isn't it?
Also the middle statement doesn't follow from the first statement (even if it's true) so the first use of "Therefore" is wrong.
Well, first of all, what does it mean to "name" a number? By that, do you mean "I can write it down"? Then it seems you would have to decide whether you are accepting formulas that stand for numbers that have an endless string of digits after the decimal--e.g., Pi, e, and your example of .999... In that case, then there are proofs that .999... = 1 with any desired level of rigor--just see http://en.wikipedia.org/wiki/0.999... If you are not accepting formulas, then you can't "name" .999..., anyway--in fact, you can't "name" most of the real numbers (we know how to write only a handful of transcendental numbers, yet we know that most real numbers are transcendentals). Yet, they're still there. EDIT: Well, really, you can "name" .999... since you'll always know the next number. But still most of the reals cannot be represented this way; there are numbers whose approximation by rationals will display no pattern at all, in fact.
Hi, TylerH! It will be useful for you to develop logacial notation skills, so that you more easily see whether your implication is valid. Let "A": Two numbers are not equal. "B": There is an infinity of numbers between two numbers. Thus, your premise is: If A, then B. This is logically equivalent to its contrapositive form: If Not-B, then Not-A Not-B: Not infinitely many numbers netween two others (meaning there exists a finite amount of such, regarding NO numbers between them as 0) Not-A: Two numbers are not not equal (meaning they are equal). Thus, your logic is fine.
Actually, in set theory there are uncountably (even co-countably) many real numbers that cannot be defined, and even more that cannot be computed. So there definitely does exists a lot of "unnamed" real numbers in set theory.
Actually that sounds much more native to me than you would think. I'm a CS nerd with a very good understanding of that sort of boolean logic. It's just that your approach at reasoning, and turning English into boolean logic, for some reason, didn't occur to me as a possibility. I wasn't in the words to logic mindset I guess.
It's hard to say anything about that, unless we know what YOUR definition of the meaning of ".999...." is. If you think ".999...." means "the limit of the sequence .9, .99, .999, ..." , then (since the terms in the sequence is the sum of a geometric progression) ".999...." is just another name for "1". On the other hand, if your definition is different (or if you haven't yet really nailed down WHAT your own definition is) then all bets are off untill after you have defined it. Once again, it depends what YOU mean by "named", and you haven't told us that. But you are on a slippery logical slope here. I suppose you probably accept that "the ratio of the circumference to the diameter of a circle" is the name of a number - otherwise usually called "pi". If so, you accept that numbers can be "named" in other ways than by a sequence of digits. Nobody knows the complete (infinite and non-repeating) sequence of decimal digits of pi, so nobody can "name" it except by stating some property that it has, like my defintion above. So if you agree with that general idea about "naming" then what about "The smallest positive number that cannot be named". Is that the name of a number, or not? If you think not, how do you propose to define what "naming" means, so you can always tell whether "names" like that are valid or invalid? Here be (probably an uncountably infinite number of) dragons ....
When I say name a number, I mean a decimal expansion. Anything that can be decimally expanded is valid. The reason for decimally expanding is to make it more clear in what interval a number lies. Like sqrt 2 or 1.41...[the ellipsis in this case means more digits, not repeating digits], so we know that it must be between 1.41 and 1.419. I've seen the Wikipedia article, but I prefer my way for its simplicity. But sometimes simplicity leaves a residue of ambiguity. That's why I come here to consult those smarter and better educated than myself.
If that is what you mean by "naming" a number, then all real numbers can be named. I included the link to the Wikipedia article not to suggest a method of proof, but to show that as long as you're allowed to "name" a number by writing a series for it, there is nothing wrong with the argument.
Also let me point out that no one has ever actually written down the decimal expansion of most numbers, and that there would be no rule by which to write them all down (so you couldn't name them all even if allowed an infinite amount of time). They exist in spite of this.
I agree with that, but the OP's idea of "naming" has the strange property that there are numbers (in fact, almost all real numbers) which can be named, but it is impossible to actually say what their names are, because there is no method of computing the digits that make up the name.
No one seems to have commented on this, so I will. Two numbers that are equal have no numbers between them. The distance between the two numbers is zero. The interval containing the two (equal) numbers consists of a single point. I don't see the advantage in distinguishing between an infinite number of numbers between two given numbers, and a finite number (i.e., zero) of numbers.
I doesn't matter that you can't write or compute the whole decimal expansion of a number. The numbers can be named that they can be named to the point necessary. The point is to name the bounded region they exist in. You can name sqrt 2 as 1.4... and you know it is more than .999..., that's all the info I need to accomplish my ends, the rest is extraneous. Mark44: I completely see your argument. The problem is that when I argued 1-.999...=0, my friend said 1-.999...=1/∞, which is right, but he refuses to admit that limits are the constant value that is approached(0, in this case), rather than the never ending sequence. He's stuck on "it will approach, but never equal" which forces me to have to try to come up with some contrived real number theory method of proving I'm right. I know that his interpretation of a limit as an infinite process of approaching is wrong, but I have no way to argue it. Suggestions?
ask him what he means by .999... Mathematicians define it to be a limit. The limit by definition of limit equals 1. Once he understands that .9999… is just a short hand notation for a limit it’s hard to argue it doens't equal one. For him to say .9999… doesn’t equal one is like arguing the limit as x goes to infinity of (x+1)/(x-1) isn’t equal to 1.
No, he knows it's a limit. He's just got a messed up understanding of limits. He says they "approach" as if approaching is an infinite process of almost equaling. It's hard to explain his views. Here's a short excerpt:
Then he needs to define what he means by limit, because the word limit in this context for math means: Lim_{x →p}f(x)=L if and only if for any given ε > 0 there exists a δ > 0 such that 0 < | x − p | < δ implies | f(x) − L | < ε Notice the limit is equal to L with the "=" symbol by definition, nowhere is the "≈" in this definition.
I've pretty much tried that: Is there any good reference that explicitly states that limits equal a value? Wikipedia won't work.