Prove 1+1 = 2 in Fewer Pages & Win US$5 at Evo & I

  • Thread starter Thread starter StevieTNZ
  • Start date Start date
  • Tags Tags
    Evo
AI Thread Summary
The discussion revolves around the challenge to prove that 1+1=2 in fewer pages than currently documented, specifically referencing "Principia Mathematica," which takes hundreds of pages to reach this conclusion. Participants debate the nature of mathematical proof, with some arguing that 1+1=2 is a tautology based on definitions, while others emphasize the complexity of formal proofs using axioms like Peano's. The conversation touches on the distinction between mathematics and science, asserting that mathematical truths are not necessarily tied to physical reality. Some participants express frustration with the perceived lack of practical application in pure mathematics, while others defend its intellectual value. The discourse also highlights the circularity in defining mathematical concepts and the varying interpretations of what constitutes a valid proof, ultimately questioning the necessity and nature of rigorous mathematical proof versus intuitive understanding.
StevieTNZ
Messages
1,934
Reaction score
873
I dare anyone to prove 1+1 = 2, in one less page than already is; not reducing the font size or changing font, either.

Whoever does so will receive US$5 off their next psychic appointment/end-of-world date prediction email subscription. Only redeemable at Evo and I's business to be set up at some point in the future.
 
Physics news on Phys.org
How can you prove a stipulation? 1+1=2 is a statement that stipulates 1+1 may also be referred to as "2". There's nothing to prove.
 
zoobyshoe said:
How can you prove a stipulation? 1+1=2 is a statement that stipulates 1+1 may also be referred to as "2". There's nothing to prove.


Actually, I seem to recall that in an advanced math graduate class I took some 50 years ago, one of the exercises was to use Peano's Postulates to prove exactly that. 1+1=2

I'm an engineer, not a mathematician and I remember thinking the whole thing seemed stupid from my practical point of view but I understood that as an exercise in mathematical formalism it could be useful to folks who care about that sort of thing.

Stevie, is there some reason for your rant or did you just need to get that off your chest?
 
Why not spend the energy on something applicable to the real world instead?
 
Last edited:
stevietnz said:
i dare anyone to prove 1+1 = 2, in one less page than already is; not reducing the font size or changing font, either.

Whoever does so will receive us$5 off their next psychic appointment/end-of-world date prediction email subscription. Only redeemable at [Evo's and my] business to be set up at some point in the future.

2-1=1
2-1+1=1+1
1+1=2

(Edit in bold)
 
If I hold up one finger, then another one finger, I get two fingers.
 
The "full" proof can be found in Russell and Whiteheads "Principia Mathematica", and no it can not be done on one page (it takes them about 300 pages to get to the proof).

But then Gödel came along and showed that the whole exericise was futile:rolleyes:
 
Nikitin said:
Why not spend the energy on something applicable to the real world instead?

Some people like reading Harry Potter. Some people like doing math exercises.
 
StevieTNZ said:
I dare anyone to prove 1+1 = 2, in one less page than already is; not reducing the font size or changing font, either.
What, exactly, do you mean by "1", "+", "=", and "2"? The proof is incredibly trivial or incredibly complex depending on how deep you want to go.

Some claim you cannot prove a definition. That's not quite true. Better said, the proof of a definition is trivial: Cite the definition. End of proof. Given that, here's the trivial proof:

Proof that 1+1=2: The definition of 2 is that it is the natural number that satisfies 1+1=2. QED.​
Defining "1", "+", "=" as per the Peano axioms and defining 2 as the successor of 1 requires a tiny bit more work, but not much. On the other hand, going the full nine yards as was done by Whitehead requires an immense amount of work and won't fit on a page.

So what do you want? The proof is either trivial and requires but a line or two, or it's incredibly non-trivial and cannot be reproduced without writing a book.
 
  • #10
This one has been kicking around since at least 1966 when I saw it on a math dep't bulletin board at U of Miami.
Today I found it many places by a search on "unknown but astute source"

This is a pdf from U of Chicago:

http://www.oekonometrie.uni-saarland.de/oekonometrie/oeko2011/FirstLesson.pdf
 
Last edited by a moderator:
  • Like
Likes 1 person
  • #11
jim hardy said:
http://www.oekonometrie.uni-saarland.de/oekonometrie/oeko2011/FirstLesson.pdf

That's funny :smile:
 
Last edited by a moderator:
  • #12
jim hardy said:
This one has been kicking around since at least 1966 when I saw it on a math dep't bulletin board at U of Miami.
Today I found it many places by a search on "unknown but astute source"

This is a pdf from U of Chicago:

http://www.oekonometrie.uni-saarland.de/oekonometrie/oeko2011/FirstLesson.pdf

I was fine until we made zero a vector.
 
Last edited by a moderator:
  • #13
Nikitin said:
Why not spend the energy on something applicable to the real world instead?
If this is the overarching mentality then much of pure math is a waste of energy. Regardless, intellectual stimulation is intellectual stimulation regardless of how "applicable" it is to the real world.
 
  • #14
WannabeNewton said:
If this is the overarching mentality then much of pure math is a waste of energy.

Which is why there are so few workers in the world doing it. Still, its certainly funner than watching a basketball game (IMO).
 
  • #15
qspeechc said:
If I hold up one finger, then another one finger, I get two fingers.

Since no one noticed this, let me expand.

1 finger + 1 finger = 2 fingers

So "1+1=2" is true for concrete, physical objects like fingers, therefore it cannot be false for numbers. Numbers are just abstractions of real things like fingers.

Furthermore, when you add 1 the result gives the next number, and "2" is simply the name of the number after 1, therefore "1+1=2" is a tautology.
 
  • #16
qspeechc said:
If I hold up one finger, then another one finger, I get two fingers.

If you hold one finger then cut one of your other fingers of and hold it in another hand.Do you still have two fingers?
 
  • #17
qspeechc said:
So "1+1=2" is true for concrete, physical objects like fingers, therefore it cannot be false for numbers.
Nonsense. 1 drop of water + 1 drop of water = 1 bigger drop of water.
 
  • #18
qspeechc said:
Since no one noticed this, let me expand.

1 finger + 1 finger = 2 fingers

So "1+1=2" is true for concrete, physical objects like fingers, therefore it cannot be false for numbers. Numbers are just abstractions of real things like fingers.

Furthermore, when you add 1 the result gives the next number, and "2" is simply the name of the number after 1, therefore "1+1=2" is a tautology.

You want to experimentally verify mathematics? I don't think so. It works the other way around. You experimentally verify science, not math.
 
  • #19
D H said:
Nonsense. 1 drop of water + 1 drop of water = 1 bigger drop of water.

And 1 bigger drop = 2 drops. You are just giving two drops another name
 
  • #20
qspeechc said:
And 1 bigger drop = 2 drops. You are just giving two drops another name

Thats the same thing you did when you appealed to fingers... You arbitrarily choose a naming scheme that would support your preconceived conclusion.
 
  • #21
qspeechc said:
And 1 bigger drop = 2 drops. You are just giving two drops another name
What if it was made up of 4 smaller drops? Your system does not give unique labels to your objects. How do define your '1' drop?
 
  • #22
I thought about it, and the problem is actually that "drop" is a vague word, it describes something but doesn't define it. In "1 drop + 1 drop = 1 bigger drop" each time you use "drop" it actually has a different meaning.

ModusPwnd said:
Thats the same thing you did when you appealed to fingers... You arbitrarily choose a naming scheme that would support your preconceived conclusion.

I don't understand. Was my meaning of "finger" arbitrary and chosen to support my conclusion?

ModusPwnd said:
You want to experimentally verify mathematics? I don't think so. It works the other way around. You experimentally verify science, not math.

Then tell me where the abstract numbers "1", "2" etc. come from.

bp_psy said:
What if it was made up of 4 smaller drops? Your system does not give unique labels to your objects. How do define your '1' drop?

I think you're getting at the same thing as me.

And everyone has conveniently forgotten this:

qspeechc said:
Furthermore, when you add 1 the result gives the next number, and "2" is simply the name of the number after 1, therefore "1+1=2" is a tautology.
 
  • #23
qspeechc said:
I don't understand. Was my meaning of "finger" arbitrary and chosen to support my conclusion?

Yes, I think so.
qspeechc said:
Then tell me where the abstract numbers "1", "2" etc. come from.

Why do they have to "come from" anywhere? They are postulated or defined axiomatically. When you start to attribute these abstract concepts onto "real world" objects like fingers and water drops you start doing science rather than mathematics.
 
  • #24
ModusPwnd said:
Yes, I think so.




Why do they have to "come from" anywhere? They are postulated or defined axiomatically. When you start to attribute these abstract concepts onto "real world" objects like fingers and water drops you start doing science rather than mathematics.


Ha ha ha. Yes. Indeed. Ho Ho Ho. Good one. Yes. Very good.
 
  • #25
Afaik, Boolean arithmetic is just as valid as standard arithmetic. In Boolean arithmetic 1+1=1
 
  • #26
It's also perfectly legitimate to specify a Galois field (GF) and use the corresponding arithmetic within it. For example, in GF(2)

1 + 1 = 0.​
 
  • #27
SW VandeCarr said:
Afaik, Boolean arithmetic is just as valid as standard arithmetic. In Boolean arithmetic 1+1=1
And arithmetic modulo 2 is just as valid as either, in which 1+1 = 0.Bottom line: You can't prove 1+1=2 unless and until you define what "1", "+", "=", and "2" mean.
 
  • #28
You can't pretend it wasn't clear we were talking about ordinary arithmetic all along. Why would anyone say "prove 1+1=2" if he meant in arithmetic modulo 2, for example.
 
  • #29
qspeechc said:
Ha ha ha. Yes. Indeed. Ho Ho Ho. Good one. Yes. Very good.
ModusPwnd was not joking.

You do realize that mathematics is not science, don't you?

Scientific theories cannot be proven to be true. Scientific theories are at best provisionally true. All it takes is one experiment to demonstrate that the theory does not comport with reality and kaboom! the theory is dead (or at least needs modification).

Mathematical theorems can be proven to be true, and once proved to be so, they remain true for all time. Mathematics is not necessarily connected with reality. That the universe is non-Euclidean does not mean that Euclid's geometry has been falsified. Euclid's theorems are as valid now as they were 2300 years ago.
 
  • #30
qspeechc said:
You can't pretend it wasn't clear we were talking about ordinary arithmetic all along. Why would anyone say "prove 1+1=2" if he meant in arithmetic modulo 2, for example.

Galois fields (finite number systems) have plenty of real-world applications. Any time you watch Blu-ray or DVD,or listen to a CD, finite number systems are involved. The information on these media are encoded with a forward error correction technique called Reed-Solomon codes. Reed-Solomon codes can be considered a subset of BCH codes (short for Bose, Chaudhuri, Hocquenghem). That way, if you scratch the disk, all is not necessarily lost.

So these types of number systems are not too terribly removed from practical life after all. :smile:
 
Last edited:
  • #31
qspeechc said:
You can't pretend it wasn't clear we were talking about ordinary arithmetic all along. Why would anyone say "prove 1+1=2" if he meant in arithmetic modulo 2, for example.

But you're asking for a proof. A proof is based on a certain set of assumptions which are (taken to be) consistent. Different kinds of arithmetic are equally valid according to the assumptions on which they are based.

EDIT: I didn't see the other posts that just preceded mine. I didn't intend to be redundant. However, consider this. In standard arithmetic we can say 1+1=10 in binary notation. It's still standard arithmetic. There's nothing special about base ten except habit and convenience.
 
Last edited:
  • #33
ModusPwnd said:
Yes, I think so.

From The Pocket Oxford Dictionary, 5th ed.: “finger. 1. n. Any of five or (excluding thumb) four terminal members of hand.”
That is the definition I used, and as far as I know, the only definition in use.

ModusPwnd said:
Why do they have to "come from" anywhere? They are postulated or defined axiomatically. When you start to attribute these abstract concepts onto "real world" objects like fingers and water drops you start doing science rather than mathematics.

Oh sigh! I was hoping I didn’t need to give this long-winded answer. I was trying avoid answering it without being noticed. Well, anyway!

From Principles of Mathematics, 2nd ed. 1963 (? not sure) written by Allendoerfer and Oakley, both former mathematics professors:

Virtually all the mathematics with which you are familiar had its roots somewhere in nature. Arithmetic and algebra grew out of men's needs for counting, financial management, and other simple operations of daily life; geometry and trigonometry developed from problems of land measurement, surveying and astronomy. [...] In recent years new forms of mathematics have been invented to help us cope with problems in social science, business,[...etc.]. Let us lump all these sources of mathematical ideas together and call them Nature.

At first our approach to nature is descriptive, but as we learn more about it and perceive relationships between its parts, we begin to construct a Mathematical Model of nature. [...] Perhaps you are familiar with this sort of process through your study of geometry, in which the axioms form an abstract description of what man saw when he began to measure the earth [italics mine].[...]

The next step in the process is to deduce the consequences of our collection of axioms. By applying logical methods of deduction we then arrive at theorems. These theorems are nothing more than logical conclusions from our axioms and must not be assumed to be firm statements about relationships which are necessarily true in nature.

You get the idea. There's a diagram in the book, which basically goes:
Nature --> Definitions, Axioms --> Theorems, Rules --> Nature (through applications of mathematics)

Unfortunately, the way mathematics is taught, many people think mathematics has nothing to do with reality, that there is no place in mathematics for intuition, insight, meaning, examples (which are like the scientist's observations and data), which is unfortunate. Does anyone think mathematics is just a big game of logic with no meaning or intuition at all? That mathematicians just suck theorems and definitions out of their thumbs? I think not.
 
Last edited:
  • #34
  • #35
qspeechc said:
Since no one noticed this, let me expand.

1 finger + 1 finger = 2 fingers

So "1+1=2" is true for concrete, physical objects like fingers, therefore it cannot be false for numbers. Numbers are just abstractions of real things like fingers.

Furthermore, when you add 1 the result gives the next number, and "2" is simply the name of the number after 1, therefore "1+1=2" is a tautology.

This is an "engineering" kind of solution that utterly lacks mathematical rigor. It IS the kind of thing I personally agree w/ but I don't pretend that it is a rigorous proof.
 
  • #36
First define 0: 0 = {}
Define 1: 1={{}}
Define 2: 2={{},{{}}}
Define =: A = B iff (for all x (x in A iff x in B)
In other words, two sets are equal if they have the same elements.
Define S(): S(x) = xU{x}
Define +:
1) A + 0 = A
2) A + S(B) = S(A) + B

Now for the proof:

Clearly S(0) = S({}) = {} U {{}} = {{}} = 1
and S(1) = S({{}}) = {{}}U{{{}}} = {{},{{}}} = 2
So 1 = S(0) and S(1) = 2
1 + 1 = 1 + S(0) = S(1) + 0 = S(1) = 2



Then again that is just as valid as defining 2 as 1+1.
 
  • #37
qspeechc said:
Lol, thanks.

Is someone actually on my side in this discussion?

Yes. I agree with you. I agree that 1 finger + 1 finger = 2 fingers proves that 1+1=2 (when everything is interpreted normally).

A formal proof of this result can be given but is not necessary. In fact, if we invent axioms and we find out that ##1+1\neq 2##, then our conclusion would not be that 1+1=2 is not true, but rather that our axioms are wrong and do not describe real life. Rigorouss proofs are not invented in order to show that 1+1=2, because we know that from experience. Rigorous proofs are invented for this in order to show that our axioms are right.

I think it's very silly to somehow pretend that mathematics is somehow disjoint from nature. We care about certain mathematics because it occurs in nature and in other fields. We don't just invent some arbitrary axioms that means nothing, our axioms are always grounded in things we can observe and feel. I admit that mathematics abstracts things so much that somehow it is not clear anymore what we are studying, but initially at least the axioms always have some purpose and are not arbitrary.
 
  • #38
micromass said:
We don't just invent some arbitrary axioms that means nothing...

Why not? I think some mathematicians do do that.

If math must be informed by and inform us of nature then why is it distinguished from science? Empiricism is a requirement of science, not math..

Rigorous proofs are invented for this in order to show that our axioms are right.

Axioms cannot be right or wrong. They are assumed, by definition. What you are describing sounds more like a scientific postulate, like from physics.
 
Last edited:
  • #39
ModusPwnd said:
Why not? I think some mathematicians do do that.

Well as far as I know mathemeticians sort of do it. They do it but the work isn't particularly respected or cared about by the mathematical community.
 
  • #40
micromass said:
Yes. I agree with you. I agree that 1 finger + 1 finger = 2 fingers proves that 1+1=2 (when everything is interpreted normally).

Well, the OP asked for a proof of a formal statement: 1+1=2. That implies a formal proof, no?
It seems that the OP asked for something and then objected when he/she got a discussion about what was asked.

[EDIT] Sorry, it was qspeechc that objected, but the argument is the same. The stated subject of the thread is a formal proof.
 
Last edited:
  • #41
SW VandeCarr said:
It seems that the OP asked for something and then objected when he/she got a discussion about what was asked.
The OP has not said one single thing since the original post. And that's a problem.
 
  • #42
D H said:
The OP has not said one single thing since the original post. And that's a problem.

Might have had a busy day at work, give him time :)
 
  • #43
It is currently proved in x number of pages. I'm asking for the proof to be in x-1 pages. So three lines doesn't meet the criteria, sorry, FlexGunship.

(don't know the figure x, but it is certainly more than 100 - refer Principia Mathematica)
 
  • #44
Shows how much attention was paid when reading! Correct, I had not said anything - thus invaliding 'It seems that the OP asked for something and then objected when he/she got a discussion about what was asked.'

Plus, I asked for the proof to be in one page less than currently is, i.e. Principia Mathematica. Therefore a proof on 3 lines will not suffice for a US$5 discount coupon.
 
  • #45
Nikitin said:
Why not spend the energy on something applicable to the real world instead?

Pure math is done because people find it enjoyable, not because there might be some later consequence or byproduct of its applicability that relates it to the real world. If this approach were taken from the very origin of mathematics then much of what we know today would be lost.
 
  • #46
qspeechc said:
... Unfortunately, the way mathematics is taught, many people think mathematics has nothing to do with reality, that there is no place in mathematics for intuition, insight, meaning, examples ...

I'm not sure what classes or books you have gone through, but in my experiences, this couldn't be further from the truth. The purpose of a text in mathematics is to develop one's knowledge and problem solving abilities with regards to the topic of the book, and the only way to do this is to build one's mathematical intuition, often through examples.

I'm indifferent towards everything else you've said :smile:
 
  • #47
f95toli said:
The "full" proof can be found in Russell and Whiteheads "Principia Mathematica", and no it can not be done on one page (it takes them about 300 pages to get to the proof).

But then Gödel came along and showed that the whole exericise was futile:rolleyes:

I believe that most mathematicians laughed at them when they heard that it took them 300 pages to prove a definition... :-D
 
  • #48
MathematicalPhysicist said:
I believe that most mathematicians laughed at them when they heard that it took them 300 pages to prove a definition... :-D

No. From Wikipedia: "'From this proposition it will follow, when arithmetical addition has been defined, that 1+1=2.' —Volume I, 1st edition, page 379 (page 362 in 2nd edition; page 360 in abridged version). (The proof is actually completed in Volume II, 1st edition, page 86, accompanied by the comment, 'The above proposition is occasionally useful.')"
 
Last edited:
  • #49
The point is that you don't "prove" 1+1=2. Russell and Whitehead were not trying to prove 1+1=2, they were trying to deduce it. They were trying to build a foundation from which all of mathematics would follow, including 1+1=2. If their system proved, or rather "deduced", anything else they would know it was wrong, and was not the foundation they were looking for.

Do you think that before 1+1=2 was "proven", all mathematicians went about saying "Well, dear me, if someone actually proves 1+1=2 is not true then all my life's work is rubbish"?

Edit: I think micromass has already made this point
 
Last edited:
  • #50
MathematicalPhysicist said:
I believe that most mathematicians laughed at them when they heard that it took them 300 pages to prove a definition... :-D

Yes, "1+1=2" is actually a definition. It's inconvenient to always go about saying "one plus one", so, for brevity, we call this "two". So we say "57" instead of "1+1+1+1+...+1"
 
Back
Top