Has anyone else tried to prove that 1 + 1 = 2 besides Russell?
I consider ##1+1=2## to be basically a definition.
Anyway, see here: http://us.metamath.org/mpegif/mmset.html#trivia
Wow! Amazingly useful! Very good webpage. I didn't know that they had it that well organized.
Wasn't Bertrand's proof thorough enough? He went 100+ pages before he finally laid down enough groundwork to define 1+1. Then another 20+ pages before he defined 1+1=2. Then it was close to the end of the book before he finally proved it.
Even if you think there's something deficient about his attempt, most people are willing to treat it as proof enough.
It's not that it's enough, but it's too much. Our proofs should be as simple as possible but not simpler. (I forget who said that). I'm working on a proof that has maybe 10 steps but it won't be ready until I can get a computer to output the proof using the same algorithm that outputs the proof to similar sentence. I'm trying to build an automated theorem prover that works for a variety of sentences. However, my ambition exceeds my knowledge. I don't know enough about the foundations of math to really prove this and I don't even know set theory. I'm working on correcting that lack of knowledge now.
I'd imagine that 99% of mathematicians either treat this as a definition or simply don't care about slogging through a "rigorous" proof from set theory of the fact. Which provides a great response anytime someone complains about a beautiful, intuitive proof that's not totally rigorous: "you haven't even done a computer-verifiable proof that 1+1=2, so...shut up."
I'm only about half serious, I don't want any angry logicians or set theorists to yell at me.
It's interesting to note that mathematicians are not too interested in the foundations of mathematics. Some speculate that that is why no one read Frege's work.
I'm new to mathematics. I'm a philosopher that sees in mathematics a guide. I really like the idea of computer-verifiable proofs. The reason why I like them is because computers help us keep track of whether or not we've obeyed all of our rules. Back when I used to try to make philosophical arguments using logic I deluded myself into believing that all of my steps inevitably followed from the previous line. With computers I now know how easy it is to delude yourself into believing that you've proved something when you really haven't. I'm wondering how popular this idea of computer-verifiable proofs is in math today.
There are many mathematicians interested in the foundations of mathematics.
Nobody reads Frege because Frege is wrong. It's also outdated and difficult to read.
What Bertrand Russell did was to first define all of the arithmetic operations in terms of sets - and from there proved that addition worked.
The whole trick was "how to do it". Now that it's been explained in detail, it's kind of pointless for someone to restate it - except for translation. Which is kindof what you're doing.
There's only about 4 pages of "The Principles of Mathematics" that you need to understand what has to be done to get a computer "proof" engine to perform the proof.
For example, §113 and §114 provide a definition of addition based on set theory. Here's how §114 ends:
As you can see, it's only long because he's being very careful and explicit with both the logic and the way it is presented. He wants to be clearly understood.
You can then skip to §131 where he takes on the defintion of the result of an addition:
As you can see, this is important information if you're looking to have a computer prove 1+1=2. It provides a basic definition that is provable in set theory.
As I recall, the actual proof is about 80% of the way through the book. If I find it in the next 5 or 10 minutes, I post it.
This is a good place to close this thread. The OP is no longer a member of this forum.
Separate names with a comma.