Is Addition Really a Basic Skill?

  • Thread starter Thread starter drcrabs
  • Start date Start date
  • Tags Tags
    Addition
AI Thread Summary
The discussion centers on the nature of addition and whether it can be proven. Participants argue that addition is a fundamental concept defined by axioms rather than something that can be proven in the traditional sense. The Peano Axioms are referenced as a formal system that defines natural numbers and addition, illustrating that addition is inherently intuitive and based on human understanding of quantity. The conversation highlights that while one can demonstrate properties of addition (like commutativity), the operation itself is defined rather than proven. Some participants express frustration over the vagueness of the original question about proving addition, emphasizing that mathematical operations are accepted based on axiomatic foundations rather than empirical proof. The discussion also touches on the philosophical implications of mathematical definitions, particularly in relation to set theory and the concept of individuality in counting. Overall, the consensus is that addition is a defined operation rooted in axiomatic logic, not a provable theorem.
drcrabs
Messages
47
Reaction score
0
I reckon addition is a basic skill. But can it be proved?
How?
 
Physics news on Phys.org
Good heaven! Have you ever gone through the check out at the store? I have 6 of this and three of that...etc. Notice how addition is Abelian. Could it be possible it is not? Like three pops + two soups is not equal to two soups + three pops? What would the world be like then?
 
That kinda didn.t answer my question
 
What is it you want to prove ?
 
Addition!
 
prove the color red
 
DrCrabs

The visible red light has a wavelength of approximately 650 nm.
There are three types of color-sensitive cones in the retina of the human eye, corresponding roughly to red, green, and blue sensitive detectors.
Experiments have produced response curves for three different kind of cones in the retina of the human eye. The "green" and "red" cones are mostly packed into the fovea centralis. By population, about 64% of the cones are red-sensitive, about 32% green sensitive, and about 2% are blue sensitive.
So if a body emitts a wavelength of approximately 650 nm, the cones that correspond to that wavelength absorb that are hance seeing the color red.
 
Maybe i should specify what I am wanting.

Can someone prove that if

1 + 1 = 2

then

1 + 2 = 3
 
first of all, that's not a proof, your stating physical observations. second, wow, what a waste of time. and third, you missed my point. addition is such a basic, intuitive concept that it defies proof. we define formal systems that reflect our intuitve feeling of how addition works, and proofs can be set up within those formal systems. 2 is defined as the number equal to 1+1, and 3 is defined as the number equal to 2+1.

if youre asking me to prove the physical law that if i put two apples in a bag, and then put another three in, and then count the total number of apples in the bag, i will always get the number five, i can't do that. just like i can't prove the law of gravity is always true, even though overwhelming evidence shows it is. any rules we formulate about the physical world are made with the assumption that the world works in a regular, consistent way. this assumption cannot be proven.
 
  • #10
All i want to know is how to prove addtion.
Is that so hard to do?

It wasn't a waste of time. I know dat stuff like the back of my hand
 
  • #11
omg, he's saying you CANT. its not something you can prove, it just is what it is. in other words, its not something that was built on foundations that show its true.
 
  • #12
i thought i answered it. your question is extremely vague, and if i didnt understand exactly what you were thinking, then youll have to explain it better.

in math, addition is DEFINED to go along with how we feel the world works. look up peanos axioms for a formal way of defining addition. if you want to prove the physical property that our idea of addition is based on, (ie, the apples in a bag i described before), well, you cant.

basically, any proof is based on initial assumptions. you can never say something is absolutely true, you can only say it is true if some other statement is true, the assumption. the initial assuptions are usually based on human intuition or physically observed phenomana, but neither of these can be "proven." you can prove the moon will orbit the Earth in an ellipse if gravity is an inverse square law, but you can't prove that about gravity. its based on observation, and there's no reason we couldn't wake up tomorrow and find there's a new law of gravity.


and if you don't think your question is vague, think about the fact that the statements 1+1=2 and 1+2=3 are meaningless until 1,2,3,+, and = are defined. and once they are, what do you have left to prove? (a better question would be to prove that addition is commutative, which is nontrivial and can be done from peanos axioms)
 
Last edited:
  • #13
Hi drcrabs,
I thought about your question, and there's not a simple answer, even though I'm sure many folks would dismiss this question as trivial. It occurred to me that you might be interested not simply in the "proof of addition", but in the general history of how human beings developed number systems and counting.
I have a wonderful book called The Universal History of Numbers by Georges Ifrah that you might enjoy. It covers human concepts of counting and its applications, from the counting of the number of fingers on our hands, to the invention abacus, to the underlying foundations of the modern computer.
It's really a fascinating read if you like that sort of historical data.
 
  • #14
1+1 = 2
subtract one from both sides
1 = 1

lol.
 
  • #15
It depends on where you start from.

From something like the Peano Axioms you can prove the existence of all of the arithmetic functions commonly used with the naturals and ensure that they have the properties that we intuitively expect them to have.

After that you have to develop Z, Q, R, etc.. and do the same as described above at each step of the development. It's not something that is properly described in a single post but to give you a very basic idea of how this works:

In PA, the existence of 0 and a successor operation ' is postulated so that we can think of the natural numbers as the sequence 0, 0', 0'', 0''' and so on...

To prove addition exists we want to prove the existence of a unique function f: NxN->N, with the properties

(1) for all x in N, f(0,x) = x
(2) for all x and y in N, f(y',x) = (f(y,x))'

After this is proven, it's then shown that this function f, or addition has all of the properties we intuitively expect it to have: it's commutative, associative, etc.

This kind of approach (first oder PA) is older in style, you tend to see it more in older books like "Set Theory and Logic" Robert Stoll and "Foundations of Analysis" Edmund Landau. The modern approach is more set theoretical, and usually based on ZF rather than PA. You start our defining numbers as sets, and then define basic arithmetic operations as operations on sets.

I was cleaning out my links the other day and came across this, which describes the set theoretcal approach.

http://www.uwec.edu/andersrn/sets.htm
 
Last edited by a moderator:
  • #16
Yes thanks guys

but can someone just prove addition please!
:mad:
 
  • #17
One doesn't prove addition, one proves a theorem about addition.

The proof you want, that if 1+1=2, then 1+2=3 can be written as:

1+2
= 1+(1+1)
= 1+1+1
=3 {definition}

assuming that the integers are a ring.

if you want to prove that from first principles then you're welcome to, but don't expect many mathematicians to care these days.
 
  • #18
Maybe this should be moved to the philosophy forum.
 
  • #19
You haven't even said in which set, but if we were to talk abourt the natural numbers and Peano's axioms here is proof that 1 + 1 = 2:

Peano's axioms:

1) 0 is a member of N.

2) for each element n that is a member of N there exists an element n* (which is a meber of N also) called the succesor of n.

3) 0 is not the succesor of any elemnt in N

4) For each pair n, m in N where n is not equal to m, then n* is not equal to m*

5) if A is a subset of N, 0 is an elemnt of A and p is a member of A implies that p* is a member of A then A = N.

Let 0* = 1 and 1* = 2

Define additon as:

n + 1 = n* (note this doesn't fully defoine additon, but it is sufficent for our purposes, to fully define additon we would also say n + p* = (n + p)*)

therefore:

1 + 1 = 1* = 2

Two thing to notice here, firstlythe proof is very, very trivial and in all honesty it proabably wasn't worth doing (more fool me)! The second thing to notice is that except the very last line, it is just a list of definitions, this is as additon is not something we prove it is something we define.
 
  • #20
in math, addition is DEFINED to go along with how we feel the world works.

Sorry, I feel the need to nitpick!

Addition is defined from axioms, but those axioms were selected because we feel they reflect how the world works.
 
  • #21
ill post this one more time, jcsd said basically the same thing:

"and if you don't think your question is vague, think about the fact that the statements 1+1=2 and 1+2=3 are meaningless until 1,2,3,+, and = are defined. and once they are, what do you have left to prove?"

ok?


ok?


PLEASE don't say "can someone please just prove addition?" again. rephrase your question if your still not satisfied.


and about what hurkyl said about it really being axioms that define addition... yea, that was sort of what i meant. i don't see a difference in saying "addition was defined such that..." and "the axioms addition is based on were defined such that..." i haven't studied formal logic that much, so maybe I am not being excrutiatingly precise in my terms.
 
  • #22
The problem I've seen is that if you say "addition is defined to reflect how the world works", it seems to suggest that the definition of addition would change if we discover the world works differently, and that you need empirical justification for mathematical facts, rather than logical derivation.
 
  • #23
well that is exactly what i mean. and if we find out tomorrw that putting 2 apples in a bag and then putting another 2 leaves 6 apples, we had better change our definition of addition, which is to say, formulate new axioms. sure math is independent of reality up to a point. you can define a system where addition is not commutative, or division is, or whatever you like. but the useful systems are those that reflect reality in some way, the best example being the real numbers. and if reality were to change, these would have to as well.
 
  • #24
The difference between "changing the axioms" and "formulating a new theory with new axioms" is important, I'm just trying to keep them from getting confused with each other.
 
  • #25
So u can't actually prove addition then?
 
  • #26
drcrabs,

If you understood what's been said by matt grime or jcsd above (on constructing the naturals), you wouldn't ask this question, again !

So, since you don't understand what they're saying, I think you should just be patient and go through school and college...
 
Last edited:
  • #27
So what are you trying to say?
That you can't prove addition?
 
  • #28
We are saying that the question is non-sensical.
 
  • #29
Okay, first of all, you don't "prove addition", you DEFINE it!

Crankfan's method is the right "mathematical" way. You don't really need to "prove" 1+ 1= 2 because basically, that's the way 2 is defined: 2 IS the "successor" of 1 and the succesor of any number, n, is n+1 (Actually, using Piano's axioms, one defines succesor first and the DEFINES addition in that way).
 
  • #30
drcrabs said:
So what are you trying to say?
That you can't prove addition?

Drcrabs, I believe that your question is quite valid. I also believe that it belongs in the real of mathematics even though most mathematicians would shove such questions onto the philosophy department.

The first thing that you need to understand is that mathematics is NOT a science. That is to say that it does not subscribe to, or use, the scientific method and for this reason it is incorrect to call it a science. It is based largely on axiomatic logic. Although the very foundation of mathematics (set theory) is actually based on a logical contradiction.

The definition of addition within mathematics is based on accepting the axioms. Axioms do not need to be proven, they merely need to be accepted. Once you have accepted the axioms then you can say something about how the axioms "operate" as in the case of addition.

Therefore in a very real sense it is true to say that mathematics cannot prove addition. They merely accept the axioms that lead to a definition of the operation of addition. To "prove" addition in mathematics you must first accept all of the axioms without proof. So at the most fundamental level mathematics cannot prove addition.

There is a real problem with mathematics in set theory actually. It has been swept under the carpet for the past 200 years and continues to be swept under the carpet to this day. Prior to the formal definition for set theory there was no problem because the whole thing was based on intuition. Of course, that was actually a problem since every human has their own idea of what they see as "intuitive". Thus the need for a formal definition for Set Theory and so in the early 1900's Georg Cantor got to impose his intuition onto everyone else by creating the formal foundation for set theory.

Other posters have mentioned the Peano Axioms which is kind of ironic because Giuseppe Peano actually totally disagreed with Georg Cantor's formalism. Peano wanted to define set theory starting with the idea of a thing (an abstract concept that he called Unity). Unfortunately Peano wasn't clever enough to come up with a working definition for his idea of Unity that the mathematical community would accept. So Georg Cantor stepped up to the plate and offered nothing as the basis of Set Theory. Yes, he had nothing to offer which was the concept of zero. He defined the number One based on the concept of zero. One is currently formally and officially defined as the set containing the empty set. The empty set of course being nothing, or zero. It's a long story and I won't go into the detail but needless to say the mathematical community though the George Cantor was a genius to be able to define the number One based on nothing! They loved it in particular because it allowed to the natural numbers to be defined out of nothing. They could be "purely abstract" and untainted by any conceptual ideas of unity.

This was actually a bad thing in my humble opinion. And this is not just my opinion. Many mathematicians at the time were extremely upset by Cantor's introduction of this phantom idea of an empty set. Peano certainly disagreed with Cantor, Leopold Kronecker was fit to be tied and was perhaps Cantor's biggest critic. In fact one great mathematician, Henri Poincare expressed his disapproval, stating that Cantor's set theory would be considered by future generations as "a disease from which one has recovered." I absolutely agree with Poincare, and I feel that it's quite sad that the time has not yet ripened when the mathematical community will realize Cantor's folly.

In any case, because of Cantor's abstract foundation of nothing mathematics is unable to prove addition. This would not be the case had Peano been permitted to introduce his concept of Unity.

Many mathematicians today would ague that all of this is mere philosophy. I totally disagree. I believe it goes to the heart of mathematics. Moreover, if Peano's idea of Unity is permitted to be the foundation of set theory then mathematics could indeed become a true science in the sense that the scientific method would then be applicable to mathematics. When Cantor's empty set theory there simply isn't anything there to which the scientific method can be applied. Cantor really did succeed in basing Set Theory on nothing!

Cantor's Set Theory (which is the basis of all modern mathematics) cannot be used to prove addition. This is because Cantor's fundamental element (the empty set) has not property of individuality. It is a phantom abstract idea that is seriously flawed when it comes to any concept of quantity.

Now for the moment let us assume that Peano's idea of unity had been accepted. Peano using the idea of One as the foundation of set theory. Zero is not a number at all in this formalism. Instead it represents the absences of number, or the absences of quantity. It is still a useful symbol and would be used in very much the same way that it is used today. Only the fundamental nature of its meaning will have been changed. In other words, zero would not be a number, but rather it is a symbol used to represent the absence of any quantity. It would not "officially" be a number by definition. It would become nothing more than a place-holding symbol that represent the absence of number. Mathematicians actually recognize this property of zero to some extent, but for some reason they continue to accept Cantor's formalism stating that zero is indeed a valid number.

I won't get into all of the far-reaching implications of all of this other than to say that Group Theory would probably be the most affected branch of mathematics were Set Theory to be changed.

However, just for the sake of proving addition let's assume that Peano's Set Theory had been accepted. Not the Peano Axioms, but his entire set theory which had formally been rejected when Cantor's empty-set theory was accepted.

Using Peano's set theory the number One is defined as Unity. Peano was unable to define what he meant by Unity in a way that the mathematical community would accept it. (that's a shame) But we all intuitively know what unity means because this is in fact how we all intuitively perceive and experience the idea of number. When we first learn the idea of number we are taught this concept using flash cards with little pictures of individual things on them. The numbers are comprehended as collections (or sets) of these individual things. What teachers and mathematicians fail to bring to our attention is the importance of the prerequisite idea of individuality. We can't even begin to count thing unless we can recognize the property of individuality of objects that we are counting. So recognizing and defining this property of individuality is paramount to any concept of Set Theory or mathematics. Cantor swept this concept of individuality under the carpet by introducing the concept of an "empty set". He genuinely thought that by doing this he was actually defining individuality, and he was. Unfortunately he was defining a "qualitative" idea of individuality rather than a "quantitative" idea. In other words, qualitatively we can have only ONE empty set. Any set that is empty *is* the empty set. Therefore there is only ONE empty set qualitatively speaking. Unfortunately mathematics in is a quantitative idea, so defining the concept of ONE based purely on a qualitative idea isn't really a solid foundation. Humanity will discover Cantor's folly as some time in the future, this is inevitable.

Getting back to Peano's quantitative idea of ONE we can define the meaning of addition. And moreover we can prove that 1+1=2. The prove is based on the idea of individuality (or Unity). Given that the number ONE is defined as a complete definition of individuality for whatever it is that we are counting the proof should be automatically evident from this definition. To say 1+1=2 simply means that a quantity of one individual thing collected together with another quantity of one individual thing is the same as two individual things.

Or if you want a more rigorous proof, you can view it as 2 is the number that after removing a quantity of one individual thing there is only one individual thing left.

The concept of individuality necessarily has to do with the definition of the "things" that are being quantitfied. Make no mistake about this. The one thing that the mathematical community liked so much about Cantor's empty set was that it seemed to free numbers from the idea of a "thing". It make them "pure" so that we can speak about number without attaching them to any "things". This is an illusion. Cantor's "thing" is "nothing"! He called his "nothing" an empty set. When mathematicians say 1+1=2 what they are really saying it that 1 empty set +1 empty set = a set containing 2 empty sets!

Cantor did not remove the concept of number from the concept of a thing. All he did was define a thing that has no quantitative property of individuality. It was actually a very bad idea. And I believe that Poincare was right and at some future time the mathematical community will wake up to Cantor's folly.

The biggest problem with Cantor's empty set is that is has no quantitative property of individuality. In other words, there's no way to define that you have only ONE empty set. An empty set is nothing? Can you have two nothings? If not, then how can you claim to have only ONE?
This is serious stuff and you have recognized it. There is no way to prove addition in mathematics because it is based on empty set theory. Until that is corrected mathematics will never be able to prove addition, nor can it ever claim to be a science. Although many people believe that mathematics is a science anyway. But by definition of the scientific method it cannot be a science as long as it's based on the concept of nothing.

On a more intuitive level just thing about the concept of a so-called "pure" number. What's pure about it? Numerals are not numbers! They are just the symbols that are use to represent the idea of number. Number itself is a quantitative idea. Yet we cannot intuitively have an idea of quantity unless we are thinking about a collection of many individual "things". Therefore to claim that we can imagine an idea of number that is not associated with the idea of individual "things" is total nonsense. It's an incompressible idea. Quantity makes no sense outside of the idea of a collection of things. Therefore to claim that we can imagine an idea of a so-called "pure" number that is removed from the idea of a collection of things is absurd.

Number is the idea of a collection of things. And this is true whether we are using things that have clearly defined properties of individuality, or whether we are pretending to give nothing a property of individuality. When we start thinking of nothing as though it is an individual thing we are only fooling ourselves. To begin with the very moment we do that we have a logical contradiction on our hands. The empty set is no longer empty. It contains this individual thing called nothing! We may as well have started with basketballs as to have started with nothing if we are going to do that.

You're onto a very real and serious concept. Keep at it and study the history of set theory! Maybe you will be the one who will finally convince the mathematical community that Poincare was right!

By the way, the way to prove that Cantor's empty set theory is logically inconsistent is through Group Theory. So if you want to make progress in this endeavor that's the place to focus!

Thanks for starting this thoughtful thread! :wink:
 
  • #31
Wow, what a long and completely off topic post.

One cannot "prove addition". It simply doesn't make sense as a sentence in the English Language.

The OP was aksed to "prove red" and he didn't (though he appears to think he did). He gave a "definition" of red. That in itself doesn't "prove" red. What does prove even mean in that context?
 
  • #32
matt grime said:
Wow, what a long and completely off topic post.

One cannot "prove addition". It simply doesn't make sense as a sentence in the English Language.

The OP was aksed to "prove red" and he didn't (though he appears to think he did). He gave a "definition" of red. That in itself doesn't "prove" red. What does prove even mean in that context?

I don't see the connection between "proving red" and "proving addition". Red is a phenomenon. Addition is a mathematical process (an operation on sets).

To ask that addition be "proven" is to ask that the mathematical process, or operation of addition, be logically justified. It makes sense to me, and seems like a valid question that any student should be interested in.
 
  • #33
NeutronStar said:
To ask that addition be "proven" is to ask that the mathematical process, or operation of addition, be logically justified. It makes sense to me, and seems like a valid question that any student should be interested in.


Erm, no, that isn't quite what prove means in mathematics, is it? To prove something one needs a hypothesis from which to make a deduction.

Ok, so prove minus, then, prove 3, prove composition of functions.
 
  • #34
Better still, prove eating or prove standing. After all, these too are processes or operations.
 
  • #35
NeutronStar said:
To ask that addition be "proven" is to ask that the mathematical process, or operation of addition, be logically justified.

A methematical operation need not be logically justified. There's no need for that. Once an operation/function is defined on a set, there is no justification required.

While "explain addition" would be a good question, "prove addition" makes no sense.
 
  • #36
You should start with proving proof with no idea about what must be present in a valid proof. Or perhaps not..
 
  • #37
matt grime said:
Erm, no, that isn't quite what prove means in mathematics, is it? To prove something one needs a hypothesis from which to make a deduction.

Ok, so prove minus, then, prove 3, prove composition of functions.


IF a=1+1 THEN a=2

There is a hypothesis and conclusion to prove.

Attemping to prove minus, or 3 make no sense because there is no hypothesis and conclusion. With 1+1=2 there is.

I'm not sure about the composition of functions, that could probably be put into a conditional statement and proven. But to do so we would probably need to refer intensely to the definitions of functions and composition. To prove the conditional statement above we would likewise need to refer intensely to the definitions of number (specifically the numbers 1 and 2) and the definition of operation of addition. While that may not actually "prove addition" it would at least show that addition is a valid logical concept. And in a very real sense that is proving it.

The bottom line that will be the death of this whole thing is that the operation of addition is dependent on the definition of numbers and vice versa. So the whole thing become circular. That's because of the empty set definition of numbers. If the numbers where defined on a concept of Unity then they wouldn't depend on the operation of addition for their definition and the process would no longer be circular. In other words, it would be "externally" provable via other forms of logic.

I might add that if Peano's Unity had been accepted as the foundation of the definition of number, then Kurt Gödel's incompleteness theorem would no longer apply to mathematics because mathematics would no longer be a self-contained logical system.

Drcrabs is either going to have to accept the axiomatic methods of mathematics or prove to the mathematical community why it is flawed. The former is way easier. :biggrin:
 
  • #38
Nonsense!
You INTRODUCE the mathematical symbol "2" by defining
2=1+1
You've got some silly, unmathematical preconception about what "2" is; get rid of that.
 
  • #39
arildno said:
Nonsense!
You INTRODUCE the mathematical symbol "2" by defining
2=1+1
You've got some silly, unmathematical preconception about what "2" is; get rid of that.

You may very well be correct. But if I have an unmathematical preconception about what "2" is I'm afraid that you'll have to blame that on the educational institutions that I've been taught by.

I was introduced to the idea of number in both kindergarten and in early grade school using a concept of collections of things (sets). The idea of "2" is ingrained in my intellect as the quantity than after having removed a quantity of 1 there is only 1 remaining. (that's actually backwards from the way it is taught, but I think it makes for a more rigorous definition).

I don't think of "2" as a printed numeral that we are using here to communitate the idea of two. That's just a symbol no different the the English word "T-W-O". The actual concept of "2" is a collection of individual elements such that after removing an individual element all that remains is enough "stuff" to define precisely the definition of whatever it is that it being quantified.

This may sound a bit complicated, but in reality I believe that this is everyone's everyday intuitive experience of the idea of quantity that we have come to formalise as numbers.

So based on this concept of unions of sets I have no problem at all comprehending addition. Unfortunately the most rigorous axiomatic definitions for addition do not permit this intuitive view. We must resort to Cantor's idea of collections of nothing.
 
  • #40
That isn't a proof neutron star, but even then proving 1 + 1 = 2 (which you'll see was done earlier if something so trivial can be called a proof) is not known as 'proving addition'.

We don't have to start out with the empty set in order to build Peanos axioms (I listed them above notice no mention of empty set), though it is possible to construct Peano's axioms starting with the empty set

0 = {}

0* = 1 = {0} = {{}}

(0*)* = 1* = 2 = {0,1} = {{{}},{}}

etc.

This was done by Von Neumann after Peano and there are other ways of constructing Peano's axioms.
 
  • #41
So basically after reading these pages, you are trying tell me that you actually can't prove addition?
 
  • #42
jcsd said:
We don't have to start out with the empty set in order to build Peanos axioms
Technically I disagree with that. Peano relies on the idea of 1 which has been formally accepted by the mathematical community to be defined by Cantor's definition. Therefore any reference to the number 1 is automatically a reference to the empty set by default. (In other words, Peano doesn't actually define the number 1 in his axioms, he merely uses the preexisting concept)

jcsd said:
0 = {}

0* = 1 = {0} = {{}}

(0*)* = 1* = 2 = {0,1} = {{{}},{}}
Now this has always been of interest to me. Since 1 = {0} = {{}}
and 2 = {0,1} = {{{}},{}} then the whole intuitive idea of addition as the union of sets gets blown out of the water because {{}} U {{}} does not equal {{{}},{}}. It would be {{},{}}. This is actually the way we perceive number as a property of the real universe by the way. We don't perceive 2 as {{{}},{}} we percieve it as {{},{}}. So why the need to define it in such an unnatural way?

If we define 2 as {{},{}} then it would be easy to prove addition as the union of sets. We can't prove that 1+1=2 using our current mathematical formalism because it makes no sense. :smile:

Instead we need to rely on axioms that merely state that it is true by the rule of formalisms rather than being able to prove it by logical deduction.
 
  • #43
matt grime said:
The OP was aksed to "prove red" and he didn't (though he appears to think he did). He gave a "definition" of red. That in itself doesn't "prove" red.

Yea waddup Grimey. I've noticed that you realized that i gave a definition. The purpose of the 'definition of red post' was to point out that the person who was asking me to prove red was getting off the topic
 
  • #44
Your not quite making sense, Peano did not make any refernce to the number 1 in his axioms, the only number explcitily mentioned is zero and the only properties it has are those that are defined by the axioms (though you can start with the number one in order to construct the non-negtaive integers). In Peano's system of the natural numbers 1 is just another name for 0* and the only properties it has are those that are defioned by Peano's axioms. Generically 1 is the mulpilcative identity in a semiring (I guess, as N is certainly not a ring under normal additon and multiplication), but this is something that emerges out of the defiuntion of mulpilication and additon in this set and we need not assume any preconceived notion of the number '1', other than the one given to us by our defintions.


In Von Neumann's constuction the succesor of some number n is simply n U {n}. Defining 2 as {{},{}} indeed makes no sense.
 
Last edited:
  • #45
drcrabs said:
So basically after reading these pages, you are trying tell me that you actually can't prove addition?

It's not a case of 'cna't' it's just that the question makes no sense.
 
  • #46
addition may be the most basic mathematical concept. in fact, i think its safe to say that addition came first, and the numbers were defined in terms of it. is there any other way to define 2 besides 1+1? if so, then maybe there would be a way to prove addition, but i don't think there is.

also, all this talk about set theory is, in my opinion, off topic. set theory is not the only way to define addition, and it implies there is only one way it could work. its possible to imagine a universe where 1+3 is not equal to 2+2. addition is, whether you like it or not, an empirical law, from which its set-theoretic defintion was derived.

when you gave the definition of red, you were actually proving my point. addition, like the color red, is not something up for debate. it is defined, plain and simple. from a definiton, we can derive all sorts of things about the things properties, like that its commutative, or that its wavelength is 650 nm. but the defintions themselves are, in a sense, handed down by god.
(red may have been a confusing example, because it is subjective, where as we all believe addition to be objective, and based on logic. my point was that they both are so basic and innate, to prove them makes no sense.)
 
  • #47
StatusX said:
also, all this talk about set theory is, in my opinion, off topic. set theory is not the only way to define addition, and it implies there is only one way it could work. its possible to imagine a universe where 1+3 is not equal to 2+2. addition is, whether you like it or not, an empirical law, from which its set-theoretic defintion was derived.

I totally agree that addition is an empirical law of our universe. Heck, I think anyone who's been paying an attention at all to human history can clearly see that our mathematics arose from observing the quantitative property of our universe.

I totally disagree, however, that Set Theory was derived from this emperical observation. I wish it were, then addition would be provable.

Finally, concerning the idea that set theory is off topic in any discussion of addition is absurd in my humble opinion. And my reason for feeling this way is because it is through the very concept of sets (or collections of things) that the universe exhibits it quantitative nature. So even though we can disguise the idea of quantity (or addition of quantities) behind the guise of other types of logic doesn't change the fact that it can always be reduced to ideas of collections of the fundmental property of individuality for this is the nature of the universe from which the idea came.

There's just no getting around it. If we are talking about number we are talking about quantity. And if we are taking about quantity, we are talking about collections of individual things. It's a basic truth of the universe. Any concept of number that is not the concept of a collection of things is simply an incorrect concept of number. Such a concept is something other than the concept of number. So to just mention mathematics or number we necessarily have to think in terms of sets. There's just no other way to comprehend it.

True, we can write up a bunch of rules and axioms and follow them. But is that truly comprehension? Especially when we can't even prove them?
 
  • #48
Anyone who starts to talk about 'empirical laws' when talking about math proofs is most defintely on the wrong track. Maths does not claim to describe reality, the axioms of mathethamtical systems do not come from the observation of reality, they are true for the system because we define them to be true. It is pewrfectly possible to have a statement that is axiomatic in one mathematical syetm but false in another.
 
  • #49
all I am saying is that addition is so basic, it is impossible to prove within math(ie, using set theory). all you can do is define it to match empirical observations. to prove the physical law of addition is impossible, just like any other physical law. in general, id agree that math and reality are completely separate (although its naive to take this too far, since math is only useful when it helps us in the real world), but i think addition may be the point at the bottom where they meet. if anything, this is a philosopical question.
 
Last edited:
  • #50
jcsd said:
Anyone who starts to talk about 'empirical laws' when talking about math proofs is most defintely on the wrong track. Maths does not claim to describe reality, the axioms of mathethamtical systems do not come from the observation of reality, they are true for the system because we define them to be true. It is pewrfectly possible to have a statement that is axiomatic in one mathematical syetm but false in another.

Had you been a student of Pythagoras he would have had you thrown overboard into the Aegean sea for making such a remark. :biggrin:

Actually I agree with you that modern day mathematicians view mathematics in a totally different way than many historical mathematicians did. And while I don't share this modern view I can't say that it is incorrect in this day and age. But what I can say is that modern day mathematics is not a correct model of the quantitative nature of our universe. :approve:

I usually put my view in a conditional statement and claim that the statement it true.

IF "mathematics is supposed to be a good model of the quantitative nature of our universe" THEN "mathematics is logically flawed".

I claim that this statement is true, although I certainly don't intend to prove it on an Internet board to a hostile audience. I'll save it for a personal lecture to people who are genuinely interested in hearing it. :smile:

I believe that most mathematicians wouldn't even be interested in hearing the proof because they would deny the hypothesis right off the bat. (This appears to be jcsd's stance, he simply doesn't believe that mathematics is supposed to be a good model of the quantitative nature of the universe and therefore any proof to show that it isn't a good model is trivial and uninteresting)

As a scientist I don't share his view. I believe that it is extremely important that mathematics properly model the quantitative nature of the universe. Therefore I am concerned with any flaws in mathematics that might make the above conditional statement true.
 
Back
Top