Is Addition Really a Basic Skill?

  • Thread starter drcrabs
  • Start date
  • Tags
    Addition
In summary, the book discusses how humans have developed a number system, and how that system is based on our intuition and observations of the world around us.
  • #1
drcrabs
47
0
I reckon addition is a basic skill. But can it be proved?
How?
 
Physics news on Phys.org
  • #2
Good heaven! Have you ever gone through the check out at the store? I have 6 of this and three of that...etc. Notice how addition is Abelian. Could it be possible it is not? Like three pops + two soups is not equal to two soups + three pops? What would the world be like then?
 
  • #3
That kinda didn.t answer my question
 
  • #4
What is it you want to prove ?
 
  • #5
Addition!
 
  • #6
prove the color red
 
  • #7
DrCrabs

The visible red light has a wavelength of approximately 650 nm.
There are three types of color-sensitive cones in the retina of the human eye, corresponding roughly to red, green, and blue sensitive detectors.
Experiments have produced response curves for three different kind of cones in the retina of the human eye. The "green" and "red" cones are mostly packed into the fovea centralis. By population, about 64% of the cones are red-sensitive, about 32% green sensitive, and about 2% are blue sensitive.
So if a body emitts a wavelength of approximately 650 nm, the cones that correspond to that wavelength absorb that are hance seeing the color red.
 
  • #8
Maybe i should specify what I am wanting.

Can someone prove that if

1 + 1 = 2

then

1 + 2 = 3
 
  • #9
first of all, that's not a proof, your stating physical observations. second, wow, what a waste of time. and third, you missed my point. addition is such a basic, intuitive concept that it defies proof. we define formal systems that reflect our intuitve feeling of how addition works, and proofs can be set up within those formal systems. 2 is defined as the number equal to 1+1, and 3 is defined as the number equal to 2+1.

if youre asking me to prove the physical law that if i put two apples in a bag, and then put another three in, and then count the total number of apples in the bag, i will always get the number five, i can't do that. just like i can't prove the law of gravity is always true, even though overwhelming evidence shows it is. any rules we formulate about the physical world are made with the assumption that the world works in a regular, consistent way. this assumption cannot be proven.
 
  • #10
All i want to know is how to prove addtion.
Is that so hard to do?

It wasn't a waste of time. I know dat stuff like the back of my hand
 
  • #11
omg, he's saying you CANT. its not something you can prove, it just is what it is. in other words, its not something that was built on foundations that show its true.
 
  • #12
i thought i answered it. your question is extremely vague, and if i didnt understand exactly what you were thinking, then youll have to explain it better.

in math, addition is DEFINED to go along with how we feel the world works. look up peanos axioms for a formal way of defining addition. if you want to prove the physical property that our idea of addition is based on, (ie, the apples in a bag i described before), well, you cant.

basically, any proof is based on initial assumptions. you can never say something is absolutely true, you can only say it is true if some other statement is true, the assumption. the initial assuptions are usually based on human intuition or physically observed phenomana, but neither of these can be "proven." you can prove the moon will orbit the Earth in an ellipse if gravity is an inverse square law, but you can't prove that about gravity. its based on observation, and there's no reason we couldn't wake up tomorrow and find there's a new law of gravity.


and if you don't think your question is vague, think about the fact that the statements 1+1=2 and 1+2=3 are meaningless until 1,2,3,+, and = are defined. and once they are, what do you have left to prove? (a better question would be to prove that addition is commutative, which is nontrivial and can be done from peanos axioms)
 
Last edited:
  • #13
Hi drcrabs,
I thought about your question, and there's not a simple answer, even though I'm sure many folks would dismiss this question as trivial. It occurred to me that you might be interested not simply in the "proof of addition", but in the general history of how human beings developed number systems and counting.
I have a wonderful book called The Universal History of Numbers by Georges Ifrah that you might enjoy. It covers human concepts of counting and its applications, from the counting of the number of fingers on our hands, to the invention abacus, to the underlying foundations of the modern computer.
It's really a fascinating read if you like that sort of historical data.
 
  • #14
1+1 = 2
subtract one from both sides
1 = 1

lol.
 
  • #15
It depends on where you start from.

From something like the Peano Axioms you can prove the existence of all of the arithmetic functions commonly used with the naturals and ensure that they have the properties that we intuitively expect them to have.

After that you have to develop Z, Q, R, etc.. and do the same as described above at each step of the development. It's not something that is properly described in a single post but to give you a very basic idea of how this works:

In PA, the existence of 0 and a successor operation ' is postulated so that we can think of the natural numbers as the sequence 0, 0', 0'', 0''' and so on...

To prove addition exists we want to prove the existence of a unique function f: NxN->N, with the properties

(1) for all x in N, f(0,x) = x
(2) for all x and y in N, f(y',x) = (f(y,x))'

After this is proven, it's then shown that this function f, or addition has all of the properties we intuitively expect it to have: it's commutative, associative, etc.

This kind of approach (first oder PA) is older in style, you tend to see it more in older books like "Set Theory and Logic" Robert Stoll and "Foundations of Analysis" Edmund Landau. The modern approach is more set theoretical, and usually based on ZF rather than PA. You start our defining numbers as sets, and then define basic arithmetic operations as operations on sets.

I was cleaning out my links the other day and came across this, which describes the set theoretcal approach.

http://www.uwec.edu/andersrn/sets.htm
 
Last edited by a moderator:
  • #16
Yes thanks guys

but can someone just prove addition please!
:mad:
 
  • #17
One doesn't prove addition, one proves a theorem about addition.

The proof you want, that if 1+1=2, then 1+2=3 can be written as:

1+2
= 1+(1+1)
= 1+1+1
=3 {definition}

assuming that the integers are a ring.

if you want to prove that from first principles then you're welcome to, but don't expect many mathematicians to care these days.
 
  • #18
Maybe this should be moved to the philosophy forum.
 
  • #19
You haven't even said in which set, but if we were to talk abourt the natural numbers and Peano's axioms here is proof that 1 + 1 = 2:

Peano's axioms:

1) 0 is a member of N.

2) for each element n that is a member of N there exists an element n* (which is a meber of N also) called the succesor of n.

3) 0 is not the succesor of any elemnt in N

4) For each pair n, m in N where n is not equal to m, then n* is not equal to m*

5) if A is a subset of N, 0 is an elemnt of A and p is a member of A implies that p* is a member of A then A = N.

Let 0* = 1 and 1* = 2

Define additon as:

n + 1 = n* (note this doesn't fully defoine additon, but it is sufficent for our purposes, to fully define additon we would also say n + p* = (n + p)*)

therefore:

1 + 1 = 1* = 2

Two thing to notice here, firstlythe proof is very, very trivial and in all honesty it proabably wasn't worth doing (more fool me)! The second thing to notice is that except the very last line, it is just a list of definitions, this is as additon is not something we prove it is something we define.
 
  • #20
in math, addition is DEFINED to go along with how we feel the world works.

Sorry, I feel the need to nitpick!

Addition is defined from axioms, but those axioms were selected because we feel they reflect how the world works.
 
  • #21
ill post this one more time, jcsd said basically the same thing:

"and if you don't think your question is vague, think about the fact that the statements 1+1=2 and 1+2=3 are meaningless until 1,2,3,+, and = are defined. and once they are, what do you have left to prove?"

ok?


ok?


PLEASE don't say "can someone please just prove addition?" again. rephrase your question if your still not satisfied.


and about what hurkyl said about it really being axioms that define addition... yea, that was sort of what i meant. i don't see a difference in saying "addition was defined such that..." and "the axioms addition is based on were defined such that..." i haven't studied formal logic that much, so maybe I am not being excrutiatingly precise in my terms.
 
  • #22
The problem I've seen is that if you say "addition is defined to reflect how the world works", it seems to suggest that the definition of addition would change if we discover the world works differently, and that you need empirical justification for mathematical facts, rather than logical derivation.
 
  • #23
well that is exactly what i mean. and if we find out tomorrw that putting 2 apples in a bag and then putting another 2 leaves 6 apples, we had better change our definition of addition, which is to say, formulate new axioms. sure math is independent of reality up to a point. you can define a system where addition is not commutative, or division is, or whatever you like. but the useful systems are those that reflect reality in some way, the best example being the real numbers. and if reality were to change, these would have to as well.
 
  • #24
The difference between "changing the axioms" and "formulating a new theory with new axioms" is important, I'm just trying to keep them from getting confused with each other.
 
  • #25
So u can't actually prove addition then?
 
  • #26
drcrabs,

If you understood what's been said by matt grime or jcsd above (on constructing the naturals), you wouldn't ask this question, again !

So, since you don't understand what they're saying, I think you should just be patient and go through school and college...
 
Last edited:
  • #27
So what are you trying to say?
That you can't prove addition?
 
  • #28
We are saying that the question is non-sensical.
 
  • #29
Okay, first of all, you don't "prove addition", you DEFINE it!

Crankfan's method is the right "mathematical" way. You don't really need to "prove" 1+ 1= 2 because basically, that's the way 2 is defined: 2 IS the "successor" of 1 and the succesor of any number, n, is n+1 (Actually, using Piano's axioms, one defines succesor first and the DEFINES addition in that way).
 
  • #30
drcrabs said:
So what are you trying to say?
That you can't prove addition?

Drcrabs, I believe that your question is quite valid. I also believe that it belongs in the real of mathematics even though most mathematicians would shove such questions onto the philosophy department.

The first thing that you need to understand is that mathematics is NOT a science. That is to say that it does not subscribe to, or use, the scientific method and for this reason it is incorrect to call it a science. It is based largely on axiomatic logic. Although the very foundation of mathematics (set theory) is actually based on a logical contradiction.

The definition of addition within mathematics is based on accepting the axioms. Axioms do not need to be proven, they merely need to be accepted. Once you have accepted the axioms then you can say something about how the axioms "operate" as in the case of addition.

Therefore in a very real sense it is true to say that mathematics cannot prove addition. They merely accept the axioms that lead to a definition of the operation of addition. To "prove" addition in mathematics you must first accept all of the axioms without proof. So at the most fundamental level mathematics cannot prove addition.

There is a real problem with mathematics in set theory actually. It has been swept under the carpet for the past 200 years and continues to be swept under the carpet to this day. Prior to the formal definition for set theory there was no problem because the whole thing was based on intuition. Of course, that was actually a problem since every human has their own idea of what they see as "intuitive". Thus the need for a formal definition for Set Theory and so in the early 1900's Georg Cantor got to impose his intuition onto everyone else by creating the formal foundation for set theory.

Other posters have mentioned the Peano Axioms which is kind of ironic because Giuseppe Peano actually totally disagreed with Georg Cantor's formalism. Peano wanted to define set theory starting with the idea of a thing (an abstract concept that he called Unity). Unfortunately Peano wasn't clever enough to come up with a working definition for his idea of Unity that the mathematical community would accept. So Georg Cantor stepped up to the plate and offered nothing as the basis of Set Theory. Yes, he had nothing to offer which was the concept of zero. He defined the number One based on the concept of zero. One is currently formally and officially defined as the set containing the empty set. The empty set of course being nothing, or zero. It's a long story and I won't go into the detail but needless to say the mathematical community though the George Cantor was a genius to be able to define the number One based on nothing! They loved it in particular because it allowed to the natural numbers to be defined out of nothing. They could be "purely abstract" and untainted by any conceptual ideas of unity.

This was actually a bad thing in my humble opinion. And this is not just my opinion. Many mathematicians at the time were extremely upset by Cantor's introduction of this phantom idea of an empty set. Peano certainly disagreed with Cantor, Leopold Kronecker was fit to be tied and was perhaps Cantor's biggest critic. In fact one great mathematician, Henri Poincare expressed his disapproval, stating that Cantor's set theory would be considered by future generations as "a disease from which one has recovered." I absolutely agree with Poincare, and I feel that it's quite sad that the time has not yet ripened when the mathematical community will realize Cantor's folly.

In any case, because of Cantor's abstract foundation of nothing mathematics is unable to prove addition. This would not be the case had Peano been permitted to introduce his concept of Unity.

Many mathematicians today would ague that all of this is mere philosophy. I totally disagree. I believe it goes to the heart of mathematics. Moreover, if Peano's idea of Unity is permitted to be the foundation of set theory then mathematics could indeed become a true science in the sense that the scientific method would then be applicable to mathematics. When Cantor's empty set theory there simply isn't anything there to which the scientific method can be applied. Cantor really did succeed in basing Set Theory on nothing!

Cantor's Set Theory (which is the basis of all modern mathematics) cannot be used to prove addition. This is because Cantor's fundamental element (the empty set) has not property of individuality. It is a phantom abstract idea that is seriously flawed when it comes to any concept of quantity.

Now for the moment let us assume that Peano's idea of unity had been accepted. Peano using the idea of One as the foundation of set theory. Zero is not a number at all in this formalism. Instead it represents the absences of number, or the absences of quantity. It is still a useful symbol and would be used in very much the same way that it is used today. Only the fundamental nature of its meaning will have been changed. In other words, zero would not be a number, but rather it is a symbol used to represent the absence of any quantity. It would not "officially" be a number by definition. It would become nothing more than a place-holding symbol that represent the absence of number. Mathematicians actually recognize this property of zero to some extent, but for some reason they continue to accept Cantor's formalism stating that zero is indeed a valid number.

I won't get into all of the far-reaching implications of all of this other than to say that Group Theory would probably be the most affected branch of mathematics were Set Theory to be changed.

However, just for the sake of proving addition let's assume that Peano's Set Theory had been accepted. Not the Peano Axioms, but his entire set theory which had formally been rejected when Cantor's empty-set theory was accepted.

Using Peano's set theory the number One is defined as Unity. Peano was unable to define what he meant by Unity in a way that the mathematical community would accept it. (that's a shame) But we all intuitively know what unity means because this is in fact how we all intuitively perceive and experience the idea of number. When we first learn the idea of number we are taught this concept using flash cards with little pictures of individual things on them. The numbers are comprehended as collections (or sets) of these individual things. What teachers and mathematicians fail to bring to our attention is the importance of the prerequisite idea of individuality. We can't even begin to count thing unless we can recognize the property of individuality of objects that we are counting. So recognizing and defining this property of individuality is paramount to any concept of Set Theory or mathematics. Cantor swept this concept of individuality under the carpet by introducing the concept of an "empty set". He genuinely thought that by doing this he was actually defining individuality, and he was. Unfortunately he was defining a "qualitative" idea of individuality rather than a "quantitative" idea. In other words, qualitatively we can have only ONE empty set. Any set that is empty *is* the empty set. Therefore there is only ONE empty set qualitatively speaking. Unfortunately mathematics in is a quantitative idea, so defining the concept of ONE based purely on a qualitative idea isn't really a solid foundation. Humanity will discover Cantor's folly as some time in the future, this is inevitable.

Getting back to Peano's quantitative idea of ONE we can define the meaning of addition. And moreover we can prove that 1+1=2. The prove is based on the idea of individuality (or Unity). Given that the number ONE is defined as a complete definition of individuality for whatever it is that we are counting the proof should be automatically evident from this definition. To say 1+1=2 simply means that a quantity of one individual thing collected together with another quantity of one individual thing is the same as two individual things.

Or if you want a more rigorous proof, you can view it as 2 is the number that after removing a quantity of one individual thing there is only one individual thing left.

The concept of individuality necessarily has to do with the definition of the "things" that are being quantitfied. Make no mistake about this. The one thing that the mathematical community liked so much about Cantor's empty set was that it seemed to free numbers from the idea of a "thing". It make them "pure" so that we can speak about number without attaching them to any "things". This is an illusion. Cantor's "thing" is "nothing"! He called his "nothing" an empty set. When mathematicians say 1+1=2 what they are really saying it that 1 empty set +1 empty set = a set containing 2 empty sets!

Cantor did not remove the concept of number from the concept of a thing. All he did was define a thing that has no quantitative property of individuality. It was actually a very bad idea. And I believe that Poincare was right and at some future time the mathematical community will wake up to Cantor's folly.

The biggest problem with Cantor's empty set is that is has no quantitative property of individuality. In other words, there's no way to define that you have only ONE empty set. An empty set is nothing? Can you have two nothings? If not, then how can you claim to have only ONE?
This is serious stuff and you have recognized it. There is no way to prove addition in mathematics because it is based on empty set theory. Until that is corrected mathematics will never be able to prove addition, nor can it ever claim to be a science. Although many people believe that mathematics is a science anyway. But by definition of the scientific method it cannot be a science as long as it's based on the concept of nothing.

On a more intuitive level just thing about the concept of a so-called "pure" number. What's pure about it? Numerals are not numbers! They are just the symbols that are use to represent the idea of number. Number itself is a quantitative idea. Yet we cannot intuitively have an idea of quantity unless we are thinking about a collection of many individual "things". Therefore to claim that we can imagine an idea of number that is not associated with the idea of individual "things" is total nonsense. It's an incompressible idea. Quantity makes no sense outside of the idea of a collection of things. Therefore to claim that we can imagine an idea of a so-called "pure" number that is removed from the idea of a collection of things is absurd.

Number is the idea of a collection of things. And this is true whether we are using things that have clearly defined properties of individuality, or whether we are pretending to give nothing a property of individuality. When we start thinking of nothing as though it is an individual thing we are only fooling ourselves. To begin with the very moment we do that we have a logical contradiction on our hands. The empty set is no longer empty. It contains this individual thing called nothing! We may as well have started with basketballs as to have started with nothing if we are going to do that.

You're onto a very real and serious concept. Keep at it and study the history of set theory! Maybe you will be the one who will finally convince the mathematical community that Poincare was right!

By the way, the way to prove that Cantor's empty set theory is logically inconsistent is through Group Theory. So if you want to make progress in this endeavor that's the place to focus!

Thanks for starting this thoughtful thread! :wink:
 
  • #31
Wow, what a long and completely off topic post.

One cannot "prove addition". It simply doesn't make sense as a sentence in the English Language.

The OP was aksed to "prove red" and he didn't (though he appears to think he did). He gave a "definition" of red. That in itself doesn't "prove" red. What does prove even mean in that context?
 
  • #32
matt grime said:
Wow, what a long and completely off topic post.

One cannot "prove addition". It simply doesn't make sense as a sentence in the English Language.

The OP was aksed to "prove red" and he didn't (though he appears to think he did). He gave a "definition" of red. That in itself doesn't "prove" red. What does prove even mean in that context?

I don't see the connection between "proving red" and "proving addition". Red is a phenomenon. Addition is a mathematical process (an operation on sets).

To ask that addition be "proven" is to ask that the mathematical process, or operation of addition, be logically justified. It makes sense to me, and seems like a valid question that any student should be interested in.
 
  • #33
NeutronStar said:
To ask that addition be "proven" is to ask that the mathematical process, or operation of addition, be logically justified. It makes sense to me, and seems like a valid question that any student should be interested in.


Erm, no, that isn't quite what prove means in mathematics, is it? To prove something one needs a hypothesis from which to make a deduction.

Ok, so prove minus, then, prove 3, prove composition of functions.
 
  • #34
Better still, prove eating or prove standing. After all, these too are processes or operations.
 
  • #35
NeutronStar said:
To ask that addition be "proven" is to ask that the mathematical process, or operation of addition, be logically justified.

A methematical operation need not be logically justified. There's no need for that. Once an operation/function is defined on a set, there is no justification required.

While "explain addition" would be a good question, "prove addition" makes no sense.
 

Similar threads

Replies
59
Views
10K
Replies
4
Views
1K
  • General Discussion
Replies
8
Views
1K
Replies
1
Views
35
  • General Discussion
Replies
25
Views
2K
  • General Discussion
Replies
8
Views
2K
Replies
12
Views
911
Replies
6
Views
1K
Replies
5
Views
875
  • General Discussion
Replies
20
Views
1K
Back
Top