I just read an article (http://www.maa.org/devlin/devlin_06_08.html) saying that multiplication is not repeated addition. I am a first year engineering student, and i am very interested in mathematics. The picture i have had of multiplication for ~12 years or something is definitely of repeated addition. Wikipedia also seems to give this definition. http://en.wikipedia.org/wiki/Multiplication I read that there are 4 elementary operations for arithmetic. Addition, subtraction, multiplication, division. I notice that subtraction is the inverse of addition, and division the inverse of multiplication. So, if multiplication was indeed repeated addition, there would only be two elementary operations. Addition, and subtraction. And since subtraction is inverse addition, that would mean that division is repeated subtraction, and it certainly isn't. As a side note: I actually remember seeing it like this when i was very young and first learning about arithmetic. And, because i saw multiplication as repeated addition, it seemed to me that division was really not like the others. Although i just accepted this and havn't thought about it since. tl;dr If multiplication isn't repeated addition, what is it? Do you think i am missing something based on my above explanation?
Hmm, I think what he's getting it is the following: multiply 0,1 and 0,1 with each other. The answer is 0,01. But this is not repeated addition since you can't add up 0,1, for 0,1 times. Of course, in higher mathematics, addition and multiplication can be totally different things. The multiplication of matrices can not be seen as a repeated addition of matrices. And there are many other kind of examples. What the author of the article wants, is that teachers explain to their students that there are 2 kinds of operations on the integers: addition and multiplication. And that these operations have nothing to do with each other. While this is certainly true for higher mathematics, I think that the auther is barking up the wrong tree. I think it's very useful for children to look at multiplication as repeated addition. And I wouldn't want my child to be taught otherwise. Of course, a few years later, we can say that this is not quite true, but I think that it is very useful to think of multiplication as repeated addition. So, while he technically right, I don't agree with him...
try to multiply e by pi. e from exponential 1 and pi = 3.14....you'd see it's not a repeated addition at all.
Multiplication is bilinear. That means (a+b)c = ac + bc and a(b+c) = ab + ac. In other words, the distributive property holds. In the very special case that "a" can be written as repeated addition of a multiplicative unit: a = 1 + 1 + 1 + ... + 1then "ab" can be written as repeated addition of b: ab = (1 + 1 + ... + 1)b = 1b + 1b + ... + 1b = b + b + ... + b
Multiplication is certainly not repeated addition. Asking what is multiplication, the correct answer will be what do we do when we multiply? So the author is entirely correct, both technically and pedagogically, the method of multiplying is entirely different from the method of adding. It is a matter of fact that we do not ordinarily calculate 4*71 the same way as 71+71+71+71. If we did the matter would be different. Even though the methods are interchangeable, we simply do not call 71+71+71+71 multiplication! To ask what multiplication is if it is not addition is like asking what addition is if it is not multiplication. What type of answer do you expect? Multiplication is the activity or method of multiplying.
I've always just considered repeated addition a special case for the natural numbers, that can easily be extended to rational numbers. It's not a contradiction, just as division, as an extension to Cancellation Law, isn't a contradiction, just an extension. This is similar to a question I've wondered about for a while. How would one rigorously define addition and multiplication?
I agree with micromass. When you start teaching children about math you don't go straight to the reals, you start with specific examples involving integers. When you deal with integers, there is in fact a direct relationship between multiplication and addition. Even with the reals though you can break up a number into its "integer" and "fractional" part with distributivity and treat the multiplication with the "integer" parts and the "fractional" parts. By the time the students have gone through learning the rationals, the same kind of idea that was introduced in primary school can be used in high school.
I sympathize with the author. It is not right to consider multiplication as an application of addition, the method of multiplication is simply not repeated addition. Young students ought not be taught that. These are two quite different operations with interesting relations which can be presented in various interesting and in my opinion more instructive ways. And it is a valid point that this notion of repeated addition is queer when we get to rational numbers, and completely abolished when it comes to real numbers (or at best the analogy is in a twisted form). The method of multiplication on the other hand is generalized to rational and real numbers in a much more natural way. Why should we consider 2.3*4.2 repeated addition and how does it help us ? There are other reasons for why the notion of repeated addition is not appropriate. What happens in an application when you are calculating the area of, say, a rectangle? Say it is 3 m long, and 2 m wide. To calculate the area, we have 3 m * 2 m = 6 m^2. What is added here? 2 m^2 + 2 m^2 +2 m^2 ? This certainly does not generalize neatly when it comes to a rectangle of length 3.3 m, and width 2.5 m. How is it intuitive that we should think of calculating area as repeated multiplication? The differences are much more important than the similarities. This is the seed of the problem. What exactly is not rigorous with multiplication and addition?
Multiplication is, at heart, repeated addition. Your 1st grade teacher wasn't lying. >Why should we consider 2.3*4.2 repeated addition and how does it help us? If you ask me to pay you $2.3 times 4.2, then the "multiplication as repeated addition" metaphor shows me clearly you are asking to be repeatedly paid $2.3 more than 4 times but fewer than 5 times. The total amount is 2.3+2.3+2.3+... where the number of iterations is 4.2. I can continue perfectly thinking of the multiplication as addition as long as I admit the idea of a fractional iteration. Another example: construct a rectangle 2.3 inches wide by 4.2 inches long. Then repeatedly add 1x1 squares on top of the rectangle until the rectangle is exactly filled up. You are also allowed to insert pieces of squares. How many repeatedly added squares did it take to fill the rectangle? I beleive I can solve the problem to arbitrarily high accuracy while performing only additions, subtractions, and comparisons. The concept of "multiplication as repeated addition" applies in every case, depending on how flexibly you can think. For example, what is "i"? It is the number, that when repeatedly added i times, yields -1. If this sounds awkward, it's because "i" is awkward, not because the basic idea of multiplication has somehow changed. There are of course other ways of viewing multiplication. In the complex plane, it can operate like a rotation. But I maintain that the "multiplication as repeated addition" metaphor can remain valid in almost any scenario and, if it helps you think, shouldn't be discarded. >This is similar to a question I've wondered about for a while. How would one rigorously define addition and multiplication? You don't. They exist purely as concepts in the human mind. Regarded as an element of natural science, the human brain produces patterns, but cannot produce true or false ideas any more than the stomach can produce true or false acid.
While focusing on an analogy (possibly torturing it in the process) may help you think about something initially, it eventually becomes an obstruction, preventing you from developing an understanding of that thing in its own right. I'm pretty sure you're past the point where it's an obstruction when you are trying to think of adding something i times.
In the spirit of some of the previous posts I wonder if it would be helpful to think of multiplication as the natural generalization of repeated addition to non-integer numbers much in the same way that the gamma function interpolates non-natural numbers for the factorial function. So that it is in some sense an artifact of human intuition that the manifestation of multiplication as repeated addition in the natural numbers carries such weight with us. If we grew up learning math from the perspective of the real numbers the fact that multiplication and repeated addition happen to coincide for integer values might be seen as simply a natural by-product of the 'cleanliness' of the integers. I suppose in this sense the process seems almost empirical in that we are constantly exploring more and more exotic sets of numbers and thus having to constantly refine our definition of multiplication to account for the new data. So what is multiplication? It seems to me to be one of those base facts that if not considered self-evident cannot be adequately explained. I think the author in the MAA article has it essentially correct, our notions of scaling and counting, central to meaningful sensory experience, seem to be at the core of our intuition for these two concepts.
As a personal story, I recall that I didn't know that I wanted to major in Math until sometime during my first Calculus course. When I started taking my second Calc course, I was really upset that the professor nor the book bothered to prove things as rigorously as possible. For example, we still relied on that "a function is continuous at a if you can draw it with out picking up your pencil near a." Clearly, this isn't a very rigorous definition, in fact, speaking as a more mature math student (that is, more mature in math than your average Calc II student) it is a really silly definition. Stuff like this really irritated me, but now I see that these sorts of "definitions" are actually pretty good to teach a Calc 1,2,3 student. First of all, with out having several months/weeks of an analysis course, it is really impossible to define continuous. So, the only option is to start first year college students in an analysis course, but I don't think this would work for most people. So, students get crappy, though very intuitive definitions of mathematical ideas. This way, we are not overloaded with tons of definitions. We get to develop an intuition for stuff, and THEN we get to learn the material more rigorously. The same goes for multiplication. Start kids out by telling them that multiplication is repeated addition (though, I, too was taught that division was repeated subtraction and I thought that was insanely stupid). Don't mention rationals or reals until they have a good intuitive understanding of the integers. And, if you look at the way most maths were developed, we see that they usually started out describing real objects, and then got more abstract. If I understand correctly, the foundations of calculus weren't laid until well after other areas of calculus had been laid.
Robert, the issue here is not rigor. There isn't anything non-rigorous about multiplication if it isn't insisted on that it's "really just" addition at work all along, so I don't at all see the analogy with the pencil-definition in calculus.
My point is that there is no need to teach kids anything other than multiplication is repeated addition. Let kids work out the intuition of multiplication, then break the news to them years later. If not, I see two alternatives: 1)Give the kids an axiomatic definition of multiplication, then explain how it works on the integers, and go from there. I think this is a terrible idea. 6 year olds will not grasp that. 2)Don't tell the kids what multiplication is, just teach them how to do it. Make them memorize multiplication tables, and them teach them how to do "long multiplication". Assuming that this would actually work in the first place, it is another terrible idea because it teaches the kids absolutely nothing and instead forces them to memorize how to multiply rather than getting an intutive grasp of multiplication. Either way, one can argue that there are downsides, I just think there are less downsides to teaching kids that multiplication is repeated addition.
Repeated addition is merely a way to multiply integers, and this is what children are taught. That we can calculate 3 x 2 by adding: 3 + 3. They don't need to be told that multiplication always is repeated addition and that "it's just done differently, like so-and-so." "This: Code (Text): 15 x 23 -------- 45 30 -------- = 345 is really just addition." What is the point of this? What is the supposed intuition it provides? The addition part of multiplication is reserved to the addition of single-digit integer, and multiplication in general should be thought of as separate from the general aspect of addition. The important part is drawing the connections between multiplication and addition. Like distribution, or that it is possible to count the number of stones ordered like a rectangle by multiplying the number of stones on the sides. How is it better to say "this is really just the same thing"? What something is is not equivalent to how it formally can be defined, and formal definitions is the last thing children should be taught. When they calculate the area of rectangles or multiply rational numbers in decimal form, remembering the repeated-addition feature of multiplication is nothing more than a hinder for good intuition. What is added in 2 m * 3 m? What is added in 0.23 * 0.35 ?
I agree that multiplication is not repeated addition. But I think it's the best to teach first-graders that it is repeated addition, because it will be much easier to grasp and to work with. I challenge everybody here to go to 6-year olds and explain multiplication without referring to repeated addition, in my opinion it's not possible... Also, if you look back historically, then multiplication really did began as a shorthand to repeated addition. It is only much later that they multiplied by other numbers. So there's no lying if you tell children that multiplication of integers is repeated addition. Of course, once you end up doing fractions, then the entire repeated addition story collapses, and we should tell the children that. But not before they're used to multiplying...
The route must obviously go through addition in basic multiplication, but that is not the point. This only part of learning how to multiply, not how to multiply in general. After learning basic multiplication at heart one can start learning multiplication in general. And then draw connections. This is in my opinion intuitively better, because all along the line even up to university mathematics multiplication and addition are the two main algebraic operations with important relations to each other. Why confuse the two with each other for years?
I think some of you guys are confusing "non-intuitive" or "awkward" with "logically absurd". There is nothing logically absurd about adding something together "i" times; gongyae is right, it's merely non-intuitive. Same for 0.23 x 0.35; the equivalent addition is not absurd, just awkward. The point is theoretically important because if, in any branch of mathematics (say, arithmetic), it transpires that multiplication can theoretically always be reduced to an operation of addition, that means multiplication is logically redundant (vide Occam's Razor). In everyday life, of course, some such operation as this: 36 12 ---- 72 360 ---- = 432 === - is far more convenient, in the practical way, than adding 36 twelve times. But the calculation is merely a conjuring trick; the application of certain mental short-cuts which are known to give a correct result. It does not suffice to prove that there really is some mystical entity called "multiplication", of the same logical status as "addition", though different in kind. How do you know that six twelves are seventy-two, without calculating it? Because you learned your multiplication tables in primary school. Could you think of a way to PROVE to a child that 6 x 12 = 72 without resorting to addition? I suggest that addition is the more logically fundamental operation (at least in arithmetic) because a) A multiplication operation is always theoretically reducible to an addition operation; b) An addition operation is never reducible to a multiplication operation, except by employing formal rules which are themselves ultimately reducible to rules of addition.
"Of course, once you end up doing fractions, then the entire repeated addition story collapses... " I really don't think so, Micromass. Try it on piece of notepaper. If you try multiplying (say) one and three-quarters by five eighths, you will find that every step can be reduced to an addition (although, towards the end, you would need to use subtraction to reduce the denominator to its lowest number).
What are you talking about? "Mystical entity called multiplication"? No one are talking about any mystical entities here. Not the "same logical status" as addition? Define logical status.