Are there really 4 fundamental math operations?

• B
• Featured
It's strange to me that multiplication and division are considered fundamental operations.
It makes sense for me that addition is a fundamental operation but multiplication is just like a function or algorithm that takes several numbers and apply additions. This is true even for multiplication with real number.

ProfuselyQuarky and epenguin

fresh_42
Mentor
It's strange to me that multiplication and division are considered fundamental operations.
It makes sense for me that addition is a fundamental operation but multiplication is just like a function or algorithm that takes several numbers and apply additions. This is true even for multiplication with real number.
First of all: addition and subtraction is the same operation, as multiplication and division is also the same operation. The latter are only the reverse of the former. Next, there are objects, which we can multiply but not add, so multiplication isn't necessarily short for multiple additions.

An algorithm are both, because it gives as a rule to make a new element out of two others.

First of all: addition and subtraction is the same operation, as multiplication and division is also the same operation. The latter are only the reverse of the former. Next, there are objects, which we can multiply but not add, so multiplication isn't necessarily short for multiple additions.

An algorithm are both, because it gives as a rule to make a new element out of two others.

Can you give me examples of those objects that we can only multiply but no add?

fresh_42
Mentor

Can you give me examples of those objects that we can only multiply but no add?
This depends a bit on what I may use.

Regular matrices can be multiplied. If you add them, they might not be regular anymore. The most simple example of them is to stretch or compress a line. You can do two stretches and get a new stretching factor, which is a multiplication. If you add them, you might get only ##\{0\}## which isn't a line anymore (stretching by the same amount in opposite direction).

In general the successive application of two functions is written as a multiplication, e.g. permutations on an ordered set or rotations in geometry.

It's a bit problematic to give examples, because if you have only one operation, then it doesn't matter how you write it: ##a+b\, , \,a \cdot b\, , \, a \circ b\, , \, a \diamondsuit b## and so on. So the examples for multiplications only must either leave the set on which the operation is defined if added, as in my first example, or simply just usually written as a multiplication, as in the example with the permutations.

From a mathematical point of view, addition and multiplication are just binary operations with some properties. On sets like numbers it just happens that you can define both. There are also other "fundamental" operations like ##\cap## and ##\cup## on subsets, or multiplications that aren't commutative or have other strange properties like ##a \cdot a = 0## or ##a\cdot a = a \neq 1##. All of them make some sense somewhere, physics and biology come to my mind.

epenguin
Homework Helper
Gold Member
But if the question is about ordinary numbers, ordinary/everyday/school airthmetic why would mountains (and I) be wrong?

mathman
other than for integers, multiplication is more than repeated addition.

fresh_42
Mentor
But if the question is about ordinary numbers, ordinary/everyday/school airthmetic why would mountains (and I) be wrong?
Nobody said anybody was wrong. In the end it comes down to what can be considered "fundamental". In a world without zero, multiplication might be the fundamental operation. I understand that addition is closely related to counting, and therefore appears to be more natural than the measurement of the size of a field, which multiplication would be needed for. However, as long as we don't define "fundamental", e.g. historical, it will be a matter of taste.

GrahamD and FactChecker
epenguin
Homework Helper
Gold Member
other than for integers, multiplication is more than repeated addition.
That is not some minor exception! It is in any case the one I brought up.
What is the simplest extension or generalisation of the integer concepts for which what you say is true?
Nobody said anybody was wrong. In the end it comes down to what can be considered "fundamental". In a world without zero, multiplication might be the fundamental operation. I understand that addition is closely related to counting, and therefore appears to be more natural than the measurement of the size of a field, which multiplication would be needed for. However, as long as we don't define "fundamental", e.g. historical, it will be a matter of taste.
If so, it seems to me some people have strange tastes.

I thought mathematics was supposed to be assuming the minimum and deriving the maximum. Why therefore assume laws of multiplication if you can prove them from those of addition?

fresh_42
Mentor
That is not some minor exception! It is in any case the one I brought up.
What is the simplest extension or generalisation of the integer concepts for which what you say is true?
##\mathbb{Z}[t]## for a transcendental ##t## and ##\mathbb{Z}[a]## for an algebraic ##a##
If so, it seems to me some people have strange tastes.
That's true without any doubt.
I thought mathematics was supposed to be assuming the minimum and deriving the maximum. Why therefore assume laws of multiplication if you can prove them from those of addition?
Because there are rings, which carry both structures at the same time. And the multiplicative structure in algebras are usually far from being the one in rings and fields of numbers. The notation ##(A \cdot B)(v)=A(B(v))## is certainly a convention, one that is induced from the fact, that ##(A+B)(v)=A(v)+B(v)## is also possible, although not always defined. But this convention is easier to grasp as if we wrote ##A\diamondsuit B## (and easier to write).

But I admit, if you consider
$$\textrm{ counting } \rightarrow (\mathbb{N},+) \rightarrow (\mathbb{N_0},+) \rightarrow (\mathbb{Z},+) \rightarrow (\mathbb{Z},+,\circ) \rightarrow \ldots$$
as a canonical way of development, then addition is "more fundamental" than multiplication, just because it comes first in this construction and very likely also in anthropology.

What's the thoughts on this:

Addition is the only fundamental mathematical operation.
That's not to say that others are derived from this, in fact, they are not, however, addition is the only one that can always be empirically and experimentally verified absolutely in all cases.

The invention of multiplication also introduces caveats and exceptions and special cases for commutation or irrationality etc.

[...] addition is the only one that can always be empirically and experimentally verified absolutely in all cases. [...]
Empirical and experimental evidence indicates that the addition of velocities V1 and V2 must always fall short of c...

To the original questions, are their really only 4 fundamental operations, there have been many responses. Some would say "4? Of course not at least 2. The inverse operations are just the normal operation on the inverse elements". And there were other comments on saying the counting addition is the most fundamental and others on saying on how velocities add are fundamental. To all this I might agree, except that the way hyperbolic tangents add (which is how the relativists add velocities) and they way counters add (which is how we normal humans add) are isomorphic to each other so there really shouldn't be a philosophical dispute towards their difference. They are of the same type of addition.

Addition is the only fundamental mathematical operation.
That's not to say that others are derived from this, in fact, they are not, however, addition is the only one that can always be empirically and experimentally verified absolutely in all cases.
To this I wholeheartedly disagree. On many levels.
Firstly is it fair the statement addition is the only fundamental? Doesn't that sound perverse? What are we to mean most fundamental? To this question I like this answer.
##\mathbb{Z}[t]## for a transcendental ##t## and ##\mathbb{Z}[a]## for an algebraic ##a##
addition is "more fundamental" than multiplication, just because it comes first in this construction and very likely also in anthropology.
I like this answer a whole lot.

Secondly, addition needs no empirical or experimental verification. Consider apples on a table. Remove an apple. Or put an extra apple. Our intuition, our mode of experience naturally "sees" numbers and their addition in this way. The making vigorous of this intuition is what integers are, and how their addition is defined. So are they empirical? Experimental? No I would say they are more the product of the way humans experience the natural world, then the phenomenology of the natural world.

But most importantly, thirdly, I disagree when you say the other operations are not derived from addition. And this is one is the important one. They can be. And this I feel is actually deep.

Okay so consider an abelian group. "Addition" refers to the product in this group. Why must the group be abelian? Good question. This too is deep, it leads to the question of where groups come from. But I ignore it. Anyways, consider this structure because it encapsulates the bare minimum for our notions of addition. Can we invent multiplication? Yup. What is multiplication? This is one deep point. It is a homomorphism from the group to itself. It is an endomorphism. This means in the intuition of multiplication, that they are maps from numbers to numbers that respect the distributive law. But there might be more multipliers then there are numbers because there may be more of these maps then the elements in the set. Because we don't want that, we pose a further restriction on our abelian group. It must be generatable from a single element (that is it must be 1 dimensional). Then the number of endomorphisms and the number of elements in the set are the same. Then we can assign each map to a number and when we multiply two numbers, we really look at one of the numbers as a map, and the other number as the input. The output is the multiplication. Or if this isn't vigorous enough then what you can do is look at the collection of the endomorphisms. They are isomorphic to the original group under the addition operator. However there is a further operation amongst functions that is not present amongst numbers. Composition (this too is deep. Any operation we are considering can be viewed as a composition). The composition between these functions define multiplication.

epenguin
Homework Helper
Gold Member
Sorry for coming back late. I think though we have lost the point of the original question. Why is multiplication and division, let's just say of integers for the moment, defined to be fundamental operation, independent of rather than deriveable from the laws or definition of ordinary addition of ordinary integers?

Is not the ordinary understanding of the meaning of multiplication n×m 'take n sets, each of m objects and and add them all up'? So it seems to me that all you need to understand at first is addition, or maybe not even understand it but have axioms for it, know how to do it. That given addition, you can define multiplication in terms of it, and that you can prove what I have seen given as axioms for multiplication. The commutative law for example.

With my definition above I am not like Wells' anti-hero Mr Polly, who could never remember whether it was six sevens or seven eights that made 56, and had no way to find out. Whereas I can find out using the above definition, I can put down a row of seven counters, do this eight times, I can make a nice rectangle of counters to be sure and so check I have the right number, 8, of rows, and then count up the total number I have, as long as I can count that high. I can do it with Hindu-Arabic numbers by moving counters so as to form rows of 10. I can use the principle to write a computer program for multiplication, and I suppose somebody has already done this and I am using it. If my counters a square they can fit together and form rectangles, I can calculate areas.

Is this not what everybody understands by multiplication?

(As for addition, this seems to me to be related to the concept of permanence. I can combine two heaps of potatoes or coins and consider that I still have the same number of these objects as before, then I can separate them in different ways into a different heaps and consider the total constant. I can recombine the heat and recover the original number I had. This concept is not innate I believe and is only grasped about age 4-5. This idea of permanence is fundamental to economic exchanges and money. It also comes up in chemistry where we consider atoms to be for practical purposes permanent. (I'm always explaining to students the stoichiometry that causes them such problems is only arithmetic that they have forgotten, true then division, ratios and proportions come into it.)

Is there anything wrong with this, anything I am overlooking? Wouldn't civilisation collapse without this understanding?

Khashishi
It's arbitrary what we call a fundamental operation.

epenguin
Homework Helper
Gold Member
It's arbitrary what we call a fundamental operation.
Being able or unable to derive its rules from those of a smaller system is not an arbitrary question.

Khashishi
Is the "min" operation fundamental?
min(4, 6) = 4
Is the "modulo" operation fundamental?
7 mod 3 = 1
Is the number of combinations?
5 nCr 3 = 10
Why or why not? I assert that we call addition, subtraction, multiplication, division fundamental only because they are simple to learn and have been historically of high importance.

Aufbauwerk 2045
I like the digital computer viewpoint. Express numbers in binary. Build a full adder from a few NAND gates. Subtract by adding negative numbers when you use 2's complement representation. Etc.

epenguin
Homework Helper
Gold Member
Is the "min" operation fundamental?
min(4, 6) = 4
Is the "modulo" operation fundamental?
7 mod 3 = 1
Is the number of combinations?
5 nCr 3 = 10
Why or why not? I assert that we call addition, subtraction, multiplication, division fundamental only because they are simple to learn and have been historically of high importance.
I see now what you mean by arbitrariness of calling an operation fundamental.
Not sure about the first one, but the second seems to require ordinary addition and subtraction, while the third requires ordinary multiplication. If so, then these operations are more fundamental than yours.
But if not, the OP's question still remains - just instead of calling the operations "fundamental" call them "the historically of high importance" operations, the question of whether they are independent awesome can be derived from others remains.

Last edited:
pwsnafu
Being able or unable to derive its rules from those of a smaller system is not an arbitrary question.
In Peano, addition is defined through the successor operator. So, addition is not fundamental either.

epenguin
Homework Helper
Gold Member
In Peano, addition is defined through the successor operator. So, addition is not fundamental either.
Yes but a simple question was asked in #1 so how have we got to #20 and people insist on talking about something else? It begins to sound like embarrassment or evasion.

Can we derive multiplication from addition? was the question.

Whether we can derive addition from something else is a different question, on the face of it. Interesting, informing us what numbers really 'are', it would justify an article or a tutorial here, but on the face of it the question I asked would follow after, once the rules of the arithmetic are established. I am accepting for the moment that they are.

Biker
I would say that multiplication can be considered repeated addition, just like we can say an integral is just an anti-derivative. It's true that an integral is an anti-derivative, but it has properties that are far more reaching than just being an anti-derivative. So, just like it's okay to think of an integral as an anti-derivative, but not to force it upon it. Just like multiplication is thought of to be repeated addition, it isn't the only thing it does.

Let's take the integers as our set and look at how addition, and multiplication are different. Under group axioms, we need our set, when two elements are operated on, to be: closed, associative, have an identity, and have an inverse. So, looking at addition, we see that: It's closed, it's associative, it has an identity, which is 0, and it has an inverse, which would just be -a, for some element a.

Now let's look at multiplication as repeated addition. Well, since addition was closed, repeated addition can also be closed. It's associative, since addition was. The identity doesn't change, it's still 0, and the inverse also doesn't change, it's still going to be some -a.

Now let's look multiplication as not repeated addition. Well, it's closed under the integers, it is associative, it has an identity, which would be 1. Problem 1, in my eyes. The next one would be that multiplication has no inverses in the Integers! There is nothing I can multiply an integer by (other than 1) to get back my identity.

And that's where we can see that multiplication isn't actually repeated addition, because not only do I have different identity elements, but one of the operators gives me a group structure, but the other one isnt even a group at all! For those reason, I would say multiplication is *not* repeated addition, but merely a good way to think about it. But then again, it also depends on what you're considering as axioms.

Maths is only a theory anyway but it does work very well, except for that complicated zero thing.

It's strange to me that multiplication and division are considered fundamental operations.
It makes sense for me that addition is a fundamental operation but multiplication is just like a function or algorithm that takes several numbers and apply additions. This is true even for multiplication with real number.
Modular forms are also fundamental operations.

Baluncore
2019 Award
I think it is good that elementary school students are taught fairy tales.
To be told that there are only four fundamental arithmetic operations makes a child's introduction to mathematics tolerable. The majority will feel safe and never need to question the truth. They will be able to work out the length of a long piece of string, the area of their garden, or the cost of 5.25 gallons of fuel at so much per gallon, and work out the change from the bank note they hand over. They might even file a tax return.

Those who go on to further mathematics will quickly forget the fairy tale as they are drawn into the fascinating complexity of reasoning and reality.

Is Zero considered to a member of the set of natural numbers by advanced students of math?
Intuitively it strikes me that the absence of a thing is not included in idea of 'a quantity of things'.
For example you cannot accuse somebody of stealing zero amount of cash.

Last edited: