# A question on vector summing

1. ### mech-eng

196
Hi, we can not sum a colon vector to a row vector, both with same number of elements. But, say A=[1 2 3] here first elements represents i, second j, and third k and say B=[1
3
5], here also first element represent i, second j, and third k. According to mathematical rules we can not sum them but we always sum them in physics. So I can not understand where I am wrong?

(By B, I tried to write a column vector.)

2. ### Simon Bridge

15,471
We do not sum column with row vectors in physics.
Cartesian i-j-k vectors always have the same representation (either row or column, not both) - you have to pick a representation when you set up the geometry just like you have to pick a direction to call positive.

Can you show me a reference where an authoritative text or a paper has summed row and column vectors?

3. ### mech-eng

196
I am asking the reason why we can not sum them. Let's say we want to sum A=1i +2j +3k
with B=2i + 3j + 5k . We can sum them easily by adding each element to each other but I ask
about If we write one of them as column and another as row why we can not sum them?
what is this reason? We just have to assume right of the equal sign row or column.

4. ### verty

1,950
Because you can't sum matrices that have different shapes. It's not a defined operation.

5. ### Matterwave

3,865
Because there is no utility in defining such a sum. It's much simpler to always represent a displacement vector as a row vector or a column vector, not both. You can define whatever you want in math, but unless it's useful, don't expect other people to follow you.

6. ### HallsofIvy

41,260
Staff Emeritus
You cannot add column and row vectors because they lie in different vector spaces!

Given any vector space, V, its dual space, V*, is the set of all linear functions from V to the real numbers. It can be shown that, for V finite dimensional, the dual space is isomorphic to the original space using the isomorphism that maps the basis vector, $v_i$, to the to the linear function, $v_i^*$ that maps $v_i$ to 1 and maps all other basis vectors to 0. A way of representing that is to represent the vector $v= \sum a_iv_i$ as the "column vector" having $a_i$ as its "ith" component. We can then represent the function that maps the ith basis vector $e_i$ to 1 and all others to 0 as the row vector having "1" in its ith position. In that case, $f(v)$ is the matrix product of the row matrix representing f and the column matrix representing v.

7. ### gopher_p

575
I don't see any reason why we couldn't define an "addition-like" operation between row and column vectors in the way which you seem to have in mind. The issues, which other posters have mentioned (or at least alluded to), are (1) whether this operation is useful and (2) whether this operation can properly be called a sum.

Considering the matrix product is defined on pairs of matrices with different shapes, and considering that this product is useful (though maybe not immediately so), and considering that we call it a product despite the fact that its inputs lie in different algebraic structures, I don't see any reason why your operation can't be called a sum and why it's necessarily not useful.

8. ### DrewD

515
Apples and oranges is the short answer. HallsofIvy has the long answer. The inbetween answer is what matterwave says, but I would add that if you have vectors

##A=\left[\begin{array}{1}a_1\\a_2\\a_3\end{array}\right]##

and

##B=\left[b_1\ b_2\ b_3\right]##

you can always add ##A^T+B## to get a row vector or ##A+B^T## to get a column vector. Why you would want to do that beats me, but if it ever happened, it is easy enough to do. What would ##A+B## equal? A row vector? A coumn vector? Neither makes sense. Sure, we could define ##A+B\equiv A^T+B##, but why?

Note also that one often defines ##i=[1\ 0\ 0]##, ##j=[0\ 1\ 0]##, and ##k=[0\ 0\ 1]## or the column versions.

9. ### Simon Bridge

15,471
And there you have it. Kinda.
The answers basically boil down to "because it is against the rules".
You want to know "how come it's against the rules?"

To understand that we need to back up a bit:

If you want to add two things together, they have to be the same kind of thing.
It does not make sense to add 2 apples to 3 pears, but it works if you change what you are adding to "fruit".
2 apples = 2 fruit, 3 pears = 3 fruit, and it makes sense to say that 2 fruit + 3 fruit = 5 fruit.

In other words - the concept of addition only make sense if we are adding the same sort of thing.

Most of the time this sort of conversion is second nature: we do it without thinking about it.
It is part of the everyday process of addition - but it is not part of the mathematical process.
In maths, the conversion is not taken for granted - just because there are so many different ways of doing it.

If I wanted to add apples to carrots, it won't work to do the "fruit conversion" I did before... so I cannot make "convert to fruit" as part of the definition of "addition".

Now back to what you were doing:
Your approach for adding a column vector to a row vector is, basically, to convert them to i-j-k vectors first, then add them. This is the same as calling the apples and pears "fruit" in the above example - you have converted the things being added into things that can be added.

Careful though:
the matrix form of vectors do not have to be based on the i-j-k vectors.
So if A is based on i-j-k and B is based on some other three vectors, your addition approach won't work quite as easily as you showed.

10. ### docarlson

2
Make a rule that allows such an operation and explore its consequences.

For instance, if the row vector (rv) is 1x3 and the column vector (cv) is 2x1, create 2x3 matrices from rv and cv by repeating rv into column dimension of cv and repeating cv into the row dimension of rv.

Who knows, next theory of quantum gravity! Or not.