Why Do Algebraic Rules Not Work the Same with Matrices?

Mdhiggenz
Messages
324
Reaction score
1

Homework Statement


Explain why each of the following algebraic rules will not work in general when the real numbers a and b are placed by nxn matrix A and B
(a+b)2=a2+2ab+b2

This question is a bit confusing to me and I have no idea how to even start this problem


Homework Equations





The Attempt at a Solution

 
Physics news on Phys.org
Mdhiggenz said:

Homework Statement


Explain why each of the following algebraic rules will not work in general when the real numbers a and b are placed by nxn matrix A and B
(a+b)2=a2+2ab+b2

This question is a bit confusing to me and I have no idea how to even start this problem


Homework Equations





The Attempt at a Solution


Try a couple of small examples: let A and B be some specific 2x2 matrices. Compute both sides and see if you get equality or not.
 
Ray Vickson said:
Try a couple of small examples: let A and B be some specific 2x2 matrices. Compute both sides and see if you get equality or not.

So let's say if I make A the identity matrix, and B some random matrix. I would computer one side (a+b)2 where a=A and b=B?

and do the same for the other side?

Thanks
 
Mdhiggenz said:
So let's say if I make A the identity matrix, and B some random matrix. I would computer one side (a+b)2 where a=A and b=B?

and do the same for the other side?

Thanks

Don't pick one of them to be the identity; pick two different matrices, neither one and identity matrix. (Try it both ways to see why; that is, try it first when one is the identity, and then do it again with two non-identity matrices. Look at what happens, and that will give you a clue for solving the problem.)
 
Two matrices I wouldn't ever use for trying to find a counterexample would be the Identity and Zero matrices. They have too many special properties of their own.
 
Lckurtz: thanks for the tip. You think you can give me a matrix that you would always use to prove, something simple.

And Ray here is my work, would you agree that the reason they could not be equal is due to the 2ab term?

2ajxki0.jpg
 
Mdhiggenz said:
Lckurtz: thanks for the tip. You think you can give me a matrix that you would always use to prove, something simple.

And Ray here is my work, would you agree that the reason they could not be equal is due to the 2ab term?

[ IMG]http://i45.tinypic.com/2ajxki0.jpg[/PLAIN]
In particular compare AB with BA .
 
So what I did was incorrect?
 
Mdhiggenz said:
So what I did was incorrect?
What you did was fine, and yes, it is the 2AB that messes you up.

Did you compare AB with BA ? --- to see why it's the 2AB that messes you up ?
 
  • #10
LCKurtz said:
Two matrices I wouldn't ever use for trying to find a counterexample would be the Identity and Zero matrices. They have too many special properties of their own.

Mdhiggenz said:
Lckurtz: thanks for the tip. You think you can give me a matrix that you would always use to prove, something simple.

No, there isn't a single special matrix. In the given example, as I think you are now aware, the point is that matrices don't generally satisfy ##AB=BA## so you don't get ##2AB## in the expansion when you multiply them out. So if you are looking for a counterexample, you don't want to choose, for example, A = the zero matrix, because it commutes with everything just because it always gives all zeroes. That is a general idea: If you are trying to prove some identity doesn't always work, look for an example by choosing variables that are unlikely to "accidentally" work because they have special properties. So if I were looking for examples in the current case, I wouldn't use the zero matrix, the identity matrix, a lower triangular matrix or a symmetric matrix. I would try a more "random" matrix that doesn't have any special properties. Does that make sense to you?
 
  • #11
SammyS said:
What you did was fine, and yes, it is the 2AB that messes you up.

Did you compare AB with BA ? --- to see why it's the 2AB that messes you up ?

I did AB and compared it to BA they are different. This I understand, what I don't understand is what that has to do with 2AB?

Or is what your saying is that in order to show that something is 100% true we must look at both cases. For instance A2 no matter what will be A2. Same thing goes with B, and A+B will be the same as B+A. However 2AB does not equal 2BA therefore it can't possibly be equal.
 
  • #12
LCKurtz said:
No, there isn't a single special matrix. In the given example, as I think you are now aware, the point is that matrices don't generally satisfy ##AB=BA## so you don't get ##2AB## in the expansion when you multiply them out. So if you are looking for a counterexample, you don't want to choose, for example, A = the zero matrix, because it commutes with everything just because it always gives all zeroes. That is a general idea: If you are trying to prove some identity doesn't always work, look for an example by choosing variables that are unlikely to "accidentally" work because they have special properties. So if I were looking for examples in the current case, I wouldn't use the zero matrix, the identity matrix, a lower triangular matrix or a symmetric matrix. I would try a more "random" matrix that doesn't have any special properties. Does that make sense to you?

Absolutely thanks for the clear explanation.
 
  • #13
Mdhiggenz said:
I did AB and compared it to BA they are different. This I understand, what I don't understand is what that has to do with 2AB?

Or is what your saying is that in order to show that something is 100% true we must look at both cases. For instance A2 no matter what will be A2. Same thing goes with B, and A+B will be the same as B+A. However 2AB does not equal 2BA therefore it can't possibly be equal.

Re. your comment in red above. Expand ##(A+B)(A+B)## symbolically the long way (distributive law) being careful about the order. What do you get?
 
  • #14
Not quite sure what you mean, it says math processing error
 
  • #15
Do it just by manipulating the A and B's with a pencil and paper. Expand (A+B)(A+B).
 
  • #16
LCKurtz said:
Do it just by manipulating the A and B's with a pencil and paper. Expand (A+B)(A+B).

Oh I see now A2+AB+BA+B2.

So making AB+BA=2AB that is implying that AB=BA which means you would be able to add them and get 2AB which is incorrect because AB does not necessarily equal BA.

?
 
  • #17
Yup, that's right.

Multiplication of real numbers has a property called commutativity, which means the order you multiply two numbers in doesn't matter, e.g. ##2\times 4 = 4\times 2##. Hence, if a and b are real numbers, you can always say that (a+b)2 equals a2+2ab+b2.

Matrix multiplication doesn't have this property, as you found. When you're multiplying matrices, you have to remember that the order in which they appear is important. You can't just move them around like you are used to from working with real numbers.
 
  • #18
It is worth considering all of this in light of vector spaces. Scalars (aka - 1-vectors, aka - 1x1 matrices) don't exist in the same vector spaces in which N x N matrices exist (where N does not equal 1). Addition and scalar multiplication are defined distinctly in each distinct vector space. Matrix multiplication is something else altogether. So, when studying linear algebra, you can't just assume that "plus" means "plus" or that "times" means "times." You can't even assume that "zero" means "zero." The addition operation is defined differently for each vector space. Thus, the idea of adding a 3x1 vector to a 2x2 matrix together doesn't make sense, because those two elements exist in different vector spaces; therefore, "addition" between two such elements is not even defined. Likewise, if v exists in a vector space V, then 0v = 0 is true, but the first 0 and the second 0 are not the same. The first 0 is a scalar. The second 0 is the null vector that exists in the vector space V.

Again, matrix multiplication is very different than the multiplication between two scalars, and it is also different than scalar multiplication in any given vector space. Matrix multiplication actually maps matrices from one vector space to another vector space (or to the same vector space when you have an NxN matrix multiplied by another NxN matrix). It should therefore not surprise us that matrix multiplication is not commutative.

Here's a challenging question: Why do you suppose that matrix multiplication is defined as it is? That is, what was the motivating reason to define matrix multiplication the way it has been defined? But perhaps that is a question for a different thread.
 
Back
Top