Proving Commutativity of Matrices: A + B and (A+B)^2 Equations Validity

  • Thread starter Thread starter Physicsissuef
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary

Homework Help Overview

The discussion revolves around proving the validity of specific matrix equations involving commutative matrices A and B. The equations in question are (A+B)^2 = A^2 + 2AB + B^2 and (A+B)^2 = A^3 + 3A^2B + 3AB^2 + B^3.

Discussion Character

  • Exploratory, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the initial steps for proving the equations without selecting specific matrices. There is a suggestion to consider the properties of matrix multiplication and the implications of commutativity. One participant proposes a method of expanding (A+B)(A+B) to derive the first equation, while another confirms the approach and suggests further multiplication for the second equation.

Discussion Status

The discussion is active, with participants exploring different methods to approach the proof. Some guidance has been provided regarding the expansion of matrix products and the importance of commutativity, though no consensus has been reached on the complete proof process.

Contextual Notes

Participants are working under the constraint of proving the equations rather than verifying them with specific examples. There is also a mention of a separate problem regarding determinants of matrices, indicating a broader context of matrix properties being discussed.

Physicsissuef
Messages
908
Reaction score
0

Homework Statement



Prove that if the matrices A and B commute (AB=BA) than the equations:

[tex]a) (A+B)^2=A^2+2AB+B^2 ; b) (A+B)^2=A^3+3A^2B+3AB^2+B^3[/tex]

are valid.

Homework Equations




The Attempt at a Solution



How will I start solving this task? Should I chose two commute matrices ? How will I know what matrices to choose?
 
Physics news on Phys.org
You don't have to pick any two arbitrary commutative matrices. In fact if you did, it would have been wrong because you are asked to prove it, not verify it.

For a start, note that in matrix multiplication, the order usually matters, AB is not the same as BA. Let C = A+B. Now you have (A+B)^2 = C(A+B). Multiply the matrix on the left to the one on the right component wise, you'll get CA + CB. Replace C with A+B, then multiply it on the right. Then make use of the fact that AB = BA to get the above.
 
And can I solve it like this:

[tex](A+B)(A+B)=A^2+AB+BA+B^2[/tex]

AB=BA

[tex](A+B)^2=A^2+AB+AB+B^2=A^2+2AB+B^2[/tex]

?
 
Yes, of course!

There is a typo in your second question: it obviously should be (A+ B)3.

Now that you know that (A+ B)2= A2+ 2AB+ B2, multiply again!
 
Ok. Thanks. I have another (I believe easy), but I can't figure out.

Choose two quadric matrices A and B (matrices with same number of rows and columns):

a)2 rows, 2 columns ; b)3 rows, 3 columns

and check the identity det(AB)=detA * detB

I know how to find detA and detB but what is detAB?
 
Uh what do you mean by what is detAB? det(AB) is the determinant of the matrix obtained by multiplying matrix A to B on the left.
 
Oh, I think I know what is the point.
det(AB), I should first find A*B and then find the determinant.

det(A) I should find the determinant of A

det(B) I should find the determinant of A

And then check det(AB)=det(A)det(B)
 

Similar threads

Replies
4
Views
2K
  • · Replies 25 ·
Replies
25
Views
2K
  • · Replies 18 ·
Replies
18
Views
4K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 69 ·
3
Replies
69
Views
11K
Replies
24
Views
4K
Replies
9
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
7K
  • · Replies 5 ·
Replies
5
Views
2K