Proof of the Left Distributive Law for matrices.

In summary: That's exactly what the sigma sign represents - the summation of the products of the corresponding entries in the first row of A and the first column of B. This is necessary because matrix multiplication involves multiplying and adding multiple terms, not just two. Wouldn't the proof be correct without the sigma signs?No, the sigma signs are crucial in showing that the left distributive law holds for matrices. Without them, the proof would not be complete.How do they even derive the sigma or sums from the definition of matrix multiplication?As mentioned before, the sigma sign represents the summation of products. In matrix multiplication, we are essentially computing the dot product of two vectors, and this involves multiplying and adding multiple terms together. The
  • #1
Faizan Sheikh
11
0
Proof of the "Left Distributive Law" for matrices.

******A(B + C) = AB + AC******

Again we show that the general element of the left hand side is the same as the right hand side. We have

(A(B + C))ij = S(Aik(B + C)kj) definition of matrix multiplication

= S(Aik(Bkj + Ckj)) definition of matrix addition

= S(AikBkj + AikCkj) distributive property of the real numbers

= S AikBkj + S AikCkj commutative property of the real numbers

= (AB)ij + (AC)ij definition of matrix multiplication

where the sum is taken from 1 to k. S is the Sigma sign.
______________________________________________________________

Can somebody tell me why there is a sigma sign in the first step? Wouldn't the proof be correct without the sigma signs? How do they even derive the sigma or sums from the definition of matrix multiplication? Say we have a matrix A, and a matrix B, and say AB is defined, then the element in the first row and the first column can be computed by multiplying and adding the corresponding entries in the first row of A, and the first column of B. Now, for instance, k=1, then wouldn't AikBkj become (Ai1)(B1j), would simply mean the first column from A multplied by the corresponding entries from column 1 of B! But what is that supposed to mean? Shouldn't it have been the first row of A, and first column of B? Please help as soon as possible.:smile:
 
Physics news on Phys.org
  • #2
Here 'k' is not some integer. It is the number of columns in first matrix ( which is equal to number of rows in second matrix)

It means an (i,j)th element of product matrix = sum of 'k' products of (i,k)th element of first matrix and (k,j)th element of second matrix.

| 1 2 3 | | 3 0 |
| 9 0 3 | x | 2 8 |
| 7 4 |

(1,1) element = 1x3 + 2x2 + 3x7 (k=3). So he used Sigma (S) in the proof.
 
  • #3
come on. this is just a big array of copies of the distributivity law for dot product.

since a(b+c) = ab + ac, where these are numbers, multiplication by oner number is linear, and since the sum of linear maps is linear, the dot product is also linear, and a matrix product is nothing but several dot products. done.


aiyee I am blinded!:bugeye:
 
Last edited:
  • #4
Can somebody tell me why there is a sigma sign in the first step?
You just did:

"then the element in the first row and the first column can be computed by multiplying and adding the corresponding entries in the first row of A, and the first column of B"
 
Last edited:

What is the Left Distributive Law for matrices?

The Left Distributive Law for matrices states that the multiplication of a matrix by a sum of two matrices is equal to the sum of the multiplication of the matrix by each of the individual matrices. In other words, if A, B, and C are matrices of appropriate sizes, (A * (B + C)) = (A * B) + (A * C).

Why is the Left Distributive Law important in matrix operations?

The Left Distributive Law is important because it allows us to simplify and rearrange matrix equations. It also helps us to understand the relationship between matrix multiplication and addition.

Can you provide an example of the Left Distributive Law for matrices?

Yes, for example, if we have the matrices A = [1 2; 3 4], B = [5 6; 7 8], and C = [9 10; 11 12], then (A * (B + C)) = (A * B) + (A * C) would be true. The result of (A * (B + C)) would be [33 38; 75 86], and the result of (A * B) + (A * C) would also be [33 38; 75 86].

Is the Left Distributive Law only applicable to matrices?

No, the Left Distributive Law can also be applied to other mathematical operations involving vectors, complex numbers, and certain algebraic structures.

How can the Left Distributive Law be proved for matrices?

The Left Distributive Law can be proved using the properties of matrix multiplication and the distributive property of real numbers. By expanding both sides of the equation and applying the properties, we can show that they are equal. This proof can also be generalized for matrices of any size as long as they are compatible for multiplication.

Similar threads

Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
15
Views
4K
Replies
13
Views
2K
Replies
4
Views
1K
Replies
1
Views
548
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
Replies
25
Views
985
  • General Math
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
Back
Top