Proof of the Left Distributive Law for matrices.

  • Context: Graduate 
  • Thread starter Thread starter Faizan Sheikh
  • Start date Start date
  • Tags Tags
    Law Matrices Proof
Click For Summary

Discussion Overview

The discussion revolves around the proof of the Left Distributive Law for matrices, specifically the equation A(B + C) = AB + AC. Participants explore the mathematical reasoning behind the proof, the role of the sigma notation in matrix multiplication, and the implications of linearity in matrix operations.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant presents a proof of the Left Distributive Law by showing that the general element of the left-hand side matches the right-hand side using definitions of matrix multiplication and addition.
  • Another participant questions the necessity of the sigma notation in the proof, suggesting that the proof could be valid without it and seeks clarification on how the sigma notation relates to matrix multiplication.
  • A different participant explains that 'k' in the sigma notation represents the number of columns in the first matrix, which corresponds to the number of rows in the second matrix, and illustrates this with an example of matrix multiplication.
  • Another participant argues that the proof essentially reiterates the distributive property of the dot product, asserting that since matrix multiplication involves linear maps, the distributive law follows naturally.
  • A follow-up post reiterates the question about the sigma notation, indicating a need for further clarification on its role in the proof.

Areas of Agreement / Disagreement

Participants express differing views on the necessity and interpretation of the sigma notation in the proof. While some clarify its meaning, others question its relevance, indicating that the discussion remains unresolved regarding the role of sigma in the context of matrix multiplication.

Contextual Notes

There are limitations in the discussion regarding the assumptions made about matrix dimensions and the definitions of matrix operations, which are not fully explored. The relationship between linearity and matrix multiplication is also touched upon but not deeply analyzed.

Faizan Sheikh
Messages
11
Reaction score
0
Proof of the "Left Distributive Law" for matrices.

******A(B + C) = AB + AC******

Again we show that the general element of the left hand side is the same as the right hand side. We have

(A(B + C))ij = S(Aik(B + C)kj) definition of matrix multiplication

= S(Aik(Bkj + Ckj)) definition of matrix addition

= S(AikBkj + AikCkj) distributive property of the real numbers

= S AikBkj + S AikCkj commutative property of the real numbers

= (AB)ij + (AC)ij definition of matrix multiplication

where the sum is taken from 1 to k. S is the Sigma sign.
______________________________________________________________

Can somebody tell me why there is a sigma sign in the first step? Wouldn't the proof be correct without the sigma signs? How do they even derive the sigma or sums from the definition of matrix multiplication? Say we have a matrix A, and a matrix B, and say AB is defined, then the element in the first row and the first column can be computed by multiplying and adding the corresponding entries in the first row of A, and the first column of B. Now, for instance, k=1, then wouldn't AikBkj become (Ai1)(B1j), would simply mean the first column from A multplied by the corresponding entries from column 1 of B! But what is that supposed to mean? Shouldn't it have been the first row of A, and first column of B? Please help as soon as possible.:smile:
 
Physics news on Phys.org
Here 'k' is not some integer. It is the number of columns in first matrix ( which is equal to number of rows in second matrix)

It means an (i,j)th element of product matrix = sum of 'k' products of (i,k)th element of first matrix and (k,j)th element of second matrix.

| 1 2 3 | | 3 0 |
| 9 0 3 | x | 2 8 |
| 7 4 |

(1,1) element = 1x3 + 2x2 + 3x7 (k=3). So he used Sigma (S) in the proof.
 
come on. this is just a big array of copies of the distributivity law for dot product.

since a(b+c) = ab + ac, where these are numbers, multiplication by oner number is linear, and since the sum of linear maps is linear, the dot product is also linear, and a matrix product is nothing but several dot products. done.


aiyee I am blinded!:bugeye:
 
Last edited:
Can somebody tell me why there is a sigma sign in the first step?
You just did:

"then the element in the first row and the first column can be computed by multiplying and adding the corresponding entries in the first row of A, and the first column of B"
 
Last edited:

Similar threads

Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
13
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K