Product of Two Summations for Matrices and Vectors

AI Thread Summary
The discussion focuses on proving the equality of two summations involving matrices A and B and a position vector x. Participants explore methods to demonstrate the interchangeability of summations and the associative property of matrix multiplication. One user attempts to expand and group terms but struggles to formalize the proof, while others suggest using known proofs from resources like ProofWiki. The conversation highlights a preference for understanding the application of these principles over rigorous algebraic proofs. Ultimately, the discussion emphasizes the importance of grasping the foundational concepts of matrix operations and summations.
PWiz
Messages
695
Reaction score
117

Homework Statement


A and B are matrices and x is a position vector. Show that
$$\sum_{v=1}^n A_{\mu v}(\sum_{\alpha = 1}^n B_{v\alpha}x_{\alpha})=\sum_{v=1}^n \sum_{\alpha = 1}^n (A_{\mu v} B_{v\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^n \sum_{v=1}^n(A_{\mu v} B_{v\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^n (\sum_{v=1}^n(A_{\mu v} B_{v\alpha})x_{\alpha})$$

Homework Equations


N/A

The Attempt at a Solution


I tried expanding the summations out and multiplying the two brackets each with n terms, but then factorization of the product simply led back to what was first given in the question. I also tried visualizing the problem as matrices, but to no avail. This isn't really homework/part of my portion, but I'm trying to get a hand of tensors and for that I need to understand summations. I'm familiar with basic summation procedures for finding out standard deviation etc but nothing of this sort. Any help is appreciated.
 
Physics news on Phys.org
This is actually asking you to prove that ## \vec{\vec A}\; \left ( \vec{\vec B} \;\vec x\right ) = \left ( \vec{\vec A} \; \vec{\vec B} \right ) \vec x ## , right ?
I don't think I understand what you did when you tried this "factorization of the product". Wouldn't it be more a thing like "grouping of the terms"?
 
BvU said:
This is actually asking you to prove that ## \vec{\vec A}\; \left ( \vec{\vec B} \;\vec x\right ) = \left ( \vec{\vec A} \; \vec{\vec B} \right ) \vec x ## , right ?
I don't think I understand what you did when you tried this "factorization of the product". Wouldn't it be more a thing like "grouping of the terms"?
Yes, I tried grouping the terms after expansion and sort them in such a way that they can be reduced to a summation sequence but it didn't work for me.
 
The result appears to be intuitive but it's just that I can't find a way to show it on paper.
 
But they are already in a summation sequence. The only thing that really happens is the interchange of the summations.
 
BvU said:
But they are already in a summation sequence. The only thing that really happens is the interchange of the summations.
But how do I show that they're interchangeable?
 
Hmm, I don't have an obvious answer to that. In ProofWiki they do an even more general case (x is a matrix C and the matrices aren't even square). THey work from the two ends towards the middle and then 'use associativity' of the contraction. But it's tough to read for a physicist.
Could you check it out ?
 
Thanks for the link. So a proof that shows matrix multiplication is associative is as good as a proof that shows that multiplication of summation sequences is associative? If so, does this hold universally for all summation sequences?
P.S. The proof seems to be easy enough to follow :)
 
Hey, I'm just a physicist. Switch summations without blinking an eye :wink: Interchange summation and integration as if it were nothing (not even with fingers crossed).

If you find this proof easy enough to follow, you're better equipped than I am to generalize. (But I think it's OK :smile: ).
You want me to invite an expert on the homework helpers forum ? (Because I realize I can't really underpin that)
 
  • #10
BvU said:
Hey, I'm just a physicist. Switch summations without blinking an eye :wink: Interchange summation and integration as if it were nothing (not even with fingers crossed).

If you find this proof easy enough to follow, you're better equipped than I am to generalize. (But I think it's OK :smile: ).
You want me to invite an expert on the homework helpers forum ? (Because I realize I can't really underpin that)
No, it's fine really. I doubt I'll be requiring a rigorous mathematical proof of this sort to proceed with my studies, and besides, I'm just doing this to expand my understanding on metrics. I'd be happier understanding the application of it's principles rather than getting stuck onto the algebra. Thanks!
 
  • #11
If you have two summations multiplied, you get one of every combination of the indices, regardless of the order:

$$\sum_{i=1}^n A_{i}\sum_{j = 1}^m B_{j}=A_1(B_1 + B_2 + \dots + B_m) + \dots ... + A_n(B_1 + B_2 + \dots + B_m)$$
$$= (A_1 + A_2 + \dots + A_n)B_1 + \dots ... +(A_1 + A_2 + \dots + A_n)B_m$$
$$=\sum_{j=1}^m (\sum_{i = 1}^n A_{i})B_{j}$$

Etc.
 
  • Like
Likes BvU and PWiz
  • #12
Alternatively, you can use induction on n. Here's the inductive step:

$$\sum_{v=1}^n A_{\mu v}(\sum_{\alpha = 1}^n B_{v\alpha}x_{\alpha})$$
$$= \sum_{v=1}^{n-1} A_{\mu v}(\sum_{\alpha = 1}^{n-1} B_{v\alpha}x_{\alpha}) + \sum_{v=1}^{n-1}(A_{\mu v} B_{vn}x_{n}) + A_{\mu n}(\sum_{\alpha = 1}^n B_{n\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^{n-1} \sum_{v=1}^{n-1}(A_{\mu v} B_{v\alpha}x_{\alpha}) + \sum_{v=1}^{n-1}(A_{\mu v} B_{vn}x_{n}) + A_{\mu n}(\sum_{\alpha = 1}^n B_{n\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^{n} \sum_{v=1}^{n-1}(A_{\mu v} B_{v\alpha}x_{\alpha}) + A_{\mu n}(\sum_{\alpha = 1}^n B_{n\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^{n} \sum_{v=1}^{n}(A_{\mu v} B_{v\alpha}x_{\alpha})$$
 

Similar threads

Back
Top