Product of Two Summations for Matrices and Vectors

Click For Summary
SUMMARY

The discussion centers on proving the equality of two summation expressions involving matrices A and B and a vector x. The key equation to demonstrate is $$\sum_{v=1}^n A_{\mu v}(\sum_{\alpha = 1}^n B_{v\alpha}x_{\alpha}) = \sum_{\alpha = 1}^n (\sum_{v=1}^n(A_{\mu v} B_{v\alpha})x_{\alpha})$$, which illustrates the interchangeability of summation sequences. Participants explored various methods, including factorization and induction, to show the associative property of matrix multiplication and summation. The consensus is that understanding these principles is crucial for further studies in tensors and matrix algebra.

PREREQUISITES
  • Matrix multiplication and properties
  • Summation notation and manipulation
  • Induction principles in mathematical proofs
  • Basic understanding of tensors
NEXT STEPS
  • Study the properties of matrix multiplication and its associativity
  • Learn about the interchangeability of summation sequences in mathematical proofs
  • Explore tensor algebra and its applications in physics
  • Review induction techniques for proving mathematical statements
USEFUL FOR

Students of mathematics, physicists, and anyone interested in deepening their understanding of matrix operations and tensor analysis.

PWiz
Messages
695
Reaction score
117

Homework Statement


A and B are matrices and x is a position vector. Show that
$$\sum_{v=1}^n A_{\mu v}(\sum_{\alpha = 1}^n B_{v\alpha}x_{\alpha})=\sum_{v=1}^n \sum_{\alpha = 1}^n (A_{\mu v} B_{v\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^n \sum_{v=1}^n(A_{\mu v} B_{v\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^n (\sum_{v=1}^n(A_{\mu v} B_{v\alpha})x_{\alpha})$$

Homework Equations


N/A

The Attempt at a Solution


I tried expanding the summations out and multiplying the two brackets each with n terms, but then factorization of the product simply led back to what was first given in the question. I also tried visualizing the problem as matrices, but to no avail. This isn't really homework/part of my portion, but I'm trying to get a hand of tensors and for that I need to understand summations. I'm familiar with basic summation procedures for finding out standard deviation etc but nothing of this sort. Any help is appreciated.
 
Physics news on Phys.org
This is actually asking you to prove that ## \vec{\vec A}\; \left ( \vec{\vec B} \;\vec x\right ) = \left ( \vec{\vec A} \; \vec{\vec B} \right ) \vec x ## , right ?
I don't think I understand what you did when you tried this "factorization of the product". Wouldn't it be more a thing like "grouping of the terms"?
 
BvU said:
This is actually asking you to prove that ## \vec{\vec A}\; \left ( \vec{\vec B} \;\vec x\right ) = \left ( \vec{\vec A} \; \vec{\vec B} \right ) \vec x ## , right ?
I don't think I understand what you did when you tried this "factorization of the product". Wouldn't it be more a thing like "grouping of the terms"?
Yes, I tried grouping the terms after expansion and sort them in such a way that they can be reduced to a summation sequence but it didn't work for me.
 
The result appears to be intuitive but it's just that I can't find a way to show it on paper.
 
But they are already in a summation sequence. The only thing that really happens is the interchange of the summations.
 
BvU said:
But they are already in a summation sequence. The only thing that really happens is the interchange of the summations.
But how do I show that they're interchangeable?
 
Hmm, I don't have an obvious answer to that. In ProofWiki they do an even more general case (x is a matrix C and the matrices aren't even square). THey work from the two ends towards the middle and then 'use associativity' of the contraction. But it's tough to read for a physicist.
Could you check it out ?
 
Thanks for the link. So a proof that shows matrix multiplication is associative is as good as a proof that shows that multiplication of summation sequences is associative? If so, does this hold universally for all summation sequences?
P.S. The proof seems to be easy enough to follow :)
 
Hey, I'm just a physicist. Switch summations without blinking an eye :wink: Interchange summation and integration as if it were nothing (not even with fingers crossed).

If you find this proof easy enough to follow, you're better equipped than I am to generalize. (But I think it's OK :smile: ).
You want me to invite an expert on the homework helpers forum ? (Because I realize I can't really underpin that)
 
  • #10
BvU said:
Hey, I'm just a physicist. Switch summations without blinking an eye :wink: Interchange summation and integration as if it were nothing (not even with fingers crossed).

If you find this proof easy enough to follow, you're better equipped than I am to generalize. (But I think it's OK :smile: ).
You want me to invite an expert on the homework helpers forum ? (Because I realize I can't really underpin that)
No, it's fine really. I doubt I'll be requiring a rigorous mathematical proof of this sort to proceed with my studies, and besides, I'm just doing this to expand my understanding on metrics. I'd be happier understanding the application of it's principles rather than getting stuck onto the algebra. Thanks!
 
  • #11
If you have two summations multiplied, you get one of every combination of the indices, regardless of the order:

$$\sum_{i=1}^n A_{i}\sum_{j = 1}^m B_{j}=A_1(B_1 + B_2 + \dots + B_m) + \dots ... + A_n(B_1 + B_2 + \dots + B_m)$$
$$= (A_1 + A_2 + \dots + A_n)B_1 + \dots ... +(A_1 + A_2 + \dots + A_n)B_m$$
$$=\sum_{j=1}^m (\sum_{i = 1}^n A_{i})B_{j}$$

Etc.
 
  • Like
Likes   Reactions: BvU and PWiz
  • #12
Alternatively, you can use induction on n. Here's the inductive step:

$$\sum_{v=1}^n A_{\mu v}(\sum_{\alpha = 1}^n B_{v\alpha}x_{\alpha})$$
$$= \sum_{v=1}^{n-1} A_{\mu v}(\sum_{\alpha = 1}^{n-1} B_{v\alpha}x_{\alpha}) + \sum_{v=1}^{n-1}(A_{\mu v} B_{vn}x_{n}) + A_{\mu n}(\sum_{\alpha = 1}^n B_{n\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^{n-1} \sum_{v=1}^{n-1}(A_{\mu v} B_{v\alpha}x_{\alpha}) + \sum_{v=1}^{n-1}(A_{\mu v} B_{vn}x_{n}) + A_{\mu n}(\sum_{\alpha = 1}^n B_{n\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^{n} \sum_{v=1}^{n-1}(A_{\mu v} B_{v\alpha}x_{\alpha}) + A_{\mu n}(\sum_{\alpha = 1}^n B_{n\alpha}x_{\alpha})$$
$$= \sum_{\alpha = 1}^{n} \sum_{v=1}^{n}(A_{\mu v} B_{v\alpha}x_{\alpha})$$
 

Similar threads

  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K