1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Product of two summations

Tags:
  1. Feb 17, 2015 #1
    1. The problem statement, all variables and given/known data
    A and B are matrices and x is a position vector. Show that
    $$\sum_{v=1}^n A_{\mu v}(\sum_{\alpha = 1}^n B_{v\alpha}x_{\alpha})=\sum_{v=1}^n \sum_{\alpha = 1}^n (A_{\mu v} B_{v\alpha}x_{\alpha})$$
    $$= \sum_{\alpha = 1}^n \sum_{v=1}^n(A_{\mu v} B_{v\alpha}x_{\alpha})$$
    $$= \sum_{\alpha = 1}^n (\sum_{v=1}^n(A_{\mu v} B_{v\alpha})x_{\alpha})$$
    2. Relevant equations
    N/A

    3. The attempt at a solution
    I tried expanding the summations out and multiplying the two brackets each with n terms, but then factorization of the product simply led back to what was first given in the question. I also tried visualizing the problem as matrices, but to no avail. This isn't really homework/part of my portion, but I'm trying to get a hand of tensors and for that I need to understand summations. I'm familiar with basic summation procedures for finding out standard deviation etc but nothing of this sort. Any help is appreciated.
     
  2. jcsd
  3. Feb 17, 2015 #2

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    This is actually asking you to prove that ## \vec{\vec A}\; \left ( \vec{\vec B} \;\vec x\right ) = \left ( \vec{\vec A} \; \vec{\vec B} \right ) \vec x ## , right ?
    I don't think I understand what you did when you tried this "factorization of the product". Wouldn't it be more a thing like "grouping of the terms"?
     
  4. Feb 17, 2015 #3
    Yes, I tried grouping the terms after expansion and sort them in such a way that they can be reduced to a summation sequence but it didn't work for me.
     
  5. Feb 17, 2015 #4
    The result appears to be intuitive but it's just that I can't find a way to show it on paper.
     
  6. Feb 17, 2015 #5

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    But they are already in a summation sequence. The only thing that really happens is the interchange of the summations.
     
  7. Feb 17, 2015 #6
    But how do I show that they're interchangeable?
     
  8. Feb 17, 2015 #7

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Hmm, I don't have an obvious answer to that. In ProofWiki they do an even more general case (x is a matrix C and the matrices aren't even square). THey work from the two ends towards the middle and then 'use associativity' of the contraction. But it's tough to read for a physicist.
    Could you check it out ?
     
  9. Feb 17, 2015 #8
    Thanks for the link. So a proof that shows matrix multiplication is associative is as good as a proof that shows that multiplication of summation sequences is associative? If so, does this hold universally for all summation sequences?
    P.S. The proof seems to be easy enough to follow :)
     
  10. Feb 17, 2015 #9

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Hey, I'm just a physicist. Switch summations without blinking an eye :wink: Interchange summation and integration as if it were nothing (not even with fingers crossed).

    If you find this proof easy enough to follow, you're better equipped than I am to generalize. (But I think it's OK :smile: ).
    You want me to invite an expert on the homework helpers forum ? (Because I realize I can't really underpin that)
     
  11. Feb 17, 2015 #10
    No, it's fine really. I doubt I'll be requiring a rigorous mathematical proof of this sort to proceed with my studies, and besides, I'm just doing this to expand my understanding on metrics. I'd be happier understanding the application of it's principles rather than getting stuck onto the algebra. Thanks!
     
  12. Feb 17, 2015 #11

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    If you have two summations multiplied, you get one of every combination of the indices, regardless of the order:

    $$\sum_{i=1}^n A_{i}\sum_{j = 1}^m B_{j}=A_1(B_1 + B_2 + \dots + B_m) + \dots ... + A_n(B_1 + B_2 + \dots + B_m)$$
    $$= (A_1 + A_2 + \dots + A_n)B_1 + \dots ... +(A_1 + A_2 + \dots + A_n)B_m$$
    $$=\sum_{j=1}^m (\sum_{i = 1}^n A_{i})B_{j}$$

    Etc.
     
  13. Feb 17, 2015 #12

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Alternatively, you can use induction on n. Here's the inductive step:

    $$\sum_{v=1}^n A_{\mu v}(\sum_{\alpha = 1}^n B_{v\alpha}x_{\alpha})$$
    $$= \sum_{v=1}^{n-1} A_{\mu v}(\sum_{\alpha = 1}^{n-1} B_{v\alpha}x_{\alpha}) + \sum_{v=1}^{n-1}(A_{\mu v} B_{vn}x_{n}) + A_{\mu n}(\sum_{\alpha = 1}^n B_{n\alpha}x_{\alpha})$$
    $$= \sum_{\alpha = 1}^{n-1} \sum_{v=1}^{n-1}(A_{\mu v} B_{v\alpha}x_{\alpha}) + \sum_{v=1}^{n-1}(A_{\mu v} B_{vn}x_{n}) + A_{\mu n}(\sum_{\alpha = 1}^n B_{n\alpha}x_{\alpha})$$
    $$= \sum_{\alpha = 1}^{n} \sum_{v=1}^{n-1}(A_{\mu v} B_{v\alpha}x_{\alpha}) + A_{\mu n}(\sum_{\alpha = 1}^n B_{n\alpha}x_{\alpha})$$
    $$= \sum_{\alpha = 1}^{n} \sum_{v=1}^{n}(A_{\mu v} B_{v\alpha}x_{\alpha})$$
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted