Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Doing G A Gdagger in Python

  1. Jun 21, 2013 #1

    I am working on a device simulation algorithm, and am implementing it in Python on my laptop that runs Arch Linux. A particular step requires me to perform a computation of the form

    [tex]\tilde{G} = G \Sigma_{c}^{in} G^{\dagger}[/tex]

    where [itex]G[/itex] and [itex]\Sigma_{c}^{in}[/itex] are both [itex]N \times N[/itex] matrices, and [itex]\Sigma_{c}^{in}[/itex] is in fact a diagonal matrix.

    N is of the order of 5040

    Now, if I do this naively, the command to do it in Python is

    Code (Text):
    Gtilde_n = Gtilde_n + dot( dot(G, sigcIn), conjugate(transpose(G)) )
    (The addition is for purposes of sub-band summation, and isn't relevant to the question I'm posing.)

    but this takes about 2 hours when it works (sometimes it just crashes my laptop).

    Now if I utilize the fact that [itex]\Sigma_{c}^{in}[/itex] is diagonal, then I can do something more naive and write a for loop of the form

    Code (Text):

                  for row in range(0,N):
                          for col in range(0,N):
                                  sum = 0
                                  for alpha in range(0,N):
                                          sum = sum + G[row,alpha]*sigcIn[alpha,alpha]*conjugate(G[col,alpha])
                                  Gtilde_n[row,col] = Gtilde_n[row,col] + sum
    But this doesn't look like a nice piece of code: it has 3 nested for loops, and looks a rather ancient way of doing the following calculation

    [tex]\tilde{G}_{ij} = \sum_{\alpha}\sum_{\beta} G_{i\alpha}(\Sigma_{c}^{in})_{\alpha\beta}(G^{\dagger})_{\beta j} = \sum_{\alpha,\beta} G_{i\alpha}(\Sigma_{c}^{in})_{\alpha\alpha}\delta_{\alpha\beta} (G^{\dagger})_{\beta j} =\sum_{\alpha} G_{i\alpha}(\Sigma_{c}^{in})_{\alpha\alpha}G^{*}_{j\alpha}[/tex]

    Is there some way to vectorize this?

    In general, is there a better way to perform this computation?
    Last edited: Jun 21, 2013
  2. jcsd
  3. Jun 21, 2013 #2
    Trivially by partioning α's range for each ij, but I would rather try to exploit this
    [tex]G_{i\alpha} G^{*}_{j\alpha} = \left(G_{j\alpha} G^{*}_{i\alpha}\right)^{*} [/tex] or/and(given Σ is real, what I suspect) premultiplying all [itex](G_{i})_{\alpha}[/itex] vectors component-wise by a vector made of the Σ diag components. That could well be done in parallel.

    Are there any known algebraic constraints on [itex]G[/itex]?
    Last edited: Jun 21, 2013
  4. Jun 21, 2013 #3
    Thanks for your reply.

    No, its just the matrix representation of a Green's function.
  5. Jun 21, 2013 #4


    Staff: Mentor

    What does this notation mean - [itex]\Sigma_{c}^{in}[/itex]?

    You said that it was a diagonal matrix, but I'm curious about the significance of c and in.
  6. Jun 22, 2013 #5
    You're sure there's no constraint related to its eigenspaces?
    Last edited: Jun 22, 2013
  7. Jun 22, 2013 #6
    So [itex]\Sigma_{c}^{in} = \Sigma_{1}^{in} + \Sigma_{2}^{in}[/itex] is simply the sum of two so called inscattering matrices, computed from the self-energy in a many body problem. There are two terms because there are two contacts. For each contact,

    [tex]\Sigma_{k}^{in} = f(E)\Gamma_k[/tex]

    k = 1, 2, and

    [tex]\Gamma_k = i[\Sigma_k - \Sigma_k^\dagger][/tex]

    It turns out that [tex]\Gamma_i[/itex] is a real matrix, i.e. all its elements are real.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook