General Linear Algebra equation

  • Thread starter Skatch
  • Start date
  • #1
18
0
This isn't really a homework question, but there are more people in this forum, hopefully someone can help. Just basic linear algebra.

I've got this operator in the form of a matrix, L. It acts on another matrix, u, either by multiplying on the left or right. The operation I want in this particular case is

[tex]Lu + uL.[/tex]

But I need to invert this, so I want to write it as Mu, so that I can find the inverse of M.

Is there some way to write the Lu+uL in the form of Mu?

I have a feeling there's something fundamental about multiplying on the right that won't allow me to do this.
 

Answers and Replies

  • #2
35,393
7,271
Matrix multiplication is not generally commutative, so it might be that Lu and uL are different.
However, if it turns out that Lu and uL are equal, then you have Lu + uL = Lu + Lu = 2Lu = Mu.
 
  • #3
22,129
3,298
Hi skatch! :smile:

Can we perhaps know the dimensions of L and u? I have a feeling that you want u to be a vector (i.e. a nx1-matrix), but that wouldn't make much sense...

In the case that you did mean u to be a vector, did you perhaps mean

[tex]Lu+u^TL[/tex]
 
  • #4
18
0
Yeah, u and L definitely don't commute in this case unfortunately. L is a finite difference operator, and how I've got it set up, it gives me the 2nd derivative approximations for u with respect to x when multiplied on the left, and with respect to y when multiplied on the right. So Lu+uL is what I was hoping to use as a discrete laplacian operator, which in turn I would like to invert.

Might just have to go about this a different way.
 
  • #5
18
0
micromass: Nope, both nxn matrices. u is an approximate solution over a grid of nxn points, u_ij = u(x_i, y_j).


I just might be headed down a dead end, and might need to define a different L for my discrete laplacian so that I can invert it. This current setup looks like it probably won't work.
 
  • #6
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,722
You can regard u as an n^2-dimensional vector U, just by spreading it out as U = (u_{11},...,u_{1n},u_{21},...,u_{2n},...,u_{n1}, ...,u_{nn}). Now Lu + uL is a linear operator on the vector U, so has a matrix representation M, which is an n^2 x n^2 matrix.

RGV
 
  • #7
This might be a bit too abstract, but if you consider L as a mapping between two spaces, then you can consider its derivative. In this case the derivative of L^2 in the direction of u is Lu + uL.

[tex] [D F(L)]u = Lu + uL[/tex]

where

[tex]F(L) = L^2[/tex]

and [DF(L)] is the "same" as the Jacobian Matrix of F(L) if you translate F(L) into a vector a column length R^n^2 then after the calculations change it back to a matrix. Therefore

[tex]M=[DF(L)][/tex]

and finding M^-1 would be easy afterwards.
 
  • #8
Ignore my last post. This is easier.

[tex]Lu+uL=Mu[/tex]

therefore

[tex]M=(Lu+uL)u^{-1}[/tex]

then

[tex]M^{-1}=u(Lu+uL)^{-1}[/tex]
 
  • #9
18
0
trans: Won't quite work for my application, I need M to be independant of the matrix I want to apply it to. I'm going to be applying it many times, so I don't want to be computing an new inverse each iteration.


Ray: I think you nailed it. After you said I would need to use n^2 by n^2 and reform the vector, I think I came up with [tex] L\otimes I + I\otimes L [/tex] as the matrix I need (using kronecker product), if I reshape u correctly into n^2 by 1. [tex]L\otimes I[/tex] multiplied by a reshaped u gives me the same thing as multiplying the original u on the left by L, and vice versa. Thanks!


(unrelated: dont suppose anyone can tell me how to write in-line tex instead of \equation lines?)
 
  • #10
85
0
Use the itex tag for inline tex.

:)
 
  • #11
18
0
T[itex]ha\eta k[/itex]s!
 

Related Threads on General Linear Algebra equation

  • Last Post
Replies
5
Views
2K
  • Last Post
Replies
12
Views
5K
  • Last Post
Replies
0
Views
1K
Replies
2
Views
1K
  • Last Post
Replies
2
Views
703
  • Last Post
Replies
5
Views
2K
  • Last Post
Replies
13
Views
2K
  • Last Post
Replies
5
Views
1K
Replies
2
Views
4K
Replies
4
Views
1K
Top