## Conjugate gradient for nonsymmetric problem

Hi, I was wondering if it is possible to adapt the conjugate gradient method (or if there's a variation of the method) for nonsymmetrical boundary value problems.

For example, I want to solve something like a 2D square grid, where $f(x)=0$ for all $x$ on the boundary of the square, $f(x_{i0,j0})=1$ and $f(x_{i1, j1})$ for specified interior points, and

$$f(x_{i,j})=.1f(x_{i-1,j})+.2f(x_{i+1,j})+.3f(x_{i,j-1})+.4f(x_{i,j+1})$$

for all other interior grid points $x_{i,j}$. If I change $f_{i,j}$ to a 1D vector $y_{k}$, and then write the system of eqs out, the matrix $A$ in the system I want to solve ($Ay=b$) is not symmetric.

From what I've read, the conjugate gradient method only works for symmetric $A$, so I was wondering if there is some way to adapt the method, or a different way of setting up the system. If not, what would be the fastest way to solve this problem? (The only reason I'm interested in conjugate gradient is b/c I heard it's fast.) I'm currently using successive over-relaxation (SOR). Is there anything faster?
 PhysOrg.com science news on PhysOrg.com >> Galaxies fed by funnels of fuel>> The better to see you with: Scientists build record-setting metamaterial flat lens>> Google eyes emerging markets networks

Recognitions:
 Quote by ihggin From what I've read, the conjugate gradient method only works for symmetric $A$, so I was wondering if there is some way to adapt the method, or a different way of setting up the system.

There are some traps here, because biconjugate gradient can be unstable. A practical stabilized version is the BiCGSTAB algorithm (also in Google!)
 Recognitions: Gold Member Science Advisor Staff Emeritus Another way the conjugate gradient method could be used is to solve$$A^T A y = A^T b$$

Recognitions:
 Quote by Hurkyl Another way the conjugate gradient method could be used is to solve$$A^T A y = A^T b$$
True, provided it doesn't matter that the condition number of $A^T A$ is the square of the condition number of $A$, which may decrease the numerical precision.
The biconjugate gradient method also involves multiplyng vectors by $A^T$, but it doesn't degrade the condition number.