Solving Non-Homogeneous Linear Systems with Eigenvector Expansions

  • Context: Graduate 
  • Thread starter Thread starter I<3NickTesla
  • Start date Start date
  • Tags Tags
    Eigenfunction
Click For Summary

Discussion Overview

The discussion revolves around solving non-homogeneous linear systems using eigenvector expansions, exploring the analogy between matrix decompositions and function expansions in terms of eigenvectors and eigenfunctions. Participants delve into the mathematical framework and properties of Hermitian matrices, as well as the completeness of eigenfunctions.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Exploratory

Main Points Raised

  • One participant expresses confusion about the analogy between decomposing matrices and functions into their eigenvector and eigenfunction bases.
  • Another participant clarifies that a matrix acts as a linear map from one vector space to another, and emphasizes the completeness property of eigenfunctions.
  • A detailed explanation is provided regarding the spectral theorem for Hermitian matrices, stating that they possess a complete set of orthonormal eigenvectors that span the space.
  • The process of expanding both the solution vector and the known vector in terms of eigenvectors is outlined, leading to a formula for the solution of the linear system.
  • Participants discuss the relationship between the eigenvector expansion solution and Green's functions, noting similarities in their mathematical representations.

Areas of Agreement / Disagreement

Participants generally agree on the mathematical framework and properties of Hermitian matrices, but there is no consensus on the clarity of the analogy between matrix and function expansions, as one participant expresses confusion.

Contextual Notes

The discussion includes complex mathematical expressions and relies on the properties of linear algebra and functional analysis, which may not be fully accessible without additional context or definitions.

I<3NickTesla
Messages
12
Reaction score
0
In the picture attached I understand everything up to 1.12. I wrote "think of it like a matrix" at the time and that made sense but now I don't really get it. There's obviously an analogy between decomposing a matrix into its eigenvector basis and a function into its eigenfunction basis but I'm not really seeing it.Thanks.
 

Attachments

  • whatwhatwhat.jpg
    whatwhatwhat.jpg
    41.9 KB · Views: 523
Physics news on Phys.org
Of course, there is an analogy: a matrix can be thought of as a linear map from a finite dimensional vector space to an other one. A linear operator is "the same", in the sense, that it is a map from an (infinite) vector space to an other one.

But this is not, what you need know. What you use in 1.12 is the so called completeness property of the eigenfunctions:

[tex] \sum f(x) f(x')=\delta(x-x')[/tex]
 
Let [itex]A[/itex] be an [itex]N\times N[/itex] Hermitian matrix - that is [itex]A=A^\dagger[/itex]. Here [itex]\dagger[/itex] means conjugate transpose if [itex]A[/itex] is complex, and simply means transpose if A is real. By the spectral theorem of linear algebra, [itex]A[/itex] has a complete set of orthonormal eigenvectors, each of which satisfy [itex]A v_n = \lambda_n v_n[/itex]. The orthonormal means [itex](v_m, v_n) = \delta_{m,n}[/itex] where [itex](x, y)[/itex] is the inner product of two vectors x and y, and [itex]\delta_{m,n}[/itex] is one if m=n and zero otherwise. Since there are N orthornormal eigenvectors, they must span our N dimensional space, so any vector can be represented as a sum of the eigenvectors.

Now, consider
[tex] A x = b[/tex]
where we know [itex]b[/itex] but not [itex]x[/itex]. The idea is to expand both [itex]b[/itex] and [itex]x[/itex] in the eigenvectors,
[tex] \begin{eqnarray}<br /> x & = & \sum_{n=1}^N x_n v_n \\<br /> b & = & \sum_{n=1}^N b_n v_n <br /> \end{eqnarray}[/tex]
From orthonormality, we can find the [itex]b_n[/itex] (these are just numbers)
[tex] b_n = (v_n,b).[/tex]
We can then solve for the [itex]x_n[/itex] as follows. We start with Ax=b, but substituting the series
[tex] A \sum_{n=1}^N x_n v_n = \sum_{n=1}^N (v_n,b) v_n. [/tex]
The left hand side is then,
[tex] \begin{eqnarray}<br /> A \sum_{n=1}^N x_n v_n & = & \sum_{n=1}^N x_n A v_n \\<br /> & = & \sum_{n=1}^N x_n \lambda_n v_n <br /> \end{eqnarray}[/tex]
Ax=b can therefore be written,
[tex] \sum_{n=1}^N x_n \lambda_n v_n = \sum_{n=1}^N (v_n,b) v_n [/tex]
Take inner product with [itex]v_m[/itex] yields
[tex] x_m \lambda_m = (v_m,b)[/tex]
so
[tex] x_m = \frac{(v_m,b)}{\lambda_m}.[/tex]
The solution is therefore
[tex] x = \sum_{n=1}^N \frac{v_n (v_n,b)}{\lambda_n}.[/tex]
This is the eigenvector expansion solution of the non-homogeneous linear system. It is analogous to the eigenvector expansion solution to the non-homogeneous SL problem (note the SL operator is Hermitian).

We almost have the Green's function analog, too. Note that the normal inner product is [itex](x,y)=x^\dagger y[/itex], so we can write,
[tex] \begin{eqnarray}<br /> x & = & \sum_{n=1}^N \frac{v_n v_n^\dagger b}{\lambda_n} \\<br /> & = & \left( \sum_{n=1}^N \frac{v_n v_n^\dagger }{\lambda_n} \right) b. <br /> \end{eqnarray}[/tex]
The final quantity in parentheses is a matrix that is analogous to green's functions for SL.

hope this helps!
 
jasonRF said:
Let [itex]A[/itex] be an [itex]N\times N[/itex] Hermitian matrix - that is [itex]A=A^\dagger[/itex]. Here [itex]\dagger[/itex] means conjugate transpose if [itex]A[/itex] is complex, and simply means transpose if A is real. By the spectral theorem of linear algebra, [itex]A[/itex] has a complete set of orthonormal eigenvectors, each of which satisfy [itex]A v_n = \lambda_n v_n[/itex]. The orthonormal means [itex](v_m, v_n) = \delta_{m,n}[/itex] where [itex](x, y)[/itex] is the inner product of two vectors x and y, and [itex]\delta_{m,n}[/itex] is one if m=n and zero otherwise. Since there are N orthornormal eigenvectors, they must span our N dimensional space, so any vector can be represented as a sum of the eigenvectors.

Now, consider
[tex] A x = b[/tex]
where we know [itex]b[/itex] but not [itex]x[/itex]. The idea is to expand both [itex]b[/itex] and [itex]x[/itex] in the eigenvectors,
[tex] \begin{eqnarray}<br /> x & = & \sum_{n=1}^N x_n v_n \\<br /> b & = & \sum_{n=1}^N b_n v_n <br /> \end{eqnarray}[/tex]
From orthonormality, we can find the [itex]b_n[/itex] (these are just numbers)
[tex] b_n = (v_n,b).[/tex]
We can then solve for the [itex]x_n[/itex] as follows. We start with Ax=b, but substituting the series
[tex] A \sum_{n=1}^N x_n v_n = \sum_{n=1}^N (v_n,b) v_n. [/tex]
The left hand side is then,
[tex] \begin{eqnarray}<br /> A \sum_{n=1}^N x_n v_n & = & \sum_{n=1}^N x_n A v_n \\<br /> & = & \sum_{n=1}^N x_n \lambda_n v_n <br /> \end{eqnarray}[/tex]
Ax=b can therefore be written,
[tex] \sum_{n=1}^N x_n \lambda_n v_n = \sum_{n=1}^N (v_n,b) v_n [/tex]
Take inner product with [itex]v_m[/itex] yields
[tex] x_m \lambda_m = (v_m,b)[/tex]
so
[tex] x_m = \frac{(v_m,b)}{\lambda_m}.[/tex]
The solution is therefore
[tex] x = \sum_{n=1}^N \frac{v_n (v_n,b)}{\lambda_n}.[/tex]
This is the eigenvector expansion solution of the non-homogeneous linear system. It is analogous to the eigenvector expansion solution to the non-homogeneous SL problem (note the SL operator is Hermitian).

We almost have the Green's function analog, too. Note that the normal inner product is [itex](x,y)=x^\dagger y[/itex], so we can write,
[tex] \begin{eqnarray}<br /> x & = & \sum_{n=1}^N \frac{v_n v_n^\dagger b}{\lambda_n} \\<br /> & = & \left( \sum_{n=1}^N \frac{v_n v_n^\dagger }{\lambda_n} \right) b. <br /> \end{eqnarray}[/tex]
The final quantity in parentheses is a matrix that is analogous to green's functions for SL.

hope this helps!

That helped, thanks
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
8
Views
3K
  • · Replies 36 ·
2
Replies
36
Views
6K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K