Eigenvalue problem with operators as matrix elements

wil3
Messages
177
Reaction score
1
Hello, I have a feeling that the solution to this question is going to be incredibly obvious, so my apologies if this turns out to be really dumb. How do I solve the following eigenvalue problem:

<br /> <br /> \begin{bmatrix}<br /> \partial_x^2 + \mu + u(x) &amp; u(x)^2 \\<br /> \bar{u(x)}^2 &amp; \partial_x^2 + \mu + u(x)<br /> \end{bmatrix}<br /> \begin{bmatrix}<br /> a(x)\\<br /> b(x)<br /> \end{bmatrix}<br /> =<br /> \omega(x)<br /> \begin{bmatrix}<br /> a(x)\\<br /> b(x)<br /> \end{bmatrix}<br />

where all x dependencies have been declared. I know the definition of the function u(x), but I need to solve for the eigenfrequency and eigenvectors.
 
Mathematics news on Phys.org
wil3 said:
Hello, I have a feeling that the solution to this question is going to be incredibly obvious, so my apologies if this turns out to be really dumb. How do I solve the following eigenvalue problem:

<br /> <br /> \begin{bmatrix}<br /> \partial_x^2 + \mu + u(x) &amp; u(x)^2 \\<br /> \bar{u(x)}^2 &amp; \partial_x^2 + \mu + u(x)<br /> \end{bmatrix}<br /> \begin{bmatrix}<br /> a(x)\\<br /> b(x)<br /> \end{bmatrix}<br /> =<br /> \omega(x)<br /> \begin{bmatrix}<br /> a(x)\\<br /> b(x)<br /> \end{bmatrix}<br />

where all x dependencies have been declared. I know the definition of the function u(x), but I need to solve for the eigenfrequency and eigenvectors.
I assume that you mean to find the function omega of x? You need to specify, because it's unclear what you want. Is your use of del indicating a partial derivative or a directional derivative. Be specific.
 
Typo in question: \omega should not depend on x. I want to solve for \omega, hence why I am calling this an eigenvalue problem. I'm not sure if getting w will also give a and b, like in a standard linear system.

\partial_x indicates a partial derivative, which is standard notation in physics for this sort of problem. But the problem would be identical if you wanted to interpret that as a directional derivative (both because the problem is one dimensional and because the subscript x would suggest a derivative in the x direction).
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
I'm interested to know whether the equation $$1 = 2 - \frac{1}{2 - \frac{1}{2 - \cdots}}$$ is true or not. It can be shown easily that if the continued fraction converges, it cannot converge to anything else than 1. It seems that if the continued fraction converges, the convergence is very slow. The apparent slowness of the convergence makes it difficult to estimate the presence of true convergence numerically. At the moment I don't know whether this converges or not.

Similar threads

Replies
2
Views
2K
Replies
1
Views
1K
Replies
1
Views
3K
Replies
10
Views
2K
Replies
16
Views
4K
Replies
4
Views
2K
Back
Top