Hi,(adsbygoogle = window.adsbygoogle || []).push({});

I asked this question in the quantum physics forum https://www.physicsforums.com/showthread.php?t=406171 but (afaics) we could not figure out a proof. Let me start with a description of the problem in quantum mechanical terms and then try to translate it into a more rigorous mathematical formulation.

Usually when considering bound states of the Schrödinger equation of a given potential one assumes that the wave function converges to zero for large radius.

One could argue that this is due to the requirement that the wave function is square integrable. But mathematically this is not necessary. One can construct wave functions with peaks, where the width of the peaks shrinks to zero, the distance between two peaks increases and the height of the peaks increases for growing radius. That means that the wave function does not converge to zero, but nevertheless it remains square integrable (provided that the width decreases fast enough). This may seem artificial and the potential need not make sense physically, but it is perfectly valid mathematically.

What I have not used so far is the fact that we are talking about a bound state, i.e. an eigenvalue in the discrete spectrum of the Hamiltonian. Is there an argument using properties of the discrete spectrum that forces the wave function to decay faster than 1/|r| or something like that? Or is it possible that something like these pathologically defined wave functions could be bound states of some strange Hamiltonian?

Mathematically the problem looks as follows. There is a differential operator

[tex]H = -\Delta + V(\vec{r})[/tex]

defined on the Hilbert space [tex]L^2(R^D)[/tex]. Usually D=1 for academic problems and D=3 for the real world, but one can consider the general case and allow for arbitrary dimension D. [tex]\Delta[/tex] is the Laplacian on [tex]R^D[/tex], but perhaps one can go one step further and use a more general differentiable manifold [tex]M^D[/tex]. In that case the Laplacian has to be replaced by the Laplace-Beltrami operator which we would denote as [tex]\Delta_M[/tex].

We consider the eigenvalue equation

[tex](H-E)\psi = 0[/tex]

where [tex]E[/tex] is in the discrete spectrum.

Then my questions are as follows:

1) What are the properties of the potential term [tex]V(x)[/tex] in order to be able to show that the eigenfunction [tex]\psi(\vec{r})[/tex] with eigenvalue [tex]E[/tex] in the discrete spectrum has the asymptotics

[tex]\lim_{|r|\to\infty}\psi(\vec{r}) = 0[/tex]

2) Alternatively: is there a general property of [tex]H[/tex] defined in [tex]L^2[/tex] (or more generally [tex]W_2^2(R^D)[/tex]) that the eigenfunction [tex]\psi(\vec{r})[/tex] has this asymptotic behaviour?

3) Alternatively: is there an argument based on general properties of [tex]L^2[/tex] (or more generally [tex]W_2^2(R^D)[/tex]) that rules out the above mentioned pathological behavior of [tex]\psi(\vec{r})[/tex]?

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# L² Hilbert space, bound states, asymptotics of wave functions

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

**Physics Forums | Science Articles, Homework Help, Discussion**