# Definition of a wavefunction as a function of position.

Hello i wonder if anyone here can explain to me, why the wavefunction as a function of position is defined like this:

$$\Psi(x,t) = \langle{x}|{\Psi(t)}\rangle$$

Why is it wise to use an inner product as a definition?

## Answers and Replies

Related Quantum Physics News on Phys.org
A quantum state is an abstract object: a vector in (in general, infinite dimensional) Hilbert space. A vector can be described a bit more concretely by its components with respect to a particular basis, e.g.:
##|\psi\rangle = \sum_i c_i |\phi_i\rangle##
where ##\{|\phi_i\rangle\}## is a basis for the Hilbert space. If its an orthonormal basis, it has the advantage that a vector's components can be very easily recovered by an inner product: ##c_i = \langle \phi_i|\psi\rangle##. Hence, the wavefunction is just the projection of the quantum state onto the position basis, ##\{|x\rangle\}## (or, more generally, ##\{|\vec{r}\rangle\}##). Note the position basis isn't always an appropriate choice, such as when considering two-level quantum systems defined by, e.g., spin states.

In quantum mechanics, we also have the Born rule which says, roughly, that if we have a state expressed in an orthonormal basis and a measurement that allows us to determine which basis state the system is in, then the probability that the system will be in state ##|\phi_i\rangle## is ##|c_i|^2##. Thus, the squared magnitude of a particle's wavefunction has the usual interpretation as the probability density that the particle will be found at a particular place if its location is measured.

$\left|ψ\right\rangle$ is called the state vector for a system, and to understand it's meaning, we need to project it on a set of basis states.

Just like $\vec{A}$ is a vector, and we give it meaning by projecting it onto a complete basis (using an inner or "dot" product) like real space in cartesian coordinates, like so:

$\vec{A} = \vec{A}\cdot\hat{r} \rightarrow \langle{\hat{r}}\left|A\right\rangle$

Where $\hat{r} = \hat{x} + \hat{y} + \hat{x} \rightarrow \left|x\right\rangle + \left|y\right\rangle + \left|z\right\rangle$

so now you'll calculate the inner product of $\vec{A}$ with each of the three basis vector and obtain the "components" of A along each direction: $\langle{x}\left|A\right\rangle$, $\langle{y}\left|A\right\rangle$, and $\langle{z}\left|A\right\rangle$. Now you can understand the meaning of $\vec{A}$ because it has been projected onto a basis.

The state vector gives probabilities amplitudes. Probabilities of what? You need to project it onto a complete basis to give it meaning. A two state system like the spin of a spin 1/2 particle is easy to understand. It can be "spin up" or "spin down" let's take the z-axis as reference for up or down, and our complete basis is formed by $\left|+z\right\rangle$ and $\left|-z\right\rangle$, meaning $\langle{+z}\left|ψ\right\rangle$ is the probability amplitude the particle is spin up and $\langle{-z}\left|ψ\right\rangle$ is for spin down. (take magnitude for actual probabilities.)

Now if our state vector is one that wants to decribe the postion of a particle, by giving an amplitude for it to be at any position $x$ we need to project it onto a basis, and give it a probability amplitude to be observed at any $x$. How many $x$ basis vectors are there? Infinity, since $x$ continuous! We take this in stride however and want to calculate the probability amplitudes $\langle{x}\left|ψ\right\rangle$ which is just the inner product between two infinite dimensional vectors.

All that background just to say since the amplitude $\langle{x}\left|ψ\right\rangle$ has different values for different values of $x$, it's just a function of $x$... the wavefunction, and we can just denote it as $\langle{x}\left|ψ\right\rangle \equiv ψ(x)$.

dextercioby
Homework Helper
The correct definition would be

$$\Psi \left(\vec{x},t\right) = \langle \vec{x},t|\Psi \rangle$$

and the full mathematical explanation is found in the theory of the representations of the Galilei group on a rigged Hilbert space.

Hello i wonder if anyone here can explain to me, why the wavefunction as a function of position is defined like this:

$$\Psi(x,t) = \langle{x}|{\Psi(t)}\rangle$$

Why is it wise to use an inner product as a definition?
Attached is an excerpt from Townsend's excellent quantum mechanics book that explains how wavefunctions and the Schrodinger equation emerge rather naturally from the bra-ket machinery.

#### Attachments

• 889.7 KB Views: 150
Thank you guys. Your assistance has always been of a great help to me.

So lets say i have a position basis. If i now use an operator ##\hat{p}## on this basis i will get basis which coresponds to a momentum space and projections of a ##|\Psi(t)\rangle## on these base vectors will now be ##\Psi (p,t)##?

In other words. Do operators transform basis or a state vector or both?

So lets say i have a position basis. If i now use an operator ##\hat{p}## on this basis i will get basis which coresponds to a momentum space and projections of a ##|\Psi(t)\rangle## on these base vectors will now be ##\Psi (p,t)##?

In other words. Do operators transform basis or a state vector or both?
No, that's not what QM operators do. (In the following, I will be assuming for simplicity, as I did in my first post, that the wavefunction is only spatially one dimensional, like the square well.)

There are two kinds of operators we use in QM: Hermitian and unitary. Hermitian operators are associated with measurements. That doesn't mean doing a measurement is the same as applying the operator. It just means that when we say ##\hat{P}## is the momentum operator, then its eigenvalues are the spectrum of possible momentum measurements. It's not a change of basis operator.

Unitary operators, on the other hand, are often physically realizable, and it does make sense to apply them to generic states like you're trying to do. Mathematically, they're generated by anti-Hermitian operators (which we can make by multiplying a Hermitian operator by ##i## and some arbitrary real constant) via an exponential map. The momentum operator generates translations in space: ##T(a) = e^{-ia\hat{P}/\hbar}, T(a)|x\rangle = |x+a\rangle## (a passive translation of the position basis to the right). In terms of wavefunctions, this means ##\langle x|T(a)|\Psi(t)\rangle = \Psi(x-a,t)## (an active translation of the wavefunction to the right). In the same manner, the Hamiltonian (the operator corresponding to the observable energy) generates time evolution of states and angular momentum operators generate rotation.

Last edited:
So lets say i have a position basis. If i now use an operator ##\hat{p}## on this basis i will get basis which coresponds to a momentum space and projections of a ##|\Psi(t)\rangle## on these base vectors will now be ##\Psi (p,t)##?

In other words. Do operators transform basis or a state vector or both?
Attached is an extension of the previous Townsend excerpt, which explains momentum space wavefunctions and the relationship between the position basis and the momentum basis.

#### Attachments

• 234.3 KB Views: 99
Ok so if i understood right the inner product is a projection of a state vector's norm ##|\,|\psi(t) \rangle\,|## to a normalised vector ##|x\rangle##. So this means that whatever i get is allso a norm!

This means that:

$$\langle x | \psi(t)\rangle \neq \psi(x,t)$$

but rather:

$$\langle x | \psi(t)\rangle = |\psi(x,t)|$$

Is this correct? I mean why wouldn't it be a norm?

Last edited:
Huh? An inner product does not give you a norm. You can define a norm via an inner product as ##||\psi\rangle| = \sqrt{\langle \psi | \psi \rangle}##. The inner product is not a "projection of a state vector's norm", it's a projection of one vector onto another.

$$\langle x | \psi(t)\rangle = |\psi(x,t)|$$

Is this correct?
No, it isn't. I don't really understand why you think it is.

Ok so if i understood right the inner product is a projection of a state vector's norm ##|\,|\psi(t) \rangle\,|## to a normalised vector $|x\rangle$.
What gave you that idea? <x|ψ> is the inner product of |psi> and |x>.

So this means that whatever i get is allso a norm!
I'm not sure what you mean by this.
This means that:

$$\langle x | \psi(t)\rangle \neq \psi(x,t)$$

but rather:

$$\langle x | \psi(t)\rangle = |\psi(x,t)|$$

Is this correct?
No, $$\langle x | \psi(t)\rangle = \psi(x,t)$$ is the correct expression.
I mean why wouldn't it be a norm?
Again, I'm not sure what you mean by this.

What gave you that idea?
Well inner product gives us a number doesn't it (it may be complex or real right)? Afterall it is only a generalization of a scalar product and they are very similar:

\begin{align}
\text{scalar product:}& & \vec{a}\cdot \vec{b} &= a_1b_1+1_2b_2+a_3b_3+\dots \xrightarrow{\text{geom. interp.}} \overbrace{|\vec{a}|}^{\rlap{\text{What would happen if this is normalized?}}}\,\underbrace{|\vec{b}|\,\cos{\varphi}}_{\rlap{\text{This is a projection of a norm.}}}\\
\text{inner product:}& & \left\langle a \right| \left. b\right\rangle &= \overline{a_1}b_1 + \overline{a_2}b_2 + a_3b_3+\dots
\end{align}

The geometrical interpretation for scalar product is known to me (Wiki) while i don't know the geometrical interpretation of an inner product. But if i look at the scalar product i can say that in fact we project a norm of a vector ##\vec{b}## and not a vector ##\vec{b}##! We then multiply this norm by a norm of ##\vec{a}## denoted ##|\vec{a}|##.

So what would happen if ##\vec{a}## vas normalised. This means ##|\vec{a}| = 1## and therefore a scalar product equals only the projection of a norm ##\vec{a}\cdot\vec{b} = |\vec{b}|\cos\varphi##. So i thought that this allso applies to an inner product ##\left \langle x\right| \left.\psi(t) \right\rangle##. Afterall an inner product is a number and not vector!

Last edited:
Afterall an inner product is a number and not vector!
No one ever said it gives you a vector. I still don't understand your confusion. The inner product gives you a complex number, as you said. Norms are always greater than or equal to zero, so clearly the inner product is not, by itself, a norm if its value can be complex.

...its value can be complex.
This changes everything indeed. Thank you.

CompuChip
Homework Helper
Note that all inner products do induce a norm. If $\langle \cdot \mid \cdot \rangle$ is an inner product, then
$$|| \cdot ||: a \mapsto \sqrt{\langle a \mid a \rangle}$$
defines a norm.

I guess part of your confusion arises from the fact that the first inner product that you have learned, the scalar product $\vec a \cdot \vec b$ on $\mathbb{R}^n$, actually induces the Euclidean norm $||a||_\mathrm{Eucl} = \sqrt{a_1^2 + a_2^2 + \cdots + a_n^2}$. Because you already have so much intuition for these things, it is sometimes hard to distinguish between e.g. the inner product and its induced norm.

However there are many other inner products, the most common ones being
(a) $\langle \vec a \mid \vec b \rangle = a_1 b_1^* + \cdots + a_n b_n^*$ (with * denoting complex conjugation) on $\mathbb{C}^n$, which reduces to the previously mentioned product if a and b are real.
(b) $\langle \vec f \mid \vec g \rangle = \int_a^b f(x) g(x)^* \, dx$ on $C(a, b)$, the space of all continuous functions on the interval [a, b].

Also note that the converse is not necessarily true, e.g.
$$||a||_0 = \max\{ a_1, a_2, \ldots, a_n \}$$
is a norm on $\mathbb{R}^n$ as well, but there is no inner product $\langle \cdot \mid \cdot \rangle_0$ such that $||a||_0 = \sqrt{\langle a \mid a \rangle_0 }$.

Demystifier
Gold Member
The correct definition would be

$$\Psi \left(\vec{x},t\right) = \langle \vec{x},t|\Psi \rangle$$

and the full mathematical explanation is found in the theory of the representations of the Galilei group on a rigged Hilbert space.
Can you give a reference please?

Learn mathematics from books written by mathematicians
Sorry, but I have to comment your signature too:
Learn how to use mathematics in physics from books written by those who use mathematics in physics!

vanhees71
Gold Member
2019 Award
A good reference for the rigged-Hilbert-space approach and the Galilei group is

L. Ballentine, Quantum Mechanics, Addison-Wesley

For a mathematically more rigorous treatment see the two-volume book by Galindo/Pascual.

One should also note that in the bra-ket formalism it is important to tell which picture of time evolution one has in mind. In dextercioby's posting it was obviously the Heisenberg picture, where the operators representing observables take the full time dependence and the state (statistical operator or representing state ket) is time independent. Then the full time dependence is in the (generalized) eigenvector of the representation. In position representation then you have what dextercioby wrote
$$\psi(t,\vec{x})=_{\mathrm{H}}\langle \vec{x},t|\psi \rangle_{\mathrm{H}}.$$
The wave function is invariant under the choice of the picture of time evolution. In non-relativistic quantum theory one more often sees the Schrödinger picture. Then the state kets carry the complete time dependence and the observable operators and their (generalized) eigenstates are time independent. Then you get
$$\psi(t,\vec{x})=_{\mathrm{S}} \langle \vec{x}|\psi,t \rangle_{\mathrm{S}}.$$

In dextercioby's posting it was obviously the Heisenberg picture, where the operators representing observables take the full time dependence and the state (statistical operator or representing state ket) is time independent. Then the full time dependence is in the (generalized) eigenvector of the representation. In position representation then you have what dextercioby wrote
$$\psi(t,\vec{x})=_{\mathrm{H}}\langle \vec{x},t|\psi \rangle_{\mathrm{H}}.$$
But how do you have a bra like ##\langle \vec{x},t|## to project onto in the Heisenberg picture? That is itself a time dependent state—but all states in Heisenberg are time independent. What am I missing?

dextercioby
Homework Helper
Hendrik, I didn't have any picture in mind when I wrote that equality. I put the <t> in the bra to make clear that t comes from the Galilei group, or rather a 1-parameter subgroup of it which is the time evolution.

vanhees71
Gold Member
2019 Award
@dextercioby: You can use the Galilei group in any picture of time evolution. It has nothing to do with the choice of the picture, since quantum theory is invariant under (time-dependent) unitary transformations, and you always can go from one picture to another by such a unitary transformation.

@LastOneStanding: A wave function is always the (generalized) scalar product of a (generalized) eigenstate of operators, representing observables and the state ket. In the Heisenberg picture the operators are time dependent and thus also the (generalized) eigenvectors are time dependent, while the state kets are time-independent.

@LastOneStanding: A wave function is always the (generalized) scalar product of a (generalized) eigenstate of operators, representing observables and the state ket. In the Heisenberg picture the operators are time dependent and thus also the (generalized) eigenvectors are time dependent, while the state kets are time-independent.
So not all kets are time independent in the Heisenberg picture? Only those that refer to physical states of systems?

kith