# Hermitian Operators?

1. Jun 15, 2014

### kq6up

I am a QM beginner so go easy on me. I have just noticed something. Let $$\hat{O}$$ be an hermitian operator. Then $$\left( \hat { O } \right) ^{ \dagger }\neq \hat { O }$$ when it is by itself. For example $$\left( \hat { p } \right) ^{ \dagger }=i\hbar \frac { \partial }{ \partial x } \neq \hat { p } =-i\hbar \frac { \partial }{ \partial x }$$.

In context with an eigenfunction $$\left( \hat { O } \Psi \right) ^{ \dagger }=\Psi ^{ \ast }\hat { O } ^{ \dagger }\neq \Psi ^{ \ast }\hat { O }$$. However, I guess it still doesn't work unless I give it a $$\Psi$$ to chew on the right. I guess this might finally work: $$\left( \hat { O } \Psi \right) ^{ \dagger }\Psi =\Psi ^{ \ast }\hat { O } ^{ \dagger }\Psi =\Psi ^{ \ast }\hat { O } \Psi$$.

However if I test this with an actualy example: $$\hat { O } =\hat { p } \quad and\quad \Psi ={ e }^{ ikx }\quad then\quad { e }^{ -ikx }(-i\hbar \frac { \partial }{ \partial x } ){ e }^{ ikx }\neq { e }^{ -ikx }(i\hbar \frac { \partial }{ \partial x } ){ e }^{ ikx }$$. Then that does not work as well.

Could someone help clarify what is going on here? In what sense are these operators hermitian. I guess not in the same sense that a matrix is hermitian.

Thanks,
Chris Maness

2. Jun 15, 2014

### Bill_K

The Hermitian conjugate is the complex conjugate transpose. You've taken the complex conjugate, but not the transpose. In this case, transpose means that the partial derivative acts to the left.

3. Jun 15, 2014

### Fredrik

Staff Emeritus
This should be
$$\hat p^\dagger =\left(i\hbar\frac{\partial}{\partial x}\right)^\dagger =-i\hbar\underbrace{\left(\frac{\partial}{\partial x}\right)^\dagger}_{\displaystyle=-\frac{\partial}{\partial x}} =\hat p.$$
The function should always be to the right of the operator...and the dagger acts only on the operator, not on the function $\hat O\psi$.

4. Jun 15, 2014

### Bill_K

... or equivalently, integrating by parts, ∂/∂x acting to the left.

5. Jun 15, 2014

### Fredrik

Staff Emeritus
When we're not doing bra-ket notation, it seems very strange to think of any operator as "acting to the left". But yes, integration by parts is what we use in the standard non-rigorous argument for $D^\dagger=-D$ (where D is the operator that takes a differentiable function to its derivative).

6. Jun 15, 2014

### kq6up

I see what expres satisfies my little thought experiment. However, I am having a hard time understanding this equation in terms of your insight: $$\frac{\partial \Phi^*}{\partial t} = -\frac{1}{i\hbar}\Phi^*H^* = -\frac{1}{i\hbar}\Phi^*H$$. This is from "The Schrödinger picture of the Ehrenfest Theorem" on the Wikipedia article on that topic. Is it wrong? I have used it to work out several problems correctly, but now considering this new information and my failed thought experiments (proofs I guess) my head is reeling.

Thanks,
Chris Maness

7. Jun 15, 2014

### Fredrik

Staff Emeritus
The issue here is whether it's OK to write $(Af)^* =f^*A^\dagger$. I'm inclined to say no, but I guess it depends on your tolerance for nonsensical mathematical expressions that can be made sense of by assigning a specific meaning to the notation. As you can tell by my choice of words, my tolerance is rather low. The only way I see to make sense of this equality is to interpret it as an abbreviated version of the statement
$$\int (Af)^*(x)g(x)\mathrm dx =\int f^*(x)(A^\dagger g)(x)\mathrm dx,~~ \text{for all g}.$$ You can see that this holds by noting that the left-hand side is equal to $\langle Af,g\rangle$, and the right-hand side is equal to $\langle f,A^\dagger g\rangle$. By definition of the $\dagger$ operation, these two numbers are equal for all g.

8. Jun 15, 2014

### kq6up

I am seeing now how that is a bit funky. For example if I kept the identities going pretending these behave analogous to hermitian matrices which is probably why I am having a hard time with this in the first place.

If this were the case it continues like so: $(Af)^* =f^*A^\dagger=f^*A$ if A is hermitian. Argh!

I have seen this one around searching on this topic, but I get NO intuitive warm and fuzzy satisfaction from it at all. If I used it, it would just be by the force of definition -- which does not build my confidence in reaching for it.

Thanks,
Chris Maness

9. Jun 15, 2014

### stevendaryl

Staff Emeritus
It's a little more intuitive if you see how it relates to the analogous fact about matrices. Are you familiar with matrices? If $f$ and $g$ are column matrices, and $A$ is a square matrix (and the dimensions are appropriate) then you can define a product of the three of them as follows:

$f^\dagger (A g)$

where $f^\dagger$ is the complex conjugate of the transpose of $f$.

It's pretty simple to prove from the properties of matrix multiplication and transposition, that

$f^\dagger (A^\dagger g) = (A f)^\dagger g$

In terms of components, this says:
$\sum_i (f^*_i (A^\dagger g)_i) = \sum_i (A f)^*_i g_i$

Hilbert spaces of functions can be thought of intuitively as a generalization of matrices to the case where the "indices" are continuous. So instead of $\sum_i$ you have $\int dx$. So the identity becomes:

$\int f^*(x) (A^\dagger g)(x) dx = \int (A f)^*(x) g(x)$

10. Jun 15, 2014

### kq6up

The only remaining question is why astrix instead of daggers in Hilbert space?

$$\int f^*(x) (A^\dagger g)(x) dx = \int (A f)^*(x) g(x)$$

Chris

11. Jun 15, 2014

### Staff: Mentor

It can almost certainly be made rigorous by means of the Rigged Hilbert space formalism.

But for the details one must consult specialist tomes. I did at one time - wont do that again.

Without going that deep into it Halls book probably resolves it:

Thanks
Bill

Last edited by a moderator: May 6, 2017
12. Jun 16, 2014

### stevendaryl

Staff Emeritus
I'm not sure what your question means. Both are used in working with Hilbert space.

The basic inner product in Hilbert space is:

$\int f^*(x) g(x) dx$

This is analogous to the inner product for column matrices:

$f^\dagger g$

So see that they are exactly analogous, write the latter out in terms of components:

$f^\dagger g = \sum_i f^*_i g_i$

On the right side, there are no daggers, because the matrix components $f_i$ are just complex numbers.

Similarly, for a particular value of $x$, $f(x)$ is just a complex number. You can take its complex conjugate, $f(x)^*$, but using a dagger doesn't make any difference: dagger applied to a number is the same as asterisk.

The distinction, that people are not used to making, is between a function, $f$, which is an infinite object, and the value $f(x)$ of a function at a particular value of $x$. This is like the distinction between a column matrix $f$ and one of its components, $f_i$.

13. Jun 16, 2014

### Fredrik

Staff Emeritus
The dagger is the adjoint operation. It takes operators to operators. The asterisk is the complex conjugate of a function. The complex conjugate of a function $f:\mathcal H\to\mathbb C$ is the function $f^*:\mathcal H\to\mathbb C$ defined by $f^*(x)=(f(x))^*$ for all x in the domain of f. In the formula quoted above, $\dagger$ acts on the operator $A$, and $*$ acts on the functions $f$ and $Af$.

The formula holds because
$$\int f^*(x) (A^\dagger g)(x) dx =\langle f,A^\dagger g\rangle =\langle (A^\dagger)^\dagger f,g\rangle =\langle Af,g\rangle = \int (A f)^*(x) g(x).$$

14. Jun 16, 2014

### stevendaryl

Staff Emeritus
This might be a nit-picky mathematical point that I should keep my mouth shut about, but there certainly can be an adjoint of a function. If $f$ is a function, then $f^\dagger$ is a functional--a function of functions that takes a function and returns a scalar. Mathematically, the action of the functional $f^\dagger$ on a function $g$ is defined by:

$f^\dagger(g) = \int f^*(x) g(x) dx$

So the $^\dagger$ operation acts on functions to return functionals, while the $^*$ acts on complex numbers to return complex numbers.

15. Jun 16, 2014

### Fredrik

Staff Emeritus
In bra-ket terminology, you're defining the adjoint of a ket as the corresponding bra, i.e. $f^\dagger =\langle f,\cdot\rangle$. I have seen that before (here at PF), but I have never found it useful.

16. Jun 16, 2014

### stevendaryl

Staff Emeritus
So the equation

$(A f)^\dagger = f^\dagger A^\dagger$

actually is not nonsense. It means that that as a functional, $(A f)^\dagger$ is the same as $f^\dagger A^\dagger$. Functionals are equal if they give the same value on every argument:

$(A f)^\dagger(g) = f^\dagger(A^\dagger(g))$

17. Jun 16, 2014

### stevendaryl

Staff Emeritus
Well, it does make identities such as the one the original poster was asking about almost trivial.

18. Jun 16, 2014

### kq6up

So for a Matrix this is obvious to me. Let H be any hermitian matrix $$(HM)^{\dagger}=M^{\dagger}H^{\dagger}=M^{\dagger}H$$

So in my mind, if the quoted equation is a true analog of matrix operation, the identity should continue to:

$(A f)^\dagger = f^\dagger A^\dagger=f^\dagger A$

Does it?

Thanks,
Chris

19. Jun 16, 2014

### stevendaryl

Staff Emeritus
Yes, in the sense that the far right side and the far left side produce the same result when applied to an argument (a square-integrable function) $g$:

$(A f)^\dagger g = (f^\dagger A) g = f^\dagger (A g)$

In terms of integrals, this is a compact way of writing:

$\int (A f)^*(x) g(x) dx = \int f^*(x) (A g)(x) dx$

20. Jun 16, 2014

### kq6up

So it works in the sense that it really needs to be in the context of the integral because the integral is to the function what the summation is to the matrix element. Correct?

Chris