Understanding Hermitian Operators for QM Beginners

In summary: If ##A## is a matrix with complex entries, then ##A^\ast## is the matrix obtained by taking the complex conjugate of each entry. The eigenvalues of ##A^\ast## are the complex conjugates of the eigenvalues of ##A##. Now if ##A## is also hermitian, then ##A^\ast=A##, so the eigenvalues of ##A## are the complex conjugates of the eigenvalues of ##A##, which means they are real. This is the reason we like hermitian matrices in quantum mechanics: their eigenvalues are real, which corresponds to having a well-defined energy. Similarly, if ##A## is an operator acting on functions, then ##A^\d
  • #1
kq6up
368
13
I am a QM beginner so go easy on me. I have just noticed something. Let $$\hat{O}$$ be an hermitian operator. Then $$\left( \hat { O } \right) ^{ \dagger }\neq \hat { O } $$ when it is by itself. For example $$\left( \hat { p } \right) ^{ \dagger }=i\hbar \frac { \partial }{ \partial x } \neq \hat { p } =-i\hbar \frac { \partial }{ \partial x }$$.

In context with an eigenfunction $$ \left( \hat { O } \Psi \right) ^{ \dagger }=\Psi ^{ \ast }\hat { O } ^{ \dagger }\neq \Psi ^{ \ast }\hat { O } $$. However, I guess it still doesn't work unless I give it a $$\Psi$$ to chew on the right. I guess this might finally work: $$ \left( \hat { O } \Psi \right) ^{ \dagger }\Psi =\Psi ^{ \ast }\hat { O } ^{ \dagger }\Psi =\Psi ^{ \ast }\hat { O } \Psi $$.

However if I test this with an actualy example: $$\hat { O } =\hat { p } \quad and\quad \Psi ={ e }^{ ikx }\quad then\quad { e }^{ -ikx }(-i\hbar \frac { \partial }{ \partial x } ){ e }^{ ikx }\neq { e }^{ -ikx }(i\hbar \frac { \partial }{ \partial x } ){ e }^{ ikx }$$. Then that does not work as well.

Could someone help clarify what is going on here? In what sense are these operators hermitian. I guess not in the same sense that a matrix is hermitian.

Thanks,
Chris Maness
 
Physics news on Phys.org
  • #2
kq6up said:
Let $$\hat{O}$$ be an hermitian operator. Then $$\left( \hat { O } \right) ^{ \dagger }\neq \hat { O } $$ when it is by itself. For example $$\left( \hat { p } \right) ^{ \dagger }=i\hbar \frac { \partial }{ \partial x } \neq \hat { p } =-i\hbar \frac { \partial }{ \partial x }$$.
The Hermitian conjugate is the complex conjugate transpose. You've taken the complex conjugate, but not the transpose. In this case, transpose means that the partial derivative acts to the left.
 
  • #3
kq6up said:
I am a QM beginner so go easy on me. I have just noticed something. Let $$\hat{O}$$ be an hermitian operator. Then $$\left( \hat { O } \right) ^{ \dagger }\neq \hat { O } $$ when it is by itself. For example $$\left( \hat { p } \right) ^{ \dagger }=i\hbar \frac { \partial }{ \partial x } \neq \hat { p } =-i\hbar \frac { \partial }{ \partial x }$$.
This should be
$$\hat p^\dagger =\left(i\hbar\frac{\partial}{\partial x}\right)^\dagger =-i\hbar\underbrace{\left(\frac{\partial}{\partial x}\right)^\dagger}_{\displaystyle=-\frac{\partial}{\partial x}} =\hat p.$$
kq6up said:
In context with an eigenfunction $$ \left( \hat { O } \Psi \right) ^{ \dagger }=\Psi ^{ \ast }\hat { O } ^{ \dagger }\neq \Psi ^{ \ast }\hat { O } $$.
The function should always be to the right of the operator...and the dagger acts only on the operator, not on the function ##\hat O\psi##.
 
  • #4
Fredrik said:
This should be
$$=\underbrace{\left(\frac{\partial}{\partial x}\right)^\dagger}_{\displaystyle=-\frac{\partial}{\partial x}}$$
... or equivalently, integrating by parts, ∂/∂x acting to the left.
 
  • #5
Bill_K said:
... or equivalently, integrating by parts, ∂/∂x acting to the left.
When we're not doing bra-ket notation, it seems very strange to think of any operator as "acting to the left". But yes, integration by parts is what we use in the standard non-rigorous argument for ##D^\dagger=-D## (where D is the operator that takes a differentiable function to its derivative).
 
  • #6
I see what expres satisfies my little thought experiment. However, I am having a hard time understanding this equation in terms of your insight: $$\frac{\partial \Phi^*}{\partial t} = -\frac{1}{i\hbar}\Phi^*H^* = -\frac{1}{i\hbar}\Phi^*H$$. This is from "The Schrödinger picture of the Ehrenfest Theorem" on the Wikipedia article on that topic. Is it wrong? I have used it to work out several problems correctly, but now considering this new information and my failed thought experiments (proofs I guess) my head is reeling.

Thanks,
Chris Maness
 
  • #7
The issue here is whether it's OK to write ##(Af)^* =f^*A^\dagger##. I'm inclined to say no, but I guess it depends on your tolerance for nonsensical mathematical expressions that can be made sense of by assigning a specific meaning to the notation. As you can tell by my choice of words, my tolerance is rather low. The only way I see to make sense of this equality is to interpret it as an abbreviated version of the statement
$$\int (Af)^*(x)g(x)\mathrm dx =\int f^*(x)(A^\dagger g)(x)\mathrm dx,~~ \text{for all g}.$$ You can see that this holds by noting that the left-hand side is equal to ##\langle Af,g\rangle##, and the right-hand side is equal to ##\langle f,A^\dagger g\rangle##. By definition of the ##\dagger## operation, these two numbers are equal for all g.
 
  • #8
Fredrik said:
The issue here is whether it's OK to write ##(Af)^* =f^*A^\dagger##

I am seeing now how that is a bit funky. For example if I kept the identities going pretending these behave analogous to hermitian matrices which is probably why I am having a hard time with this in the first place.

If this were the case it continues like so: ##(Af)^* =f^*A^\dagger=f^*A## if A is hermitian. Argh!

$$\int (Af)^*(x)g(x)\mathrm dx =\int f^*(x)(A^\dagger g)(x)\mathrm dx,~~ \text{for all g}.$$

I have seen this one around searching on this topic, but I get NO intuitive warm and fuzzy satisfaction from it at all. If I used it, it would just be by the force of definition -- which does not build my confidence in reaching for it.

Thanks,
Chris Maness
 
  • #9
kq6up said:
I have seen this one around searching on this topic, but I get NO intuitive warm and fuzzy satisfaction from it at all. If I used it, it would just be by the force of definition -- which does not build my confidence in reaching for it.

Thanks,
Chris Maness

It's a little more intuitive if you see how it relates to the analogous fact about matrices. Are you familiar with matrices? If [itex]f[/itex] and [itex]g[/itex] are column matrices, and [itex]A[/itex] is a square matrix (and the dimensions are appropriate) then you can define a product of the three of them as follows:

[itex]f^\dagger (A g)[/itex]

where [itex]f^\dagger[/itex] is the complex conjugate of the transpose of [itex]f[/itex].

It's pretty simple to prove from the properties of matrix multiplication and transposition, that

[itex]f^\dagger (A^\dagger g) = (A f)^\dagger g[/itex]

In terms of components, this says:
[itex]\sum_i (f^*_i (A^\dagger g)_i) = \sum_i (A f)^*_i g_i[/itex]

Hilbert spaces of functions can be thought of intuitively as a generalization of matrices to the case where the "indices" are continuous. So instead of [itex]\sum_i[/itex] you have [itex]\int dx[/itex]. So the identity becomes:

[itex]\int f^*(x) (A^\dagger g)(x) dx = \int (A f)^*(x) g(x)[/itex]
 
  • Like
Likes 1 person
  • #10
The only remaining question is why astrix instead of daggers in Hilbert space?

$$\int f^*(x) (A^\dagger g)(x) dx = \int (A f)^*(x) g(x)$$

Chris
 
  • #11
stevendaryl said:
Hilbert spaces of functions can be thought of intuitively as a generalization of matrices to the case where the "indices" are continuous. So instead of [itex]\sum_i[/itex] you have [itex]\int dx[/itex]. So the identity becomes:

[itex]\int f^*(x) (A^\dagger g)(x) dx = \int (A f)^*(x) g(x)[/itex]

It can almost certainly be made rigorous by means of the Rigged Hilbert space formalism.

But for the details one must consult specialist tomes. I did at one time - won't do that again.

Without going that deep into it Halls book probably resolves it:
https://www.amazon.com/dp/146147115X/?tag=pfamazon01-20

Thanks
Bill
 
Last edited by a moderator:
  • #12
kq6up said:
The only remaining question is why astrix instead of daggers in Hilbert space?
$$\int f^*(x) (A^\dagger g)(x) dx = \int (A f)^*(x) g(x)$$
Chris

I'm not sure what your question means. Both are used in working with Hilbert space.

The basic inner product in Hilbert space is:

[itex]\int f^*(x) g(x) dx[/itex]

This is analogous to the inner product for column matrices:

[itex]f^\dagger g[/itex]

So see that they are exactly analogous, write the latter out in terms of components:

[itex]f^\dagger g = \sum_i f^*_i g_i[/itex]

On the right side, there are no daggers, because the matrix components [itex]f_i[/itex] are just complex numbers.

Similarly, for a particular value of [itex]x[/itex], [itex]f(x)[/itex] is just a complex number. You can take its complex conjugate, [itex]f(x)^*[/itex], but using a dagger doesn't make any difference: dagger applied to a number is the same as asterisk.

The distinction, that people are not used to making, is between a function, [itex]f[/itex], which is an infinite object, and the value [itex]f(x)[/itex] of a function at a particular value of [itex]x[/itex]. This is like the distinction between a column matrix [itex]f[/itex] and one of its components, [itex]f_i[/itex].
 
  • #13
kq6up said:
The only remaining question is why astrix instead of daggers in Hilbert space?

$$\int f^*(x) (A^\dagger g)(x) dx = \int (A f)^*(x) g(x)$$

Chris
The dagger is the adjoint operation. It takes operators to operators. The asterisk is the complex conjugate of a function. The complex conjugate of a function ##f:\mathcal H\to\mathbb C## is the function ##f^*:\mathcal H\to\mathbb C## defined by ##f^*(x)=(f(x))^*## for all x in the domain of f. In the formula quoted above, ##\dagger## acts on the operator ##A##, and ##*## acts on the functions ##f## and ##Af##.

The formula holds because
$$\int f^*(x) (A^\dagger g)(x) dx =\langle f,A^\dagger g\rangle =\langle (A^\dagger)^\dagger f,g\rangle =\langle Af,g\rangle = \int (A f)^*(x) g(x).$$
 
  • #14
Fredrik said:
The dagger is the adjoint operation. It takes operators to operators. The asterisk is the complex conjugate of a function. The complex conjugate of a function ##f:\mathcal H\to\mathbb C## is the function ##f^*:\mathcal H\to\mathbb C## defined by ##f^*(x)=(f(x))^*## for all x in the domain of f. In the formula quoted above, ##\dagger## acts on the operator ##A##, and ##*## acts on the functions ##f## and ##Af##.

The formula holds because
$$\int f^*(x) (A^\dagger g)(x) dx =\langle f,A^\dagger g\rangle =\langle (A^\dagger)^\dagger f,g\rangle =\langle Af,g\rangle = \int (A f)^*(x) g(x).$$

This might be a nit-picky mathematical point that I should keep my mouth shut about, but there certainly can be an adjoint of a function. If [itex]f[/itex] is a function, then [itex]f^\dagger[/itex] is a functional--a function of functions that takes a function and returns a scalar. Mathematically, the action of the functional [itex]f^\dagger[/itex] on a function [itex]g[/itex] is defined by:

[itex]f^\dagger(g) = \int f^*(x) g(x) dx[/itex]

So the [itex]^\dagger[/itex] operation acts on functions to return functionals, while the [itex]^*[/itex] acts on complex numbers to return complex numbers.
 
  • #15
In bra-ket terminology, you're defining the adjoint of a ket as the corresponding bra, i.e. ##f^\dagger =\langle f,\cdot\rangle##. I have seen that before (here at PF), but I have never found it useful.
 
  • #16
stevendaryl said:
This might be a nit-picky mathematical point that I should keep my mouth shut about, but there certainly can be an adjoint of a function. If [itex]f[/itex] is a function, then [itex]f^\dagger[/itex] is a functional--a function of functions that takes a function and returns a scalar. Mathematically, the action of the functional [itex]f^\dagger[/itex] on a function [itex]g[/itex] is defined by:

[itex]f^\dagger(g) = \int f^*(x) g(x) dx[/itex]

So the [itex]^\dagger[/itex] operation acts on functions to return functionals, while the [itex]^*[/itex] acts on complex numbers to return complex numbers.

So the equation

[itex](A f)^\dagger = f^\dagger A^\dagger[/itex]

actually is not nonsense. It means that that as a functional, [itex](A f)^\dagger[/itex] is the same as [itex]f^\dagger A^\dagger[/itex]. Functionals are equal if they give the same value on every argument:

[itex](A f)^\dagger(g) = f^\dagger(A^\dagger(g))[/itex]
 
  • #17
Fredrik said:
In bra-ket terminology, you're defining the adjoint of a ket as the corresponding bra, i.e. ##f^\dagger =\langle f,\cdot\rangle##. I have seen that before (here at PF), but I have never found it useful.

Well, it does make identities such as the one the original poster was asking about almost trivial.
 
  • #18
stevendaryl said:
So the equation

[itex](A f)^\dagger = f^\dagger A^\dagger[/itex]

So for a Matrix this is obvious to me. Let H be any hermitian matrix $$(HM)^{\dagger}=M^{\dagger}H^{\dagger}=M^{\dagger}H$$

So in my mind, if the quoted equation is a true analog of matrix operation, the identity should continue to:

[itex](A f)^\dagger = f^\dagger A^\dagger=f^\dagger A[/itex]

Does it?

Thanks,
Chris
 
  • #19
kq6up said:
So for a Matrix this is obvious to me. Let H be any hermitian matrix $$(HM)^{\dagger}=M^{\dagger}H^{\dagger}=M^{\dagger}H$$

So in my mind, if the quoted equation is a true analog of matrix operation, the identity should continue to:

[itex](A f)^\dagger = f^\dagger A^\dagger=f^\dagger A[/itex]

Does it?

Thanks,
Chris

Yes, in the sense that the far right side and the far left side produce the same result when applied to an argument (a square-integrable function) [itex]g[/itex]:

[itex](A f)^\dagger g = (f^\dagger A) g = f^\dagger (A g)[/itex]

In terms of integrals, this is a compact way of writing:

[itex]\int (A f)^*(x) g(x) dx = \int f^*(x) (A g)(x) dx[/itex]
 
  • #20
So it works in the sense that it really needs to be in the context of the integral because the integral is to the function what the summation is to the matrix element. Correct?

Chris
 
  • #21
kq6up said:
So it works in the sense that it really needs to be in the context of the integral because the integral is to the function what the summation is to the matrix element. Correct?

Chris

Definitely. The understanding of functions as "vectors" in the Hilbert space, and [itex]\dagger[/itex] as the adjoint requires that you understand [itex]f^\dagger g[/itex] as an integral:

[itex]f^\dagger g = \int f^*(x) g(x) dx[/itex]

[EDIT]
I keep bringing up the analogy with matrices. In matrices, the meaning of matrix multiplication is summation:
[itex]f^\dagger g = \sum_i f^*_i g_i[/itex]. Integration is sort of the generalization of this to a continuous index, [itex]x[/itex], instead of a discrete index, [itex]i[/itex].
 
Last edited:
  • Like
Likes 1 person
  • #22
Ok, thank you. $$mind \Rightarrow blown$$

Chris
 
  • #23
Here is a handy little identity that I have used to solve a homework problem. I saw on several authoritative sites on self-adjunct operators:

$$\left< { \hat{O}f }|{g } \right> =\left< { f }|{ \hat{O}g } \right> $$ If O hat is self-adjoint.

However, I tried this little mathematical experiment to see if it works:

Let $$\Psi=e^{ikx}$$, and $$\left< { \hat { p } \Psi }|{ \Psi } \right> =\left< { \Psi }|{ \hat { p } \Psi } \right> $$

The right side equals:

$$\left< { -i\hbar \frac { \partial }{ \partial x } \Psi }|{ \Psi } \right> =\left< { -i\hbar (-i)\Psi }|{ \Psi } \right> =-\hbar \left< { \Psi }|{ \Psi } \right> =-\hbar $$

The left side equals:

$$\left< { \Psi }|{ -i\hbar \frac { \partial }{ \partial x } \Psi } \right> =-i\hbar \left< { \Psi }|{ i\Psi } \right> =\hbar \left< { \Psi }|{ \Psi } \right> =\hbar $$

It no worky, the L.S. and the R.S. are not identical. Where did I go wrong here?

Thanks,
Chris
 
  • #24
kq6up said:
However, I tried this little mathematical experiment to see if it works:

Let $$\Psi=e^{ikx}$$, and $$\left< { \hat { p } \Psi }|{ \Psi } \right> =\left< { \Psi }|{ \hat { p } \Psi } \right> $$

The right side equals:

$$\left< { -i\hbar \frac { \partial }{ \partial x } \Psi }|{ \Psi } \right> =\left< { -i\hbar (-i)\Psi }|{ \Psi } \right> =-\hbar \left< { \Psi }|{ \Psi } \right> =-\hbar $$
What you wrote before the first equality looks like what you had on the left, not on the right. I don't know what you're doing next, but the derivative isn't going to disappear. (Edit: Ahh...after seing strangrep's reply, I noticed that you assumed that ##\Psi=e^{ikx}##. OK, in that case d/dx is going to pull out a factor of ik from ##\Psi##).

kq6up said:
The left side equals:
Are you sure you're not confusing left with right? Remember, on your left hand, the thumb is to the right. :wink:

To verify that ##\hat p## is self-adjoint, you should use the definition of this particular inner product to prove that ##\langle\hat p f,f\rangle=\langle f,\hat pf\rangle## for all square-integrable functions f. Actually, it doesn't have to be the same function on both sides. You can prove that ##\langle\hat p f,g\rangle=\langle f,\hat pg\rangle## for all square-integrable f,g.

Hint: integration by parts, or equivalently, use the product rule in the form ##f'g=(fg)'-fg'##.

(You can of course continue to denote the function by ##\Psi## if you want. I'm using f mainly because it's easier to type).
 
Last edited:
  • #25
Hello,

The momentum operator needs to be treated quite carefully (as does everything, ha ha). Its self-adjointness and eigenvalue spectrum depend very strongly on the boundary conditions of the states. I would suggest writing out those inner products in integral form and very carefully going through the computation. You will see when you have to make some assumption on the boundary condition.

EDIT: Fredrik beat me to it.
 
Last edited:
  • #26
I did the proof for the general version with the integrals. It does work. These things are very tricky indeed. It is the most nuanced math I have ever done.

Regards,
Chris
 
  • #27
kq6up said:
$$\Psi = e^{ikx}$$ [...]
The right side equals:
$$\left< { -i\hbar \frac { \partial }{ \partial x } \Psi }|{ \Psi } \right> =\left< { -i\hbar (-i)\Psi }|{ \Psi } \right> = [\cdots] $$
The 1st step looks wrong. You've taken the derivative of ##\Psi^\dagger##, not ##\Psi##.

But perhaps you've already seen this when you did it in terms of an integral?

(Oh, and I think you left out a "k", but that's not the problem.)
 
  • #28
But when it's in the bra isn't it supposed to be the conjugate?

Chris
 
  • #29
kq6up said:
But when it's in the bra isn't it supposed to be the conjugate?
No. The difference between the bra and the ket are denoted by the brackets, not by what's inside. Using your notation, [itex]| \Psi \rangle = |e^{ikx} \rangle[/itex] and [itex]\langle \Psi | = \langle e^{ikx} |[/itex]. The conjugate comes in when you write [itex]\langle \Phi | \Psi \rangle[/itex] as an integral.

But your notation is strange. [itex]e^{ikx}[/itex] implies that you are working in the position or momentum basis while [itex]| \Psi \rangle [/itex] denotes a general ket vector. The standard notation for the connection between these concepts is [itex]e^{ikx} = \langle x | \Psi \rangle =: \Psi(x)[/itex] where [itex]\Psi(x)[/itex] is called the wavefunction.
 
  • #30
kq6up said:
But when it's in the bra isn't it supposed to be the conjugate?

Chris
No. If A is an operator and f a square integrable function, then Af means the same thing no matter where you put it.

The bra that corresponds to ##A|\alpha\rangle## is ##\langle\alpha|A^\dagger##, but I see no place to use that in your calculation.

What does a product of a bra and an operator even mean? It's defined by
$$\big(\langle\alpha|A\big)|\beta\rangle =\langle\alpha|\big(A|\beta\rangle\big).$$ The product ##\langle\alpha|A## is defined by saying that this equality holds for all ##|\beta>##.

The bra that corresponds to an arbitrary vector (i.e. function) f is the map ##g\mapsto \langle f,g\rangle##, which can be written as ##\langle f,\cdot\rangle##. So the bra that corresponds to Af is the map ##\langle Af,\cdot\rangle##. This map takes an arbitrary ##g## to ##\langle Af,g\rangle##, which is equal to ##\langle f,A^\dagger g\rangle##. In bra-ket notation, the map that takes ##g## to ##\langle f,A^\dagger g\rangle## is written as ##\langle f|A^\dagger##. In inner product notation, we'd have to use something super ugly like ##\langle f,A^\dagger(\cdot)\rangle## or just invent a new symbol for it.
 
  • #31
I think what I need here is a good textbook that hand holds me through this formalism for both the operators and the bra-ket notation. None of the books I have really gives a clear treatment from the bottom up -- just bits and pieces only, and I have looked a plenty of books the past few days. Mary Boas' text, Math Methods -- Riley & Hobson (which is a great book, but not fined grained enough in this area). I am using Griffiths to learn QM, and the text is very clear, but when you get to the problem set one gets the sense that a lot of what is expected in the problem set is not really developed in the text itself.

Suggestions?

Edit: Zettili looks like he develops these details in a much more thorough way. I think I will go through his chapter 2, and report back.

Thanks,
Chris
 
Last edited:
  • #32
kq6up said:
Suggestions?
There's a reason I put a "permanent" suggestion in my (2nd) signature line below. :wink:
 
  • Like
Likes 1 person
  • #33
strangerep said:
There's a reason I put a "permanent" suggestion in my (2nd) signature line below. :wink:

I second that.

Ballentine had a strong effect on me clearing up all sorts of issues.

Thanks
Bill
 
  • Like
Likes 1 person
  • #34
Ok, reporting back. Zettili looks really good, but again it might be a little too presumptuous for me. I can see now that my math training was very practical and not rigorous and formal. I have had the requisite three semesters of calculus almost 20 years ago now, and I have had a very feeble introduction to differential equations. After my community college experience I transferred to UCR where I earned my bachelors in physics. I did fine in classical, but that was the only core class I did ok in. However, QM, E&M, EM Radiation, Stat Mech I gave up, and did almost no problems from the problem sets. At that point I should have carved out the time to fill in the missing holes, but discouragement was the reigning emotion. The profs tended to be merciful and give you an obligory C if you just show up for everything. All of the other physics courses that did not require the math rigor of the aforementioned courses, and I did fairly well in them.

Fast forward to 2014. I decide to brush up and cover ground that I missed as an undergrad. Math was my first priority. Mary Boas' Methods was recommended, and I have been plowing through her text since Febuary doing most of the problems in the set. Doing quite well with though the going is slow. After finishing her 100 pages of linear algebra (which I have never taken before) I started to peek ahead at some QM, and found that I could actually begin to understand it. It definitely was time for a break from that math book. In the past two weeks I have done more QM problems than I have ever done as an undergrad -- with the added bonus of actually mostly understanding what I am doing. Sorry for all of the detail, but I thought it necessary to provide a basis for where I am at mathematically speaking.

I mostly understand the physics involved, but some of this QM math is really strange to me. However, I am resolved to learn it no matter how far back I have to go to build the correct foundation to get comfortable with the math that is involved. I will take a look at Ballentine today, and it does look less presumptuous in the opening math section, but maybe I will need a course that teaches me real/complex analysis and/or whatever folks learn things related to Hilbert space and other abstract generalized vector stuff. Also, most of the mathy/proofy statements here that are not a part of just practically doing lower division fairly straight forward physics stuff is *mostly* lost on me.

Thanks for your time,

Regards,
Chris
 
  • #35
I spent a good portion of the day yesterday going over the first part of Chapter 1 in Ballentine. I can say it was very clear and helpful except for a small rough patch referring to linear functionals. All my questions in regards to this thread have been cleared up.

There are times when self instruction can really be a bear, but it is nice to go at one's own pace.

Thank you and regards kind sirs,
Chris
 

1. What is a Hermitian operator?

A Hermitian operator is a mathematical object used in quantum mechanics to represent observables, such as position, momentum, or energy. It is a type of linear operator that has a special property called self-adjointness, which means that the operator is equal to its own adjoint (or conjugate transpose).

2. How are Hermitian operators used in quantum mechanics?

Hermitian operators are used to represent physical observables in quantum mechanics. They act on quantum states to produce measurable quantities, such as the position or momentum of a particle. They also play a crucial role in determining the allowed energy levels of a quantum system.

3. What is the significance of the eigenvalues of a Hermitian operator?

The eigenvalues of a Hermitian operator represent the possible outcomes of a measurement of the corresponding observable. In quantum mechanics, the probability of obtaining a particular eigenvalue is given by the square of the absolute value of the associated eigenstate's coefficient in the quantum state being measured.

4. How do you determine if an operator is Hermitian?

To determine if an operator is Hermitian, you can check if it satisfies the condition of self-adjointness. This means that the operator must be equal to its own adjoint (or conjugate transpose). In other words, if A is a Hermitian operator, then A† = A, where A† is the adjoint of A.

5. Can all operators in quantum mechanics be represented as Hermitian operators?

No, not all operators in quantum mechanics can be represented as Hermitian operators. Only observables, such as position, momentum, and energy, can be represented by Hermitian operators. Other types of operators, such as unitary operators, are also important in quantum mechanics but do not have the property of self-adjointness.

Similar threads

  • Quantum Physics
2
Replies
56
Views
3K
Replies
2
Views
1K
Replies
3
Views
827
  • Quantum Physics
Replies
31
Views
2K
  • Quantum Physics
Replies
3
Views
842
  • Quantum Physics
Replies
9
Views
886
Replies
4
Views
1K
Replies
10
Views
1K
Replies
7
Views
569
Replies
3
Views
411
Back
Top