Graduate Does Commutativity Affect Linearity?

Click For Summary
The discussion explores the relationship between commutativity and linearity in operators, specifically focusing on the operators T and T' defined as T = (iħ d/dx + γ) and T' = (-iħ d/dx + γ). The key point is that while these operators commute, they can still be non-linear due to the presence of the constant γ, which affects their linearity depending on how it is applied to functions. If γ is treated as a constant translation, the operator becomes non-linear; however, if it acts as a multiplication operator, it can be linear. The conversation also clarifies that symmetry in operators does not imply linearity, as these are separate properties. The conclusion emphasizes that the nature of γ is crucial in determining the linearity of the operators involved.
SemM
Gold Member
Messages
194
Reaction score
13
Hi, I have in a previous thread discussed the case where:

\begin{equation}
TT' = T'T
\end{equation}

and someone, said that this was a case of non-linear operators. Evidently, they commute, so their commutator is zero and therefore they can be measured at the same time. What makes them however non-linear?

Thanks!
 
Physics news on Phys.org
SemM said:
Hi, I have in a previous thread discussed the case where:

\begin{equation}
TT' = T'T
\end{equation}

and someone, said that this was a case of non-linear operators. Evidently, they commute, so their commutator is zero and therefore they can be measured at the same time. What makes them however non-linear?

Thanks!
First you tell me what ##T## is, and then I'll tell you whether it is linear or not, not the other way around. E.g. if ##T\; , \;T'## stand for matrices, then they are linear. The fact that they commute doesn't allow any conclusions besides that they commute. To say they are linear or non-linear just by the equation ##TT'=T'T## is nonsense, except they have a fixed and one and only meaning which I'm not aware of.
 
  • Like
Likes StoneTemplePython and SemM
fresh_42 said:
First you tell me what ##T## is, and then I'll tell you whether it is linear or not, not the other way around. E.g. if ##T\; , \;T'## stand for matrices, then they are linear. The fact that they commute doesn't allow any conclusions besides that they commute. To say they are linear or non-linear just by the equation ##TT'=T'T## is nonsense, except they have a fixed and one and only meaning which I'm not aware of.

Thanks for that, I am not going to point out who said what in other posts, and get right to the point:

##T = \bigg(i\hbar \frac{d}{dx} + \gamma\bigg)##
##T' = \bigg(-i\hbar \frac{d}{dx} + \gamma\bigg)##

where ##\gamma## is a constant.

and [TT'] = [T'T]About linearity, I am not sure how to prove it.

However, to prove another property, Hermiticity, the proof of that these are Hermitian is not made. I am going to perform this on T:

http://www.colby.edu/chemistry/PChem/notes/MomentumHermitian.pdf

to check that.

Thanks
 
Last edited:
SemM said:
Thanks for that, I am not going to point out who said what in other posts, and get right to the point:

##T = i\hbar d/dx - \gamma##
##T' = -i\hbar d/dx - \gamma##

where ##\gamma## is a constant.
In this case, ##\gamma## makes it non-linear, except if ## \gamma = 0##.
$$T(f+g) = {i}{\hbar} \dfrac{d}{dx} (f+g) - \gamma = \left( {i}{\hbar} \dfrac{d}{dx} f \right) + \left( {i}{\hbar} \dfrac{d}{dx} g \right) - \gamma $$ but $$T(f)+T(g) = \left( {i}{\hbar} \dfrac{d}{dx} f \right) + \left( {i}{\hbar} \dfrac{d}{dx} g \right) - 2\gamma $$ I learned to call it affine-linear, because it is still flat as a straight, just not through the origin and therefore ##T(0) = \gamma \stackrel{i.g.}{\neq} 0## which makes it non-linear.

Can you prove why linearity of (an arbitrary) ##T## would imply ##T(0)=0## ?

Edit: ##\dfrac{i}{\hbar} \longrightarrow i \hbar ## corrected.
 
Last edited:
  • Like
Likes FactChecker and SemM
fresh_42 said:
In this case, ##\gamma## makes it non-linear, except if ## \gamma = 0##.
$$T(f+g) = \dfrac{i}{\hbar} \dfrac{d}{dx} (f+g) - \gamma = \left( \dfrac{i}{\hbar} \dfrac{d}{dx} f \right) + \left( \dfrac{i}{\hbar} \dfrac{d}{dx} g \right) - \gamma $$ but $$T(f)+T(g) = \left( \dfrac{i}{\hbar} \dfrac{d}{dx} f \right) + \left( \dfrac{i}{\hbar} \dfrac{d}{dx} g \right) - 2\gamma $$ I learned to call it affine-linear, because it is still flat as a straight, just not through the origin and therefore ##T(0) = \gamma \stackrel{i.g.}{\neq} 0## which makes it non-linear.

Can you prove why linearity of (an arbitrary) ##T## would imply ##T(0)=0## ?
Fantastic, let me try to answer this tomorrow! Thanks Fresh!
 
One further remark:
In case ##\gamma ## is actually ##\gamma \cdot \operatorname{id} = \gamma \cdot I## the operator ##T## becomes linear, as the sum of two linear operators is linear again. In this case we have:
$$
T(f+g) = {i}{\hbar} \dfrac{d}{dx} (f+g) + \gamma \cdot I (f+g)= {i}{\hbar} \dfrac{d}{dx}f + \gamma \cdot f + {i}{\hbar} \dfrac{d}{dx}g + \gamma \cdot g = T(f)+T(g)
$$
so all depends on how to read ##\gamma ##, as a constant translation "plus ##\gamma ##" (## \gamma . f = + \gamma ## which is non-linear) or as the constant linear operator "times ##\gamma ##" (## \gamma .f = \gamma \cdot f## which is linear).
 
Last edited:
  • Like
Likes SemM
fresh_42 said:
One further remark:
In case ##\gamma ## is actually ##\gamma \cdot \operatorname{id} = \gamma \cdot I## the operator ##T## becomes linear, as the sum of two linear operators is linear again. In this case we have:
$$
T(f+g) = {i}{\hbar} \dfrac{d}{dx} (f+g) + \gamma \cdot I (f+g)= {i}{\hbar} \dfrac{d}{dx}f + \gamma \cdot f + {i}{\hbar} \dfrac{d}{dx}g + \gamma \cdot g = T(f)+T(g)
$$
so all depends on how to read ##\gamma ##, as a constant translation "plus ##\gamma ##" (## \gamma . f = + \gamma ## which is non-linear) or as the constant linear operator "times ##\gamma ##" (## \gamma .f = \gamma \cdot f## which is linear).
Thanks for this Fresh!

##\gamma## is a real physical constant. However, you mention ##\gamma \cdot \operatorname{id} = \gamma \cdot I## does it mean that ##\gamma## should be a complex number in order to make T linear?
 
SemM said:
##\gamma## is a real physical constant. However, you mention ##\gamma \cdot \operatorname{id} = \gamma \cdot I## does it mean that ##\gamma## should be a complex number in order to make T linear?
It doesn't matter whether ##\gamma## is real or complex. Important is what it means. For example, let ##\gamma = -i \hbar c## for any constant number ##c \in \mathbb{C}##, which means ##c## can also be real. It doesn't matter. Then what is ##T.e^{cx}=T(e^{cx})\,##? Is it
$$
T.e^{cx}=T(e^{cx})= i \hbar \dfrac{d}{dx} e^{cx} + \gamma.e^{cx} = i \hbar c \cdot e^{cx} + \gamma = i \hbar c\cdot (e^{cx} -1)
$$
or is it
$$
T.e^{cx}=T(e^{cx})= i \hbar \dfrac{d}{dx} e^{cx} + \gamma.e^{cx} = i \hbar c \cdot e^{cx} + \gamma \cdot e^{cx} = i \hbar c\cdot (e^{cx} - e^{cx}) = 0
$$
The first case is non-linear, because ##\gamma ## is a constant translation away from the origin, whereas the second case is linear and ##e^{cx}## is an eigenvector of ##T## with eigenvalue ##0##.

Both possibilities are usually denoted simply by ##T = i \hbar \dfrac{d}{dx} + \gamma##. But the first case means ##T.f=T(f)=i \hbar f' + \gamma## and the second means ##T.f =T(f)= i \hbar f' + \gamma \cdot f##. In the second case, the operator is ##T= i \hbar \dfrac{d}{dx} + \gamma = i \hbar \dfrac{d}{dx} + \gamma \cdot I## with the identity operator ##I=id=1##. Whether ##T## is linear or not, depends on how applies ##\gamma ## on a function ##f##. Doesn't apply ##\gamma ## on ##f## at all, or does it ## f \longmapsto \gamma \cdot f##. The latter is linear.
 
  • Like
Likes SemM
fresh_42 said:
It doesn't matter whether ##\gamma## is real or complex. Important is what it means. For example, let ##\gamma = -i \hbar c## for any constant number ##c \in \mathbb{C}##, which means ##c## can also be real. It doesn't matter. Then what is ##T.e^{cx}=T(e^{cx})\,##? Is it
$$
T.e^{cx}=T(e^{cx})= i \hbar \dfrac{d}{dx} e^{cx} + \gamma.e^{cx} = i \hbar c \cdot e^{cx} + \gamma = i \hbar c\cdot (e^{cx} -1)
$$
or is it
$$
T.e^{cx}=T(e^{cx})= i \hbar \dfrac{d}{dx} e^{cx} + \gamma.e^{cx} = i \hbar c \cdot e^{cx} + \gamma \cdot e^{cx} = i \hbar c\cdot (e^{cx} - e^{cx}) = 0
$$
The first case is non-linear, because ##\gamma ## is a constant translation away from the origin, whereas the second case is linear and ##e^{cx}## is an eigenvector of ##T## with eigenvalue ##0##.

Both possibilities are usually denoted simply by ##T = i \hbar \dfrac{d}{dx} + \gamma##. But the first case means ##T.f=T(f)=i \hbar f' + \gamma## and the second means ##T.f =T(f)= i \hbar f' + \gamma \cdot f##. In the second case, the operator is ##T= i \hbar \dfrac{d}{dx} + \gamma = i \hbar \dfrac{d}{dx} + \gamma \cdot I## with the identity operator ##I=id=1##. Whether ##T## is linear or not, depends on how applies ##\gamma ## on a function ##f##. Doesn't apply ##\gamma ## on ##f## at all, or does it ## f \longmapsto \gamma \cdot f##. The latter is linear.
Thanks Fresh, the ##\gamma## is applied on the function f indeed. This becomes inevitably nonlinear. The solution to the ODE ##TT'\psi= 0## becomes therefore with complex constants, and it has no hermitian counterpart. This becomes a nontrivial case, where no physical sensible properties can be derived.
 
  • #10
Dear Fresh42, can one relate non-linearity with asymmetricity of the operator? I noticed in an attachment given by Dr Du in another thread, that linear operators that vanish in the integral

\begin{equation}
\langle\psi, -i\hbar d/dx\phi\rangle - \langle-i\hbar d/dx\phi, \psi\rangle
\end{equation}

are considered as symmetric.

In this thread, considering T for instance in the equation above it gives:

\begin{equation}
\langle\psi, (i\hbar d/dx+\gamma)\phi\rangle - \langle (i\hbar d/dx+\gamma) \phi, \psi\rangle
\end{equation}

this is still symmetric.

Do you have a reference of symmetricity and operator linearity, or are these two irrelated?

Thanks!
 
  • #11
SemM said:
Do you have a reference of symmetricity and operator linearity, or are these two irrelated?
Symmetry (or anti-symmetry) is unrelated to the linearity of the operators. An operator is a mapping, a function. It can have several properties and linearity is one of them. On the other hand is the symmetry in the expressions above a relation between two operators which have nothing to do with the special nature of each of them. If ## \langle a,b \rangle = \langle b,a \rangle## then you can call ##a## and ##b## symmetric, regardless what ##a## or ##b## are (as long the product ##\langle\; , \; \rangle## is defined of course).

It is as if someone said: "Tarjei Bø and Johannes Thingnes Bø are brothers." This is a symmetric relationship and it doesn't matter if we say Tarjei is Johannes Thingnes' brother or the other way around. It is a property between them. Now whether they are biathletes or not is a completely different matter.

And by the way, the longer I think about your operator ##i \hbar \dfrac{d}{dx} - \gamma## the more I have the feeling that it has to be read as ##i \hbar \dfrac{d}{dx} - \gamma\cdot I##, in which case it is a linear operator. But I don't have the book, so I can't know for sure.
 
  • Like
Likes SemM
  • #12
fresh_42 said:
Symmetry (or anti-symmetry) is unrelated to the linearity of the operators. An operator is a mapping, a function. It can have several properties and linearity is one of them. On the other hand is the symmetry in the expressions above a relation between two operators which have nothing to do with the special nature of each of them. If ## \langle a,b \rangle = \langle b,a \rangle## then you can call ##a## and ##b## symmetric, regardless what ##a## or ##b## are (as long the product ##\langle\; , \; \rangle## is defined of course).

It is as if someone said: "Tarjei Bø and Johannes Thingnes Bø are brothers." This is a symmetric relationship and it doesn't matter if we say Tarjei is Johannes Thingnes' brother or the other way around. It is a property between them. Now whether they are biathletes or not is a completely different matter.

And by the way, the longer I think about your operator ##i \hbar \dfrac{d}{dx} - \gamma## the more I have the feeling that it has to be read as ##i \hbar \dfrac{d}{dx} - \gamma\cdot I##, in which case it is a linear operator. But I don't have the book, so I can't know for sure.
Thanks for the excellent illustration on symmetry. So one can conclude that symmetry of two operators has no implications on the properties of the operator itself, only the relationship to one another, and if they are for instance related in a matrix as pairs a, b, symmetry may be used in some fashion, should it be necessary or simplifying. But this is , according to what I read from your answer, not directly relevant with the properties of the operator itself and what it does to a function in its domain.

About the writing. These two operators are parts of a Hamittonian, that is part of a paper. I am thinking based on your point on it, that , given that it must be paired with T', thus TT', the identity part in I, becomes redundant, and it results as:

\begin{equation}
(h^2 \frac{d^2}{dx^2} - 2i \gamma \frac{d}{dx} + \gamma^2)\psi = 0
\end{equation}

However, in order to study this whole operator, the two operators T and T', were investigated to see how the overall operator behaves.

If this is however not sufficient, and there are some parts that remain in question, please let me know!

Cheers
 
  • #13
fresh_42 said:
And by the way, the longer I think about your operator ##i \hbar \dfrac{d}{dx} - \gamma## the more I have the feeling that it has to be read as ##i \hbar \dfrac{d}{dx} - \gamma\cdot I##, in which case it is a linear operator. But I don't have the book, so I can't know for sure.

Hi Fresh , I have looked into this in Kreyszigs Functional Analysis, and he writes the same as you here:

quote:
##"A^{*} = \beta(\alpha Q-i/\alpha D)"##
##"A = \beta(\alpha Q+i/\alpha D)"##

and

"##A^{*} A=\pi/h\big(\alpha^2Q^2+1/\alpha^2D^2-h/2\pi Ĩ \big)##"
"##A^{*} A=\pi/h\big(\alpha^2Q^2+1/\alpha^2D^2+h/2\pi Ĩ \big)##"

and hence:
##AA^{*} - A^{*} A=Ĩ##
end quote

So applied on these operators here
T = ##i \hbar \dfrac{d}{dx} - \gamma##
T* = ##-i \hbar \dfrac{d}{dx} - \gamma##

using the identity matrix on the constant (Why?) one gets:

T = ##i \hbar \dfrac{d}{dx} - \gamma Î##
T* =##-i \hbar \dfrac{d}{dx} - \gamma Î##

and then
##AA^{*} - A^{*} A=Ĩ##

as Kreyszig. This means T and T* are one anothers Hilbert-adjoint operator. But does it mean that if they are Hilbert-adjoint pair, are they self-adjoint as well?
 
  • #14
Hey SemM.

Linear operators have the property that they are invertible [operators have to be] and that they have the properties f(A + B) = f(A) + f(B) and f(sA) = s*f(A).

The stuff with hermitian operators means that you have complex numbers meaning you have to have s = a + b*i and make it all consistent with complex numbers but the idea is the same as with a real number case.

Non-linear operators will have some sort of Taylor series expansion where you have a spectrum that is applied to multiple powers of the diagonalized operators itself.

There is a result in linear algebra where you have f(O) = P*f(D)*P_inverse where D contains the eigenvalues and P contains eigen-vectors along with f() being some transformation [including non-linear] that you do on the operator O itself. As long as it can be properly diagonalized, then you can transform an operator with some function.

There is a result in operator algebras which generalizes this to an operator on a Hilbert space [which is infinite dimensional and results converge to get numbers not positive or negative infinity - i.e. real] and you can literally expand a non-linear operator in the same way you do with a Taylor series expansion of a function - but you need to get its spectrum [eigenvalues and eigenvectors] and you are able to expand it enough terms to get an approximation if it's highly non-linear.

I'd read about diagonalization first and then read about applying a function to the eigenvectors before looking at the result of the Taylor-like expansion in a book on operator algebras or C* algebras and see what a non-linear operator does to the eigenvalues. Engineering and pure mathematics books can do this for you.

You can do things like e^A where A is an operator and provided you have the right conditions for A then any smooth/analytical function will be able to be applied on the operator and you can calculate it with some precision in error.

When you look at the derivative terms for the non-linear operator, they don't drop off after the linear term - just like you would expect for a non-linear function.
 
  • Like
Likes SemM
  • #15
chiro said:
Linear operators have the property that they are invertible [operators have to be]
Sorry, but where did you read this?
 
  • #16
SemM said:
But does it mean that if they are Hilbert-adjoint pair, are they self-adjoint as well?
Look up the definition of "self-adjoint operators."
 
  • #17
Mark44 said:
Look up the definition of "self-adjoint operators."

I did, and it does not imply that.
 
  • #18
SemM said:
I did, and it does not imply that.
Hilbert-adjoint is a bit of a strange name for it. Drop Hilbert here. Imagine an operator ##T \, : \, H_1 \longrightarrow H_2## and its adjoint operator ##T^*##. Then ##\langle Tx,y \rangle_{H_2} = \langle x,T^*y \rangle_{H_1}##. Now what does self-adjoint require and could mean?
 
  • Like
Likes SemM
  • #19
fresh_42 said:
Hilbert-adjoint is a bit of a strange name for it. Drop Hilbert here. Imagine an operator ##T \, : \, H_1 \longrightarrow H_2## and its adjoint operator ##T^*##. Then ##\langle Tx,y \rangle_{H_2} = \langle x,T^*y \rangle_{H_1}##. Now what does self-adjoint require and could mean?

A self-adjoint operator requires a symmetric relation to its own Herminian counterpart, so that ##TT^*=T^*T##, and therefore if these where two observables, they would be measurable at the same time, by the commutation relation ##TT^*-T^*T=0##.
 
  • #20
SemM said:
A self-adjoint operator requires a symmetric relation to its own Herminian counterpart, so that ##TT^*=T^*T##, and therefore if these where two observables, they would be measurable at the same time, by the commutation relation ##TT^*-T^*T=0##.
No, it doesn't have anything to do with commutativity. Self-adjoint means ##T=T^*##, it is adjoint to itself. This requires ##H_1=H_2##. Whether ##[T,T^*]=0## or not is a different question and not always true. Of course it is if ##T=T^*##.
 
  • Like
Likes SemM
  • #21
fresh_42 said:
No, it doesn't have anything to do with commutativity. Self-adjoint means ##T=T^*##, it is adjoint to itself. This requires ##H_1=H_2##. Whether ##[T,T^*]=0## or not is a different question and not always true. Of course it is if ##T=T^*##.
Ok , thanks for clarifying that!
 
  • #23
chiro said:
For the invertibility of a linear operator:

https://en.wikipedia.org/wiki/Continuous_linear_operator

Note the inverse property - you can't have that unless it's invertible.

The notation on that page uses ##A^{-1}## to represent the pre-image of the operator ##A##, not it's inverse.

Note that the linear operator that maps a space to the zero vector is trivially continuous and clearly not invertible.

In general there is no requirement for a continuous linear operator, or any continuous function, to be one to one, hence invertible.
 
  • Like
Likes StoneTemplePython
  • #24
Check that the closure of the kernel implies invertibility.
 
  • #25
##\mathcal{H} \longrightarrow \{0\}## has a closed kernel and is definitely not invertible. The best argument against inverses is, that the dual space ##\mathcal{H}^* =\{\, T\, : \,\mathcal{H} \rightarrow \mathbb{R}\text{ or }\mathbb{C}\}## is again a complete vector space, and thus the zero is really needed, and addition won't work with invertibility.
 
  • #26
fresh_42 said:
One further remark:
In case ##\gamma ## is actually ##\gamma \cdot \operatorname{id} = \gamma \cdot I## the operator ##T## becomes linear, as the sum of two linear operators is linear again. In this case we have:
$$
T(f+g) = {i}{\hbar} \dfrac{d}{dx} (f+g) + \gamma \cdot I (f+g)= {i}{\hbar} \dfrac{d}{dx}f + \gamma \cdot f + {i}{\hbar} \dfrac{d}{dx}g + \gamma \cdot g = T(f)+T(g)
$$
so all depends on how to read ##\gamma ##, as a constant translation "plus ##\gamma ##" (## \gamma . f = + \gamma ## which is non-linear) or as the constant linear operator "times ##\gamma ##" (## \gamma .f = \gamma \cdot f## which is linear).
You should also specify linear over _what_? Complexes, Reals, etc.
 
  • #27
chiro said:
Check that the closure of the kernel implies invertibility.
A function is invertible iff it is one-to-one. For a linear operator this is equivalent to the kernel being the zero vector only.
 
  • #28
Do you know how to link the nullity to the invertibility of the matrix? Do you understand why I said that?
 
  • #29
Post #27:
PeroK said:
A function is invertible iff it is one-to-one. For a linear operator this is equivalent to the kernel being the zero vector only.

Post #28:
chiro said:
Do you know how to link the nullity to the invertibility of the matrix?
Didn't @PeroK just do that in the post before yours?
 
  • #30
The point is that it is invertible - which is what I said a long time ago.

Do you agree or not?

If there is no nullity in the matrix and it is square it must be invertible.

Do you agree or not?
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
3K
Replies
18
Views
2K
  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K