High School Can Linear Combinations in Quantum Mechanics Include Multiplication of States?

Click For Summary
Linear combinations in quantum mechanics are defined as sums of states multiplied by arbitrary constants, such as Ψ = αψ + βψ. Multiplying states, as in Ψ = αψ × βψ, does not qualify as a linear combination due to the nature of linearity, which involves addition and scalar multiplication. The discussion emphasizes that operations like AA' = A'A indicate commutation but do not imply linear dependence. Additionally, the concept of bilinearity is introduced, clarifying that multiplication of operators is a non-linear operation. Understanding these foundational concepts is crucial for grasping more advanced topics in quantum mechanics and functional analysis.
SeM
Hi, I read that linear combinations of a state, Psi, can be as:

\begin{equation}
\Psi = \alpha \psi + \beta \psi
\end{equation}

where ##\alpha## and ##\beta## are arbitrary constants.

Can however this be a valid linear combination?\begin{equation}
\Psi = \alpha \psi \times \beta \psi
\end{equation}

Thanks!
 
Last edited by a moderator:
Physics news on Phys.org
Thanks. Is there a definition of this? Such as "linear ..." something?

Thanks
 
No, It's really me who is writing it, and want to say that AA' = T and A'A = T, and both A and A' are linearily dependent with some other operators, so that makes T also dependent with some other operator, given that A and A' are a "linear ... " of T.
 
SeM said:
No, It's really me who is writing it, and want to say that AA' = T and A'A = T, and both A and A' are linearily dependent with some other operators, so that makes T also dependent with some other operator, given that A and A' are a "linear ... " of T.
##AA'=A'A## means ##A## and ##A'## commute: ##[A,A']=0##. This is a non-linear property. You must not conclude any linear dependencies from this equation alone. E.g. take ##A= \begin{bmatrix}0&1&0\\0&0&0\\0&0&0\end{bmatrix}## and ##A'=\begin{bmatrix}0&0&0\\0&0&0\\0&0&1\end{bmatrix}##. They both multiply to ##T=0## from either side but are not linear dependent on anything (but ##0##).
 
  • Like
Likes SeM
fresh_42 said:
##AA'=A'A## means ##A## and ##A'## commute: ##[A,A']=0##. This is a non-linear property. You must not conclude any linear dependencies from this equation alone. E.g. take ##A= \begin{bmatrix}0&1&0\\0&0&0\\0&0&0\end{bmatrix}## and ##A'=\begin{bmatrix}0&0&0\\0&0&0\\0&0&1\end{bmatrix}##. They both multiply to ##T=0## from either side but are not linear dependent on anything (but ##0##).
Thanks! Is this why xA'A = xAA' is the only allowed combination with another operator (x)?
 
SeM said:
Thanks! Is this why xA'A = xAA' is the only allowed combination with another operator (x)?
No. However, I'm not quite sure what you mean. Usually small letters indicate vectors and capital letters matrices or operators in this context. As you had ##A'A=T=AA'## you surely also have ##xA'A=xT=xAA'## and ##A'Ax=Tx=AA'x## for all vectors ##x## and also ##XA'A=XT=XAA'## and ##A'AX=TX=AA'X## for any operator aka matrix ##X##. I got the impression also from your other threads, that you should relearn linear algebra again, since it seems you confuse fundamental concepts which are necessary to understand functional analysis. I have explained some basics about operators and such here:
https://www.physicsforums.com/insights/tell-operations-operators-functionals-representations-apart/
but the concept of linearity, subspaces and linear independency which is crucial to understand functional analysis is not explained. One could say that linear algebra is often finite dimensional and functional analysis infinite dimensional. Of course this is a bit very short and provocative, but there is some truth in it. So the basics of linear algebra are important since they occur everywhere. Linearity is addition and scalar multiplication (stretching and compressing vectors). What you wrote was ##AA'## which is a multiplication of mappings and thus not linear. It is linear in a single argument, because the distributive laws hold: ##A(A'+B')=AA'+AB'## and ##(A+B)A'=AA'+BA'##. But this requires one factor to be fixed. It is therefore called bilinear (i.e. linear in both arguments). As a whole, means as a multiplication in contrast to addition, ##AA'## is a non-linear concept.

This also answers your introductory question: ##\Psi = \alpha \psi + \beta \psi## is a linear combination (addition and stretching by ##\alpha , \beta ## resp.), whereas ##\Psi = \alpha \psi \times \beta \psi ## is none, because it is a multiplication.
 
  • Like
Likes SeM
By the way ##\Psi = \alpha \psi + \beta \psi = (\alpha+\beta)\psi##. Anyway, why is this question in the differential geometry forum?
 
  • #10
martinbn said:
By the way ##\Psi = \alpha \psi + \beta \psi = (\alpha+\beta)\psi##. Anyway, why is this question in the differential geometry forum?
Good question. Moved to linear algebra.
 
  • #11
fresh_42 said:
It is therefore called bilinear (i.e. linear in both arguments). As a whole, means as a multiplication in contrast to addition, ##AA'## is a non-linear concept.

This also answers your introductory question: ##\Psi = \alpha \psi + \beta \psi## is a linear combination (addition and stretching by ##\alpha , \beta ## resp.), whereas ##\Psi = \alpha \psi \times \beta \psi ## is none, because it is a multiplication.
Thanks Fresh42 for the consistent outline on this. Indeed I am confused. This operator is a differential operator with a constant, and it happens twice, such as the factorized form of the operator H of the Schrödinger eqn.

So I was trying to say something about the two factorized components of some form of Hamitlonian, (similar to the Schrödinger H) and whether they were bounded/unbouded and/or linear. But it turns out they are neither bounded or unbounded, because they both yield complex norms. So they can be called complex operators simply. And if I have:

B = (ih d/dx + g) and B' = (-ihd/dx +g) they are in combination BB' = B'B but are NOT bilinear nor linear?

Thanks!
 
  • #12
SeM said:
Hi, I read that linear combinations of a state, Psi, can be as:

\begin{equation}
\Psi = \alpha \psi + \beta \psi
\end{equation}

where ##\alpha## and ##\beta## are arbitrary constants.

Can however this be a valid linear combination?\begin{equation}
\Psi = \alpha \psi \times \beta \psi
\end{equation}
The term "linear combination" is one that is defined in just about every linear algebra textbook. It is used in conjunction with another concept, the span of a set of vectors. The linear combination concept also comes up in differential equations in which the set of all solutions of a homogeneous differential equation is the set of linear combinations of a basic set of solutions.

Since you are working with operators and Hilbert spaces and such (based on other threads), I would strongly advise you to review the more basic concepts of linear algebra and differential equations.

Also, don't write $\alpha$ and similar -- this doesn't do anything on this site. Instead, use ##\alpha## (for inline TeX) or $$\alpha$$ (for standalone TeX).
 
  • Like
Likes SeM
  • #13
Mark44 said:
The term "linear combination" is one that is defined in just about every linear algebra textbook. It is used in conjunction with another concept, the span of a set of vectors. The linear combination concept also comes up in differential equations in which the set of all solutions of a homogeneous differential equation is the set of linear combinations of a basic set of solutions.

Since you are working with operators and Hilbert spaces and such (based on other threads), I would strongly advise you to review the more basic concepts of linear algebra and differential equations.

Also, don't write $\alpha$ and similar -- this doesn't do anything on this site. Instead, use ##\alpha## (for inline TeX) or $$\alpha$$ (for standalone TeX).

Thanks, will do!
 
  • #14
fresh_42 said:
I have read it today, and it was very useful and well written pedagogically. I learned a few things I was not aware of. Thanks. There is a typo however:

at :
"
A common example are matrix groups G and vector spaces V 8) where the operation is the application of the transformation represented by the matrix. A orthogonal matrix (representing a rotation) g=[cosφsinφ−sinφcosφ](3) transforms a two dimensional vector in its by φ rotated version.
"

it says "in its", which is missing some word after it, or it should say "in it"?

Cheers and thanks for the link
 
  • #15
Maybe it's not the best English, but it is correct: a vector is transformed into a new one. The new one is the same as the old one, just rotated by an angle of ##\varphi ##, so it's in a kind the by ##\varphi## rotated version of itself. So maybe in its own version which is rotated by ##\varphi## might have been better. I simply substituted the subordinated clause which is rotated by ##\varphi## by its prepositional phrase its by ##\varphi## rotated version and the prepositional genitive version of itself by its grammatical case its version.
 
  • #16
fresh_42 said:
Maybe it's not the best English, but it is correct: a vector is transformed into a new one. The new one is the same as the old one, just rotated by an angle of ##\varphi ##, so it's in a kind the by ##\varphi## rotated version of itself. So maybe in its own version which is rotated by ##\varphi## might have been better. I simply substituted the subordinated clause which is rotated by ##\varphi## by its prepositional phrase its by ##\varphi## rotated version and the prepositional genitive version of itself by its grammatical case its version.

I think the article was excellent anyway!

Thanks!
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
7
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
Replies
2
Views
2K