Linear combinations

  • #1
Hi, I read that linear combinations of a state, Psi, can be as:

\begin{equation}
\Psi = \alpha \psi + \beta \psi
\end{equation}

where ##\alpha## and ##\beta## are arbitrary constants.

Can however this be a valid linear combination?


\begin{equation}
\Psi = \alpha \psi \times \beta \psi
\end{equation}

Thanks!
 
Last edited by a moderator:

Answers and Replies

  • #3
Thanks. Is there a definition of this? Such as "linear ..." something?

Thanks
 
  • #5
No, It's really me who is writing it, and want to say that AA' = T and A'A = T, and both A and A' are linearily dependent with some other operators, so that makes T also dependent with some other operator, given that A and A' are a "linear ... " of T.
 
  • #6
fresh_42
Mentor
Insights Author
2022 Award
17,793
18,957
No, It's really me who is writing it, and want to say that AA' = T and A'A = T, and both A and A' are linearily dependent with some other operators, so that makes T also dependent with some other operator, given that A and A' are a "linear ... " of T.
##AA'=A'A## means ##A## and ##A'## commute: ##[A,A']=0##. This is a non-linear property. You must not conclude any linear dependencies from this equation alone. E.g. take ##A= \begin{bmatrix}0&1&0\\0&0&0\\0&0&0\end{bmatrix}## and ##A'=\begin{bmatrix}0&0&0\\0&0&0\\0&0&1\end{bmatrix}##. They both multiply to ##T=0## from either side but are not linear dependent on anything (but ##0##).
 
  • #7
##AA'=A'A## means ##A## and ##A'## commute: ##[A,A']=0##. This is a non-linear property. You must not conclude any linear dependencies from this equation alone. E.g. take ##A= \begin{bmatrix}0&1&0\\0&0&0\\0&0&0\end{bmatrix}## and ##A'=\begin{bmatrix}0&0&0\\0&0&0\\0&0&1\end{bmatrix}##. They both multiply to ##T=0## from either side but are not linear dependent on anything (but ##0##).


Thanks! Is this why xA'A = xAA' is the only allowed combination with another operator (x)?
 
  • #8
fresh_42
Mentor
Insights Author
2022 Award
17,793
18,957
Thanks! Is this why xA'A = xAA' is the only allowed combination with another operator (x)?
No. However, I'm not quite sure what you mean. Usually small letters indicate vectors and capital letters matrices or operators in this context. As you had ##A'A=T=AA'## you surely also have ##xA'A=xT=xAA'## and ##A'Ax=Tx=AA'x## for all vectors ##x## and also ##XA'A=XT=XAA'## and ##A'AX=TX=AA'X## for any operator aka matrix ##X##. I got the impression also from your other threads, that you should relearn linear algebra again, since it seems you confuse fundamental concepts which are necessary to understand functional analysis. I have explained some basics about operators and such here:
https://www.physicsforums.com/insights/tell-operations-operators-functionals-representations-apart/
but the concept of linearity, subspaces and linear independency which is crucial to understand functional analysis is not explained. One could say that linear algebra is often finite dimensional and functional analysis infinite dimensional. Of course this is a bit very short and provocative, but there is some truth in it. So the basics of linear algebra are important since they occur everywhere. Linearity is addition and scalar multiplication (stretching and compressing vectors). What you wrote was ##AA'## which is a multiplication of mappings and thus not linear. It is linear in a single argument, because the distributive laws hold: ##A(A'+B')=AA'+AB'## and ##(A+B)A'=AA'+BA'##. But this requires one factor to be fixed. It is therefore called bilinear (i.e. linear in both arguments). As a whole, means as a multiplication in contrast to addition, ##AA'## is a non-linear concept.

This also answers your introductory question: ##\Psi = \alpha \psi + \beta \psi## is a linear combination (addition and stretching by ##\alpha , \beta ## resp.), whereas ##\Psi = \alpha \psi \times \beta \psi ## is none, because it is a multiplication.
 
  • #9
martinbn
Science Advisor
3,106
1,464
By the way ##\Psi = \alpha \psi + \beta \psi = (\alpha+\beta)\psi##. Anyway, why is this question in the differential geometry forum?
 
  • #10
fresh_42
Mentor
Insights Author
2022 Award
17,793
18,957
By the way ##\Psi = \alpha \psi + \beta \psi = (\alpha+\beta)\psi##. Anyway, why is this question in the differential geometry forum?
Good question. Moved to linear algebra.
 
  • #11
It is therefore called bilinear (i.e. linear in both arguments). As a whole, means as a multiplication in contrast to addition, ##AA'## is a non-linear concept.

This also answers your introductory question: ##\Psi = \alpha \psi + \beta \psi## is a linear combination (addition and stretching by ##\alpha , \beta ## resp.), whereas ##\Psi = \alpha \psi \times \beta \psi ## is none, because it is a multiplication.


Thanks Fresh42 for the consistent outline on this. Indeed I am confused. This operator is a differential operator with a constant, and it happens twice, such as the factorized form of the operator H of the Schrödinger eqn.

So I was trying to say something about the two factorized components of some form of Hamitlonian, (similar to the Schrodinger H) and whether they were bounded/unbouded and/or linear. But it turns out they are neither bounded or unbounded, because they both yield complex norms. So they can be called complex operators simply. And if I have:

B = (ih d/dx + g) and B' = (-ihd/dx +g) they are in combination BB' = B'B but are NOT bilinear nor linear?

Thanks!
 
  • #12
36,858
8,903
Hi, I read that linear combinations of a state, Psi, can be as:

\begin{equation}
\Psi = \alpha \psi + \beta \psi
\end{equation}

where ##\alpha## and ##\beta## are arbitrary constants.

Can however this be a valid linear combination?


\begin{equation}
\Psi = \alpha \psi \times \beta \psi
\end{equation}
The term "linear combination" is one that is defined in just about every linear algebra textbook. It is used in conjunction with another concept, the span of a set of vectors. The linear combination concept also comes up in differential equations in which the set of all solutions of a homogeneous differential equation is the set of linear combinations of a basic set of solutions.

Since you are working with operators and Hilbert spaces and such (based on other threads), I would strongly advise you to review the more basic concepts of linear algebra and differential equations.

Also, don't write $\alpha$ and similar -- this doesn't do anything on this site. Instead, use ##\alpha## (for inline TeX) or $$\alpha$$ (for standalone TeX).
 
  • #13
The term "linear combination" is one that is defined in just about every linear algebra textbook. It is used in conjunction with another concept, the span of a set of vectors. The linear combination concept also comes up in differential equations in which the set of all solutions of a homogeneous differential equation is the set of linear combinations of a basic set of solutions.

Since you are working with operators and Hilbert spaces and such (based on other threads), I would strongly advise you to review the more basic concepts of linear algebra and differential equations.

Also, don't write $\alpha$ and similar -- this doesn't do anything on this site. Instead, use ##\alpha## (for inline TeX) or $$\alpha$$ (for standalone TeX).

Thanks, will do!
 
  • #14


I have read it today, and it was very useful and well written pedagogically. I learned a few things I was not aware of. Thanks. There is a typo however:

at :
"
A common example are matrix groups G and vector spaces V 8) where the operation is the application of the transformation represented by the matrix. A orthogonal matrix (representing a rotation) g=[cosφsinφ−sinφcosφ](3) transforms a two dimensional vector in its by φ rotated version.
"

it says "in its", which is missing some word after it, or it should say "in it"?

Cheers and thanks for the link
 
  • #15
fresh_42
Mentor
Insights Author
2022 Award
17,793
18,957
Maybe it's not the best English, but it is correct: a vector is transformed into a new one. The new one is the same as the old one, just rotated by an angle of ##\varphi ##, so it's in a kind the by ##\varphi## rotated version of itself. So maybe in its own version which is rotated by ##\varphi## might have been better. I simply substituted the subordinated clause which is rotated by ##\varphi## by its prepositional phrase its by ##\varphi## rotated version and the prepositional genitive version of itself by its grammatical case its version.
 
  • #16
Maybe it's not the best English, but it is correct: a vector is transformed into a new one. The new one is the same as the old one, just rotated by an angle of ##\varphi ##, so it's in a kind the by ##\varphi## rotated version of itself. So maybe in its own version which is rotated by ##\varphi## might have been better. I simply substituted the subordinated clause which is rotated by ##\varphi## by its prepositional phrase its by ##\varphi## rotated version and the prepositional genitive version of itself by its grammatical case its version.

I think the article was excellent anyway!

Thanks!
 

Suggested for: Linear combinations

  • Last Post
Replies
7
Views
641
Replies
3
Views
1K
Replies
3
Views
649
  • Last Post
Replies
2
Views
901
  • Last Post
Replies
3
Views
778
Replies
4
Views
1K
  • Last Post
Replies
2
Views
464
  • Last Post
Replies
1
Views
617
Replies
3
Views
531
Replies
4
Views
150
Top