# Bra-Ket Notation Manipulations: Quantum State Expansion

• I
• expos4ever

#### expos4ever

TL;DR Summary
Confused by manipulations within a sum using bra-ket notation (in a teaching video). The context is the expansion of a quantum state using an orthonormal basis.
I suspect it will help if you know about my background: I did some linear algebra in university but never used it and am now in my mid 60s. I am interested in understanding the mathematics of quantum physics. I have read a number of layman's texts on quantum mechanics, but they all gloss over the math.

I found this very interesting series online and I have a question about one of the videos that deals with "bras" and "bra-ket" notation; here is the link:

At the 8 minute mark, an example begins. At 8:25, he says "let's move the inner product to the front". While I think I understand the basics of inner products, I do not understand how such a move is justified or exactly what he is doing. It almost seems that he misspoke and should have said "let's move the inner product to the back". Has he mis-spoken? Assuming he is not, what is going on at this point?

I believe I understand the bit at 8:32 about "breaking the inner product apart", but I do not understand how the two Ai's together constitute an operator (although I believe I correctly understand what an operator is - it is a mathematical object that acts on a vector to generate another vector). Maybe it is the notation that is tripping me up.

Any words of wisdom greatly appreciated.

At the 8 minute mark, an example begins. At 8:25, he says "let's move the inner product to the front". While I think I understand the basics of inner products, I do not understand how such a move is justified or exactly what he is doing. It almost seems that he misspoke and should have said "let's move the inner product to the back". Has he mis-spoken? Assuming he is not, what is going on at this point?
An inner product is a complex number, so it commutes with vectors, ##a \ket{\phi} = \ket{\phi} a##, so that step is allowed. The point of doing this is to reveal a structure,
\begin{align*} \big( \braket{A_i|\psi} \big) \ket{A_i} &= \ket{A_i} \big( \braket{A_i|\psi} \big)\\ &= \big( \ket{A_i}\bra{A_i} \big) \ket{\psi} \end{align*}
where I have used parenthesis to make this more explicit.

I would also have said "move to the back," but when you work a lot with operators and kets, you get into a habit of reading equation right to left (since this is the order in which operators operate), so the author of the video might think of the right-most position as "front."
I believe I understand the bit at 8:32 about "breaking the inner product apart", but I do not understand how the two Ai's together constitute an operator (although I believe I correctly understand what an operator is - it is a mathematical object that acts on a vector to generate another vector). Maybe it is the notation that is tripping me up.
Given what I have put in bold, you should see that ##\ket{\cdot}\bra{\cdot}## is an operator, as it acts on a ket and returns a ket. Given ##\hat{\Lambda} \equiv \ket{\alpha}\bra{\beta}##,
\begin{align*} \hat{\Lambda} \ket{\psi} &= \ket{\alpha}\braket{\beta|\psi} \\ &= b \ket{\alpha} \end{align*}
with ##b \equiv \braket{\beta|\psi} \in \mathbb{C}##.

• Steve4Physics, malawi_glenn, expos4ever and 2 others
As a non-mathematician, can I add this...

but I do not understand how the two Ai's together constitute an operator (although I believe I correctly understand what an operator is - it is a mathematical object that acts on a vector to generate another vector). Maybe it is the notation that is tripping me up.
In the present context, an operator can be represented by a matrix (which you probably already know).

Example in 2D with orthonormal basis vectors ##\ket {A_1} = \begin {pmatrix} 1 \\ 0 \end {pmatrix}## and ##\ket {A_2} = \begin {pmatrix} 0 \\ 1 \end {pmatrix}## so ##\bra {A_1} = (1~0)## and ##\bra {A_2} = (0~1)##.

##\ket {A_1} \bra {A_1} = \begin {pmatrix} 1 \\ 0 \end {pmatrix} (1~0) = \begin {pmatrix} 1 & 0 \\ 0 & 0\end{pmatrix}## This is a tensor product of a ket and a bra which produces a matrix.

Similarly ##\ket {A_2} \bra {A_2} = \begin {pmatrix} 0 & 0 \\ 0 & 1\end{pmatrix}##

##\sum_{i=1}^2 \ket {A_i} \bra {A_i} = \begin {pmatrix} 1 & 0 \\ 0 & 0\end{pmatrix} + \begin {pmatrix} 0 & 0 \\ 0 & 1\end{pmatrix} = \begin {pmatrix} 1 & 0 \\ 0 & 1\end{pmatrix}## which is the identity operator as noted in the video.

Edited.

• malawi_glenn, expos4ever and topsquark
Thanks to both who answered. I have not had time yet to go through your answers in detail, but I can think of one immediate follow-on. You both obviously think an expression of the form |x> <y| is legitimate - operator on right, ket on left. I was thinking that since the form <a| is used to denote an operator, and |b> a ket (vector), you could only ever have this form : <a| |b>, and not |b> <a|. See what I mean? I thought the operator had to be on the left. I suspect you will tell me that there is no such restriction as they are both simply matrices from a mathematical perspective.

• PeroK
Think of it like this

An operator is something that when acting on a ket from the left returns a ket. And when acting on a bra from the right you get a bra.

##| a \rangle \langle b | ## fits the bill and is this a way of writing an operator

Bra is not an operator since it produces a scalar when acting on a ket.

Last edited:
• PeroK
You both obviously think an expression of the form |x> <y| is legitimate - operator on right, ket on left. I was thinking that since the form <a| is used to denote an operator, and |b> a ket (vector), you could only ever have this form : <a| |b>, and not |b> <a|. See what I mean? I thought the operator had to be on the left. I suspect you will tell me that there is no such restriction as they are both simply matrices from a mathematical perspective.
This could cause extra confusion but I’ll have a go…

When we write ##\braket {A|B}## (equivalent to ##\bra A \ket B##) the result is a scalar.

The bra, ##\bra A##, is acting as a ‘sort of’ operator which maps the vector ##\ket B## to a scalar. I believe the correct description for the bra when acting in this way is a ‘linear functional’.

Note that ##\bra A## is not a member of the same vector space as ##\ket A## and ##\ket B##. Kets belong to a space of column vectors; bras belong to a space of row vectors. Each space is the dual of the other.

Example: suppose we want the magnitude-squared (a scalar) for ##\ket B##. Is there a linear functional which will do it for us? Yes! The required linear functional turns out to be the conjugate transpose of ##\ket B## itself. The required linear functional is ##\bra B##, i.e. ##\braket { B|B} = ||B||^2##.

But the term linear operator means something which acts on a vector to produce another vector. Typically a linear operator can be represented by a matrix (e.g. think of a rotation matrix).

W can construct linear operators from vectors This is what @DrClaude showed in Post #2. It may help to make-up some simple (real) values for the vectors ##\ket {\alpha} , \ket {\beta}## and ##\ket {\psi}## and construct/use the matrix ##\ket {\alpha} \bra {\beta}##. It should help you see what’s going on. The linear operator (matrix) acts on ##\ket {\psi}## and produces a new vector which is 'in the direction of' ##\ket {\alpha}##.

• malawi_glenn and PeroK
In linear algebra, a linear transformation ##f## is called an operator when ## f: A \to A##, i.e. when domain ##A## and codomain ##A## are the same set.

In linear algebra, a linear transformation ##f## is called an operator when ## f: A \to A##, i.e. when domain ##A## and codomain ##A## are the same set.
Apologies if I’m being picky but in the context of QM, the domain and codomain are the same (complex Hilbert) space (not 'set').

But in other (non-QM) contexts, they needn't be the same vector space. E.g. you could have a linear transformation ##f## such that ##f: \mathbb {R^3} \to \mathbb {R^2}##.

[Minor edit.]

Apologies if I’m being picky but in the context of QM, the domain and codomain are the same (complex Hilbert) space (not 'set').

But in other (non-QM) contexts, they needn't be the same vector space. E.g. you could have a linear transformation ##f## such that ##f: \mathbb {R^3} \to \mathbb {R^2}##.

[Minor edit.]
Well this is linear algebra subforum.

In all my linear algebra books, an operator is a linear transformation which the domain and the codomain are the same. Sure you can have other linear transformations but those are not called operators.

All I wanted to say was that bras are not operators in this sense.

• PeroK