# Bra and kets

1. Sep 5, 2014

### K space

I'm new to the concepts of quanum mechanics and the bra-ket representation in general.
I've seen in the textbook that the compleatness relation is used all the time when working with the bra and kets. I'm a bit confused about how this relation is being used when applied more than once in a single expression, and why it's always applied as summations/inegrals over different eigenvalues then.

For example, consider:

$<ψ|\hat X|ψ>$

Expanding ψ in a basis:

$|ψ>=\sum_{a}c_a|a>$ and $<ψ|=\sum_{a}c_a<a|$
which gives

$<ψ|\hat X|ψ>=\sum_{a}c_ac^*_a<a|\hat X|a>=\sum_{a}c_ac^*_aa<a|a>$

In the book they would insted have two different sums for the bra and the ket:

$<ψ|\hat X|ψ>=\sum_{a}\sum_{b}c_ac^*_b<b|\hat X|a>=\sum_{a}\sum_{b}c_ac^*_ba<b|a>$

Will these two expressions be equal each other? If that's the case, why is it always introduced summations over different eigenvales as in the latter case? It feels like i'm missing something fundamental here.

2. Sep 5, 2014

### Quantum Braket

Be careful of your complex conjugates when creating a dual vector space and operating on it.
As for subscripts,

This would not be correct. The components of the dual of the state (bra) are in general not the same as those of the ket. They are in fact complex conjugates. Thus you can say that the ket is a linear combination $|ψ>=\sum_{a}c_a|a>$
And that the bra is a linear combination and $<ψ|=\sum_{b}c_b<b|$ were c_b is an element of the set of complex numbers.
Or, you can do as the book shows below:

This is correct.

What text are you using? I would work on expansion of states and creating a dual space, and do so with some concrete states (column vectors) and get a feel for how this is done before working more directly with the basis independent Dirac formalism. Convince yourself of some of these properties
For a book I recommend Shankar and or Zettili to start with.

Last edited: Sep 5, 2014
3. Sep 5, 2014

### vela

Staff Emeritus
Let's just look at simple algebra. Let
$$y = x_1 + x_2 = \sum_{i=1}^2 x_i.$$ Then $y^2 = x_1^2 + 2x_1x_2+x_2^2$. Now if you use the same index variable, you'd get
$$\sum_{i=1}^2 x_i x_i = x_1^2+x_2^2.$$ Clearly, that's wrong. On the other hand,
$$\sum_{i=1}^2 \sum_{j=1}^2 x_i x_j = x_1^2+x_1x_2 + x_2x_1 + x_2^2$$ yields the correct result.

4. Sep 6, 2014

### Luca_Mantani

The 2 formulas are equal if the basis vectors are orthonormal and
$$<a|b>=\delta_{ab}$$
So the sum with index b give non null terms only when $a=b$.

5. Sep 6, 2014

### K space

I'm a bit confused now when I see different opinions about if the expressions are equal. More simply, if I expand $|ψ>=\sum_{a}c_a|a>$, will that always implie that im allowed to expand the corresponding bra as: $<ψ|=\sum_{a}c_a^*<a|$ (with summation over same a) to use in an expression containing $|ψ>$ and $<ψ|$?

6. Sep 6, 2014

### Quantum Braket

Your last statement is correct. But, play around with creating a dual using summation over b, and then using orthonormalisim to see what happens.
The difficulty you may be coming across is is indical freedom. Is the index free?

7. Sep 6, 2014

### vela

Staff Emeritus
No, you can't and shouldn't. There's a reason why the book uses two different indices.

You're ending up with the same expression by accident. It only "works" because you're assuming an orthogonal basis consisting of eigenkets of $\hat{X}$. You're expanding $\lvert \psi \rangle$ in a very specific basis. Use any other basis, and you won't get the correct result. If you wanted to calculate $\langle \psi \lvert \hat{Y} \rvert \psi \rangle$, where $\hat{Y}$ is some other operator, your approach wouldn't generally work either.

8. Sep 6, 2014

### Quantum Braket

In standard nomenclature and as is used in beginning books like Shankar (2nd edition pp 14) or Griffiths he can construct his dual as such. This is possible because the ket is defined from as constructed by an orthogonal basis (note that it does not need to be constructed from eigenkets of some operator) Granted, in general indices should be used for your given, but most texts at the beginning sidestep this issue for the sake of clarity and then explain exceptions.

Very basically, and sufficient for the OP at this stage when learning how to manipulate states in the Dirac notation, if you construct (or expand) a ket from an orthogonal basis that spans the space, you can create a dual (bra) by taking the adjoint of the ket assuming a countably finite Hilbert space.

9. Sep 6, 2014

### Fredrik

Staff Emeritus
An index that's being summed over is always a dummy variable, i.e. it can be replaced by any other symbol without changing the meaning of the statement. That's why it's always correct to use different indices with different summations. If you use the same symbol in both places, you risk confusing yourself.

Example: Suppose that you want to sum all the components of a matrix M. You can sum the rows first, and then sum the results like this: $\sum_j\big(\sum_i M_{ij}\big)$. You can rewrite this as $\sum_i\sum_j M_{ij}$, and you can replace j by any other symbol, but if you choose to replace it by i, you end up with the extremely confusing $\sum_i\sum_i M_{ii}$. You made things even worse by also dropping one of the summation sigmas. We would end up with $\sum_i M_{ii}$, which is only the sum of the diagonal components of the matrix.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted