# Bras and Kets and Tensors

1. Jun 22, 2008

### Phrak

Is there a way bras and kets can be understood in terms of vectors and tensors and coordinate bases?

I'm fairly sure that if a ket is thought of as a vector with an upper index, then it's bra is a vector with a lower index, but getting the rest of it all to look like tensors is rather mysterious.

2. Jun 23, 2008

### strangerep

Kets (states) are just vectors in the Hilbert space. And after all, a Hilbert space is just a
vector space equipped with an Hermitian inner product (and some extra arcane stuff
about "completion" in the inf-dim case).

For a finite-dimensional Hilbert space H, it's all rather easy. The bras actually live in the "dual"
space $H^*$ (i.e., the space of linear functionals over H, meaning the space of
linear mappings from H to the complex numbers). That's really what this upper/lower
index business is all about. For finite-dimensional spaces, the dual $H^*$ is actually
isomorphic to the primal space H, so people tend to forget about the distinction. But in infinite
dimensions, the primal and dual spaces are no longer isomorphic in general, so there
is no canonical foundation for raising and lowering indices in general. People also tend to be
more pedantic, and talk about "self-adjoint", or "symmetric" operators and such-like. This is
discussed in textbooks on "Functional Analysis".

3. Jun 24, 2008

### pmb_phy

Yes. Absolutely.

Kets can be thought of as elements of a vector space. However I'm not sure if the term "coordinate basis" can be applied here. There is definitely a basis though. The basis "vectors" are eigenvectors to Hermitian operators. Tenor products can be defined in terms of trhe tensor products of the "vectors" (kets).
That is correct.

Pete

4. Jun 24, 2008

### Phrak

Thankyou strangerep. I don't understand all this languge, I'm afraid; I'm just now learning bras and kets. But could you tell me how the Hermitian inner product is accomplished in finite dimensions? In using vectors with real entries, that I'm familiar with, the inner product is accomplished with the metric tensor. Is there a similar operator in Hilbert space?

5. Jun 24, 2008

### Phrak

I take it that kets are always unit vectors. So that if |r> is a ket, b is a complex number and |s> = |b r>, then |s> is not properly a ket?

6. Jun 25, 2008

### strangerep

Imagine two vectors u,v with complex entries. Complex-conjugate the entries of one
vector, and then form the familiar inner product, e.g., $u^*_1v_1 + u^*_2v_2 + ...$.
Mostly, you don't need to worry about the metric here because it's just $\delta_{ij}$.

7. Jun 25, 2008

### pmb_phy

You can always normalize them to unity so if the norm is not 1 then it doesn't imply that it isn't a ket.

By the way. The way to write |s> is |s> = b|r>. Did you see it written some other way? I have a vauge recollection of such a notation but I can't recall where. Thanks.

Pete

8. Jun 25, 2008

### mrandersdk

I'm fairly sure that if a ket is thought of as a vector with an upper index, then it's bra is a vector with a lower index, but getting the rest of it all to look like tensors is rather mysterious.

is correct, but could you elabrorate on what you mean by:

but getting the rest of it all to look like tensors is rather mysterious.

9. Jun 25, 2008

### Fredrik

Staff Emeritus
In that case you may find this recent thread about bras and kets useful. Read #17 first, because it explains a silly mistake I made in #2.

The question is a little bit strange. An inner product on a real vector space V is just a function from $V\times V$ into $\mathbb R$ that satisfies a few conditions. It doesn't need to be "accomplished" by something else. You're probably thinking about how you can use matrix multiplication to define an inner product when the vector space is $\mathbb R^n$: $\left<x,y\right>=x^TSy$ where S is a symmetric matrix. But even in this case, the inner product isn't defined using the metric tensor. It is the metric tensor. S is just a matrix. The inner product/metric tensor is the map $(x,y)\mapsto \left<x,y\right>.$

10. Jun 28, 2008

### Phrak

That's a point. Without a coordinate system, it makes no sense to talk about coordinate basis. Knowing that you're using coordinate bases, or veilbien would be important when applying operators like
$$\frac{ \partial x^i }{\partial x^{j'}}$$

Can tensors be defined as objects that transform as products of bras and kets?

My mistake. Apparently, the proper way to write bras and kets is to place only labels within the delimiters, as I've recently learned.

Last edited: Jun 28, 2008
11. Jun 28, 2008

### comote

In finite dimensional space I think of it this way.

I have a vector $$x$$ in Hilbert space. The Ket $$|x\rangle$$ is the representation as a column vector and the bra $$\langle x |$$ is the representation
as a row vector. Now $$\langle x|y\rangle$$ will give you a scalar and $$|x\rangle\langle y|$$ will give you a matrix.

I know that it is not exactly the same way it is normally used but this retains the spirit of the notation while, at least for me, getting rid of some of the confusion.

Last edited: Jun 28, 2008
12. Jun 28, 2008

### mrandersdk

I don't think that is a completely good way to look at it, because the ket is more general it doesn't assume a certain basis. Choosing a basis though you can in the finite case use that view.

But actually the column vector is more like a wavefunction (in a discrete finite version).

13. Jun 28, 2008

### comote

You have a point, usually though if what I am thinking about is independent of basis I shy away from Bra-Ket notation.

14. Jun 28, 2008

### mrandersdk

ok, but the meaning of a state ket is actually to make the state independent of a basis, you kan then write it a basis fx. the posistion basis like this

$$|\Psi> = \hat{I}|\Psi> = \int |x><x|\Psi> dx = \int <x|\Psi> |x> dx$$

$$\Psi(x) = <x|\Psi>$$ is what called the wave function. So when you write a column vector, you only writing $$\Psi(x) = <x|\Psi>$$, because the basis$$|x>$$, is assumed. So I think that $$\Psi(x) = <x|\Psi>$$ is actually more like a column vector, this thoug has a infinite index namely x.

I the finite discrete case we often use some $$c_i$$, and them write them in a column, but they are exactly the same as $$\Psi(x)$$.

So the point is that, the ket notation is actually there to avoid refering to a basis before nessesary (when you need to do explicit calculations it is often more convenient to choose some basis, fx position, momentum or the energy bases.)

15. Jun 28, 2008

### mrandersdk

not sure what your point is whit that equallity. And guess you mean

$$\psi(x) = \int \delta(x-y)\psi(y)dy$$

16. Jun 28, 2008

### comote

Or

$$\psi(x) = \int\delta_x(y)\psi(y)dy \\ = \langle\delta_x,\psi\rangle$$

We are both getting to the same point though. We have to remember what space we are working on. I would rather not even think about using Bra-Ket notation except when using proper vectors.

Yeah, I messed it up, that's why I deleted it and put this one up.

17. Jun 28, 2008

### mrandersdk

$$\psi(x) = \int\delta_x(y)\psi(y)dy \\ = \langle\delta_x,\psi\rangle$$

is nothing like

$$|\Psi> = \hat{I}|\Psi> = \int |x><x|\Psi> dx = \int <x|\Psi>|x> dx$$

This is using a basis. You use a delta function, to write something that doesn't say any thing.

18. Jun 28, 2008

### comote

$$\psi$$ represents a vector in Hilbert space, just like $$|\psi\rangle$$
and $$\psi(x)$$ represents a number, just like
$$\langle x|\psi\rangle$$

I meant to to its' value at a point in Euclidean space before, sorry for the confusion.

19. Jun 28, 2008

### Phrak

But this is the interesting part. In transforming a vector ket to a dual vector bra, as if you were lowering an index, the complex conjugate is taken of the vector coefficients. This can be accomplished tensoraly if complex numbers are represented as column vectors,
$$c = \left( \begin{array}{c} a & b \end{array} \right)$$
The conjugate of c is taken with
$$\rho = \left( \begin{array}{cc} 1 & 0 & 0 & -1 \end{array} \right)$$ .
so that
$$c_{\beta} = \rho^{\beta} _{\alpha} c^\alpha$$

In three dimensions, the lowering metric would be
$$g_{ij} = \left( \begin{array}{ccc} \rho & 0 & 0 & 0 & \rho & 0 & 0 & 0 & \rho \end{array} \right)$$

The idea is that whenever an index is lowered or raised, the complex conjugate is applied. But does this scheme hang together for, say, a unitary operator, where a unitary operator acting on a ket is type(1,1) tensor (one upper and one lower index) that takes a ket and returns a ket? I'm not so sure it hangs together consistantly.

20. Jun 28, 2008

### mrandersdk

stil think there is a big differens. Are you thinking of the functions as something fx. in $$L^2(R^3)$$, and then you can make them in to a number by taking inner product with the delta function, bacause if you do, then the hilbert space $$L^2(R^3)$$, is not general enough. The hilbert space is bigger, and when you choose a basis you get a wave function in fx $$L^2(R^3)$$ if you choose the position basis. But choosing different bases, you wil get other "wave functions". Maybe it is because you are only use to work in the position basis, and yet not seen bra-ket in on it's full scale?