# The basic math of quantum mechanics

• kindlin
In summary: In fact, one of the nice properties of bra-ket notation is that if you have a linear operator on a vector space and you want to find all the combinations of vectors that make its operator work, you can just write out the bra-ket notation and then do a linear search through it.In summary, this paper is about quantum mechanics and the notation used to represent the mathematics. This notation is not very easy to understand and this is the main problem with trying to learn it.
kindlin

I would love to read this paper, however I'm stumped in the very first equation.

$|\psi \rangle =\sum|a_k\rangle\langle a_k|\psi\rangle$

I actually have some basis in the ideas behind QM, and a general love of math, but I just have no idea how to approach it mathematically. I have tried to look up the Dirac notation before and got some answer, but it was as equally hard to understand as my original question. This seems to be my main problem with even beginning to try and teach myself some semblance of QM, all I find as explanations to my questions are more confusing than my original question!

I wouldn't be against buying a book, but I suspect that any book I buy will have equations similar to this and I will be in the exact same predicament, but a 100$poorer. I don't think this math is hard, I just don't know what it is. Like if you tried to teach yourself Einstein Notation but had no idea what a summation over i was; it just wouldn't work. If I could get some basic understanding of the notation used throughout the linked paper, I think I would be well on my way to being able to more fully understand the real quantities of QM and not just some 'kind of' answers that I am forced to search for because of my lack of mathematical understanding. A little about me background: I just graduated with my masters in civil - structural engineering, so I have some basic understanding of tensors and higher level analysis etc, I just have never needed to be exposed to the higher mathematics which are so crucial for things like QM and GR (And finally, string theory or LQG). I hope to be able to pursue a degree in theoretical physics next, as my currently chosen career progresses, but in the mean time I would love to be able to pursue these topics without being completely overwhelmed in simple (or maybe not so simple?) notation. kindlin said: I'm stumped in the very first equation. $|\psi \rangle =\sum|a_k\rangle\langle a_k|\psi\rangle$ I wouldn't be against buying a book, but I suspect that any book I buy will have equations similar to this and I will be in the exact same predicament, but a 100$ poorer.

Setting the budget at less than \$100, where is the first placed you are stumped in the Wikipedia article on the notation? http://en.wikipedia.org/wiki/Bra–ket_notation

jedishrfu
Find V.Subrmanium IIT Madras lectures on YOUTUBE.I have similar interest and found these about 30 lectures very good.So far I studied only first five which contain mathematics background.

If you know linear algebra, bra-ket notation is just a slightly esoteric way of writing out vectors and linear operators. ##|v\rangle## is just a vector, e.g., ##\vec{v}##. ##\langle u|## is the linear functional dual the vector ##\vec{u}## via an inner product defined on the vector space. In other words, ##\langle u | v \rangle##, which we interpret as the functional ##\langle u |## applied to the vector ##|v\rangle##, is just another way of writing ##\vec{u} \cdot \vec{v}##. In fact, that's where the notation comes from in the first place. One sometimes write a general inner product as ##(\vec{u},\vec{v})## or ##\langle\vec{u},\vec{v}\rangle## so just Dirac looked at that and said, "Well we've got a bracket of two vectors. Let's just use the right side of that as the notation for vectors themselves and call them "kets" and the left side as the notation for dual vectors and call them (hehe) "bras". Then I can make an inner product bracket just multiplying a bra and a ket. Get it guys? Bra-ket? Bracket? Guys?" Pretty much everything else follows from this. If you've got a linear operator ##\hat{O}## on your vector space then ##\langle u | \hat{O} | v \rangle## is ##\vec{u} \cdot (\hat{O} \vec{v})##. If I've got an orthonormal basis of vectors (which we use almost exclusively in quantum mechanics) where I label the i-th vector ##|a_i\rangle## then ##\langle a_i | \hat{O} | a_j \rangle## is the ij-th element of the matrix representative of ##\hat{O}## with respect to this basis, ##O_{ij}##. Like I said, exactly the same as standard linear algebra. The expression you're having trouble with is the analogous result for vectors themselves: the i-th component of the coordinate vector of some vector ##\vec{u}## in this basis is just ##\vec{a}_i \cdot \vec{u} = \langle a_i | u \rangle##.

It seems a bit silly at first, but once you get used to it bra-ket notation has very nice properties. You can only put vectors together in so many ways in linear algebra (inner products, tensor products, and linear combinations basically exhaust the list) and when you use bra-ket notation you can usually see at a glance if you've done something that doesn't make sense.

I'll give a concrete example. It might make things clearer.

If you have a two dimension vector you can pick a coordinate system (which doesn't have to be at right angles) then you can define two basic vectors a1=(1,0) and a2=(0,1) from which all other vectors can be made. So choose a vector like ψ=(c,d)

(c,d) = (1,0) * c + (0,1) * d

but c = (1,0).(c,d)
and d= (0,1).(c,d)

so

(c,d) = (1,0) * (1,0).(c,d) + (0,1) * (0,1).(c,d)

or

Ψ =∑ a * (a.Ψ)

which is basically the equation you were after. All it says is that any vector can be built out of basic vectors.

dextercioby
kindlin said:
I would love to read this paper, however I'm stumped in the very first equation.

$|\psi \rangle =\sum|a_k\rangle\langle a_k|\psi\rangle$
The Riesz representation theorem says that if ##\mathcal H## is a Hilbert space over ##\mathbb C##, then for each bounded linear functional ##\phi:\mathcal H\to\mathbb C##, there's a unique ##x\in\mathcal H## such that ##\phi=\langle x,\cdot\rangle##, i.e. a unique ##x\in\mathcal H## such that ##\phi(y)=\langle x,y\rangle## for all ##y\in\mathcal H##.

Bra-ket notation is essentially just the convention to denote a typical vector by ##|x\rangle## instead of ##x##, and the corresponding linear functional by ##\langle x|## instead of by ##\langle x,\cdot\rangle##. (As already indicated above, this denotes the map that takes an arbitrary ##y\in\mathcal H## to ##\langle x,y\rangle##).

So bras are linear functionals that take vectors (that we now call kets) to complex numbers. The translation from bra-ket notation to standard notation is pretty simple: ##\langle x|y\rangle## is defined as ##\langle x||y\rangle## (this is the value of ##\langle x|## at ##|y\rangle##, i.e. the output produced by ##\langle x|## when it takes ##|y\rangle## as input), so we have
$$\langle x|y\rangle = \langle x||y\rangle =\langle x,\cdot\rangle(y)=\langle x,y\rangle.$$ The formula you asked about is just a vector expressed using an orthonormal basis. Suppose that ##x\in\mathcal H## and that ##\{e_i\}_{i=1}^\infty## is an orthonormal basis for ##\mathcal H##. Then there's a unique sequence ##(c_i)_{i=1}^\infty## in ##\mathbb C## such that ##x=\sum_{i=1}^\infty c_i e_i##. It's very easy to find a formula for those numbers. For all positive integers ##i##, we have
$$\langle e_i,x\rangle =\left\langle e_i,\sum_{j=1}^\infty c_j e_j\right\rangle =\sum_{j=1}^\infty c_j \langle e_i,e_j\rangle = c_i.$$ So we can always write
$$x=\sum_{i=1}^\infty \langle e_i,x\rangle e_i.$$ If we use bra-ket notation as discussed above, and also the convention that it's OK to write the product of a number c and a vector x as xc rather than cx, then we can rewrite this as
$$|x\rangle =\sum_{i=1}^\infty |e_i\rangle\langle e_i|x\rangle.$$

Last edited:
kindlin said:
I'm stumped in the very first equation.

$|\psi \rangle =\sum|a_k\rangle\langle a_k|\psi\rangle$

Symbols bracketed like $|X\rangle$ are called kets and represent an entry in a column vector (or a fixed linear combination of entries). So, for example, $|X\rangle$ might be shorthand for $\left[ \begin{array}{c} 1 \\ 0 \\0 \end{array} \right]$ or for $\left[ \begin{array}{c} 0 \\ 1 \\0 \end{array} \right]$ or for $\left[ \begin{array}{c} 0 \\ 0 \\ -1 \end{array} \right]$ or for $\left[ \begin{array}{c} \sqrt{1/2} \\ i \sqrt{1/2} \\0 \end{array} \right]$.

Symbols bracketed like $\langle X |$ are called bras and represent an entry in a row vector. For example, $\langle X |$ could be $\left[ \begin{array}{cccc} 0 & 0 & 0 & 1 \end{array} \right]$ or $\left[ \begin{array}{cccc} \sqrt{1/2} & i \sqrt{1/2} & 0 & 0 \end{array} \right]$.

Placing a bra next to a ket (resulting in a "bracket", har har) means "multiply these two things". So $|X\rangle\langle Y|$ means multiply some column vector pointing along X by a row vector pointing along Y. Note that the order matters quite a bit here: row-times-column (bra-ket) gives you a single complex value as the result while column-times-row (ket-bra) gives you an NxN matrix as the result. When you multiply a ket $| X \rangle$ by its corresponding bra $\langle X |$, the 1x1 ordering gives you the inner product and the NxN ordering gives you the outer product.

So what $|\psi \rangle =\sum|a_k\rangle\langle a_k|\psi\rangle$ is saying is something like "you can break any wavefunction into parts corresponding to the outer-products of each $a_k$". Alternatively, you can think of it as "if you keep the stuff along $a_k$ for each k, you've kept everything (without repetition)". In other words, $a$ is a basis for the state space.

Last edited:
Although it would not be my first port of call - Susskinds books and lectures are much better for that - but after you get a smattering I would go through Chapter 2 of Ballentine:
https://www.amazon.com/dp/9814578584/?tag=pfamazon01-20

Things should be a lot clearer after that.

Its a bit more advanced but continue on to Chapter 3 to understand the true basis of Schroedingers equation etc.

Thanks
Bill

Last edited by a moderator:

## 1. What is quantum mechanics?

Quantum mechanics is a branch of physics that studies the behavior of particles at the subatomic level. It describes how particles such as electrons and photons behave and interact with each other.

## 2. What is the basic math behind quantum mechanics?

The basic math of quantum mechanics is based on complex numbers and linear algebra. It involves concepts such as wave functions, operators, and eigenvalues, which are used to describe the probabilities of a particle's behavior and properties.

## 3. How does quantum mechanics differ from classical mechanics?

Quantum mechanics differs from classical mechanics in that it takes into account the probabilistic nature of particles at the subatomic level, whereas classical mechanics assumes determinism and predictability.

## 4. What are the key principles of quantum mechanics?

The key principles of quantum mechanics include the superposition principle, which states that particles can exist in multiple states at the same time, and the uncertainty principle, which states that it is impossible to know both the position and momentum of a particle with absolute certainty.

## 5. What are some real-world applications of quantum mechanics?

Quantum mechanics has many practical applications, including in the development of technologies such as transistors, lasers, and magnetic resonance imaging (MRI) machines. It also plays a crucial role in fields such as cryptography and quantum computing.

• Quantum Physics
Replies
19
Views
1K
• Quantum Physics
Replies
14
Views
937
• Quantum Physics
Replies
6
Views
1K
• Quantum Physics
Replies
21
Views
970
• Quantum Physics
Replies
22
Views
1K
• Quantum Physics
Replies
15
Views
1K
• Quantum Physics
Replies
44
Views
3K
• Quantum Physics
Replies
1
Views
918
• Quantum Physics
Replies
2
Views
957
• Quantum Physics
Replies
36
Views
3K