# Quantization and heat vs work

1. Nov 10, 2007

### cp7970

i understand that adding heat to an atom will cause the electrons to populate higher energy levels... but apparently doing work will cause the energy levels themselves to change (increase i guess?) Is this true and if so why?!

Thanks!!

Last edited: Nov 10, 2007
2. Nov 10, 2007

### Count Iblis

Indeed. The energy levels of a system depend on the external parameters like the volume. Suppose the system is in some energy level E. But E and depends of the volume, so let's write this as E(V). What will happen if we change the volume from V1 to V2? In the limit of an infinitely slow change from V1 to V2 the wavefunction of the system will just "follow the change" and you'll end up at energy level E(V2). This is the so-called "Adiabatic Theorem" of Quantum Mechanics.

In the limit of an infinitely fast change from V1 to V2 the system stays in the same state (the so-called Sudden Aprroximation). However that state is then no longer an energy level. What then happens is that if you measure the enrgy of the system, the system will collapse to sme energy level with some probability. These probabilities can be computed using the formalism of quantum mechanics. So, you'll get a probability distribution over the possible energies. The expectation value of the energy will stay the same, i.e. it will be E(V1).

In the first adiabatic case the system performes work. The entropy stays the same. In the second case of a sudden change, the system does not perform any work. The entropy increases.

Last edited: Nov 10, 2007
3. Nov 10, 2007

### cp7970

so heat is a way of changing a system's energy by means of a temperature change only - and the kinetic energies of the electrons just move up to higher energy levels... but the levels themselves depend on external variables (like V as you postulate) so as work changes these external variables it inevitably changes the energy levels

i think that's an ok way of thinking about things - yes?

Thanks very much!

4. Nov 10, 2007

### Count Iblis

It is better to leave temperature out of this for the moment. You have to take a very fundamental view here. In high school we teach children a bit about temperature like it being related to energy etc. However, what you should ask yourself now is:

What is temperature really? I mean if everything consists of particles then given a box contaning a number of particles how do you define temperature?

What is heat and work? If the internal energy of a box containing particles changes how do we split it up in a part that is heat and another part that is work.? Why do we want to do this in the first place?

he ansewer to these questions is basically that we want to describe a system of a large number of particles statisticaly. We do not want to deal with the exact state of a system containing zillions of particles. So, we have limited information about the system. Given what we do know, like the internal energy and the volume, there are a huge number of states that the system can be in.

Now the fundamental postulate of Statistical Physics says that all these possible states are equally likely. Now, if you dump energy in a system then you can lose information. I.e. the number of possible states the system (and the system where ythe energy is coming from or going to) can be in given what you knew before can increase. It can only increase or stay the same because all states are eqyually likely and "more states" is a more likey situation than "less states".

Now, when you change the volume the system stays in the same energylevel which just changes as a fnction of volume. So, you don't lose any information about the syste. You can reverse the change by changing the volume back to what it was.

If you dump heatin the system, you increase the randomness, i.e. the number of states the system can be in. You cannot reverse this to get back to the previous situation.

5. Nov 10, 2007

### cesiumfrog

Is that all? This might be wandering slightly off-topic, but since discovering Bayesian statistics, isn't it important to specify which basis in which the distribution is uniform?

Isn't choosing the "natural" basis as much of a problem for thermodynamics as it is for quantum de-coherence and for MWI?

6. Nov 10, 2007

### Count Iblis

I was oversimplifying a bit. In classical statistical mechanics the state of a system of N particles is described by specifying all the momenta and velocities of the N particles, so 6 N coordinates in total. In thi 6 N dimensional space you have a uniform probability distribution, i.e. the probability is proporional to d^(3N)P d^(3N)x.

In quantum mechanics things are simpler. You fix a small energy interval and assign a uniform probability distribution over the set of all energy eigenstates within that energy interval. This defines the density matrix for the micro-canonical distribution.

Basis problem in QM? Perhaps we should focus on the observer. We should accept that in principle one needs to define rigorously the state of an observer who has measured the spin of an electron and found it to be "spin up". In practice we are not going to write down the exact states of an observer. But if we do nothing and then come up with paradoxes then that's not convincing to me.

The important feature about MWI is that time evolution is unitary. But we know that if two states are related by a unitary transformation, they are basically the same. We are just looking at the state from a different perspective. So, the Heisenberg representation may be a more fundamental point of view.

So, in my opinion te possible "basis states" that an observer can find him/herself in is not really a problem. They just define the observers in different possible states. An entangled schrödinger cat state with the radio-active atom in a box is, i.m.o., the same physical state as the cat and the atom that went into the box, because these states are related by a unitary transformation.

You can do the following thought experiment. Instead of a cat you put a human in a box and instead of the atom triggering an event leading to the death of the person you just take a deterministic event. Let's say that nerve gas will be adminstered and the person dies.

The wave function of the person and the gas in the box undergoes a unitary time evolution. However, in principle, we can still talk to the person. Formally, you can write down observables which allow you to measure the original wavefunction of the person in the box before the gas was adminstered. So, in principle, you can still discuss things with the person in the state he was before the gas was adminstered. As long as the superposition exists, the person is still (in principle) accessible to us.

7. Nov 11, 2007

### cesiumfrog

My argument is that, though I agree the possible states of the system form a space of dimension 6N, these points can be described in terms of completely arbitrary coordinate charts, so you can't impose any "uniform distribution" over this manifold until after you choose a metric.

For example, by making a distribution uniform over velocity, you are implicitly choosing for the distribution to be very non-uniform over velocity-squared (which in particular means that the relation between entropy and temperature is something you assumed without justification right from the start, not a result). Obviously I'm not saying we made the wrong choice, just that the choice should always be explicit and preferably well motivated. It seems not to be.

(My link to QM was referring to fairly philosophical arguments over whether the Born rule can be derived, perhaps using decision theory..)

Last edited: Nov 11, 2007
8. Nov 11, 2007

### Count Iblis

In classical mechanics we have Liouville's theorem and Boltzmann's H-theorem to motivate this. But want we want is that a time average of a macroscopic observable of a system take over sufficently long time intervals is equal to the ensemble average.

If I remember my old lecture notes correctly, this is not a settled question...