# Questions about quantum mechanics reducing the complexity of classical models

1. May 6, 2012

### IttyBittyBit

The author computes the entropy of the classical simulator using the Shannon entropy, then computes the entropy of the quantum simulator using von Neumann entropy and gets a smaller number, thus concluding that quantum simulators require smaller input.

Firstly, are these two measures directly comparable? For example, the von Neumann entropy of a pure state is 0, even if it's maximally entangled. The corresponding classical entropy would be nonzero (and maximized).

Also, it doesn't seem like the quantum simulator is using any less internal memory. It still seems to require log(|S|) qubits (S = space of causal states), which is the same internal memory as the classical model.

Last edited: May 6, 2012
2. May 7, 2012

### kith

Yes. The von Neumann entropy is the Shannon entropy of the probability distribution given by the weights pi of the kets |ψi> which are contained in the density matrix.

What do you mean by "the corresponding classical entropy"?

Btw: I haven't read the paper.

3. May 7, 2012

### IttyBittyBit

Yes, but the point that the paper is trying to make is that to communicate the state of a simulator, you need less bits if the simulator is quantum.

But the thing is: it's obvious that the information of classical and quantum states is not directly comparable. For example, consider the case of 2 bits: {00,01,10,11}. Let's say we have a classical distribution over these bits P(00)=P(01)=0, P(10)=P(11)=1/2. The entropy of this distribution is 1. The corresponding quantum state would be (|10> + |11>)/√2. By 'corresponding' I mean performing a measurement on this quantum state would give us the same results as sampling from the classical distribution. However, the entropy of the quantum state is 0.

However, this is not the main focus of my question. My main issue is that the internal memory of the simulator - the thing that is ostensibly more important when considering which model is 'simpler' - is the same in both models (actually, it's increased in the quantum case because it has to be reversible).

Last edited: May 7, 2012
4. May 7, 2012

### kith

The connection between the two is decoherence. In order to perform a measurement on your qubits, you have to interact with them. This introduces an environment which leads to the decay of the coherences of the qubits-state. This takes the pure superposition state into a mixed one.

The full state (qubits + measurement device) gets entangled but remains pure. Only the qubit state alone is mixed. This is a very remarkable property of QM: the entropy of a subsystem can be greater than the entropy of the whole system.

I don't have time to read the paper and contribute to more specific issues, sorry.

5. May 7, 2012

### IttyBittyBit

Everything you said is correct, but I don't see how it relates to my question.