Cause of quantum effects on small and large scales

  • #1
entropy1
1,232
72
If I understand correctly, quantum effects become very small as the object in consideration becomes larger.

My question: what causes this? For instance: does it have to do with the relation between the quantum objects in the macro object, or does the math show that quantum effects become negligable at greater distances? (the latter would seem strange to me with respect to entanglement, which holds over large distances)
 
Last edited:
Physics news on Phys.org
  • #2
entropy1 said:
If I understand correctly, quantum effects become very small as the object in consideration becomes larger.

My question: what causes this? For instance: does it have to do with the relation between the quantum objects in the macro object, or does the math show that quantum effects become negligable at greater distances? (the latter would seem strange to me with respect to entanglement, which holds over large distances)
With small objects, measuring their properties affects the properties significantly (observably). So one could say that if, say, a position could be measured without changing it, the object is classical. The progress from quantum to classical is tracked by a reduction in the change due to measurement, i.e. a weakening in the non-commutivity of operators. It is possible to bounce a lot of radiation off an aircraft without affecting its momentum. But this is not the case with an atom.
 
  • #3
Thanks! How does the non-commutivity of the operators weaken? :wideeyed:
 
  • #4
It's due to the value of Planck's constant. If it happened to have a much bigger value, you and I would perhaps experience quantum effects at human scales.
 
  • #5
entropy1 said:
quantum effects become very small as the object in consideration becomes larger.
Not necessarily. Only those effects that violate classical intuition become very small or very difficult to establish. Things like superconductivity, X-rays, or radioactive decay are largescale quantum effects that have a large impact in our modern culture.
 
  • #6
entropy1 said:
Thanks! How does the non-commutivity of the operators weaken? :wideeyed:
By averaging and smearing, the effects become diluted.
 
  • #7
Mentz114 said:
By averaging and smearing, the effects become diluted.

Averaging what exactly?
 
  • #8
entropy1 said:
Averaging what exactly?
It depends. An individual gas molecule is a quantum object with the usual uncertainty in momentum and position. But we can define an average velocity for a molecule and treat the gas as an ensemble of classical(ish) particles.

The transition from quantum -> classical dynamics is a big subject and is much discussed in these forums. The measurement problem is related don't expect any easy answers.
 
  • Like
Likes entropy1
  • #9
entropy1 said:
Averaging what exactly?

States with lots of pieces are 'more perpendicular' to each other than states with fewer pieces. They allow for distinguishing finer-grained details.

For example, consider trying to distinguish a single qubit in the state ##\left| 0 \right\rangle## from a single qubit in the state ##\cos(1°) \left| 0 \right\rangle + \sin(1°) \left| 1 \right\rangle##. That's pretty hard! The states are only 1° apart, and the math says states need to be 90° apart to be 100% reliably distinguished. At 1° you'll make random errors almost every time.

But if I give you ten thousand copies of the qubit, then your task becomes distinguishing the compound state ##\left| 0 \right\rangle^{\otimes 10000}## from the compound state ##(\cos(1°) \left| 0 \right\rangle + \sin(1°) \left| 1 \right\rangle)^{\otimes 10000}##. The angle between those two states is ##\arccos[(\cos 1°)^{10000}] \approx 77°##. Suddenly we're making errors at a more reasonable rate; like half of the time instead of almost all of the time.

If we jump up again, to a *million* states, then the two possible inputs are basically perpendicular and we almost never make mistakes. Increasing the number of states compounds small differences so that we can reliably tell them apart.
 
  • #10
@Strilanc

I am not familiar with the [itex]\otimes[/itex] symbol yet (tensor product?), but it seems to me your clarification is what I wanted to know: the math to explain this is available and simple and sound. Thanks! :smile:
 
  • #11
Does this help?
 
  • #12
  • #13
Mentz114 said:
By averaging and smearing, the effects become diluted.
sciencejournalist00 said:
Does this help?


Does it have to do with the canceling out of probability waves of larger number of particles at larger distances? :smile:
 
  • #14
It has to do with the phase between probability waves. When the phase between the probability waves is random, the quantum effects will cancel each other out. However, artificially, the phase between waves can be aligned, giving rise to large scale quantum efects.

A large scale quantum effect is superconductivity, superfluidity, interference patterns (superposition of trajectories) from holograms, coherence in lasers. Not all quantum effects are lost on large scales.

For example, liquid helium is described by a single wavefunction for all its particles. This is why it flows without viscosity.And superconductors have all electrons described by one wavefunction. This is why they have zero electric resistance
 
Last edited:

Similar threads

Back
Top