Cause of quantum effects on small and large scales

Click For Summary

Discussion Overview

The discussion centers on the causes of quantum effects diminishing as the size of an object increases, exploring both theoretical and conceptual aspects. Participants examine the relationship between quantum and classical mechanics, the role of measurement, and the implications of quantum phenomena at larger scales.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants propose that quantum effects become negligible at larger scales due to the relationship between quantum objects within a macro object, while others question whether this is a mathematical artifact.
  • It is suggested that the transition from quantum to classical behavior can be tracked by the reduction in the impact of measurement on properties, with larger objects exhibiting less sensitivity to measurement.
  • One participant notes that the value of Planck's constant plays a crucial role in determining the scale at which quantum effects are experienced.
  • Others argue that not all quantum effects diminish with scale, citing examples such as superconductivity and radioactive decay, which remain significant at larger scales.
  • Participants discuss the concept of averaging and smearing in quantum states, suggesting that larger ensembles of particles can lead to classical-like behavior.
  • There is mention of the phase relationship between probability waves, with some arguing that random phases can lead to cancellation of quantum effects, while aligned phases can produce observable large-scale quantum phenomena.

Areas of Agreement / Disagreement

Participants express a mix of agreement and disagreement regarding the nature of quantum effects at larger scales. While some assert that quantum effects become negligible, others highlight significant large-scale quantum phenomena, indicating that the discussion remains unresolved.

Contextual Notes

The discussion touches on complex topics such as the measurement problem and the transition from quantum to classical dynamics, which are acknowledged as extensive subjects with ongoing debate.

entropy1
Messages
1,232
Reaction score
72
If I understand correctly, quantum effects become very small as the object in consideration becomes larger.

My question: what causes this? For instance: does it have to do with the relation between the quantum objects in the macro object, or does the math show that quantum effects become negligable at greater distances? (the latter would seem strange to me with respect to entanglement, which holds over large distances)
 
Last edited:
Physics news on Phys.org
entropy1 said:
If I understand correctly, quantum effects become very small as the object in consideration becomes larger.

My question: what causes this? For instance: does it have to do with the relation between the quantum objects in the macro object, or does the math show that quantum effects become negligable at greater distances? (the latter would seem strange to me with respect to entanglement, which holds over large distances)
With small objects, measuring their properties affects the properties significantly (observably). So one could say that if, say, a position could be measured without changing it, the object is classical. The progress from quantum to classical is tracked by a reduction in the change due to measurement, i.e. a weakening in the non-commutivity of operators. It is possible to bounce a lot of radiation off an aircraft without affecting its momentum. But this is not the case with an atom.
 
Thanks! How does the non-commutivity of the operators weaken? :wideeyed:
 
It's due to the value of Planck's constant. If it happened to have a much bigger value, you and I would perhaps experience quantum effects at human scales.
 
entropy1 said:
quantum effects become very small as the object in consideration becomes larger.
Not necessarily. Only those effects that violate classical intuition become very small or very difficult to establish. Things like superconductivity, X-rays, or radioactive decay are largescale quantum effects that have a large impact in our modern culture.
 
entropy1 said:
Thanks! How does the non-commutivity of the operators weaken? :wideeyed:
By averaging and smearing, the effects become diluted.
 
Mentz114 said:
By averaging and smearing, the effects become diluted.

Averaging what exactly?
 
entropy1 said:
Averaging what exactly?
It depends. An individual gas molecule is a quantum object with the usual uncertainty in momentum and position. But we can define an average velocity for a molecule and treat the gas as an ensemble of classical(ish) particles.

The transition from quantum -> classical dynamics is a big subject and is much discussed in these forums. The measurement problem is related don't expect any easy answers.
 
  • Like
Likes   Reactions: entropy1
entropy1 said:
Averaging what exactly?

States with lots of pieces are 'more perpendicular' to each other than states with fewer pieces. They allow for distinguishing finer-grained details.

For example, consider trying to distinguish a single qubit in the state ##\left| 0 \right\rangle## from a single qubit in the state ##\cos(1°) \left| 0 \right\rangle + \sin(1°) \left| 1 \right\rangle##. That's pretty hard! The states are only 1° apart, and the math says states need to be 90° apart to be 100% reliably distinguished. At 1° you'll make random errors almost every time.

But if I give you ten thousand copies of the qubit, then your task becomes distinguishing the compound state ##\left| 0 \right\rangle^{\otimes 10000}## from the compound state ##(\cos(1°) \left| 0 \right\rangle + \sin(1°) \left| 1 \right\rangle)^{\otimes 10000}##. The angle between those two states is ##\arccos[(\cos 1°)^{10000}] \approx 77°##. Suddenly we're making errors at a more reasonable rate; like half of the time instead of almost all of the time.

If we jump up again, to a *million* states, then the two possible inputs are basically perpendicular and we almost never make mistakes. Increasing the number of states compounds small differences so that we can reliably tell them apart.
 
  • #10
@Strilanc

I am not familiar with the \otimes symbol yet (tensor product?), but it seems to me your clarification is what I wanted to know: the math to explain this is available and simple and sound. Thanks! :smile:
 
  • #11
Does this help?
 
  • #12
  • #13
Mentz114 said:
By averaging and smearing, the effects become diluted.
sciencejournalist00 said:
Does this help?


Does it have to do with the canceling out of probability waves of larger number of particles at larger distances? :smile:
 
  • #14
It has to do with the phase between probability waves. When the phase between the probability waves is random, the quantum effects will cancel each other out. However, artificially, the phase between waves can be aligned, giving rise to large scale quantum efects.

A large scale quantum effect is superconductivity, superfluidity, interference patterns (superposition of trajectories) from holograms, coherence in lasers. Not all quantum effects are lost on large scales.

For example, liquid helium is described by a single wavefunction for all its particles. This is why it flows without viscosity.And superconductors have all electrons described by one wavefunction. This is why they have zero electric resistance
 
Last edited:

Similar threads

  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 46 ·
2
Replies
46
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 0 ·
Replies
0
Views
749