- #1

- 180

- 4

Classical physics explain macroscopic phenomenon where measurements are definitive without probability of that event happening.

Is it happening just because of the fact that the Signal to Noise ratio is low in a microscope event and in a macroscopic event all random processes add up to together to result in a high Signal to Noise ratio thereby reducing the probability of these weird events (like quantum tunneling for example) from happening in a macroscopic world?

For example:

If X(t) is a random process that's being measured then error percentage σx/X

if we have a new random process Y(t)=X1(t)+X2(t)...+Xn(t) and if they aren't correlated, then error percentage σy/Y=σx/(√nX), meaning the probability of that event from happening drops by √n times

Is that the reason why we see a difference in physics between microscopic and macroscopic objects?