- #1
aquaregia
- 21
- 0
When I was first learning programming I was surprised that computers only hold numbers to 32 or 64 bits of precision (sometimes more). What I am wondering is to what precision does stuff in the universe happen?
For example: if you had a super slow motion microscope and could zoom into a single atom of hydrogen bounce off of something, to what level of precision would the angle be accurate compared to what the relevant formulas say it should be?
Would a quantum computer calculate something to this degree of precision? Because I remember reading something that said that using a quantum computer it would be possible to model how something would happen in reality EXACTLY rather than how most computers do an approximation.
For example: if you had a super slow motion microscope and could zoom into a single atom of hydrogen bounce off of something, to what level of precision would the angle be accurate compared to what the relevant formulas say it should be?
Would a quantum computer calculate something to this degree of precision? Because I remember reading something that said that using a quantum computer it would be possible to model how something would happen in reality EXACTLY rather than how most computers do an approximation.