- #1
- 2,550
- 1
Suppuse I'm making a measuring device of some sort that gets its feed from a microcontroller, would there be a huge difference between 8bits, 16bits, and 32bits microcontroller in terms of the device capability to measure accurately? Like, I imagine, the 8bits could only measure jumps of 0.5mm, the 16bits jumps of 0.1mm and the 32bits 0.05mm. Is that right?
Or will the difference be on a more microscopic scale?
I guess I'm more looking to see how big is the difference in terms of accuracy of measurements.
Or will the difference be on a more microscopic scale?
I guess I'm more looking to see how big is the difference in terms of accuracy of measurements.