Since a CPU is essentially a chip, a transistor, or a capacitor (?), it must be operating on a continuous scale. How does it translate a continuous scale into a binary one?
Yes. For example, if you are connecting two digital devices by a cable, what is the maximum length of cable you can use for a given error rate in the transmitted signal.Has anyone seen the probability of a "false zero" (1 read as 0) or a "false one" (0 read as 1) being calculated on any type of circuitry?
There are enormous branches of electrical engineering devoted to exactly this possibility. Every piece of digital logic ever designed includes many such considerations.Has anyone seen the probability of a "false zero" (1 read as 0) or a "false one" (0 read as 1) being calculated on any type of circuitry?
Is this because of some kind of averaging algorithm (execute an operation many times, then take the average [or some other summary statistic]), or is there some other explanation?It's interesting that something which is variably random at the most basic layer becomes something fairly deterministic at the top.
Maybe randomness & determinism aren't mutually exclusive after all.