luckis11
- 272
- 2
How does it (it=what?) distinguish each 8-digit (10100100) from the previous and the next?
The discussion revolves around understanding 8-digit binary numbers, specifically focusing on how bytes are distinguished in computing, both at the hardware and software levels. Participants also explore the nature of binary signals and their representation in electrical terms.
Participants do not reach a consensus on the initial question regarding byte distinction, and there is ongoing confusion and debate about the nature of binary signals and their representation in computing.
There are unresolved assumptions regarding the definitions of bits and bytes, as well as the specifics of how signals are transmitted and interpreted in hardware and software contexts.
luckis11 said:PLEASE forget my previous question. I want to grasp this:
The bits 0 and 1 are what?
The whether signal passes from a gate or not? This seems wrong because the gate NOT converts a signal "1" to a signal "0", whereas a signal always passes from that gate?
Is it that the e.g. 101 means that at a wire (just a wire, no gates in between) there is an electrical pulse of (wavefront- no wavefront-wavefront)? This also seems wrong because if it was so, the signal that arrives from the one wire to the gate should be 111111111..., and the other one should be 000000000...otherwise how could it be that...
? A link that explains this?