ISamson said:
Hello.
How do computers and electronics understand bit signals and information? how do they process it?
I know that a bit consists of ons and offs, but how does the computer understand this and what to do with it??
Is there something I don't understand?

Thank you.
At the most basic level, as already noted, there is the concept of
logic gate. With combinations and arrays of such gates(like
and, or, nor, xor etc.) series of
0s and
1s (e.g.
bytes, words) can be transferred and processed by the
CPU, using
memory and
storage devices. Now, at the electronics level, there is a lot going on to get this function properly(levels of voltage at various components, levels of input - output signals etc.). At a more abstract level, i.e. a level closer to the
user of the machine, comes the idea of the
operating system. Back in the old days - there are way more experienced people here to talk about this in more detail than me although I do have some experience as well, there were
wires,
switches and
panels with bulbs in order to insert data and process it (switches later became electromechanical structures (
relays)) (first generation), then
vacuum tubes were used as the basic circuitry for memory and CPU (second generation). Later,
transistors were used and
SSI, MSI, LSI integrated circuits (third generation) and later
VLSI integrated circuits (fourth generation), as a very brief description - for more info you can take a look at
Wikipedia for instance. At some point the need arose to have some system to control the hardware and the various tasks to be performed, so the notion of an operating system came into being. The OS manages hardware resources, provides services for accessing those resources and creates higher - level abstractions such as files, directories and processes. It is also the platform on which application and system programs run and it is an
extension of the machine or in other words, what a user
realizes as machine at the logical level. So, the OS has its very important role in the management / processing data bits as useful information, concerning the interaction between user and machine.
I would also like to point out that the notion of
0s and
1s takes various forms inside a computer: it can be that there is flow of current or not at some component, that something is magnetized or not and in general various things that can only take one of two values.