How does a computer understand instructions and perform operations?

Click For Summary

Discussion Overview

The discussion centers around how computers interpret instructions and perform operations, exploring the relationship between programming languages, binary conversion, and the underlying hardware components such as transistors and capacitors. The scope includes conceptual understanding of computer architecture and operation.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification

Main Points Raised

  • One participant expresses confusion about how computers understand instructions despite being composed of basic hardware components, seeking resources for further learning.
  • Another participant poses an analogy comparing a computer's operation to a mousetrap, suggesting a need for deeper thinking about the question.
  • A third participant explains that binary instruction values serve as addresses for microcode instructions, detailing how microcode operates as a machine language that controls logic operations.
  • Further elaboration describes how program code is processed by the CPU, with logic gates acting as switches that dictate operations based on data in registers and memory.
  • There is a mention of the complexity of modern computers, suggesting that understanding has become a collective effort rather than an individual one, and emphasizing the importance of breaking down concepts into manageable levels.

Areas of Agreement / Disagreement

The discussion presents multiple viewpoints on how computers process instructions, with no consensus on a singular explanation. Participants provide different analogies and technical details, indicating a range of understanding and interpretation.

Contextual Notes

Participants acknowledge the complexity of modern computing systems and the limitations in fully understanding all aspects of their operation. There is an emphasis on the evolving nature of computer architecture and the challenges in grasping its intricacies.

kingsarat
Messages
3
Reaction score
0
Hai,
I see many programming languages with instructions etc., and computer performing accordingly. Also, I hear computer converts the instructions into binary and it performs operations. Well, what I don't get is, even though it converts the instructions binary, how can it understand that it has to perform particular operation? All that computer has got is bunch of transistors and capacitors and some other hardware. How can it understand all numbers, instructions etc etc? If there any good books dealing with this basic subject please let me know.
 
Engineering news on Phys.org
To put you on the right track thinking-wise I will ask you a question: How does a mousetrap know how to catch a mouse?
 
The short of it is that the binary instruction value itself is used as a starting address of microcode instructions to execute. Although some computers (see RISC) run microcode or pseudo-microcode right out of main memory. Microcode is a kind of machine language where each bit in the instruction controls a logic operation that moves, stores, or processes data. That is, the bits themselves are connected to control pins of the different logic components.

This might be a good starting place:

http://en.wikipedia.org/wiki/Microcode
 
kingsarat said:
Hai,
I see many programming languages with instructions etc., and computer performing accordingly. Also, I hear computer converts the instructions into binary and it performs operations. Well, what I don't get is, even though it converts the instructions binary, how can it understand that it has to perform particular operation? All that computer has got is bunch of transistors and capacitors and some other hardware. How can it understand all numbers, instructions etc etc? If there any good books dealing with this basic subject please let me know.
It's a really clever system. Verrrrry Sipmply: The data that is held in the bit of memory that holds the program (the program code) is fed to the processor and becomes the input for a number of logic gates. These act as switches which control what operations the processor has to perform next. The data from another part of memory is fed into various places in the processor (called registers) and the program code controls what happens to the data in these registers.
The Processor Unit goes through a similar routine each cycle: it takes an instruction in, it loads registers and manipulates data. It also makes decisions (the really clever part of programming) that govern what it does next: it may just take the next instruction in its list or it may 'jump' to another instruction - depending upon the results of tests that it has done with data. But, again, what it does depends on the digital values in certain registers, which act as inputs to (literally) switching circuits.

The clever thing about the modern computer is that it takes data, stored in its memory or its mass storage and that data consists of control signals that determine what happens.

No one knows 'everything' about any of the modern computers any more. I think that the change occurred at round about the level of the BBC Micro. People had a chance of knowing pretty much all of the hardware operation and also the total operating system. Anything more complex and it just has to be a team - so don't feel bad about being flummoxed by just what goes on. Everyone is, to some extent. The secret is to split the thing into 'levels' and ignore the BS merchants who flaunt lots of buzzwords as if they know it all. They don't. The just know more than you do! :smile:
 

Similar threads

Replies
1
Views
2K
  • · Replies 14 ·
Replies
14
Views
5K
  • · Replies 29 ·
Replies
29
Views
4K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 29 ·
Replies
29
Views
4K
  • · Replies 18 ·
Replies
18
Views
4K
Replies
8
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K