How does the computer understand computer code

  • Thread starter Thread starter Niaboc67
  • Start date Start date
  • Tags Tags
    Code Computer
Click For Summary

Discussion Overview

The discussion revolves around how computers interpret and execute programming commands, focusing on the transition from high-level programming languages to machine code and the underlying hardware processes. Participants explore the abstraction layers involved in programming and the complexity of computer operations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants propose that the understanding of commands by computers is fundamentally based on binary (1's and 0's), but they question how such simplicity can lead to complex systems like video games or operating systems.
  • Others explain that there are multiple levels of abstraction, including machine code and logic gates, which facilitate the execution of commands by the processor.
  • A participant illustrates the transition from high-level languages to assembly language, detailing how specific commands are translated into machine code and ultimately into binary opcodes.
  • There is mention of the architecture of processors, including the use of pipelines to optimize instruction fetching, decoding, and execution.
  • Some participants highlight the complexity of decoding instructions, mentioning methods used in older mini computers that involve indexing into function tables based on instruction data.

Areas of Agreement / Disagreement

Participants generally agree on the existence of multiple abstraction layers in programming, but there is no consensus on the implications of these layers or the overall understanding of how computers process commands.

Contextual Notes

Participants reference various levels of abstraction and the complexity of computer architecture, but some assumptions about the audience's prior knowledge of computer science concepts may not be explicitly stated.

Niaboc67
Messages
249
Reaction score
3
I've been programming on and off for a few years just recently taking it seriously. Something I've always wondering about is how the computer understands the commands I type in. Is it essentially binary? and if so how could 1's and 0's possibly create something as complex as a video game or an operating system?
 
Technology news on Phys.org
If yup know a little bit about high level programming languages, there are at least two additional levels on which this question can be answered. One level is a machine code (assembler) - that's the way your commands are analyzed, interpreted and executed. Other level is a level of logic gates, where the zeros and ones are a way of changing logical states of other gates, which in turn makes it possible for a processor to execute the machine code.
 
Niaboc67 said:
I've been programming on and off for a few years just recently taking it seriously. Something I've always wondering about is how the computer understands the commands I type in. Is it essentially binary?
Yes

and if so how could 1's and 0's possibly create something as complex as a video game or an operating system?

Since you obviously know that they DO, why do you pose the question as though it seems unlikely to you that they could?

Just study some basic computer architecture. As Borek said, the lowest level of understanding is logic gates and above that is groups of logic gates controlled by machine code (1's and 0's) and above that are high level languages. Each of these things builds from the bottom up.

As you are aware, by the time you get up to high-level languages, you are not actually programming the machine, you are programming a program (a compiler) which takes your statements back down to machine code.
 
Niaboc67 said:
and if so how could 1's and 0's possibly create something as complex as a video game or an operating system?

How could a bunch of simple protons, neutrons and electrons possibly create something as complex as the human body? See what I did there? :)
 
  • Like
Likes   Reactions: 1 person
Niaboc67 said:
I've been programming on and off for a few years just recently taking it seriously. Something I've always wondering about is how the computer understands the commands I type in. Is it essentially binary? and if so how could 1's and 0's possibly create something as complex as a video game or an operating system?

I think others have pointed this out, but there are several different layers of abstraction between high level languages and what is going on in the hardware.

For example, take the following C++ statement:

Code:
a = b+c; // a,b,c are integers

Go down one level of abstraction, and you end up with assembly language:
Code:
mov eax, b      // fetch variable b from memory.
mov ebx, c      // fetch variable c from memory.
add eax, ebx   // add b + c.
mov a, eax     // store the result in variable a.

The next layer of abstraction is to turn the assembly code into opcode. So each of those instructions above will translated to a different binary number. In addition, opcodes can be variable sizes on some platforms. So there is a bitmap that is looked at to compare the instruction names to the opocde format needed. Generally, the more common instructions have smaller instruction codes. But again it's platform dependent. For example, ARM processors uses fix sized opcodes.

Output of a lst file:
Code:
0000002A  A1 00000008 R	    mov eax, b
0000002F  8B 1D 0000000C R	    mov ebx, cc
00000035  03 C3		            add eax, ebx
00000037  A3 00000004 R	    mov a, eax

The first column is the offset in code. The second column is the opcode. The third column is the offset for the variable. Notice some instructions generated more than one byte.


The next step down is to look at the architecture itself. For example, most likely the system will use pipelines. And it breaks the process of fetching, decoding, and executing instructions into multiple tasks.

The next step down is to look at digital circuit design.

Looking at an adder is a good but simple example of a circuit design.
 
Once at machine language, there are various methods to decode and execute instructions. One method used on some old 16 bit mini computers was to use some number of bits from the instruction data to index into a "bit per function" table. Say the processor has 79 possible operations and 49 nops that are represented as 7 bits of an instruction. The processor reads an instruction, and uses the 7 bits to index into an table that is 80 bits wide. Only a single bit in each table entry is set, and the bit that is set triggers a specific operation (load, add, subtract, nop, ... ). A similar decoding process is used for most processors, using combinatoins of multiple table lookups and/or decoders and/or demultiplexers and/or ... .
 

Similar threads

  • · Replies 18 ·
Replies
18
Views
4K
Replies
1
Views
6K
  • · Replies 29 ·
Replies
29
Views
4K
Replies
10
Views
5K
Replies
29
Views
6K
Replies
5
Views
2K
  • · Replies 21 ·
Replies
21
Views
2K
  • Sticky
  • · Replies 13 ·
Replies
13
Views
8K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K