How exactly do computers execute code?

  • Thread starter Thread starter grandpa2390
  • Start date Start date
  • Tags Tags
    Code Computers
Click For Summary

Discussion Overview

The discussion centers around the mechanisms by which computers execute code, particularly focusing on the transition from high-level programming to the low-level operations of hardware. Participants explore concepts related to logic circuits, CPU architecture, and the physical processes involved in code execution.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant expresses confusion about how hardware executes instructions encoded in binary, questioning the role of circuits and whether there is a "brain" within the computer that interprets these instructions.
  • Another participant suggests that understanding the CPU at the transistor level may clarify these processes, indicating that a detailed exploration of digital logic is necessary.
  • Some participants propose that the execution of code involves boolean logic and the use of registers within the CPU, which facilitate rapid operations on data.
  • There is mention of the importance of practical experience in understanding these concepts, with suggestions to build simple computers or study specific microprocessors like the 6502.
  • Multiple participants discuss the role of instruction cycles, opcodes, and micro-operations in the execution of code, highlighting the complexity of modern CPUs.
  • One participant emphasizes that while the components of a CPU are simple, their combination leads to complex functionality, managed through abstraction.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the best way to understand how computers execute code. There are multiple competing views on the level of detail required and the most effective methods for learning about CPU operations.

Contextual Notes

Some participants note that understanding the execution of code requires significant time and effort, particularly with textbooks and practical experience. There are also references to historical computing methods, such as punch cards, which contrast with modern practices.

Who May Find This Useful

This discussion may be useful for individuals interested in computer architecture, digital logic design, or those seeking to deepen their understanding of how programming translates to hardware operations.

grandpa2390
Messages
473
Reaction score
14
one thing that has always miffed me is how is it that the hardware carries out the coded instruction. the code is 1's and 0's, and if I have understood anything. 1's and 0's are on and off. but how is the computer flipping the circuits on and off? something must tell it to do that. but then what is telling it to follow those instructions. etc.

I am unfamiliar, but have friends to took circuits or logic circuits or something like that, and so I guess at the most basic level, logic circuits are used in a magical way to control everything. all code must carry with it instructions that activate those circuits?

I don't know. it's weird to me. almost as if there must be a brain in there deciphering the code and carry out the orders.
 
Technology news on Phys.org
What reading have you been doing so far on this? Seeing what you've been reading and not understanding will help us to answer your question the best that we can. Thanks. :smile:
 
  • Like
Likes   Reactions: Klystron
A tutorial video like this might help. There are others on Youtube.



There is no magic. It is just boolean logic. I'm not sure the level of understanding you seek. Do you need a description of the CPU down to the level of each transistor?
 
anorlunda said:
There is no magic. It is just boolean logic. I'm not sure the level of understanding you seek. Do you need a description of the CPU down to the level of each transistor?
I know there's no magic. I don't think that video is what I'm looking for. but probably, that's what I'm looking for. how the cpu works at the transistor level. not just a generic explanation you get in any course. but how exactly, physically, is the cpu processing.

I can understand how a computer worked back in the days when human involvement was required. manually plugging and unplugging as code. or with the punch cards. when computers required human intercession, and software was physically altering the system. which I don't understand how a cpu does.
 
i think you'll have to build one.
I learned by troubleshooting them. Hung a 'scope on every IC in it .
Of course that was in the days of 7400 logic IC's, when a CPU occupied three boards seventeen inches square and you could do that.
By 1974 they'd progressed to the point a CPU was now one IC the size of your thumb.

Hobbyists are building them for the exact reason you state.
https://eater.net/8bit/

good luck
old jim
 
  • Like
Likes   Reactions: grandpa2390 and OmCheeto
If you are curious enough to study, I recommend choosing the famous 6502 chip as your object.

A simple google search for 6502 will point you to numerous books, articles, emulators, circuit diagrams, and even fan clubs of people who study and celebrate the 6502. The 6502 was a very capable microcomputer, but much much simpler than today's CPUs. Those 6502 groups and forums are filled with people who share your interests. They can probably help you with very specific suggestions.

Once upon a time, I had all the 6502 op codes memorized, and I could read code in binary. Those were fun times. :biggrin:
 
  • Like
Likes   Reactions: phinds, sysprog, grandpa2390 and 2 others
Typically the opcode and/or options are used to index the equivalent of tables, which in turn result in running a sequence of micro -operations, and/or more table lookups. In some simple processors, there's a single table lookup with a large number of bits per entry, 1 bit for every possible operation. I recall a 16 micro based on AMD 2900 family of chips that only had just under 80 possible combinations of operations, so it use a table that was 80 bits wide, with only 1 bit set per entry.
 
  • Like
Likes   Reactions: grandpa2390
rcgldr said:
Typically the opcode and/or options are used to index the equivalent of tables, which in turn result in running a sequence of micro -operations, and/or more table lookups.

CPUs with micro code add another level of abstraction. That's why I recommended the 6502. No micro code.
 
  • Like
Likes   Reactions: grandpa2390
  • #10
grandpa2390 said:
how the cpu works at the transistor level. not just a generic explanation you get in any course. but how exactly, physically, is the cpu processing.

You can start at the transistor level and work up from there by looking into Digital Logic. In my undergrad Digital Logic class we actually designed a very, very basic CPU. Unfortunately you just won't be able to really understand this stuff without spending a significant amount of time and effort with a good textbook on the subject. You can watch as many videos as you want, but until you get down and dirty solving digital logic problems you won't truly understand.
 
  • Like
Likes   Reactions: sysprog, grandpa2390, QuantumQuest and 1 other person
  • #11
The punch card is just in memory now, as a bunch on/off bits. The clock is cycling, fetching, decoding, executing, it’s built in. You have your programs instructions loaded in memory, you have a number pointing you to where the next one is. That number increments, or an instruction can change it.

It seams highly complex, but all of the components individually are fairly simple. How they are combined to make more complex machinery is also relatively simple, at each level. The key to managing the complexity is abstraction.
 
  • Like
Likes   Reactions: Klystron
  • #12
Drakkith said:
You can start at the transistor level and work up from there by looking into Digital Logic. ... You can watch as many videos as you want, but until you get down and dirty solving digital logic problems you won't truly understand.
Exactly. Many things what you just can't understand by a book will become common and trivial once you start using them. Don't try to expand your understanding without any practical experience.
 
  • Like
Likes   Reactions: jim hardy
  • #13
grandpa2390 said:
I don't know. it's weird to me. almost as if there must be a brain in there deciphering the code and carry out the orders.

It seems that way because there IS a brain in there deciphering the code and carrying out the orders.

A processor has a set of instruction codes it is designed to understand. A common design feature is also to have "registers", which are a few memory locations right on the chip. Things done on registers are very fast. Let's say you have four registers A, B, C and D. Then a built-in instruction might be ADD A, B, C (encoded in some binary format) which the processor is wired to understand as "Add what's in register A to what's in register B and store the result in register C". When the processor is executing that instruction, it invokes special addition circuitry which is also right on the chip, as well as the circuitry which reads and writes things to the desired places.

Then it gets the next instruction and does whatever that says. And the next. And the next.

You might want to pick up a book on assembly language for some processor, any processor, just to see what those fundamental instructions look like. Assembly language is the text version of the commands. The processor of course encodes them as 1's and 0's. Maybe ADD is instruction 0101 and registers A, B, C, D are written as binary 00, 01, 10, 11 so ADD A, B, C to the computer really looks like 0101 00 01 10 for example.

The binary language, what the CPU actually understands, is called "machine language". There isn't really a good reason to learn or understand that unless you're really curious. And these days there is very little call for people to work in the slightly more-readable assembly language either.
 
  • Like
Likes   Reactions: sysprog, grandpa2390, Klystron and 1 other person
  • #14
RPinPA said:
And these days there is very little call for people to work in the slightly more-readable assembly language either.

It's good to have at least a passing understanding of assembly. If you ever get into trouble debugging you might get dumped to an assembly view. Also if your code is not working you can compare the assembly to the C and it will show you what part of the C language you violated.

BoB
 
  • Like
Likes   Reactions: grandpa2390, Klystron, jim hardy and 1 other person
  • #15
grandpa2390 said:
one thing that has always miffed me is how is it that the hardware carries out the coded instruction. the code is 1's and 0's, and if I have understood anything. 1's and 0's are on and off. but how is the computer flipping the circuits on and off? something must tell it to do that. but then what is telling it to follow those instructions. etc.

I am unfamiliar, but have friends to took circuits or logic circuits or something like that, and so I guess at the most basic level, logic circuits are used in a magical way to control everything. all code must carry with it instructions that activate those circuits?

I don't know. it's weird to me. almost as if there must be a brain in there deciphering the code and carry out the orders.
You might want to look into what's called a finite state machine. Also, you should probably concentrate on the gate level, rather than the transistor level.
 
  • Like
Likes   Reactions: sysprog, grandpa2390 and Klystron
  • #16
There are newer editions, but this Tanenbaum book from 1998 does a good job of explaining how it all happens, from start to finish. At least, it would help with a conceptual understanding. This older edition can be had for a mere $12 used on Amazon: https://www.amazon.com/dp/0130959901/?tag=pfamazon01-20
 
  • Like
Likes   Reactions: grandpa2390, Klystron and berkeman
  • #17
anorlunda said:
If you are curious enough to study, I recommend choosing the famous 6502 chip as your object.
I preferred the 6809. Ahh, assembly language programming on the Color Computer...
 
  • #18
grandpa2390 said:
at the most basic level, logic circuits are used in a magical way to control everything. all code must carry with it instructions that activate those circuits?
Yup.

For a reasonably basic look starting at the transistor level:
http://www.circuitstoday.com/logic-gates
(above. and much more, found with: https://www.google.com/search?&q=logic+gate+circuit+diagram)

An early bible for the computer operation (and more) is THE ART OF COMPUTER PROGRAMMING, Volume 1, Fundamental Algorithms, by Donald E. Knuth, Addison-Wesley Publishing Company, ISBN 0-201-03809-9. (that's the Second edition, I believe there are later ones.)

Digital Logic is based on signals that are either On or Off (One or Zero, True or False); just like your refrigerator is either On or Off.

Then realize that any logical operation (or mathematical for that matter) can be created from the three logical operations of AND, OR, NOT. (can also be done with other operations, but these are the most used and simple enough to keep straight) (That's the field called Boolean Algebra, quite useful when you get in a bit deeper)

AND = output is True if all inputs are True
OR = output is True if any one or more inputs are True
NOT = output is True if input is False; output is False if input is True (i.e. the output is the complement of the input

Hope this helps.
Cheers,
Tom
 
  • Like
Likes   Reactions: grandpa2390, FactChecker and Klystron
  • #19
Drakkith said:
You can start at the transistor level and work up from there by looking into Digital Logic. In my undergrad Digital Logic class we actually designed a very, very basic CPU. Unfortunately you just won't be able to really understand this stuff without spending a significant amount of time and effort with a good textbook on the subject. You can watch as many videos as you want, but until you get down and dirty solving digital logic problems you won't truly understand.
The (free with little ad bar on its own screen only) Android app game https://apkpure.com/circuit-scramble-computer-logic-puzzles/com.Suborbital.CircuitScramble/download (< direct download link from apkpure) is fun and instructive for neophytes in the workings of logic gates.
 
Last edited:
  • Like
Likes   Reactions: Drakkith

Similar threads

  • Sticky
  • · Replies 13 ·
Replies
13
Views
8K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
10
Views
5K
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
Replies
29
Views
6K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
11
Views
3K
  • · Replies 6 ·
Replies
6
Views
7K