How exactly do computers execute code?

In summary: The CPU still requires human intervention to run software. The CPU still requires human intervention to run software.
  • #1
grandpa2390
474
14
one thing that has always miffed me is how is it that the hardware carries out the coded instruction. the code is 1's and 0's, and if I have understood anything. 1's and 0's are on and off. but how is the computer flipping the circuits on and off? something must tell it to do that. but then what is telling it to follow those instructions. etc.

I am unfamiliar, but have friends to took circuits or logic circuits or something like that, and so I guess at the most basic level, logic circuits are used in a magical way to control everything. all code must carry with it instructions that activate those circuits?

I don't know. it's weird to me. almost as if there must be a brain in there deciphering the code and carry out the orders.
 
Technology news on Phys.org
  • #2
What reading have you been doing so far on this? Seeing what you've been reading and not understanding will help us to answer your question the best that we can. Thanks. :smile:
 
  • Like
Likes Klystron
  • #3
A tutorial video like this might help. There are others on Youtube.



There is no magic. It is just boolean logic. I'm not sure the level of understanding you seek. Do you need a description of the CPU down to the level of each transistor?
 
  • #4
anorlunda said:
There is no magic. It is just boolean logic. I'm not sure the level of understanding you seek. Do you need a description of the CPU down to the level of each transistor?
I know there's no magic. I don't think that video is what I'm looking for. but probably, that's what I'm looking for. how the cpu works at the transistor level. not just a generic explanation you get in any course. but how exactly, physically, is the cpu processing.

I can understand how a computer worked back in the days when human involvement was required. manually plugging and unplugging as code. or with the punch cards. when computers required human intercession, and software was physically altering the system. which I don't understand how a cpu does.
 
  • #5
i think you'll have to build one.
I learned by troubleshooting them. Hung a 'scope on every IC in it .
Of course that was in the days of 7400 logic IC's, when a CPU occupied three boards seventeen inches square and you could do that.
By 1974 they'd progressed to the point a CPU was now one IC the size of your thumb.

Hobbyists are building them for the exact reason you state.
https://eater.net/8bit/

good luck
old jim
 
  • Like
Likes grandpa2390 and OmCheeto
  • #6
If you are curious enough to study, I recommend choosing the famous 6502 chip as your object.

A simple google search for 6502 will point you to numerous books, articles, emulators, circuit diagrams, and even fan clubs of people who study and celebrate the 6502. The 6502 was a very capable microcomputer, but much much simpler than today's CPUs. Those 6502 groups and forums are filled with people who share your interests. They can probably help you with very specific suggestions.

Once upon a time, I had all the 6502 op codes memorized, and I could read code in binary. Those were fun times. :biggrin:
 
  • Like
Likes phinds, sysprog, grandpa2390 and 2 others
  • #8
Typically the opcode and/or options are used to index the equivalent of tables, which in turn result in running a sequence of micro -operations, and/or more table lookups. In some simple processors, there's a single table lookup with a large number of bits per entry, 1 bit for every possible operation. I recall a 16 micro based on AMD 2900 family of chips that only had just under 80 possible combinations of operations, so it use a table that was 80 bits wide, with only 1 bit set per entry.
 
  • Like
Likes grandpa2390
  • #9
rcgldr said:
Typically the opcode and/or options are used to index the equivalent of tables, which in turn result in running a sequence of micro -operations, and/or more table lookups.

CPUs with micro code add another level of abstraction. That's why I recommended the 6502. No micro code.
 
  • Like
Likes grandpa2390
  • #10
grandpa2390 said:
how the cpu works at the transistor level. not just a generic explanation you get in any course. but how exactly, physically, is the cpu processing.

You can start at the transistor level and work up from there by looking into Digital Logic. In my undergrad Digital Logic class we actually designed a very, very basic CPU. Unfortunately you just won't be able to really understand this stuff without spending a significant amount of time and effort with a good textbook on the subject. You can watch as many videos as you want, but until you get down and dirty solving digital logic problems you won't truly understand.
 
  • Like
Likes sysprog, grandpa2390, QuantumQuest and 1 other person
  • #11
The punch card is just in memory now, as a bunch on/off bits. The clock is cycling, fetching, decoding, executing, it’s built in. You have your programs instructions loaded in memory, you have a number pointing you to where the next one is. That number increments, or an instruction can change it.

It seams highly complex, but all of the components individually are fairly simple. How they are combined to make more complex machinery is also relatively simple, at each level. The key to managing the complexity is abstraction.
 
  • Like
Likes Klystron
  • #12
Drakkith said:
You can start at the transistor level and work up from there by looking into Digital Logic. ... You can watch as many videos as you want, but until you get down and dirty solving digital logic problems you won't truly understand.
Exactly. Many things what you just can't understand by a book will become common and trivial once you start using them. Don't try to expand your understanding without any practical experience.
 
  • Like
Likes jim hardy
  • #13
grandpa2390 said:
I don't know. it's weird to me. almost as if there must be a brain in there deciphering the code and carry out the orders.

It seems that way because there IS a brain in there deciphering the code and carrying out the orders.

A processor has a set of instruction codes it is designed to understand. A common design feature is also to have "registers", which are a few memory locations right on the chip. Things done on registers are very fast. Let's say you have four registers A, B, C and D. Then a built-in instruction might be ADD A, B, C (encoded in some binary format) which the processor is wired to understand as "Add what's in register A to what's in register B and store the result in register C". When the processor is executing that instruction, it invokes special addition circuitry which is also right on the chip, as well as the circuitry which reads and writes things to the desired places.

Then it gets the next instruction and does whatever that says. And the next. And the next.

You might want to pick up a book on assembly language for some processor, any processor, just to see what those fundamental instructions look like. Assembly language is the text version of the commands. The processor of course encodes them as 1's and 0's. Maybe ADD is instruction 0101 and registers A, B, C, D are written as binary 00, 01, 10, 11 so ADD A, B, C to the computer really looks like 0101 00 01 10 for example.

The binary language, what the CPU actually understands, is called "machine language". There isn't really a good reason to learn or understand that unless you're really curious. And these days there is very little call for people to work in the slightly more-readable assembly language either.
 
  • Like
Likes sysprog, grandpa2390, Klystron and 1 other person
  • #14
RPinPA said:
And these days there is very little call for people to work in the slightly more-readable assembly language either.

It's good to have at least a passing understanding of assembly. If you ever get into trouble debugging you might get dumped to an assembly view. Also if your code is not working you can compare the assembly to the C and it will show you what part of the C language you violated.

BoB
 
  • Like
Likes grandpa2390, Klystron, jim hardy and 1 other person
  • #15
grandpa2390 said:
one thing that has always miffed me is how is it that the hardware carries out the coded instruction. the code is 1's and 0's, and if I have understood anything. 1's and 0's are on and off. but how is the computer flipping the circuits on and off? something must tell it to do that. but then what is telling it to follow those instructions. etc.

I am unfamiliar, but have friends to took circuits or logic circuits or something like that, and so I guess at the most basic level, logic circuits are used in a magical way to control everything. all code must carry with it instructions that activate those circuits?

I don't know. it's weird to me. almost as if there must be a brain in there deciphering the code and carry out the orders.
You might want to look into what's called a finite state machine. Also, you should probably concentrate on the gate level, rather than the transistor level.
 
  • Like
Likes sysprog, grandpa2390 and Klystron
  • #16
There are newer editions, but this Tanenbaum book from 1998 does a good job of explaining how it all happens, from start to finish. At least, it would help with a conceptual understanding. This older edition can be had for a mere $12 used on Amazon: https://www.amazon.com/dp/0130959901/?tag=pfamazon01-20
 
  • Like
Likes grandpa2390, Klystron and berkeman
  • #17
anorlunda said:
If you are curious enough to study, I recommend choosing the famous 6502 chip as your object.
I preferred the 6809. Ahh, assembly language programming on the Color Computer...
 
  • #18
grandpa2390 said:
at the most basic level, logic circuits are used in a magical way to control everything. all code must carry with it instructions that activate those circuits?
Yup.

For a reasonably basic look starting at the transistor level:
http://www.circuitstoday.com/logic-gates
(above. and much more, found with: https://www.google.com/search?&q=logic+gate+circuit+diagram)

An early bible for the computer operation (and more) is THE ART OF COMPUTER PROGRAMMING, Volume 1, Fundamental Algorithms, by Donald E. Knuth, Addison-Wesley Publishing Company, ISBN 0-201-03809-9. (that's the Second edition, I believe there are later ones.)

Digital Logic is based on signals that are either On or Off (One or Zero, True or False); just like your refrigerator is either On or Off.

Then realize that any logical operation (or mathematical for that matter) can be created from the three logical operations of AND, OR, NOT. (can also be done with other operations, but these are the most used and simple enough to keep straight) (That's the field called Boolean Algebra, quite useful when you get in a bit deeper)

AND = output is True if all inputs are True
OR = output is True if any one or more inputs are True
NOT = output is True if input is False; output is False if input is True (i.e. the output is the complement of the input

Hope this helps.
Cheers,
Tom
 
  • Like
Likes grandpa2390, FactChecker and Klystron
  • #19
Drakkith said:
You can start at the transistor level and work up from there by looking into Digital Logic. In my undergrad Digital Logic class we actually designed a very, very basic CPU. Unfortunately you just won't be able to really understand this stuff without spending a significant amount of time and effort with a good textbook on the subject. You can watch as many videos as you want, but until you get down and dirty solving digital logic problems you won't truly understand.
The (free with little ad bar on its own screen only) Android app game https://apkpure.com/circuit-scramble-computer-logic-puzzles/com.Suborbital.CircuitScramble/download (< direct download link from apkpure) is fun and instructive for neophytes in the workings of logic gates.
 
Last edited:
  • Like
Likes Drakkith

1. How do computers understand code?

Computers understand code through a process called compilation or interpretation. In compilation, the code is translated into machine code that the computer can directly execute. In interpretation, the code is read and executed line by line by a program called an interpreter.

2. What is the role of the CPU in executing code?

The CPU, or central processing unit, is responsible for executing code. It retrieves instructions from memory, decodes them, and then executes them. The speed and efficiency of the CPU play a crucial role in the overall performance of a computer.

3. How does memory play a role in executing code?

Memory is where the instructions and data needed to execute code are stored. When the CPU needs to execute a specific instruction, it retrieves it from memory. The amount and speed of memory also affect the overall performance of a computer.

4. What is the difference between compiled and interpreted code?

In compiled code, the code is converted into machine code before execution. This results in faster execution but requires a separate compilation step. In interpreted code, the code is executed line by line without being converted into machine code first. This allows for easier debugging but can be slower in performance.

5. Can a computer execute multiple lines of code at once?

Yes, modern computers have multiple cores which allow them to execute multiple lines of code simultaneously. This is known as parallel processing and can greatly improve the speed and efficiency of executing code.

Similar threads

  • Sticky
  • Programming and Computer Science
Replies
13
Views
4K
  • Programming and Computer Science
Replies
3
Views
1K
  • Programming and Computer Science
Replies
10
Views
3K
  • Programming and Computer Science
Replies
29
Views
3K
Replies
14
Views
2K
  • Programming and Computer Science
Replies
2
Views
2K
  • Programming and Computer Science
Replies
9
Views
2K
Replies
11
Views
2K
  • Programming and Computer Science
Replies
1
Views
2K
Back
Top