Logic Gates and CPUs: Basic Design Structure of Computer Processors

  • Thread starter Thread starter Higgy
  • Start date Start date
  • Tags Tags
    Logic Logic gates
AI Thread Summary
The discussion centers on the basic design structure of computer processors, emphasizing that processors are fundamentally composed of millions of logic gates, such as AND, OR, and NOR gates. These gates are organized to perform operations dictated by a sequence of instructions, effectively transforming input data into output. The conversation clarifies that while processors can theoretically be built using a single type of gate, modern microprocessors utilize a vast number of gates to enable parallel processing, enhancing computational speed.Participants express a desire for a clearer understanding of how processors function, particularly in relation to quantum computing. They note that classical processors read binary data from memory, execute operations via logic gates, and write results back to memory or output devices. The discussion also touches on the importance of understanding computer architecture and synchronization in grasping how processors work, suggesting that a deeper study of digital systems and their components is necessary for a comprehensive understanding. Overall, the conversation highlights the complexity and organization of logic gates within processors and their role in computing.
Higgy
Messages
43
Reaction score
0
I would like to know the basic design structure of computer processors.

My concept of a processor is that it reads some binary data from memory, performs operations on it (according to a set of instructions, which it also reads in), and then writes the result to memory somewhere. (Is this correct?)

What I'm curious about is the "performs operations on it" part. Is the "operations" section of a processor just a huge array of millions of AND, OR, NOR, etc, gates, which are invoked in a sequence dictated by the "instructions"?

If that's true, could you (in principle) have a processor made out of just a single set of logic gates? If a processor is really just a chip with millions of logic gates on it, do modern microprocessors have so many gates simply so that many operations can be performed at once in parallel (to speed up computing time)?

Also, if anyone can point me to a nice discussion of this aspect of computing, I'd be really grateful. Thanks!
 
Technology news on Phys.org
rcgldr said:
I'm not sure what you mean by a single set, but a computer could use a single type of gate, such as all NAND or all NOR gates to impement it's logic.

Wiki articles:

http://en.wikipedia.org/wiki/NAND_logic

http://en.wikipedia.org/wiki/NOR_logic

Thanks for answering. Maybe my question is too elementary, but I mostly just want to know if the chip in my computer that's called the CPU is really just filled with logic gates. I've taken a college course in logic (philosophy dept.) and a college course in circuits (physics dept.) which covered digital electronics, basic addition with NAND, NOR, etc, gates, plus I've been reading quite a bit online.

Unfortunately, I've found that most material falls into one of three categories: too simplistic ("a processor processes stuff!"), too abstract ("imagine a Turing machine..."), or too advanced to address this question.

To be completely transparent, I'm getting interested in quantum computing/information, and I'm trying to construct a mental picture of what a realization of a quantum computer would look like, by way of analogy to a classical computer. For example, a classical computer has a hard disk which stores binary information in the orientation of magnetic moments. This is read into memory (on/off transistors?), and then fed into the "processor" (logic gates?), and the output is written into memory or to the terminal.

Discussions of quantum computers often involve operations (pulse trains) on localized entities (ions, atoms, NVs, etc). I want to create a mental picture of how a real quantum computer might read in information, perform operations on it, and the write out data. There's plenty of literature out there that explains this, I know, but all within the universe of quantum computers. I would be more comfortable being able to say "oh, this here is what we'd call the hard disk in a normal computer, and this here is what we'd call the CPU...".
 
Higgy said:
I would like to know the basic design structure of computer processors.

My concept of a processor is that it reads some binary data from memory, performs operations on it (according to a set of instructions, which it also reads in), and then writes the result to memory somewhere. (Is this correct?)
Sure...

What I'm curious about is the "performs operations on it" part. Is the "operations" section of a processor just a huge array of millions of AND, OR, NOR, etc, gates, which are invoked in a sequence dictated by the "instructions"?
I don't think this is a good characterization of what's going on in a processor...

I think the reverse is more accurate: the sequence of instructions that the processor can perform are implemented as a cascade of millions of logic gates...

If that's true, could you (in principle) have a processor made out of just a single set of logic gates? If a processor is really just a chip with millions of logic gates on it, do modern microprocessors have so many gates simply so that many operations can be performed at once in parallel (to speed up computing time)?

Also, if anyone can point me to a nice discussion of this aspect of computing, I'd be really grateful. Thanks!
I'm really not sure what you're trying to say here. Indeed, if you've taken the courses that you mentioned then you should really already understand, in principle, how the processor works as a collection of logic gates...

I don't know anything about quantum computers beyond their potential existence but I'd be surprised if they were all that analogous to digital computers. If my understanding is correct, they will be able to do certain calculations with literally infinite parallelism...
 
Jocko Homo said:
I don't think this is a good characterization of what's going on in a processor...

I think the reverse is more accurate: the sequence of instructions that the processor can perform are implemented as a cascade of millions of logic gates...
So a processor is just a collection of millions of logic gates on a chip. Data is fed in, the gates transform it appropriately, and the result is outputted to the terminal, memory, whatever.

Jocko Homo said:
I'm really not sure what you're trying to say here. Indeed, if you've taken the courses that you mentioned then you should really already understand, in principle, how the processor works as a collection of logic gates...
Whether or not the CPU in my computer is just a collection of logic gates, and nothing else, is what I'm asking. Once I know that, I would understand (or be on my way to understanding) how it works.

Jocko Homo said:
I don't know anything about quantum computers beyond their potential existence but I'd be surprised if they were all that analogous to digital computers. If my understanding is correct, they will be able to do certain calculations with literally infinite parallelism...
Perhaps I shouldn't have mentioned my interest in quantum computers. It'll open up a can of worms, I'm sure.

Thanks for the help.
 
Higgy said:
So a processor is just a collection of millions of logic gates on a chip. Data is fed in, the gates transform it appropriately, and the result is outputted to the terminal, memory, whatever.
Whether or not the CPU in my computer is just a collection of logic gates, and nothing else, is what I'm asking. Once I know that, I would understand (or be on my way to understanding) how it works.
Yes, your "processor is just a collection of" logic gates. Incidentally, so is your RAM...

I don't know how much you know about processors but I'll mention, for interest's sake, that its registers are just built-in RAM in the sense that they perform exactly the same function: the temporary storage of data...

Perhaps I shouldn't have mentioned my interest in quantum computers. It'll open up a can of worms, I'm sure.
That's what I saw when you mentioned it...
 
Higgy, it might help you if you pick up an introductory book on digital electronics.

In particular you should look at things like clocks and synchronization to help you understand in depth how everything gets synchronized (instructions, guaranteed data reads and writes and so on), and then combine that perspective with the logical gates perspective.
 
I'd recommend this book. It starts at logic gates and works its way up to binary arithmetic and machine language, then goes into assembler, compiler, and operating system programming. It also has tools for download and specifications for you to build and program your own virtual 16-bit computer. I actually used Minecraft's Redstone to work with the logic and build the computer instead of the included tools, but it worked all the same.
 
I've taken a college course in logic (philosophy dept.) and a college course in circuits (physics dept.) which covered digital electronics, basic addition with NAND, NOR, etc, gates, plus I've been reading quite a bit online.
I can advise you the following : if you want to know how a processor works, you need less electronics and more Computer Architecture courses, i.e. more of the functional aspects of the logic circuits.
These two courses were good, but you will need more to understand how a processor works. How to design digital systems (more advanced components : registers, mux, multipliers, etc then use them to design even more advanced components: ALUs, memories, control units) how to synchronize and connect them, how to control them together, that's what u need to learn to understand a CPU.
So a processor is just a collection of millions of logic gates on a chip. Data is fed in, the gates transform it appropriately, and the result is outputted to the terminal, memory, whatever.
Basically, yes, organized into components. But it's an oversimplification.
do modern microprocessors have so many gates simply so that many operations can be performed at once in parallel (to speed up computing time)?
Ermm... using the same 8 gates you can design the component so that transfer is parallel or serial. The bus is what makes it the stuff parallel.
In principle the sequence of operation is executed in serial. ( op1, op2... opN - i won't go into how this can be made in parallel). 2 transfers from a register to another are done in serial/parallel, via the bus. If you have an 8 bit bus, 8 bits are transferred in simultaneously, for the first transfer. After a period of time the second transfer occurs in the same manner.

In terms of quantum mechanics, I think that each each component is an operator that changes the state of the input bits, though I'm not sure if this is the correct interpretation.
 
Back
Top