How is a Computer Designed and Built?

  • Thread starter Thread starter SpanishPhysic
  • Start date Start date
  • Tags Tags
    Computer Works
AI Thread Summary
To understand how computers work fundamentally, it is essential to grasp the relationship between high-level programming languages and machine code, which consists of binary sequences (1's and 0's) that computers interpret. High-level languages like C++ or Java are translated into machine code through compilers, allowing the computer to execute commands such as saving a file. Each character or command is represented by specific binary sequences, often based on standards like ASCII.For a deeper understanding of computer architecture, resources such as architecture manuals from CPU manufacturers like Intel or AMD are recommended. These documents explain how binary instructions are processed by the CPU, detailing the fetch-execute cycle where the system reads and executes instructions sequentially. Additionally, exploring the OSI model can provide insights into networking and computing perspectives. For those interested in the electronic foundations of computers, studying digital systems and computer engineering texts will be beneficial. Understanding the circuitry and components used in computer construction will further enhance knowledge of how computers operate at a fundamental level.
SpanishPhysic
Messages
7
Reaction score
0
Well, I want learn how the computer works and for understand all about computer I decided make one, I see this site: www.mycpu.eu and it's very complete but he isn't explains the electronic foundations and physicists of what he has done to construct his computer (he not explains how he has designed every part of the components). Because I want to learn, though it may be of basic form, how it is designed (including all kinds of calculations) every piece and my final aim is to know as a computer "knows" for example 011111010101011 it means to
"save a file", and it not mean " reproduce this song ". Do not be if understand me.

Can somebody help me? lool :D





Sorry for my bad english, I only want learn.
 
Technology news on Phys.org
Generally, when you write a program, you use a high level programming language that resembles, to some extent, English, in that you can read it without too much hassle. Of course it is more formal than English, and the syntax must be exact because the computer won't understand what you mean if you mispell something given that it has no idea of the context, being a machine and all.

So, you have a program written in text that a human can read. In order for the computer to understand the program, it must be translated into "machine code", a sequence of 1's and 0's, so you use a compiler to convert the program into something the computer can "read".

As an easy example, the text that you type on your screen is not stored that way on your computer memory. Each character in the text is represented as a small sequence of 1's and 0's according to the ASCII conversion table (there are other conversion methods too).

So let's say you have a function in your program to save a file:

Code:
function SaveFile()
{
    do_some_stuff;
    save_the_file;
}

When you translate that into machine code by compiling the program it becomes:

Code:
11001010100011001101011100011111010101

A very specific number sequence that the computer interprets correctly, as a function to save the file. The hardware in your computer is designed to "read" things in binary, not English.

For the record, the reason we use 1's and 0's (and not, say 0-9) is because 1 and 0 correspond very nicely to the on/off position of an electrical switch, which is what your computer is made of, millions of them, billions, trillions even, all working in complex but logical ways.
 
SpanishPhysic said:
Well, I want learn how the computer works and for understand all about computer I decided make one, I see this site: www.mycpu.eu and it's very complete but he isn't explains the electronic foundations and physicists of what he has done to construct his computer (he not explains how he has designed every part of the components). Because I want to learn, though it may be of basic form, how it is designed (including all kinds of calculations) every piece and my final aim is to know as a computer "knows" for example 011111010101011 it means to
"save a file", and it not mean " reproduce this song ". Do not be if understand me.

Can somebody help me? lool :D

Sorry for my bad english, I only want learn.

There are different perspectives and each perspective helps contribute to the whole understanding.

For example most programmers (should) know that all your data and code is stored somewhere in memory, and depending on the actual architecture of the platform you are using (which determines the instruction set), the understanding of what is going on is going to be a mixture of flow-control and state space. You can tackle this at multiple levels from a higher end view of a high generation language (like C++) or at a low level (i.e. the actual assembler code or if you aren't as fortunate, the machine level code).

If you want to get the whole picture, I recommend you look at the OSI model. It will give you a good enough understanding to think about computers from all of the perspective.

The perspective I said above takes for granted that there is an implementation to perform all the logical operations, to store binary data, and so on. All programmers take for granted that when a computer is run, it will execute instructions in a particular way, it will do them reliably (the error of unreliability is very small), it will access memory in a particular way and so on.

If you want to go deeper into the rabbit hole, you will need to get into engineering knowledge, and if you want to go even deeper, then you will be delving into physics, material science, and some deep applied mathematics.

So if you can read about the OSI model in networking, you could use analogues from that to think about that in the context of computing.

If you just want to know how you go from 111010100101 to "do this", you can skip the OSI and download an architecture manual from a CPU vendor like Intel or AMD. You won't get the knowledge about how the CPU physics work, but you'll understand exactly how you go from binary to output.
 
chiro said:
If you just want to know how you go from 111010100101 to "do this", you can skip the OSI and download an architecture manual from a CPU vendor like Intel or AMD. You won't get the knowledge about how the CPU physics work, but you'll understand exactly how you go from binary to output.

This is what I want to know, because, it could be strange, but in my universty i studied the microprocessor, ram, hard disk, c++ , java, assembly language, mantissas... etc , but... I do not know the essence of computer work, know exactly what I'm quoting you.
 
SpanishPhysic said:
This is what I want to know, because, it could be strange, but in my universty i studied the microprocessor, ram, hard disk, c++ , java, assembly language, mantissas... etc , but... I do not know the essence of computer work, know exactly what I'm quoting you.

Answering your question completely would require a lot of different knowledge on a lot of different subjects. I will answer it only from one perspective.

If you want a "high-level" answer of how you go from "11010100111" to "do this", get the architecture manual from a website like Intel or AMD.

Also get some documentation of executable formats for different platforms: like windows or unix/linux based OS's.

Once you understand how the machine "prepares" for execution, then you can look at the machine code of the actual instructions.

The basic idea is once the system is "prepared" for an application, it sets the instruction pointer and then does a fetch and execute cycle.

At a simple level it will read a byte. Depending on that byte it will read more bytes and the process goes on until the application is closed (or crashes). This is a very superficial explanation however, because there are a lot of variables that play important roles in the background and depending on what is happening, especially in things like interrupts, knowing how the state space affects computation (especially with important registers).

I would take the simple route and assume that you don't have to worry about interrupts for the moment. Look at the instructions and their opcode representation. This will provide the first clue of how you turn those 1's and 0's into "do this".

If you want to go deeper and ask questions like "how does the CPU fetch a word from RAM" or similar questions you will need to look at some computer engineering texts and books on digital systems. You could theoretically end up reading papers on semiconductors if you wanted to go deep enough.
 
That mycpu site is using fairly low level components. You should be able to find out about the circuitry used in those components, to get an idea of how the mycpu computer works. You didn't mention what your knowledge of electronics is.
 
Dear Peeps I have posted a few questions about programing on this sectio of the PF forum. I want to ask you veterans how you folks learn program in assembly and about computer architecture for the x86 family. In addition to finish learning C, I am also reading the book From bits to Gates to C and Beyond. In the book, it uses the mini LC3 assembly language. I also have books on assembly programming and computer architecture. The few famous ones i have are Computer Organization and...
I have a quick questions. I am going through a book on C programming on my own. Afterwards, I plan to go through something call data structures and algorithms on my own also in C. I also need to learn C++, Matlab and for personal interest Haskell. For the two topic of data structures and algorithms, I understand there are standard ones across all programming languages. After learning it through C, what would be the biggest issue when trying to implement the same data...
Back
Top