Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How a computer works

  1. Jul 24, 2011 #1
    Well, I want learn how the computer works and for understand all about computer I decided make one, I see this site: www.mycpu.eu and it's very complete but he isn't explains the electronic foundations and physicists of what he has done to construct his computer (he not explains how he has designed every part of the components). Because I want to learn, though it may be of basic form, how it is designed (including all kinds of calculations) every piece and my final aim is to know as a computer "knows" for example 011111010101011 it means to
    "save a file", and it not mean " reproduce this song ". Do not be if understand me.

    Can somebody help me? lool :D





    Sorry for my bad english, I only want learn.
     
  2. jcsd
  3. Jul 25, 2011 #2
    Generally, when you write a program, you use a high level programming language that resembles, to some extent, English, in that you can read it without too much hassle. Of course it is more formal than English, and the syntax must be exact because the computer won't understand what you mean if you mispell something given that it has no idea of the context, being a machine and all.

    So, you have a program written in text that a human can read. In order for the computer to understand the program, it must be translated into "machine code", a sequence of 1's and 0's, so you use a compiler to convert the program into something the computer can "read".

    As an easy example, the text that you type on your screen is not stored that way on your computer memory. Each character in the text is represented as a small sequence of 1's and 0's according to the ASCII conversion table (there are other conversion methods too).

    So let's say you have a function in your program to save a file:

    Code (Text):
    function SaveFile()
    {
        do_some_stuff;
        save_the_file;
    }
    When you translate that into machine code by compiling the program it becomes:

    Code (Text):
    11001010100011001101011100011111010101
    A very specific number sequence that the computer interprets correctly, as a function to save the file. The hardware in your computer is designed to "read" things in binary, not English.

    For the record, the reason we use 1's and 0's (and not, say 0-9) is because 1 and 0 correspond very nicely to the on/off position of an electrical switch, which is what your computer is made of, millions of them, billions, trillions even, all working in complex but logical ways.
     
  4. Jul 25, 2011 #3

    chiro

    User Avatar
    Science Advisor

    There are different perspectives and each perspective helps contribute to the whole understanding.

    For example most programmers (should) know that all your data and code is stored somewhere in memory, and depending on the actual architecture of the platform you are using (which determines the instruction set), the understanding of what is going on is going to be a mixture of flow-control and state space. You can tackle this at multiple levels from a higher end view of a high generation language (like C++) or at a low level (i.e. the actual assembler code or if you aren't as fortunate, the machine level code).

    If you want to get the whole picture, I recommend you look at the OSI model. It will give you a good enough understanding to think about computers from all of the perspective.

    The perspective I said above takes for granted that there is an implementation to perform all the logical operations, to store binary data, and so on. All programmers take for granted that when a computer is run, it will execute instructions in a particular way, it will do them reliably (the error of unreliability is very small), it will access memory in a particular way and so on.

    If you want to go deeper into the rabbit hole, you will need to get into engineering knowledge, and if you want to go even deeper, then you will be delving into physics, material science, and some deep applied mathematics.

    So if you can read about the OSI model in networking, you could use analogues from that to think about that in the context of computing.

    If you just want to know how you go from 111010100101 to "do this", you can skip the OSI and download an architecture manual from a CPU vendor like Intel or AMD. You won't get the knowledge about how the CPU physics work, but you'll understand exactly how you go from binary to output.
     
  5. Jul 25, 2011 #4
    This is what I want to know, because, it could be strange, but in my universty i studied the microprocessor, ram, hard disk, c++ , java, assembly language, mantissas.... etc , but... I do not know the essence of computer work, know exactly what I'm quoting you.
     
  6. Jul 26, 2011 #5

    chiro

    User Avatar
    Science Advisor

    Answering your question completely would require a lot of different knowledge on a lot of different subjects. I will answer it only from one perspective.

    If you want a "high-level" answer of how you go from "11010100111" to "do this", get the architecture manual from a website like Intel or AMD.

    Also get some documentation of executable formats for different platforms: like windows or unix/linux based OS's.

    Once you understand how the machine "prepares" for execution, then you can look at the machine code of the actual instructions.

    The basic idea is once the system is "prepared" for an application, it sets the instruction pointer and then does a fetch and execute cycle.

    At a simple level it will read a byte. Depending on that byte it will read more bytes and the process goes on until the application is closed (or crashes). This is a very superficial explanation however, because there are a lot of variables that play important roles in the background and depending on what is happening, especially in things like interrupts, knowing how the state space affects computation (especially with important registers).

    I would take the simple route and assume that you don't have to worry about interrupts for the moment. Look at the instructions and their opcode representation. This will provide the first clue of how you turn those 1's and 0's into "do this".

    If you want to go deeper and ask questions like "how does the CPU fetch a word from RAM" or similar questions you will need to look at some computer engineering texts and books on digital systems. You could theoretically end up reading papers on semiconductors if you wanted to go deep enough.
     
  7. Jul 26, 2011 #6

    rcgldr

    User Avatar
    Homework Helper

    That mycpu site is using fairly low level components. You should be able to find out about the circuitry used in those components, to get an idea of how the mycpu computer works. You didn't mention what your knowledge of electronics is.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: How a computer works
Loading...