Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Simulate computer inside a computer

  1. Jan 26, 2016 #1
    If I simulate a virtual computer inside a real computer graphically, would it take more resources or would it work like before? It is different from cloud computing, I'm simulating a computer graphically.
     
  2. jcsd
  3. Jan 26, 2016 #2

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    Look up emulators and "virtual machine" on Wikipedia. After reading those articles, if you still have questions, post them here.

    Edit: I don't understand what you mean by graphically.
     
  4. Jan 26, 2016 #3

    russ_watters

    User Avatar

    Staff: Mentor

    More resources than what? Before what?
     
  5. Jan 26, 2016 #4
    In general, yes. It will use more resources. There's no free lunch.

    It might be possible to trade resources though. For example certain unused portions of the CPU like branch prediction could be skipped by lowering the execution speed since only one branch will be taken anyway.

    It also might be possible to limit resources by limiting the faithfulness of the simulation. A trivial example: A brick can simulate a computer that's been turned off.
     
  6. Jan 26, 2016 #5

    nsaspook

    User Avatar
    Science Advisor

    Your modern Intel x86 instruction set CPU doesn't run native x86 instructions on the silicon. It runs RISC-like micro-instructions internally with very complex micro-coded decoders for CISC instructions. It's both faster and more efficient to emulate/simulate the CISC instruction set on modern processes than the run the native instructions in silicon.

    http://www.hardwaresecrets.com/inside-pentium-m-architecture/4/
     
  7. Jan 27, 2016 #6
    I thought simulating a computer would create a faster one, but I feel that the exascale computer is still going in the right direction, we'll wait for that and go from there
     
  8. Jan 27, 2016 #7
    Well hmm, is it possible to simulate a computer with a particle simulation for the transistors, after all they are just on and off switches. So each particle would represent an on or off switch with a different type of computing. Eventually it would simulate how a CPU works. That way you don't need to build those transistors, you just simulate how it works. But again simulating how these particles behave might take up more computing power than a CPU can handle. What do you guys think? It's only a thought
     
  9. Jan 27, 2016 #8
    Typically they are not just on/off switches. Timing is critically important and most systems use the transition states to control timing. For example a state of one bit might be latched by the transition of another.

    These are technical issues for optical and quantum computing. Solving them is a high priority.
     
  10. Jan 27, 2016 #9
    You are right, all you need is timing and RAM. We need a lot of RAM, and more RAM for the clock assuming 1 byte each
     
  11. Jan 27, 2016 #10
    Back when I was in school, we did an graphical simulator of the micro-coding of an 8x86 for teaching. It wasn't that hard. (Getting the artwork right was the hardest part). But it was a simulator, not any sort of emulator intended to run code.
     
  12. Jan 29, 2016 #11
    Hmm, quantum computer is for exponential calculation, couldn't we optimize arithmetic or create a supercomputer with a graphical simulator and use that for our calculations? The actual computer device is limited by 2 bit calculations but a simulation program can do just about anything.
     
  13. Jan 29, 2016 #12

    analogdesign

    User Avatar
    Science Advisor

    The simulation program is running on a computer that is limited to binary calculations. Where is the win?
     
  14. Jan 29, 2016 #13

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    Enlighten me please Jeff. Graphical? I don't know what you mean.
    • You used an interactive circuit builder in Spice?
    • You sketched artwork for a chip etching mask or a printed circuit mask?
     
  15. Jan 29, 2016 #14

    analogdesign

    User Avatar
    Science Advisor

    SPICE is far, far too slow to use as an interactive microcode simulator.

    I'm sure Jeff meant he had a program that allowed the various registers to be printed to the screen as the simulation progressed. I've done similar things but I've always just printed them to stdout or redirected to a log file.
     
  16. Jan 29, 2016 #15
    Yes. The program displayed the CPU in block diagram form. It would execute commands by loading the registers, running the ALU, setting the flags, etc. Basically it displayed the machine code and how the µcode drove the machine code.

    It was intended for teaching CPU architecture classes.
     
  17. Jan 30, 2016 #16
    Can I emulate more CPU power for number of calculations greater than the CPU I use itself?
     
  18. Jan 30, 2016 #17

    Borg

    User Avatar
    Science Advisor
    Gold Member

    That's kind of like asking if you can run faster by carrying yourself.
     
  19. Jan 30, 2016 #18
    But Wikipedia says it is possible to emulate an IBM PC with a Commodore 64 here. I'm interested in the performance I would get from that
     
  20. Jan 30, 2016 #19

    Vanadium 50

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    A. "But Wikipedia says" is not a very good argument.

    B. You didn't - and should have - included the second half of that quote. "Yes, it's possible for a 64 to emulate an IBM PC, in the same sense that it's possible to bail out Lake Michigan with a teaspoon." It was disingenuous not to do that.

    Of course not. If you could run an emulator faster on the same hardware, we'd run emulators on top of emulators until we had infinitely fast computers and we could just sit back and await the robot apocalypse.
     
  21. Jan 30, 2016 #20
    Darn it I had my hopes up, it's not that it would be faster, but how would you maximize calculations per second? Is increasing the number of CPU the only way?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook