Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Definition and alternatives for Von Neumann architecture?

  1. Sep 24, 2016 #1
    I have been studying about computers and found that they evolved from the basic mechanical devices with limited functions to the amazing machines we have today. Its all very new and interesting to me.

    I believe that programming is the act of writing an algorithm in a higher or lower level language. An algorithm is a sequence of detailed and unambigious steps needed to solve a problem or complete a task.

    Older computing devices such as the Jacques loom, pascalline, the US census counting machine, ABC computer, were said to have limited programmability because they could only carry out a limited range of algorithms? is this correct? The program for the Jacques loom was the punching cards that were input into it?

    So programming often meant the result was changing the physical motions of these old computers?

    The architecture of a computer refers to the the components and the interactions between them.

    The Von Nuemann architecture consists of a CPU, Memory unit, and an I/O.

    The CPU is where the steps of algorithms are carried out, the computational unit. The memory unit contains the program and the data required for the program, the I/O is a connection to the outside world.
    My question is, what are the alternatives to this type of architecture and is my understanding of it correct?
    What are the advantages of this architecture as compared to others?
     
  2. jcsd
  3. Sep 24, 2016 #2

    phinds

    User Avatar
    Gold Member
    2016 Award

    One thing you missed is the VERY important characteristics that Von Neumann machines can reprogram themselves. The older mechanical types could not do that. The reason they can do that is that the data store and instruction store are the same store so instructions don't care whether they are working on what we think of as data or what is actually a machine instruction. In very early days of programming I think much more was made of this than is the case now.

    A modern type of architecture that is NOT a Von Neumann machine and cannot self-modify is the Harvard Architecture, more commonly used as a modified Harvard Architecture. I don't know about today but in the early days of DSP's that was the more common architecture. They have separate data and instruction stores and do simultaneous fetches of both, thus adding to speed at the cost of an unneeded flexibility.

    So Harvard has a speed advantage over Von Neumann but storage and flexibility limitations in comparison. In a Von Neumann machine, it doesn't matter if you have a huge program and a small amount of data or a small program and a huge amount of data, all the memory is available for either one. In the Harvard architecture, both program and data are limited by their respective data stores.
     
    Last edited: Sep 24, 2016
  4. Sep 24, 2016 #3

    phinds

    User Avatar
    Gold Member
    2016 Award

    Here's an example of a VERY useful self-modification of the kind that I often did back in the 60's in machine language. I was writing programs for real-time data collection/processing at NASA. Now "real-time" in that case MEANT real time, not like what it means today. Data came in at a certain speed and you could capture it or lose it. Those were the only options. To get the data from the telemetry systems onto a computer tape, a certain amount of processing was needed as the data passed through the computer. I remember one program where we started off at something like 150% of real-time. That is, our processing for 10 seconds of data took 15 seconds. Since this was real time, we didn't HAVE 150% of real time, we had 100%.

    One of the tricks was as follows: certain types of data required decision making in a nested loop but the result of the first decision changed, for ALL of the rest of the data inside that nest, what path the program took next within that section of the nest. This meant that in the loop there was a decision instruction that, the first time through, had to compare one value to another but that in all subsequent operations until the end of that part of the nest, the decision always produced the same result.

    Assembly language (= machine language) allowed for a kind of flexibility that is not available in higher level languages, like this. Once the decision was made, the first instruction of the string of decision instructions was replaced with a simple jump instruction and at the end of the nest the decision instruction was put back. For situations where there was a lot of data processed in that nest, the savings was noticeable.

    The rest of getting it down below 100% of real time was long, tedious, very serious, rethinking of the algorithms down to the level of using, where possible, shift vs multiply instructions, add vs multiply instructions, etc. I took over a month but we did get it done.
     
  5. Sep 24, 2016 #4
    Gonna take me some time to assimilate the replies. will get back to you soon, thanks!
     
  6. Sep 24, 2016 #5

    phinds

    User Avatar
    Gold Member
    2016 Award

    I should add, to avoid confusion, when I spoke above about machines reprogramming themselves, you need to understand that ALGORITHMS can in effect reprogram themselves by storing decision trees in data store but that is not what I was talking about. It is a useful concept and is at the heart of what is today called "machine learning", but what I was talking about specifically is the capability of Von Neumann machines to have algorithms modify their own instructions, not just make different decisions based on something in data store.
     
  7. Sep 25, 2016 #6

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    I think you can see that the 3 components CPU, memory, and I/O are essential to achieve the goal that Von Neumann wanted. So it is hard to imagine another design that achieves the goal of reprogramability without those 3 components in some shape or form. There may be designs where the details are different from each other. Analog computers can have swapable patch boards to run different programs. Field Programmable Gate Arrays (FPGA) are also somewhat different. An overview of the subject of "Reconfigurable computing" can be found in https://en.wikipedia.org/wiki/Reconfigurable_computing

    @phinds point about programs that can modify their own algorithm is important. It carries the "reconfigurable" concept one step further. Neural Networks are like that. So are many of the scripting languages that allow a running program to piece together new code on the fly and run it using "eval". These generally run on the same Von Neumann machines.
     
  8. Sep 25, 2016 #7
    Yes these sort of things were done in assembler for the very reasons you gave. Then along came Computer Scientists who smacked us about the head telling us that modifying the code made the program non-re-enetrant. As if we cared?
    :-)
     
  9. Sep 25, 2016 #8

    phinds

    User Avatar
    Gold Member
    2016 Award

    Yeah, I can guarantee you that if those purists ever had to knock a program down from 150% of real time to under 100%, they'd change their tune or get fired.
     
  10. Sep 25, 2016 #9

    Stephen Tashi

    User Avatar
    Science Advisor

  11. Sep 26, 2016 #10

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    Nothing gets uglier than trying to squeeze a program into a time or memory limit. One thing that people don't realize is that there are situations where the hardware is fixed for a long time and the software is guaranteed to grow. It can get ugly.
     
  12. Sep 26, 2016 #11

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    I don't know what category a quantum computer fits. That may not be settled yet.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Definition and alternatives for Von Neumann architecture?
Loading...