Definition and alternatives for Von Neumann architecture?

In summary, computers evolved from simple mechanical devices to more complex machines that are able to carry out more complicated algorithms. Programming is the act of writing an algorithm in a higher or lower level language.
  • #1
Logical Dog
362
97
I have been studying about computers and found that they evolved from the basic mechanical devices with limited functions to the amazing machines we have today. Its all very new and interesting to me.

I believe that programming is the act of writing an algorithm in a higher or lower level language. An algorithm is a sequence of detailed and unambigious steps needed to solve a problem or complete a task.

Older computing devices such as the Jacques loom, pascalline, the US census counting machine, ABC computer, were said to have limited programmability because they could only carry out a limited range of algorithms? is this correct? The program for the Jacques loom was the punching cards that were input into it?

So programming often meant the result was changing the physical motions of these old computers?

The architecture of a computer refers to the the components and the interactions between them.

The Von Nuemann architecture consists of a CPU, Memory unit, and an I/O.

The CPU is where the steps of algorithms are carried out, the computational unit. The memory unit contains the program and the data required for the program, the I/O is a connection to the outside world.
My question is, what are the alternatives to this type of architecture and is my understanding of it correct?
What are the advantages of this architecture as compared to others?
 
Technology news on Phys.org
  • #2
One thing you missed is the VERY important characteristics that Von Neumann machines can reprogram themselves. The older mechanical types could not do that. The reason they can do that is that the data store and instruction store are the same store so instructions don't care whether they are working on what we think of as data or what is actually a machine instruction. In very early days of programming I think much more was made of this than is the case now.

A modern type of architecture that is NOT a Von Neumann machine and cannot self-modify is the Harvard Architecture, more commonly used as a modified Harvard Architecture. I don't know about today but in the early days of DSP's that was the more common architecture. They have separate data and instruction stores and do simultaneous fetches of both, thus adding to speed at the cost of an unneeded flexibility.

So Harvard has a speed advantage over Von Neumann but storage and flexibility limitations in comparison. In a Von Neumann machine, it doesn't matter if you have a huge program and a small amount of data or a small program and a huge amount of data, all the memory is available for either one. In the Harvard architecture, both program and data are limited by their respective data stores.
 
Last edited:
  • Like
Likes FactChecker, QuantumQuest and Logical Dog
  • #3
Here's an example of a VERY useful self-modification of the kind that I often did back in the 60's in machine language. I was writing programs for real-time data collection/processing at NASA. Now "real-time" in that case MEANT real time, not like what it means today. Data came in at a certain speed and you could capture it or lose it. Those were the only options. To get the data from the telemetry systems onto a computer tape, a certain amount of processing was needed as the data passed through the computer. I remember one program where we started off at something like 150% of real-time. That is, our processing for 10 seconds of data took 15 seconds. Since this was real time, we didn't HAVE 150% of real time, we had 100%.

One of the tricks was as follows: certain types of data required decision making in a nested loop but the result of the first decision changed, for ALL of the rest of the data inside that nest, what path the program took next within that section of the nest. This meant that in the loop there was a decision instruction that, the first time through, had to compare one value to another but that in all subsequent operations until the end of that part of the nest, the decision always produced the same result.

Assembly language (= machine language) allowed for a kind of flexibility that is not available in higher level languages, like this. Once the decision was made, the first instruction of the string of decision instructions was replaced with a simple jump instruction and at the end of the nest the decision instruction was put back. For situations where there was a lot of data processed in that nest, the savings was noticeable.

The rest of getting it down below 100% of real time was long, tedious, very serious, rethinking of the algorithms down to the level of using, where possible, shift vs multiply instructions, add vs multiply instructions, etc. I took over a month but we did get it done.
 
  • Like
Likes QuantumQuest and Logical Dog
  • #4
Gonna take me some time to assimilate the replies. will get back to you soon, thanks!
 
  • #5
I should add, to avoid confusion, when I spoke above about machines reprogramming themselves, you need to understand that ALGORITHMS can in effect reprogram themselves by storing decision trees in data store but that is not what I was talking about. It is a useful concept and is at the heart of what is today called "machine learning", but what I was talking about specifically is the capability of Von Neumann machines to have algorithms modify their own instructions, not just make different decisions based on something in data store.
 
  • #6
I think you can see that the 3 components CPU, memory, and I/O are essential to achieve the goal that Von Neumann wanted. So it is hard to imagine another design that achieves the goal of reprogramability without those 3 components in some shape or form. There may be designs where the details are different from each other. Analog computers can have swapable patch boards to run different programs. Field Programmable Gate Arrays (FPGA) are also somewhat different. An overview of the subject of "Reconfigurable computing" can be found in https://en.wikipedia.org/wiki/Reconfigurable_computing

@phinds point about programs that can modify their own algorithm is important. It carries the "reconfigurable" concept one step further. Neural Networks are like that. So are many of the scripting languages that allow a running program to piece together new code on the fly and run it using "eval". These generally run on the same Von Neumann machines.
 
  • Like
Likes Logical Dog and QuantumQuest
  • #7
phinds said:
Here's an example of a VERY useful self-modification of the kind that I often did back in the 60's in machine language. I was writing programs for real-time data collection/processing at NASA.

Yes these sort of things were done in assembler for the very reasons you gave. Then along came Computer Scientists who smacked us about the head telling us that modifying the code made the program non-re-enetrant. As if we cared?
:-)
 
  • Like
Likes Logical Dog
  • #8
Yeah, I can guarantee you that if those purists ever had to knock a program down from 150% of real time to under 100%, they'd change their tune or get fired.
 
  • Like
Likes FactChecker
  • #10
phinds said:
Yeah, I can guarantee you that if those purists ever had to knock a program down from 150% of real time to under 100%, they'd change their tune or get fired.
Nothing gets uglier than trying to squeeze a program into a time or memory limit. One thing that people don't realize is that there are situations where the hardware is fixed for a long time and the software is guaranteed to grow. It can get ugly.
 
  • #11
I don't know what category a quantum computer fits. That may not be settled yet.
 
  • Like
Likes Logical Dog

What is Von Neumann architecture?

Von Neumann architecture is a computing model that describes the structure of a computer and how it processes information. It was developed by John von Neumann in the 1940s and is still the basis for most modern computers today.

What are the main components of Von Neumann architecture?

The main components of Von Neumann architecture include the central processing unit (CPU), memory, input/output devices, and a control unit. The CPU is responsible for executing instructions and performing calculations, while memory stores data and instructions. Input/output devices allow for communication between the computer and the outside world, and the control unit coordinates and manages the flow of data between these components.

What are some alternatives to Von Neumann architecture?

One alternative to Von Neumann architecture is Harvard architecture, which uses separate memory for data and instructions. Another alternative is the modified Harvard architecture, which combines aspects of Von Neumann and Harvard architectures. There are also non-Von Neumann architectures, such as cellular automata and quantum computing, that are still in development and not widely used yet.

What are the advantages of using Von Neumann architecture?

Von Neumann architecture allows for efficient and sequential processing of instructions, making it easier to design and program computers. It also allows for the use of a single memory unit, reducing the complexity and cost of computer systems.

What are the limitations of Von Neumann architecture?

One limitation of Von Neumann architecture is the Von Neumann bottleneck, where the CPU and memory are connected by a single bus, leading to potential performance issues. It also does not allow for simultaneous processing of instructions, which can impact the speed and efficiency of computing.

Similar threads

  • Programming and Computer Science
Replies
29
Views
3K
  • Sticky
  • Programming and Computer Science
Replies
13
Views
4K
  • Computing and Technology
Replies
10
Views
2K
  • STEM Academic Advising
Replies
5
Views
1K
  • STEM Academic Advising
Replies
1
Views
2K
  • Computing and Technology
2
Replies
44
Views
3K
  • STEM Academic Advising
Replies
13
Views
3K
  • STEM Academic Advising
Replies
13
Views
2K
  • Computing and Technology
Replies
1
Views
4K
Back
Top