Parallel Computing Operating Systems for the PC

Click For Summary

Discussion Overview

The discussion centers on parallel operating systems suitable for PCs with multiple processors, exploring the capabilities of existing systems like Windows and Linux in handling parallel computations. Participants examine the nature of true parallel operating systems, the role of compilers, and the effectiveness of various programming models such as MPI and OpenMP.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Exploratory

Main Points Raised

  • Some participants assert that both Windows and Linux can work with multiple processors, but question whether they qualify as true parallel operating systems.
  • There is a claim that neither Windows nor Linux can effectively distribute computations across all available processors without specific programming efforts.
  • One participant mentions that dedicated operating systems for true parallel processing existed in the past, suggesting that MPI (Message-Passing Interface) was a common method for parallelizing code.
  • Another participant expresses skepticism about the ability of standard Linux to distribute workloads effectively, regardless of program implementation.
  • Some participants note that general-purpose parallel operating systems for PCs may not exist, and that compilers and operating systems may not optimally parallelize generic programs.
  • There is mention of the limitations of current PC architectures for massively parallel systems, emphasizing the importance of hardware and network topology in parallel computing.
  • Participants discuss the use of MPI and OpenMP in both supercomputer clusters and personal dual-core PCs, highlighting their roles in managing parallel computations.
  • One participant introduces the concept of GPU-based parallel computing units, such as NVIDIA's Tesla, as a potential alternative for high-performance parallel processing.
  • There is a viewpoint that the effectiveness of an operating system in utilizing multiple cores is secondary to the applications designed to leverage those cores.

Areas of Agreement / Disagreement

Participants express differing views on the capabilities of existing operating systems in achieving true parallelism. While some believe that standard operating systems can handle multiprocessing, others remain unconvinced and highlight the need for specific programming techniques and architectures.

Contextual Notes

Participants mention various programming models and architectures, but there is no consensus on the effectiveness of standard operating systems in fully utilizing multiple processors for parallel computations. The discussion also reflects uncertainty regarding the current state of parallel computing technologies and their practical applications.

RJ Emery
Messages
114
Reaction score
6
Anyone know of parallel operating systems designed to work with multiple PC processors?
 
Computer science news on Phys.org
RJ Emery said:
Anyone know of parallel operating systems designed to work with multiple PC processors?
Both Windows and the Linux variants work with multiple processors.
 
MeJennifer said:
Both Windows and the Linux variants work with multiple processors.
I do not believe either is a true parallel operating system. For example, neither can spread a computation among all the available processors. There may be special variants of Linux capable of doing this, but I am not yet aware of any.
 
RJ Emery said:
I do not believe either is a true parallel operating system. For example, neither can spread a computation among all the available processors. There may be special variants of Linux capable of doing this, but I am not yet aware of any.
Both Windows and Linux can spread and balance the computation provided the programmer properly parallelized the code.

I am not sure what else you think would be a "true" parallel operating system.
 
Last edited:
MeJennifer said:
Both Windows and Linux can spread and balance the computation provided the programmer properly parallelized the code. I am not sure what else you think would be a "true" parallel operating system.
It has been at least a decade since I last looked at this area. At the time, I do recall there being dedicated operating systems that enabled true parallel processing on a network of PC processors. The term network here comprised several different architectures and topologies. An attempt to generalize the special programming required of whatever network was in use was to use MPI (Message-Passing Interface). MPI was one way code could be parallelized and make it portable among supercomputers or other distributed computing alternatives.

If both Windows XP and Vista, and any Linux distro, can distribute instructions to be executed among the available processors for a given compute-intensive program, than what tools similar to MPI if not MPI exist today to enable such parallelization?

I'm still not convinced standard Linux can truly distribute the work load of a properly parallelized program regardless of how well the program is implemented.

However, I am opened minded, and perhaps you or others can convince me otherwise.
 
There probably is no general-purpose highly-optimized parallel operating system on PCs. Compilers and operating systems are probably just not smart enough to take a generic program and parallelize it optimally. Of course, certain computation problems are better suited to certain configurations of distributed processors [which, of course, requires knowledge of the cpu capabilites, memory speeds, networking speeds, and topology]. ...but I'm no expert. I took a class in parallel-computation a while back.

This might be of interest to you:
http://openmosix.sourceforge.net/#What
as part of
http://clusterknoppix.sw.be/
which is found in
http://dirk.eddelbuettel.com/quantian.html
 
Last edited by a moderator:
robphy said:
There probably is no general-purpose highly-optimized parallel operating system on PCs. Compilers and operating systems are probably just not smart enough to take a generic program and parallelize it optimally. Of course, certain computation problems are better suited to certain configurations of distributed processors [which, of course, requires knowledge of the cpu capabilites, memory speeds, networking speeds, and topology]. ...but I'm no expert. I took a class in parallel-computation a while back.
There are compilers that automatically parallelize code, but really the scope is fairly limited, it is better to have a good programmer, who understands the issues related to parallelization, to take a look at it.

Note though, that the current PC bus architecture was never designed for massively parallel systems. If you are looking for that then the PC architecture is not going to work, and clearly 8 or even 16 processors is nowhere near the definition of massively parallel systems.
The main issues with architectures that enable massively parallel configurations is how to perform an effective data transfer between the processors and their caches.

Years ago, the Inmos transputers using the Ockham OS, were in vogue. You could construct your own topology, for instance use a hypercube configuration.
Currently there is no general purpose massively parallel computer system available that I know of, unless you are willing to pay millions. :smile:

There are alternatives, if you can parallelize your code enough you could use grid style computing, lookup beowulf systems if you are interested. Using a good network topology with gigabyte or higher interconnects you could generate an immense level of computing power.

But overall the ability to perform parallel computations primarily depends on the hardware, the topology, and the way a program is coded, the OS plays a relatively minor role in this.
 
Last edited:
are u talking about dual or quad cores?. Two my knowledge the supercomputer clusters in ontario are dual core that run linux as the OS. ANd use either MPI or openmp. And i believe bluegene has the same structure. As for personnel dual core PCs. MPI/openmp.
 
neurocomp2003 said:
are u talking about dual or quad cores?
Does it make a difference? In terms of an operating system making full use of what is available, what impact does dual versus quad core have?

neurocomp2003 said:
To my knowledge the supercomputer clusters in ontario are dual core that run linux as the OS. ANd use either MPI or openmp. And i believe bluegene has the same structure. As for personnel dual core PCs. MPI/openmp.
The consensus seems to be standard Linux can handle the multiprocessing. As for the multi-threading of programs, that would appear to be a function of using such tools as MPI or OpenMP, would you agree?

I was not aware of OpenMP until your post. Thank you for pointing it out.
 
  • #10
dual or quad cores...doesn't matter, was just tryign to understand whether you were considering multicores or single CPU networked clusters.

For multicore PCs, you can utilize openmp. And if you consider clusters, than you can have a hybrid openmp/mpi sfwr scheme. Where openmp is used on the multicores and mpi used in communicating between PCs.

U can still use the simple functions that come with C interms of passing data (send/recv/accept/connect/bind/listen)..openmp/mpi are just hierarchical languages ontop of these.
 
  • #11
MeJennifer said:
Currently there is no general purpose massively parallel computer system available that I know of, unless you are willing to pay millions. :smile:
Not quite general purpose but NVIDIA is coming out with Tesla. These are GPU based parallel computing units with 128 processors. Starting from next month you can buy the entry level unit that can reach 500 GFlops peak performance for about $1500. The C based toolkit is already available. Unfortunately the first generation is 32bit only.

GPU processors are not general purpose but they can be used for simulations of physical processes, neural networks, image recognition etc.

Exiting!
 
  • #12
I don't actually think that it is importand for an OS to take advantage of a multiple core processor. The important thing is for the applications to use the multiple cores for the various processes. Suppose that you have just an OS with no other programms installed. What good is there to use 2 or 4 cores just for the OS. The importand thing is for the OS to "see" the multiple processors and for the applications to use them. Except for the case that you want to run many computation heavy apps at the same time... but how many times does anyone do that?
 
  • #13
dtsormpa said:
I don't actually think that it is importand for an OS to take advantage of a multiple core processor. The important thing is for the applications to use the multiple cores for the various processes. Suppose that you have just an OS with no other programms installed. What good is there to use 2 or 4 cores just for the OS. The importand thing is for the OS to "see" the multiple processors and for the applications to use them. Except for the case that you want to run many computation heavy apps at the same time... but how many times does anyone do that?
Yes, you are correct. My concern was for various scheduled tasks to run on a different processors as allocated by the operating system to effect greater overall throughput. The same holds true for general applications not necessarily written for a multi-processor environment.
 

Similar threads

  • · Replies 102 ·
4
Replies
102
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 14 ·
Replies
14
Views
10K
Replies
2
Views
4K
  • · Replies 25 ·
Replies
25
Views
14K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 8 ·
Replies
8
Views
6K
  • · Replies 38 ·
2
Replies
38
Views
8K
  • · Replies 2 ·
Replies
2
Views
2K