Parallel Computing Operating Systems for the PC

In summary: I am not sure what you are asking. Are you asking about dual or quad core processors? Two my knowledge the supercomputer clusters in ontario are dual core that run linux as the OS. ANd use either MPI or openmp. And i believe bluegene has the same structure. As for personnel dual core PCs....
  • #1
RJ Emery
113
6
Anyone know of parallel operating systems designed to work with multiple PC processors?
 
Computer science news on Phys.org
  • #2
RJ Emery said:
Anyone know of parallel operating systems designed to work with multiple PC processors?
Both Windows and the Linux variants work with multiple processors.
 
  • #3
MeJennifer said:
Both Windows and the Linux variants work with multiple processors.
I do not believe either is a true parallel operating system. For example, neither can spread a computation among all the available processors. There may be special variants of Linux capable of doing this, but I am not yet aware of any.
 
  • #4
RJ Emery said:
I do not believe either is a true parallel operating system. For example, neither can spread a computation among all the available processors. There may be special variants of Linux capable of doing this, but I am not yet aware of any.
Both Windows and Linux can spread and balance the computation provided the programmer properly parallelized the code.

I am not sure what else you think would be a "true" parallel operating system.
 
Last edited:
  • #5
MeJennifer said:
Both Windows and Linux can spread and balance the computation provided the programmer properly parallelized the code. I am not sure what else you think would be a "true" parallel operating system.
It has been at least a decade since I last looked at this area. At the time, I do recall there being dedicated operating systems that enabled true parallel processing on a network of PC processors. The term network here comprised several different architectures and topologies. An attempt to generalize the special programming required of whatever network was in use was to use MPI (Message-Passing Interface). MPI was one way code could be parallelized and make it portable among supercomputers or other distributed computing alternatives.

If both Windows XP and Vista, and any Linux distro, can distribute instructions to be executed among the available processors for a given compute-intensive program, than what tools similar to MPI if not MPI exist today to enable such parallelization?

I'm still not convinced standard Linux can truly distribute the work load of a properly parallelized program regardless of how well the program is implemented.

However, I am opened minded, and perhaps you or others can convince me otherwise.
 
  • #6
There probably is no general-purpose highly-optimized parallel operating system on PCs. Compilers and operating systems are probably just not smart enough to take a generic program and parallelize it optimally. Of course, certain computation problems are better suited to certain configurations of distributed processors [which, of course, requires knowledge of the cpu capabilites, memory speeds, networking speeds, and topology]. ...but I'm no expert. I took a class in parallel-computation a while back.

This might be of interest to you:
http://openmosix.sourceforge.net/#What
as part of
http://clusterknoppix.sw.be/
which is found in
http://dirk.eddelbuettel.com/quantian.html
 
Last edited by a moderator:
  • #7
robphy said:
There probably is no general-purpose highly-optimized parallel operating system on PCs. Compilers and operating systems are probably just not smart enough to take a generic program and parallelize it optimally. Of course, certain computation problems are better suited to certain configurations of distributed processors [which, of course, requires knowledge of the cpu capabilites, memory speeds, networking speeds, and topology]. ...but I'm no expert. I took a class in parallel-computation a while back.
There are compilers that automatically parallelize code, but really the scope is fairly limited, it is better to have a good programmer, who understands the issues related to parallelization, to take a look at it.

Note though, that the current PC bus architecture was never designed for massively parallel systems. If you are looking for that then the PC architecture is not going to work, and clearly 8 or even 16 processors is nowhere near the definition of massively parallel systems.
The main issues with architectures that enable massively parallel configurations is how to perform an effective data transfer between the processors and their caches.

Years ago, the Inmos transputers using the Ockham OS, were in vogue. You could construct your own topology, for instance use a hypercube configuration.
Currently there is no general purpose massively parallel computer system available that I know of, unless you are willing to pay millions. :smile:

There are alternatives, if you can parallelize your code enough you could use grid style computing, lookup beowulf systems if you are interested. Using a good network topology with gigabyte or higher interconnects you could generate an immense level of computing power.

But overall the ability to perform parallel computations primarily depends on the hardware, the topology, and the way a program is coded, the OS plays a relatively minor role in this.
 
Last edited:
  • #8
are u talking about dual or quad cores?. Two my knowledge the supercomputer clusters in ontario are dual core that run linux as the OS. ANd use either MPI or openmp. And i believe bluegene has the same structure. As for personnel dual core PCs. MPI/openmp.
 
  • #9
neurocomp2003 said:
are u talking about dual or quad cores?
Does it make a difference? In terms of an operating system making full use of what is available, what impact does dual versus quad core have?

neurocomp2003 said:
To my knowledge the supercomputer clusters in ontario are dual core that run linux as the OS. ANd use either MPI or openmp. And i believe bluegene has the same structure. As for personnel dual core PCs. MPI/openmp.
The consensus seems to be standard Linux can handle the multiprocessing. As for the multi-threading of programs, that would appear to be a function of using such tools as MPI or OpenMP, would you agree?

I was not aware of OpenMP until your post. Thank you for pointing it out.
 
  • #10
dual or quad cores...doesn't matter, was just tryign to understand whether you were considering multicores or single CPU networked clusters.

For multicore PCs, you can utilize openmp. And if you consider clusters, than you can have a hybrid openmp/mpi sfwr scheme. Where openmp is used on the multicores and mpi used in communicating between PCs.

U can still use the simple functions that come with C interms of passing data (send/recv/accept/connect/bind/listen)..openmp/mpi are just hierarchical languages ontop of these.
 
  • #11
MeJennifer said:
Currently there is no general purpose massively parallel computer system available that I know of, unless you are willing to pay millions. :smile:
Not quite general purpose but NVIDIA is coming out with Tesla. These are GPU based parallel computing units with 128 processors. Starting from next month you can buy the entry level unit that can reach 500 GFlops peak performance for about $1500. The C based toolkit is already available. Unfortunately the first generation is 32bit only.

GPU processors are not general purpose but they can be used for simulations of physical processes, neural networks, image recognition etc.

Exiting!
 
  • #12
I don't actually think that it is importand for an OS to take advantage of a multiple core processor. The important thing is for the applications to use the multiple cores for the various processes. Suppose that you have just an OS with no other programms installed. What good is there to use 2 or 4 cores just for the OS. The importand thing is for the OS to "see" the multiple processors and for the applications to use them. Except for the case that you want to run many computation heavy apps at the same time... but how many times does anyone do that?
 
  • #13
dtsormpa said:
I don't actually think that it is importand for an OS to take advantage of a multiple core processor. The important thing is for the applications to use the multiple cores for the various processes. Suppose that you have just an OS with no other programms installed. What good is there to use 2 or 4 cores just for the OS. The importand thing is for the OS to "see" the multiple processors and for the applications to use them. Except for the case that you want to run many computation heavy apps at the same time... but how many times does anyone do that?
Yes, you are correct. My concern was for various scheduled tasks to run on a different processors as allocated by the operating system to effect greater overall throughput. The same holds true for general applications not necessarily written for a multi-processor environment.
 

1. What is parallel computing?

Parallel computing is a type of computing where multiple processors or cores work together to solve a problem or perform a task. This allows for faster processing and improved performance compared to traditional serial computing.

2. How do parallel computing operating systems for the PC differ from traditional operating systems?

Parallel computing operating systems are specifically designed to manage multiple processors or cores and distribute tasks among them. This is in contrast to traditional operating systems, which are designed for single processor systems.

3. What are the benefits of using a parallel computing operating system on a PC?

Using a parallel computing operating system can lead to improved performance and speed for tasks that can be divided into smaller, independent parts. It can also allow for more efficient use of resources, as multiple cores can work on different tasks simultaneously.

4. Are there any downsides to using a parallel computing operating system?

One downside is that not all tasks can be parallelized, meaning they cannot be divided into smaller parts and distributed among multiple cores. In these cases, a parallel computing operating system may not provide any significant performance benefits. Additionally, parallel computing can be more complex and require specialized programming techniques.

5. What are some examples of parallel computing operating systems for the PC?

Some examples of parallel computing operating systems for the PC include Linux with its built-in support for parallel processing, Windows Server with its parallel processing capabilities, and specialized operating systems like Hadoop and Apache Spark for big data processing.

Similar threads

  • Computing and Technology
Replies
7
Views
2K
  • Computing and Technology
Replies
25
Views
10K
  • Computing and Technology
Replies
14
Views
5K
  • Computing and Technology
Replies
15
Views
2K
  • Computing and Technology
2
Replies
38
Views
5K
  • Computing and Technology
Replies
2
Views
1K
  • Computing and Technology
Replies
12
Views
2K
Replies
9
Views
2K
  • Computing and Technology
Replies
13
Views
2K
  • Computing and Technology
Replies
2
Views
1K
Back
Top