Start Parallel Processing: Learn with 16 Computers

Click For Summary
To start with parallel processing, it's essential to understand the fundamentals of distributed computing and how to set up a network of computers for this purpose. With 16 computers available, creating a network can facilitate parallel processing tasks effectively. One notable tool for this is DMAKE, a distributed make utility designed for managing large projects by distributing compilation tasks across multiple computers. DMAKE operates by assigning individual source modules to different compiler servers, optimizing the compilation process and ensuring efficient use of resources.In addition to compilation tasks, parallel processing can be applied to various domains, such as analyzing satellite images. This involves dividing the images into segments and processing each segment on different computers simultaneously, showcasing the versatility of parallel algorithms. Overall, leveraging multiple computers for parallel processing can significantly enhance performance for large-scale projects, making it a valuable area of exploration for those with a background in physics and computing.
welatiger
Messages
85
Reaction score
0
I graduated from the Faculty of Science Department of Physics
and i want to know about parallel processing
from where i start ?
and i have 16 computer, what i can do to make a network between them to produce a parallel processing.
 
Technology news on Phys.org
One somewhat cool tool that never seemed to make it out of Microsoft in house development was DMAKE, a distributed make utility for large multiple source projects. The network contained a computer with the source files, another computer with the DMAKE app and associated tools, and multiple computers that were compiler servers. DMAKE would send off messages to each of the compiler servers to start compiling individual source modules. When a compile server signaled completion, DMAKE would then assign yet another module to be compliled, until all modules were compiled. Once the compile steps for a library module were completed, DMAKE would then start a link server to link all the just produced objects into a library or directly to an executable if it was a single library project. I saw it demoed while taking some device driver class for Windows NT at Microsoft University (back in the 1990's). The compiler and linker servers were kept effeciently busy until all the libraries were made and then linked to create a completed project. I don't recall if DMAKE kept a history of compile times from previous builds in order to optimize the scheduling. Computers are so fast these days that it would take a huge project for something like DMAKE to make sense now, but it was impressive at the time.


Another example in the parallel algorithm thread was distributing the analysis of satellite photos by breaking them up into multiple parts and sending each part to a different computer to operate on.
 
Last edited:
Learn If you want to write code for Python Machine learning, AI Statistics/data analysis Scientific research Web application servers Some microcontrollers JavaScript/Node JS/TypeScript Web sites Web application servers C# Games (Unity) Consumer applications (Windows) Business applications C++ Games (Unreal Engine) Operating systems, device drivers Microcontrollers/embedded systems Consumer applications (Linux) Some more tips: Do not learn C++ (or any other dialect of C) as a...

Similar threads

  • · Replies 102 ·
4
Replies
102
Views
2K
Replies
7
Views
2K
  • · Replies 31 ·
2
Replies
31
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
4K
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K