Computer Architecture - Synchronisation issue between different processes?

Click For Summary

Discussion Overview

The discussion revolves around the importance of synchronization between different processes in parallel computing. Participants explore various scenarios where synchronization is crucial, including multi-threaded applications, deadlock situations, and the need for combining outputs from multiple threads.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Homework-related

Main Points Raised

  • Some participants emphasize that synchronization is essential for the proper functioning of multi-threaded or multi-tasking programs, as it prevents programming failures.
  • Examples of failure cases are provided, such as two processes incrementing a shared variable simultaneously, leading to incorrect results if not synchronized.
  • Others discuss the concept of decoupling applications in time using threads, where multiple sub-tasks can run in parallel, necessitating synchronization to combine their results.
  • A scenario involving deadlock is presented, where two processes lock resources needed by each other, leading to a situation where neither can proceed without proper scheduling or synchronization mechanisms.
  • One participant suggests that scheduling alone won't resolve deadlock issues and proposes using atomic calls to lock multiple resources simultaneously to avoid such situations.
  • Different requirements for synchronization are mentioned, including dependencies between threads and the need for messaging schemes or completion status tracking.

Areas of Agreement / Disagreement

Participants generally agree on the importance of synchronization in parallel computing, but there are multiple competing views on how to implement it effectively, particularly regarding deadlock resolution and resource locking strategies. The discussion remains unresolved on the best approaches to these issues.

Contextual Notes

Some participants express confusion over terminology and concepts, indicating that further clarification may be needed for those less familiar with the technical language used in the discussion.

Lumen Bliss
Messages
3
Reaction score
0
I'm doing my homework and I've already found out some things about synchronisation issues but if anyone could tell me in a little more detail why synchronisation between different processes belonging to the same or different programs is an important task in parallel computing?

Thanks in advance! x
 
Physics news on Phys.org
Why synchronisation is important is that a multi-threaded or multi-tasking program wouldn't function properly without it. Beyond that, you would be getting into specific situations that can cause programming failures if synchronisation is not implemented properly, which is usually done with examples showing failure cases. A common example is two processes trying to increment some shared variable; the failure case occurs when both processes get the "old value" and end up incrementing the variable by one instead of two. If the incrementing operation was synchronized, then one of the processes would increment the variable before the other process could access the variable, eliminating the failure.

After understanding why synchronisation is important, you'll probably want to move onto the various methods for how it is done.
 
Last edited:
rcgldr said:
Why synchronisation is important is that a multi-threaded or multi-tasking program wouldn't function properly without it. Beyond that, you would be getting into specific situations that can cause programming failures if synchronisation is not implemented properly, which is usually done with examples showing failure cases. A common example is two processes trying to increment some shared variable; the failure case occurs when both processes get the "old value" and end up incrementing the variable by one instead of two. If the incrementing operation was synchronized, then one of the processes would increment the variable before the other process could access the variable, eliminating the failure.

After understanding why synchronisation is important, you'll probably want to move onto the various methods for how it is done.

Thank you that helped a lot! x
 
Threads are used to decouple an application in time (like modules are used to create a logical decoupling of applications).

Lets say you have a big task that can be separated in multiple sub-tasks which can be executed in parallel (meaning all sub-tasks can be decoupled in time). You can create multiple threads and each thread runs a sub-task. Of course the solution of the big task is a combination of solutions of each sub-task. To combine the solutions you need to synchronize the threads. You can also synchronize the threads in case they access a common resource.
 
atomthick said:
Threads are used to decouple an application in time (like modules are used to create a logical decoupling of applications).

Lets say you have a big task that can be separated in multiple sub-tasks which can be executed in parallel (meaning all sub-tasks can be decoupled in time). You can create multiple threads and each thread runs a sub-task. Of course the solution of the big task is a combination of solutions of each sub-task. To combine the solutions you need to synchronize the threads. You can also synchronize the threads in case they access a common resource.
Sorry Atomthick, that confused me a little, I need it in layman terms, lol. Thanks for answering though! x
 
Another scenario is a deadlock situation. Imagine that two processes, A & B need 2 files to read to continue their execution. But they both need the same: F1 and F2. Without sync and processes executing at random at each timeslot a bad scenario would go like this:

Process A locks F1
Process B locks F2
Process A tries to lock F2 and blocks because B controls its lock.
Process B tries to lock F1 and blocks because A controls its lock.
...
...
Processes remain forever blocked because they can't get their resource. This is a deadlock situation. Scheduling is needed to avoid this. For example, a scheduler might give precedence to Α until it finishes and releases the files. Then it would allow B to execute etc...

I hope I helped. I'm still an undergrad so make sure to double-check what I say ;)
 
Last edited:
atomthick said:
Threads are used to decouple an application in time.
Lumen Bliss said:
Sorry Atomthick, that confused me a little, I need it in layman terms.
decouple application in time - I haven't seen this terminology used before either, but he means to split up an application or program into separate threads or processes that can run at the same time, which he explained later in his post. He then goes on to explain that if the output (he called it solution) from each thread needs to be combined with the output of other threads, then some type of synchronization is needed.

Constantinos said:
Another scenario is a deadlock situation. ... Process A locks F1 ... Process B locks F2 ... Process A tries to lock F2 ... scheduling is needed
Scheduling won't solve this issue. What is needed is a process that allows the locking of multiple resources in single "atomic" (uninterruptable by other threads) call. The calling thread will wait (blocked) until all of the resources are unlocked and then lock all of the requested resources at one time. This also eliminates any issues related to priority between threads. The resources are usually unlocked one at a time, since only the locking of multiple resources at the same time is an issue. In the case of Windows, there is a synchonization function called WaitForMultipleObjects() that acccomplishes this.

Getting back to the original question, there are different requirements for synchronization. Threads may depend on the outputs of each other, in which case some type of messaging scheme is normally used. Or the sub-threads may not depend on other sub-thread outputs, but some master process does and needs to know at least when those sub-threads complete. In some cases the threads are completely independent, and if their maximum time to completion is known, the master process can just wait for that time instead of waiting for completion status from the threads or the master process may not care when the sub-processes it spawns complete and terminate (exit).
 
Last edited:

Similar threads

  • · Replies 102 ·
4
Replies
102
Views
4K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 10 ·
Replies
10
Views
6K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
14
Views
2K
  • · Replies 29 ·
Replies
29
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 23 ·
Replies
23
Views
6K
  • · Replies 4 ·
Replies
4
Views
1K