Computer Architecture - Synchronisation issue between different processes?

AI Thread Summary
Synchronization is crucial in parallel computing as it ensures that multi-threaded or multi-tasking programs function correctly, preventing programming failures. Without proper synchronization, processes may access shared variables simultaneously, leading to incorrect results, such as both processes incrementing a variable by one instead of two. Deadlock situations can also arise when processes lock resources needed by each other, causing them to block indefinitely. Effective synchronization methods, such as atomic resource locking and messaging schemes, are essential to manage dependencies between threads and ensure smooth execution. Understanding these concepts is vital for developing efficient parallel applications.
Lumen Bliss
Messages
3
Reaction score
0
I'm doing my homework and I've already found out some things about synchronisation issues but if anyone could tell me in a little more detail why synchronisation between different processes belonging to the same or different programs is an important task in parallel computing?

Thanks in advance! x
 
Physics news on Phys.org
Why synchronisation is important is that a multi-threaded or multi-tasking program wouldn't function properly without it. Beyond that, you would be getting into specific situations that can cause programming failures if synchronisation is not implemented properly, which is usually done with examples showing failure cases. A common example is two processes trying to increment some shared variable; the failure case occurs when both processes get the "old value" and end up incrementing the variable by one instead of two. If the incrementing operation was synchronized, then one of the processes would increment the variable before the other process could access the variable, eliminating the failure.

After understanding why synchronisation is important, you'll probably want to move onto the various methods for how it is done.
 
Last edited:
rcgldr said:
Why synchronisation is important is that a multi-threaded or multi-tasking program wouldn't function properly without it. Beyond that, you would be getting into specific situations that can cause programming failures if synchronisation is not implemented properly, which is usually done with examples showing failure cases. A common example is two processes trying to increment some shared variable; the failure case occurs when both processes get the "old value" and end up incrementing the variable by one instead of two. If the incrementing operation was synchronized, then one of the processes would increment the variable before the other process could access the variable, eliminating the failure.

After understanding why synchronisation is important, you'll probably want to move onto the various methods for how it is done.

Thank you that helped a lot! x
 
Threads are used to decouple an application in time (like modules are used to create a logical decoupling of applications).

Lets say you have a big task that can be separated in multiple sub-tasks which can be executed in parallel (meaning all sub-tasks can be decoupled in time). You can create multiple threads and each thread runs a sub-task. Of course the solution of the big task is a combination of solutions of each sub-task. To combine the solutions you need to synchronize the threads. You can also synchronize the threads in case they access a common resource.
 
atomthick said:
Threads are used to decouple an application in time (like modules are used to create a logical decoupling of applications).

Lets say you have a big task that can be separated in multiple sub-tasks which can be executed in parallel (meaning all sub-tasks can be decoupled in time). You can create multiple threads and each thread runs a sub-task. Of course the solution of the big task is a combination of solutions of each sub-task. To combine the solutions you need to synchronize the threads. You can also synchronize the threads in case they access a common resource.
Sorry Atomthick, that confused me a little, I need it in layman terms, lol. Thanks for answering though! x
 
Another scenario is a deadlock situation. Imagine that two processes, A & B need 2 files to read to continue their execution. But they both need the same: F1 and F2. Without sync and processes executing at random at each timeslot a bad scenario would go like this:

Process A locks F1
Process B locks F2
Process A tries to lock F2 and blocks because B controls its lock.
Process B tries to lock F1 and blocks because A controls its lock.
...
...
Processes remain forever blocked because they can't get their resource. This is a deadlock situation. Scheduling is needed to avoid this. For example, a scheduler might give precedence to Α until it finishes and releases the files. Then it would allow B to execute etc...

I hope I helped. I'm still an undergrad so make sure to double-check what I say ;)
 
Last edited:
atomthick said:
Threads are used to decouple an application in time.
Lumen Bliss said:
Sorry Atomthick, that confused me a little, I need it in layman terms.
decouple application in time - I haven't seen this terminology used before either, but he means to split up an application or program into separate threads or processes that can run at the same time, which he explained later in his post. He then goes on to explain that if the output (he called it solution) from each thread needs to be combined with the output of other threads, then some type of synchronization is needed.

Constantinos said:
Another scenario is a deadlock situation. ... Process A locks F1 ... Process B locks F2 ... Process A tries to lock F2 ... scheduling is needed
Scheduling won't solve this issue. What is needed is a process that allows the locking of multiple resources in single "atomic" (uninterruptable by other threads) call. The calling thread will wait (blocked) until all of the resources are unlocked and then lock all of the requested resources at one time. This also eliminates any issues related to priority between threads. The resources are usually unlocked one at a time, since only the locking of multiple resources at the same time is an issue. In the case of Windows, there is a synchonization function called WaitForMultipleObjects() that acccomplishes this.

Getting back to the original question, there are different requirements for synchronization. Threads may depend on the outputs of each other, in which case some type of messaging scheme is normally used. Or the sub-threads may not depend on other sub-thread outputs, but some master process does and needs to know at least when those sub-threads complete. In some cases the threads are completely independent, and if their maximum time to completion is known, the master process can just wait for that time instead of waiting for completion status from the threads or the master process may not care when the sub-processes it spawns complete and terminate (exit).
 
Last edited:
Back
Top