Computer Architecture - Synchronisation issue between different processes?

In summary, synchronization is important in parallel computing because it allows multiple threads or processes to work together efficiently and avoid conflicts or issues such as deadlocks. Different methods of synchronization may be used depending on the specific needs and requirements of the program.
  • #1
Lumen Bliss
3
0
I'm doing my homework and I've already found out some things about synchronisation issues but if anyone could tell me in a little more detail why synchronisation between different processes belonging to the same or different programs is an important task in parallel computing?

Thanks in advance! x
 
Physics news on Phys.org
  • #2
Why synchronisation is important is that a multi-threaded or multi-tasking program wouldn't function properly without it. Beyond that, you would be getting into specific situations that can cause programming failures if synchronisation is not implemented properly, which is usually done with examples showing failure cases. A common example is two processes trying to increment some shared variable; the failure case occurs when both processes get the "old value" and end up incrementing the variable by one instead of two. If the incrementing operation was synchronized, then one of the processes would increment the variable before the other process could access the variable, eliminating the failure.

After understanding why synchronisation is important, you'll probably want to move onto the various methods for how it is done.
 
Last edited:
  • #3
rcgldr said:
Why synchronisation is important is that a multi-threaded or multi-tasking program wouldn't function properly without it. Beyond that, you would be getting into specific situations that can cause programming failures if synchronisation is not implemented properly, which is usually done with examples showing failure cases. A common example is two processes trying to increment some shared variable; the failure case occurs when both processes get the "old value" and end up incrementing the variable by one instead of two. If the incrementing operation was synchronized, then one of the processes would increment the variable before the other process could access the variable, eliminating the failure.

After understanding why synchronisation is important, you'll probably want to move onto the various methods for how it is done.

Thank you that helped a lot! x
 
  • #4
Threads are used to decouple an application in time (like modules are used to create a logical decoupling of applications).

Lets say you have a big task that can be separated in multiple sub-tasks which can be executed in parallel (meaning all sub-tasks can be decoupled in time). You can create multiple threads and each thread runs a sub-task. Of course the solution of the big task is a combination of solutions of each sub-task. To combine the solutions you need to synchronize the threads. You can also synchronize the threads in case they access a common resource.
 
  • #5
atomthick said:
Threads are used to decouple an application in time (like modules are used to create a logical decoupling of applications).

Lets say you have a big task that can be separated in multiple sub-tasks which can be executed in parallel (meaning all sub-tasks can be decoupled in time). You can create multiple threads and each thread runs a sub-task. Of course the solution of the big task is a combination of solutions of each sub-task. To combine the solutions you need to synchronize the threads. You can also synchronize the threads in case they access a common resource.
Sorry Atomthick, that confused me a little, I need it in layman terms, lol. Thanks for answering though! x
 
  • #6
Another scenario is a deadlock situation. Imagine that two processes, A & B need 2 files to read to continue their execution. But they both need the same: F1 and F2. Without sync and processes executing at random at each timeslot a bad scenario would go like this:

Process A locks F1
Process B locks F2
Process A tries to lock F2 and blocks because B controls its lock.
Process B tries to lock F1 and blocks because A controls its lock.
...
...
Processes remain forever blocked because they can't get their resource. This is a deadlock situation. Scheduling is needed to avoid this. For example, a scheduler might give precedence to Α until it finishes and releases the files. Then it would allow B to execute etc...

I hope I helped. I'm still an undergrad so make sure to double-check what I say ;)
 
Last edited:
  • #7
atomthick said:
Threads are used to decouple an application in time.
Lumen Bliss said:
Sorry Atomthick, that confused me a little, I need it in layman terms.
decouple application in time - I haven't seen this terminology used before either, but he means to split up an application or program into separate threads or processes that can run at the same time, which he explained later in his post. He then goes on to explain that if the output (he called it solution) from each thread needs to be combined with the output of other threads, then some type of synchronization is needed.

Constantinos said:
Another scenario is a deadlock situation. ... Process A locks F1 ... Process B locks F2 ... Process A tries to lock F2 ... scheduling is needed
Scheduling won't solve this issue. What is needed is a process that allows the locking of multiple resources in single "atomic" (uninterruptable by other threads) call. The calling thread will wait (blocked) until all of the resources are unlocked and then lock all of the requested resources at one time. This also eliminates any issues related to priority between threads. The resources are usually unlocked one at a time, since only the locking of multiple resources at the same time is an issue. In the case of Windows, there is a synchonization function called WaitForMultipleObjects() that acccomplishes this.

Getting back to the original question, there are different requirements for synchronization. Threads may depend on the outputs of each other, in which case some type of messaging scheme is normally used. Or the sub-threads may not depend on other sub-thread outputs, but some master process does and needs to know at least when those sub-threads complete. In some cases the threads are completely independent, and if their maximum time to completion is known, the master process can just wait for that time instead of waiting for completion status from the threads or the master process may not care when the sub-processes it spawns complete and terminate (exit).
 
Last edited:

1. What is computer architecture?

Computer architecture refers to the design and organization of the components that make up a computer system, including the hardware, software, and networking infrastructure. It involves understanding how different components interact with each other and how they contribute to the overall functionality of the system.

2. What is synchronisation in computer architecture?

Synchronisation in computer architecture refers to the coordination of different processes or components to ensure that they are accessing shared resources in a controlled and orderly manner. This prevents conflicts and errors that may arise from simultaneous access to the same resource.

3. What is a synchronisation issue between different processes?

A synchronization issue between different processes occurs when multiple processes are trying to access and modify shared resources at the same time. This can lead to data inconsistencies or errors, as the processes may interfere with each other's operations.

4. How can synchronisation issues be resolved?

Synchronisation issues can be resolved by implementing synchronization mechanisms, such as locks, semaphores, or monitors, which ensure that only one process can access a shared resource at a time. These mechanisms help to coordinate the execution of processes and prevent conflicts.

5. What are some common examples of synchronisation issues?

Some common examples of synchronisation issues include race conditions, where the order of execution of processes affects the outcome, and deadlock, where two or more processes are waiting for each other to release resources, resulting in a standstill. Other examples include data corruption and inconsistent output due to concurrent access to shared resources.

Similar threads

  • Engineering and Comp Sci Homework Help
Replies
3
Views
971
  • Programming and Computer Science
Replies
10
Views
5K
  • Programming and Computer Science
Replies
29
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
4
Views
725
  • Programming and Computer Science
Replies
14
Views
1K
Replies
8
Views
843
  • Quantum Physics
Replies
14
Views
1K
Replies
2
Views
871
  • Engineering and Comp Sci Homework Help
Replies
3
Views
2K
Back
Top