Showing posts with label Computer Concepts. Show all posts
Showing posts with label Computer Concepts. Show all posts

Sunday, October 2, 2016

rsktutors

Deadlock

Dead Lock
Dear Reader,
        Today morning I was using my computer system for different types of works. I observed my computer system gets hang for sometimes(may be unknown reasons). Based on this issue, I decided to share the information about this Deadlock concept which often comes with Operating system. So let's begin our discussion.
What is a Deadlock?
        Before answering this question, we have to ensure that our computer system has been modeled with limited resources like memory, printers, tape drives, CPU etc. That means the allocated work to a computer system to perform, it has to share some same resources for every work. At that point of time, we may get this deadlock situation.

        So, We can give the understanding meaning for deadlock. Deadlock is the situation where the computer system requesting for one resource, but that one was already utilizing by another process. At the same time, that process needs some resource, which was held by the first process. In this case, no process executes. Every process remains in the waiting state for the infinite time. This situation is known as Deadlock situation. The following simple figure explains this.

In general, a deadlock occurs in concurrent programming, in which two or more operations/processes are waiting to complete the other operation/process. Which leads nothing to complete.
  
How to Avoid Deadlock:

        By following 4 methods, we can prevent the deadlock situation. Because of all these causes deadlock situations.

a. Mutual Exclusion:
        If one process is using a shared data(which may be modifiable), the another process must not do the same operation at the same time. Suppose more than one process is trying to do an operation on the shared data, one process only allowed to this. All the other processes were kept in waiting state, to avoid critical section(trying to access shared resource, must not concurrently access). Once, the execution of the first process has finished, the other process which was under waiting state should permit to use that shared data. This technique is known as Mutual Exclusion.


b. Hold and Wait:
        Processes must release holding resources for simultaneous access by the other processes. I.e., Requesting process hold already, resources while waiting for requested resources.  


c. No Preemption:
          Preemption means temporarily interrupting an operation being carried out by a computer. If it was not allowed, Deadlock occurs. I.e.,  Resources cannot be removed temporarily by a process holding it.
d. Circular Wait:
          The processes in the computer system form a circular list, where each process in the list is waiting for a resource held by the next process in the list.


Ok. But,  How handle a deadlock, if already it occurs? Let us see.

How to handle Deadlock?

a. Preemption:
          By taking a resource, which was already held by a process and allocate it to another requesting process. But, it may lead to other types of problems.
b. Rollback:
          We should make a regular record of system operations at each stage. Suppose, at any situation causing deadlock, immediately roll back everything to the previous stage.    
c. Killing one or more process(s):
        Just terminate one or more process to avoid a deadlock situation. It is simple, but care must be taken.

By using Banker's Algorithm, we can avoid Deadlock situation. Banker's algorithm was developed by  Edsger Dijkstra. According to Banker's algorithm, consider each request as it occurs, and see if granting it leads to a safe state. If it is a safe state, then the request is granted. Otherwise, it is postponed.


With this information, I hope you can understand the Deadlock situation and it's preventing and handling methods. Please share this information. And please feel free to give comments or suggestions in the below section. See you in my next article.
Thank You.



Read More
rsktutors

Multiprocessing

Multiprocessing

Dear reader,

       Today, I would like to share the information about one of the concepts of Computer system, Multiprocessing. Now a day's all Computer come under this category. In simple words, we are doing so many operations at the same time with a computer system. For example, we are typing some letter, hearing music, downloading some file from the internet and others. All these works are running at the same time. This methodology is known as Multiprocessing. Let us move for detail discussion.

Multiprocessing means, it is the capability of a computer system to perform more than one operation/task/job at the same time. Multiprocessor operating system means different types of jobs/tasks are performing with more than one Central Processing Units in one computer. In this case, all CPUs are in communication, means that they can share different types of computer system peripherals like memory, bus, some I/O devices. Hence, this type of computer systems is also known as Tightly coupled systems or Symmetric Systems.

To process huge data with high speed, we use this type of computer systems. Ex: Satellite control systems, Weather forecasting systems. The following figure brings an idea about the architecture of multiprocessing operating systems.



    Under multiprocessing computer systems, every processor runs on its own operating system, by communicating with one another. This type of relationship referred as Master-Slave relationship. That means, when assigning a job, the main processor(master) controls all the operations regarding the job. This method is very useful. Because this type of  computer systems will share the available resources. Therefore, we can complete a task within a short period of time, that compare with single processor computer systems. The other advantage is, suppose one of the communicating processors gets a failure, there is no terminating the process. But, it is not possible in the case of a single processor system.

To achieve this, we must assure with processor and motherboard support. The entire work/job/task has been controlled by the operating system by allocating various tasks to different processors to perform. This method follows the threading concept. Which means, single operating can be divided into a number of small tasks(knows as subroutines) that can run independently on their own processor simultaneously.

Don't get confuse with multiprocessing and multiprogramming.  Multiprogramming refers the execution of multiple programs by a single processor at a given time. But multiprocessor means the execution of a single application by several processors. Multiprocessor means, parallel execution of multiple processes using more than one processor. This is not the meaning to execute a single process that needs more processors.

In the beginning days, processor cost was very high and peripherals were very low speed. The problem is, when the computer executes a program means, it has to access required peripherals. But all these peripherals are having less computing/processing speed, which results in CPU under waiting. The first Multiprogramming computer system was Leo III and  designed by Lyons and Co. With the technology of Virtual memory and Virtual machines, multiprogramming was enhanced. Multiprogramming never give assurance that an application/program/job/task will run in a timely manner. Multiprocessing systems that treat all CPUs equally are known as Symmetric Multiprocessing Systems(both hardware and software are connected to single shared main memory) and all CPUs are not equal, at that time system resources are may be divided as either Asymmetric Multiprocessing or Clustered Processing or Non-Uniform memory access multiprocessing.

Mainly there are 3 types of problems with multiprocessor system. Namely, Locking System, Shared Data and False Sharing. Let us discuss these terms one by one.

Locking System:
        Locks will help us to write the correct code of statements for multiprocessors. As per multiprocessor concept, it should increase the total performance by executing different tasks concurrently on different CPUs. That means to provide safe access to the shared resources among multiple processors. Locks provide us serializable access. 

Shared Data:
        Cache coherence protocol allows multiprocessors to access the shared data as serializable manner. Delays in serialization will impact on system performance. Cache coherence traffic and interconnection network may also reduce the system performance. It can be eliminated by avoiding the sharing of data.

False Sharing:
        This type of problem comes when unrelated data items used by different processors which are located next to each other in memory and sharing a single cache line(memory is stored within the cache system in units is known as a cache line). False sharing comes when threads on different processors try to modify variables that reside on the same cache line. It is a popular performance issue on SMP(Symmetrical Multi-Processing) systems.

With this basic information, I hope you would understand the importance of multiprocessing. See you in my next article. Please feel free to give comments or suggestions in the below section.
Thank You.






Read More