Open In App

Difference between Grid computing and Cluster computing

Improve
Improve
Like Article
Like
Save
Share
Report

Cluster Computing: 
A Computer Cluster is a local network of two or more homogeneous computers.A computation process on such a computer network i.e. cluster is called Cluster Computing

Grid Computing: 
Grid Computing can be defined as a network of homogeneous or heterogeneous computers working together over a long distance to perform a task that would rather be difficult for a single machine. 

Difference between Cluster and Grid Computing: 
 

Cluster Computing Grid Computing
Nodes must be homogeneous i.e. they should have same type of hardware and operating system. Nodes may have different Operating systems and hardwares. Machines can be homogeneous or heterogeneous.
Computers in a cluster are dedicated to the same work and perform no other task. Computers in a grid contribute their unused processing resources to the grid computing network.
Computers are located close to each other. Computers may be located at a huge distance from one another.
Computers are connected by a high speed local area network bus. Computers are connected using a low speed bus or the internet.
Computers are connected in a centralized network topology. Computers are connected in a distributed or de-centralized network topology.
Scheduling is controlled by a central server. It may have servers, but mostly each node behaves independently.
Whole system has a centralized resource manager. Every node manages it’s resources independently.
Whole system functions as a single system. Every node is autonomous, and anyone can opt out anytime.
Cluster computing is used in areas such as WebLogic Application Servers, Databases, etc. Grid computing is used in areas such as predictive modeling, Automation, simulations, etc.
It has Centralized Resource management. It has Distributed Resource Management.

 


Last Updated : 21 Feb, 2023
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads