**Hierarchical Clustering:**

Hierarchical clustering is basically an unsupervised clustering technique which involves creating clusters in a predefined order. The clusters are ordered in a top to bottom manner. In this type of clustering, similar clusters are grouped together and are arranged in a hierarchical manner. It can be further divided into two types namely agglomerative hierarchical clustering and Divisive hierarchical clustering. In this clustering, we link the pairs of clusters all the data objects are there in the hierarchy.

**Non Hierarchical Clustering:**

Non Hierarchical Clustering involves formation of new clusters by merging or splitting the clusters.It does not follow a tree like structure like hierarchical clustering.This technique groups the data in order to maximize or minimize some evaluation criteria.K means clustering is an effective way of non hierarchical clustering.In this method the partitions are made such that non-overlapping groups having no hierarchical relationships between themselves.

**Difference between Hierarchical Clustering and Non Hierarchical Clustering:**

S.NO. | Hierarchical Clustering: | Non Hierarchical Clustering: |
---|---|---|

1. | Hierarchical Clustering involves creating clusters in a predefined order from top to bottom . |
Non Hierarchical Clustering involves formation of new clusters by merging or splitting the clusters instead of following a hierarchical order. |

2. | It is considered less reliable than Non Hierarchical Clustering. | It is comparatively more reliable than Hierarchical Clustering. |

3. | It is considered slower than Non Hierarchical Clustering. | It is comparatavely more faster than Hierarchical Clustering. |

4. | It is very problematic to apply this technique when we have data with high level of error. | It can work better then Hierarchical clustering even when error is there. |

5. | It is comparatively easier to read and understand. | The clusters are difficult to read and understand as compared to Hierarchical clustering. |

6. | It is relatively unstable than Non Hierarchical clustering. | It is a relatively stable technique. |

## Recommended Posts:

- ML | Hierarchical clustering (Agglomerative and Divisive clustering)
- Difference between K means and Hierarchical Clustering
- Difference between CURE Clustering and DBSCAN Clustering
- Hierarchical Clustering in R Programming
- DBSCAN Clustering in ML | Density based clustering
- K means Clustering - Introduction
- Clustering in R Programming
- Analysis of test data using K-Means Clustering in Python
- Clustering in Machine Learning
- Different Types of Clustering Algorithm
- ML | Unsupervised Face Clustering Pipeline
- ML | Determine the optimal value of K in K-Means Clustering
- ML | Mini Batch K-means clustering algorithm
- Image compression using K-means clustering
- ML | Mean-Shift Clustering
- ML | K-Medoids clustering with solved example
- Implementing Agglomerative Clustering using Sklearn
- ML | OPTICS Clustering Implementing using Sklearn
- ML | OPTICS Clustering Explanation
- ML | V-Measure for Evaluating Clustering Performance

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.