Prerequisite: K-Means Clustering | Introduction
There is a popular method known as elbow method which is used to determine the optimal value of K to perform the K-Means Clustering Algorithm. The basic idea behind this method is that it plots the various values of cost with changing k. As the value of K increases, there will be fewer elements in the cluster. So average distortion will decrease. The lesser number of elements means closer to the centroid. So, the point where this distortion declines the most is the elbow point.
In the above figure, its clearly observed that the distribution of points are forming 3 clusters. Now, let’s see the plot for the squared error(Cost) for different values of K.
Clearly the elbow is forming at K=3. So the optimal value will be 3 for performing K-Means.
Another Example with 4 clusters.
Corresponding Cost graph-
In this case the optimal value for k would be 4. (Observable from the scattered points).
Below is the Python implementation:
- Elbow Method for optimal value of k in KMeans
- Silhouette Algorithm to determine the optimal value of k
- ML | Hierarchical clustering (Agglomerative and Divisive clustering)
- DBSCAN Clustering in ML | Density based clustering
- ML | Fuzzy Clustering
- ML | Spectral Clustering
- ML | Classification vs Clustering
- ML | K-Medoids clustering with example
- ML | Mean-Shift Clustering
- Clustering in Machine Learning
- ML | OPTICS Clustering Explanation
- K means Clustering - Introduction
- Different Types of Clustering Algorithm
- ML | Types of Linkages in Clustering
- Image compression using K-means clustering
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.