Prerequisite: K-Nearest Neighbours Algorithm
K-Nearest Neighbors is one of the most basic yet essential classification algorithms in Machine Learning. It belongs to the supervised learning domain and finds intense application in pattern recognition, data mining and intrusion detection. It is widely disposable in real-life scenarios since it is non-parametric, meaning, it does not make any underlying assumptions about the distribution of data (as opposed to other algorithms such as GMM, which assume a Gaussian distribution of the given data).
This article will demonstrate how to implement the K-Nearest neighbors classifier algorithm using Sklearn library of Python.
Step 1: Importing the required Libraries
Step 2: Reading the Dataset
Step 3: Training the model
Step 4: Evaluating the model
We now try to find the optimum value for ‘k’ ie the number of nearest neighbors.
Step 5: Plotting the training and test scores graph
From the above scatter plot, we can come to the conclusion that the optimum value of k will be around 5.
- ML | Voting Classifier using Sklearn
- ML | Ridge Regressor using sklearn
- sklearn.Binarizer() in Python
- ML | Implementing L1 and L2 regularization using Sklearn
- Sklearn | Feature Extraction with TF-IDF
- ML | Dummy classifiers using sklearn
- ML | sklearn.linear_model.LinearRegression() in Python
- Python | Linear Regression using sklearn
- Implementing DBSCAN algorithm using Sklearn
- Implementing Agglomerative Clustering using Sklearn
- ML | OPTICS Clustering Implementing using Sklearn
- Image Classifier using CNN
- ML | Bagging classifier
- Python | Decision Tree Regression using sklearn
- Python | Create Test DataSets using Sklearn
- NLP | Classifier-based Chunking | Set 2
- NLP | Classifier-based tagging
- ML - Nearest Centroid Classifier
- NLP | Classifier-based Chunking | Set 1
- Calculate Efficiency Of Binary Classifier
Improved By : shubham_singh