Skip to content
Related Articles

Related Articles

Major Kernel Functions in Support Vector Machine (SVM)
  • Difficulty Level : Medium
  • Last Updated : 16 Jul, 2020
GeeksforGeeks - Summer Carnival Banner

Kernel Function is a method used to take data as input and transform into the required form of processing data. “Kernel” is used due to set of mathematical functions used in Support Vector Machine provides the window to manipulate the data. So, Kernel Function generally transforms the training set of data so that a non-linear decision surface is able to transformed to a linear equation in a higher number of dimension spaces. Basically, It returns the inner product between two points in a standard feature dimension.

Standard Kernel Function Equation : 

K (\bar{x}) = 1, if ||\bar{x}|| <= 1
K (\bar{x}) = 0, Otherwise

Major Kernel Functions :-

For Implementing Kernel Functions, first of all we have to install “scikit-learn” library using command prompt terminal:

    pip install scikit-learn
  • Gaussian Kernel: It is used to perform transformation, when there is no prior knowledge about data.

K (x, y) = e ^ - (\frac{||x - y||^2} {2 \sigma^2})



  • Gaussian Kernel Radial Basis Function (RBF) : Same as above kernel function, adding radial basis method to improve the transformation.

K (x, y) = e ^ - (\gamma{||x - y||^2})
K (x, x1) + K (x, x2) (Simplified - Formula)
K (x, x1) + K (x, x2) > 0 (Green)
K (x, x1) + K (x, x2) = 0 (Red)


Code:




from sklearn.svm import SVC
classifier = SVC(kernel ='rbf', random_state = 0)
 # training set in x, y axis
classifier.fit(x_train, y_train)
  • Sigmoid Kernel: this function is equivalent to a two-layer, perceptron model of neural network, which is used as activation function for artificial neurons.

K (x, y) = tanh (\gamma.{x^T y}+{r})

Code:




from sklearn.svm import SVC
classifier = SVC(kernel ='sigmoid')
classifier.fit(x_train, y_train) # training set in x, y axis
  • Polynomial Kernel: It represents the similarity of vectors in training set of data in a feature space over polynomials of the original variables used in kernel.

K (x, y) = tanh (\gamma.{x^T y}+{r})^d, \gamma>0

Code:




from sklearn.svm import SVC
classifier = SVC(kernel ='poly', degree = 4)
classifier.fit(x_train, y_train) # training set in x, y axis

Code:




from sklearn.svm import SVC
classifier = SVC(kernel ='linear')
classifier.fit(x_train, y_train) # training set in x, y axis

machine-learning

My Personal Notes arrow_drop_up
Recommended Articles
Page :