Major Kernel Functions in Support Vector Machine (SVM)

Kernel Function is a method used to take data as input and transform into the required form of processing data. “Kernel” is used due to set of mathematical functions used in Support Vector Machine provides the window to manipulate the data. So, Kernel Function generally transforms the training set of data so that a non-linear decision surface is able to transformed to a linear equation in a higher number of dimension spaces. Basically, It returns the inner product between two points in a standard feature dimension.

Standard Kernel Function Equation : 

K (\bar{x}) = 1, if ||\bar{x}|| <= 1
K (\bar{x}) = 0, Otherwise

Major Kernel Functions :-

For Implementing Kernel Functions, first of all we have to install “scikit-learn” library using command prompt terminal:

    pip install scikit-learn
  • Gaussian Kernel: It is used to perform transformation, when there is no prior knowledge about data.

K (x, y) = e ^ - (\frac{||x - y||^2} {2 \sigma^2})



  • Gaussian Kernel Radial Basis Function (RBF) : Same as above kernel function, adding radial basis method to improve the transformation.

K (x, y) = e ^ - (\gamma{||x - y||^2})
K (x, x1) + K (x, x2) (Simplified - Formula)
K (x, x1) + K (x, x2) > 0 (Green)
K (x, x1) + K (x, x2) = 0 (Red)


Code:

filter_none

edit
close

play_arrow

link
brightness_4
code

from sklearn.svm import SVC
classifier = SVC(kernel ='rbf', random_state = 0)
 # training set in x, y axis
classifier.fit(x_train, y_train)

chevron_right


  • Sigmoid Kernel: this function is equivalent to a two-layer, perceptron model of neural network, which is used as activation function for artificial neurons.

K (x, y) = tanh (\gamma.{x^T y}+{r})

Code:

filter_none

edit
close

play_arrow

link
brightness_4
code

from sklearn.svm import SVC
classifier = SVC(kernel ='sigmoid')
classifier.fit(x_train, y_train) # training set in x, y axis

chevron_right


  • Polynomial Kernel: It represents the similarity of vectors in training set of data in a feature space over polynomials of the original variables used in kernel.

K (x, y) = tanh (\gamma.{x^T y}+{r})^d, \gamma>0

Code:

filter_none

edit
close

play_arrow

link
brightness_4
code

from sklearn.svm import SVC
classifier = SVC(kernel ='poly', degree = 4)
classifier.fit(x_train, y_train) # training set in x, y axis

chevron_right


Code:

filter_none

edit
close

play_arrow

link
brightness_4
code

from sklearn.svm import SVC
classifier = SVC(kernel ='linear')
classifier.fit(x_train, y_train) # training set in x, y axis

chevron_right





My Personal Notes arrow_drop_up

Check out this Author's contributed articles.

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.


Article Tags :
Practice Tags :


Be the First to upvote.


Please write to us at contribute@geeksforgeeks.org to report any issue with the above content.