## Introduction

The **Radial Basis Function (RBF)** kernel is one of the most powerful, useful, and popular kernels in the Support Vector Machine (SVM) family of classifiers. In this article, we’ll discuss what exactly makes this kernel so powerful, look at its working, and study examples of it in action. We’ll also provide code samples for implementing the RBF kernel from scratch in Python that illustrates how to use the RBF kernel on your own data sets. Let’s dive in!...

## What are Kernels in SVM?

SVM is an algorithm that has shown great success in the field of classification. It separates the data into different categories by finding the best hyperplane and maximizing the distance between points. To this end, a kernel function will be introduced to demonstrate how it works with support vector machines. Kernel functions are a very powerful tool for exploring high-dimensional spaces. They allow us to do linear discriminants on nonlinear manifolds, which can lead to higher accuracies and robustness than traditional linear models alone.

If you want to have a quick overview of SVM kernels, check this article: SVM Kernels: Polynomial Kernel - From Scratch Using Python.

The kernel function is just a mathematical function that converts a low-dimensional input space into a higher-dimensional space. This is done by mapping the data into a new feature space. In this space, the data will be linearly separable. This means that a support vector machine can be used to find a hyperplane that separates the data.

For example, if the input 𝑥 is two-dimensional, the kernel function will map it into a three-dimensional space. In this space, the data will be linearly separable.

In addition, they provide more features than those of other algorithms such as neural networks or tree ensembles in some kinds of problems involving handwritten recognition, face detection, etc because they extract intrinsic properties of data points through a kernel function.

## The RBF Kernel

**Radial Basis Function Kernel**is a very powerful kernel used in SVM. Unlike linear or polynomial kernels, RBF is more complex and efficient at the same time that it can combine multiple polynomial kernels multiple times of different degrees to project the non-linearly separable data into higher dimensional space so that it can be separable using a hyperplane.

**||X1 - X2||^2**is known as the

**Squared Euclidean Distance**and

**σ**is a free parameter that can be used to tune the equation.

**ℽ = 1 / 2σ^2,**the equation will be

**Squared Euclidean Distance**is multiplied by the

**gamma**parameter and then finding the

**exponent**of the whole. This equation can find the

**transformed inner products**for mapping the data into higher dimensions

**directly without actually transforming the entire dataset which leads to inefficiency. And this is why it is known as the RBF kernel**

**function.**

**Gaussian Distribution curve**which is known as

**a bell-shaped curve.**Thus RBF kernel is also known as

**Gaussian Radial Basis Kernel.**

**K-Nearest Neighbors**and

**Support Vector Machines.**

## Implementing RBF kernel with SVM using Python

**Scikit-Learn make_circles**dataset.

## Creating the dataset

import numpy as npimport matplotlib.pyplot as pltimport pandas as pdfrom sklearn.datasets import make_circlesX, y = make_circles(n_samples=500, noise=0.06, random_state=42)df = pd.DataFrame(dict(x1=X[:, 0], x2=X[:, 1], y=y))

colors = {0:'blue', 1:'yellow'}fig, ax = plt.subplots()grouped = df.groupby('y')for key, group in grouped:group.plot(ax=ax, kind='scatter', x='x1', y='x2', label=key, color = colors[key])plt.show()

from sklearn.svm import SVCfrom sklearn.metrics import accuracy_scoreclf = SVC(kernel="linear")clf.fit(X, y)pred = clf.predict(X)print("Accuracy: ",accuracy_score(pred, y))------Accuracy: 0.496

**Polynomial Kernel.**

clf = SVC(kernel="poly")clf.fit(X, y)pred = clf.predict(X)print("Accuracy: ",accuracy_score(pred, y))-------Accuracy: 0.566

## The RBF kernel using Python

def RBF(X, gamma):# Free parameter gammaif gamma == None:gamma = 1.0/X.shape[1]# RBF kernel EquationK = np.exp(-gamma * np.sum((X - X[:,np.newaxis])**2, axis = -1))return K

**gamma**parameter. The gamma parameter can be any value that tunes the equation. You can try giving different values to gamma to see the change in the accuracy of prediction.

X = RBF(X, gamma=None)

clf = SVC(kernel="linear")clf.fit(X, y)pred = clf.predict(X)print("Accuracy: ",accuracy_score(pred, y))----Accuracy: 0.94