Skip to content
Related Articles

Related Articles

Hyperparameter tuning using GridSearchCV and KerasClassifier
  • Last Updated : 26 Nov, 2020

Hyperparameter tuning is done to increase the efficiency of a model by tuning the parameters of the neural network. Some scikit-learn APIs like GridSearchCV and RandomizedSearchCV are used to perform hyper parameter tuning. 

In this article, you’ll learn how to use GridSearchCV to tune Keras Neural Networks hyper parameters. 

Approach: 

  1. We will wrap Keras models for use in scikit-learn using KerasClassifier which is a wrapper.
  2. We will use cross validation using KerasClassifier and GridSearchCV
  3. Tune hyperparameters like number of epochs, number of neurons and batch size.

Implementation of the scikit-learn classifier API for Keras:

tf.keras.wrappers.scikit_learn.KerasClassifier(

   build_fn=None, **sk_params

)



Code:

filter_none

edit
close

play_arrow

link
brightness_4
code

# import the libraries 
import tensorflow as tf
import pandas as pd
from sklearn.compose import ColumnTransformer
from sklearn.preprocessing import OneHotEncoder
from keras.wrappers.scikit_learn import KerasClassifier
from sklearn.model_selection import GridSearchCV
from sklearn.preprocessing import LabelEncoder
from sklearn.preprocessing import StandardScaler

chevron_right


The dataset can be downloaded from here
Import the dataset using which we’ll predict if a customer stays or leave. 
 

Code:

filter_none

edit
close

play_arrow

link
brightness_4
code

# The last column is a binary value
dataset = pd.read_csv('Churn_Modelling.csv')
X = dataset.iloc[:, 3:-1].values
y = dataset.iloc[:, -1].values

chevron_right


Code: Preprocess the data

filter_none

edit
close

play_arrow

link
brightness_4
code

le = LabelEncoder()
X[:, 2] = le.fit_transform(X[:, 2])
#perform one hot encoding 
ct = ColumnTransformer(transformers=[('encoder', OneHotEncoder(), [1])], remainder='passthrough')
X = np.array(ct.fit_transform(X))
# perform standardization of the data. 
sc = StandardScaler()
X = sc.fit_transform(X)

chevron_right


To use the KerasClassifier wrapper, we will need to build our model in a function which needs to be passed to the build_fn argument in the KerasClassifier constructor. 

Code:

filter_none

edit
close

play_arrow

link
brightness_4
code

def build_clf(unit):
  # creating the layers of the NN
  ann = tf.keras.models.Sequential()
  ann.add(tf.keras.layers.Dense(units=unit, activation='relu'))
  ann.add(tf.keras.layers.Dense(units=unit, activation='relu'))
  ann.add(tf.keras.layers.Dense(units=1, activation='sigmoid'))
  ann.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
  return ann

chevron_right


Code: create the object of KerasClassifier class

filter_none

edit
close

play_arrow

link
brightness_4
code

model=KerasClassifier(build_fn=build_clf)

chevron_right


Now we will create the dictionary of the parameters we want to tune and pass as an argument in GridSearchCV. 

Code:

filter_none

edit
close

play_arrow

link
brightness_4
code

params={'batch_size':[100, 20, 50, 25, 32], 
        'nb_epoch':[200, 100, 300, 400],
        'unit':[5,6, 10, 11, 12, 15],
           
        }
gs=GridSearchCV(estimator=model, param_grid=params, cv=10)
# now fit the dataset to the GridSearchCV object. 
gs = gs.fit(X, y)

chevron_right


The best_score_ member gives the best score observed during the optimization procedure and the best_params_ describes the combination of parameters that achieved the best results.

Code:

filter_none

edit
close

play_arrow

link
brightness_4
code

best_params=gs.best_params_
accuracy=gs.best_score_

chevron_right


Output:

Accuracy:  0.80325

Best Params:  {‘batch_size’: 20, ‘nb_epoch’: 200, ‘unit’: 15}

machine-learning

My Personal Notes arrow_drop_up
Recommended Articles
Page :