Prerequisites: Decision Tree Classifier
Extremely Randomized Trees Classifier(Extra Trees Classifier) is a type of ensemble learning technique which aggregates the results of multiple de-correlated decision trees collected in a “forest” to output it’s classification result. In concept, it is very similar to a Random Forest Classifier and only differs from it in the manner of construction of the decision trees in the forest.
Each Decision Tree in the Extra Trees Forest is constructed from the original training sample. Then, at each test node, Each tree is provided with a random sample of k features from the feature-set from which each decision tree must select the best feature to split the data based on some mathematical criteria (typically the Gini Index). This random sample of features leads to the creation of multiple de-correlated decision trees.
To perform feature selection using the above forest structure, during the construction of the forest, for each feature, the normalized total reduction in the mathematical criteria used in the decision of feature of split (Gini Index if the Gini Index is used in the construction of the forest) is computed. This value is called the Gini Importance of the feature. To perform feature selection, each feature is ordered in descending order according to the Gini Importance of each feature and the user selects the top k features according to his/her choice.
Consider the following data:-
Let us build a hypothetical Extra Trees Forest for the above data with five decision trees and the value of k which decides the number of features in a random sample of features be two. Here the decision criteria used will be Information Gain. First, we calculate the entropy of the data. Note the formula for calculating the entropy is:-
where c is the number of unique class labels and is the proportion of rows with output label is i.
Therefore for the given data, the entropy is:-
Let the decision trees be constructed such that:-
Note that the formula for Information Gain is:-
Using the above-given formulas:-
Computing total Info Gain for each feature:-
Total Info Gain for Outlook = 0.246+0.246 = 0.492 Total Info Gain for Temperature = 0.029+0.029+0.029 = 0.087 Total Info Gain for Humidity = 0.151+0.151+0.151 = 0.453 Total Info Gain for Wind = 0.048+0.048 = 0.096
Thus the most important variable to determine the output label according to the above constructed Extra Trees Forest is the feature “Outlook”.
The below given code will demonstrate how to do feature selection by using Extra Trees Classifiers.
Step 1: Importing the required libraries
Step 2: Loading and Cleaning the Data
Step 3: Building the Extra Trees Forest and computing the individual feature importances
Step 4: Visualizing and Comparing the results
Thus the above-given output validates our theory about feature selection using Extra Trees Classifier. The importance of features might have different values because of the random nature of feature samples.
- Parameters for Feature Selection
- ML | Chi-square Test for feature selection
- Chi-Square Test for Feature Selection - Mathematical Explanation
- Image Classifier using CNN
- ML | Bagging classifier
- NLP | Classifier-based tagging
- NLP | Classifier-based Chunking | Set 2
- ML | Implementation of KNN classifier using Sklearn
- NLP | Classifier-based Chunking | Set 1
- Calculate Efficiency Of Binary Classifier
- Ensemble Classifier | Data Mining
- ML | Training Image Classifier using Tensorflow Object Detection API
- ML | Feature Mapping
- ML | Feature Scaling - Part 1
- Sklearn | Feature Extraction with TF-IDF
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.