The effect of different feature selection methods on models created with XGBoost
Loading...
Links to Files
Author/Creator
Author/Creator ORCID
Date
2024-11-08
Type of Work
Department
Program
Citation of Original Publication
Rights
Attribution 4.0 International CC BY 4.0
Abstract
This study examines the effect that different feature selection methods have on models created with XGBoost, a popular machine learning algorithm with superb regularization methods. It shows that three different ways for reducing the dimensionality of features produces no statistically significant change in the prediction accuracy of the model. This suggests that the traditional idea of removing the noisy training data to make sure models do not overfit may not apply to XGBoost. But it may still be viable in order to reduce computational complexity.