The effect of different feature selection methods on models created with XGBoost

dc.contributor.authorNeyra, Jorge
dc.contributor.authorSiramshetty, Vishal B.
dc.contributor.authorAshqar, Huthaifa
dc.date.accessioned2024-12-11T17:02:46Z
dc.date.available2024-12-11T17:02:46Z
dc.date.issued2024-11-08
dc.description.abstractThis study examines the effect that different feature selection methods have on models created with XGBoost, a popular machine learning algorithm with superb regularization methods. It shows that three different ways for reducing the dimensionality of features produces no statistically significant change in the prediction accuracy of the model. This suggests that the traditional idea of removing the noisy training data to make sure models do not overfit may not apply to XGBoost. But it may still be viable in order to reduce computational complexity.
dc.description.urihttp://arxiv.org/abs/2411.05937
dc.format.extent11 pages
dc.genrejournal articles
dc.genrepreprints
dc.identifierdoi:10.13016/m29pfq-ivsa
dc.identifier.urihttps://doi.org/10.48550/arXiv.2411.05937
dc.identifier.urihttp://hdl.handle.net/11603/37105
dc.language.isoen_US
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Data Science
dc.relation.ispartofUMBC Student Collection
dc.rightsAttribution 4.0 International CC BY 4.0
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subjectComputer Science - Information Retrieval
dc.subjectComputer Science - Machine Learning
dc.titleThe effect of different feature selection methods on models created with XGBoost
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0002-6835-8338

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2411.05937v1.pdf
Size:
424 KB
Format:
Adobe Portable Document Format