Building semi-supervised decision trees with semi-cart algorithm
Research output: Contribution to journal › Article › peer-review
Electronic versions
Documents
- s13042-024-02161-z
Final published version, 3.21 MB, PDF document
Licence: CC BY Show licence
DOI
Decision trees are a fundamental statistical learning tool for addressing classification and regression problems through a recursive partitioning approach that effectively accommodates numerical and categorical data [1, 2]. The Classification and regression tree (CART) algorithm underlies modern Boosting methodologies such as Gradient boosting machine (GBM), Extreme gradient boosting (XGBoost), and Light gradient boosting machine (LightGBM). However, the standard CART algorithm may require improvement due to its inability to learn from unlabeled data. This study proposes several modifications to incorporate test data into the training phase. Specifically, we introduce a method based on Graph-based semi-supervised learning called “Distance-based Weighting,” which calculates and removes irrelevant records from the training set to accelerate the training process and improve performance. We present Semi-supervised classification and regression tree (Semi-Cart), a new implementation of CART that constructs a decision tree using weighted training data. We evaluated its performance on thirteen datasets from various domains. Our results demonstrate that Semi-Cart outperforms standard CART methods and contributes to statistical learning.
Original language | English |
---|---|
Pages (from-to) | 4493-4510 |
Number of pages | 18 |
Journal | International Journal of Machine Learning and Cybernetics |
Volume | 15 |
Issue number | 10 |
Early online date | 24 Apr 2024 |
DOIs | |
Publication status | Published - 1 Oct 2024 |