Building semi-supervised decision trees with semi-cart algorithm

Allbwn ymchwil: Cyfraniad at gyfnodolynErthygladolygiad gan gymheiriaid

StandardStandard

Building semi-supervised decision trees with semi-cart algorithm. / Abedinia, Aydin; Seydi, Vahid.
Yn: International Journal of Machine Learning and Cybernetics, 24.04.2024.

Allbwn ymchwil: Cyfraniad at gyfnodolynErthygladolygiad gan gymheiriaid

HarvardHarvard

Abedinia, A & Seydi, V 2024, 'Building semi-supervised decision trees with semi-cart algorithm', International Journal of Machine Learning and Cybernetics. https://doi.org/10.1007/s13042-024-02161-z

APA

Abedinia, A., & Seydi, V. (2024). Building semi-supervised decision trees with semi-cart algorithm. International Journal of Machine Learning and Cybernetics. https://doi.org/10.1007/s13042-024-02161-z

CBE

Abedinia A, Seydi V. 2024. Building semi-supervised decision trees with semi-cart algorithm. International Journal of Machine Learning and Cybernetics. https://doi.org/10.1007/s13042-024-02161-z

MLA

Abedinia, Aydin a Vahid Seydi. "Building semi-supervised decision trees with semi-cart algorithm". International Journal of Machine Learning and Cybernetics. 2024. https://doi.org/10.1007/s13042-024-02161-z

VancouverVancouver

Abedinia A, Seydi V. Building semi-supervised decision trees with semi-cart algorithm. International Journal of Machine Learning and Cybernetics. 2024 Ebr 24. doi: 10.1007/s13042-024-02161-z

Author

Abedinia, Aydin ; Seydi, Vahid. / Building semi-supervised decision trees with semi-cart algorithm. Yn: International Journal of Machine Learning and Cybernetics. 2024.

RIS

TY - JOUR

T1 - Building semi-supervised decision trees with semi-cart algorithm

AU - Abedinia, Aydin

AU - Seydi, Vahid

PY - 2024/4/24

Y1 - 2024/4/24

N2 - Decision trees are a fundamental statistical learning tool for addressing classification and regression problems through a recursive partitioning approach that effectively accommodates numerical and categorical data [1, 2]. The Classification and regression tree (CART) algorithm underlies modern Boosting methodologies such as Gradient boosting machine (GBM), Extreme gradient boosting (XGBoost), and Light gradient boosting machine (LightGBM). However, the standard CART algorithm may require improvement due to its inability to learn from unlabeled data. This study proposes several modifications to incorporate test data into the training phase. Specifically, we introduce a method based on Graph-based semi-supervised learning called “Distance-based Weighting,” which calculates and removes irrelevant records from the training set to accelerate the training process and improve performance. We present Semi-supervised classification and regression tree (Semi-Cart), a new implementation of CART that constructs a decision tree using weighted training data. We evaluated its performance on thirteen datasets from various domains. Our results demonstrate that Semi-Cart outperforms standard CART methods and contributes to statistical learning.

AB - Decision trees are a fundamental statistical learning tool for addressing classification and regression problems through a recursive partitioning approach that effectively accommodates numerical and categorical data [1, 2]. The Classification and regression tree (CART) algorithm underlies modern Boosting methodologies such as Gradient boosting machine (GBM), Extreme gradient boosting (XGBoost), and Light gradient boosting machine (LightGBM). However, the standard CART algorithm may require improvement due to its inability to learn from unlabeled data. This study proposes several modifications to incorporate test data into the training phase. Specifically, we introduce a method based on Graph-based semi-supervised learning called “Distance-based Weighting,” which calculates and removes irrelevant records from the training set to accelerate the training process and improve performance. We present Semi-supervised classification and regression tree (Semi-Cart), a new implementation of CART that constructs a decision tree using weighted training data. We evaluated its performance on thirteen datasets from various domains. Our results demonstrate that Semi-Cart outperforms standard CART methods and contributes to statistical learning.

U2 - 10.1007/s13042-024-02161-z

DO - 10.1007/s13042-024-02161-z

M3 - Article

JO - International Journal of Machine Learning and Cybernetics

JF - International Journal of Machine Learning and Cybernetics

SN - 1868-8071

ER -