An Experimental Evaluation of Mixup Regression Forests

Research output: Contribution to journalArticlepeer-review

Electronic versions

Documents

DOI

Over the past few decades, the remarkable prediction capabilities of ensemble methods have been used within a wide range of applications. Maximization of base-model ensemble accuracy and diversity are the keys to the heightened performance of these methods. One way to achieve diversity for training the base models is to generate articial/synthetic instances for their incorporation with the original instances. Recently, the mixup method was proposed for improving the classication power of deep neural networks (Zhang et al., 2017). Mixup method generates articial instances by combining pairs of instances and their labels, these new instances are used for training the neural networks promoting its regularization. In this paper, new regression tree ensembles trained with mixup, which we will refer to as Mixup Regression Forest, are presented and tested. The experimental study with 61 datasets showed that the mixup approach improved the results of both Random Forest and Rotation Forest.

Keywords

  • Mixup, Random forest, Regression, Rotation forest
Original languageEnglish
Article number113376
JournalExpert Systems with Applications
Volume151
Early online date10 Apr 2020
DOIs
Publication statusPublished - 1 Aug 2020

Total downloads

No data available
View graph of relations