A Novel Hyperparameter-Free Approach to Decision Tree Construction That Avoids Overfitting by Design
Compartir
Metadatos
Mostrar el registro completo del ítemFecha
2019-07-22Resumen
Decision trees are an extremely popular machine learning technique. Unfortunately, overfitting in decision trees still remains an open issue that sometimes prevents achieving good performance. In this paper, we present a novel approach for the construction of decision trees that avoids the overfitting by design, without losing accuracy. A distinctive feature of our algorithm is that it requires neither the optimization of any hyperparameters, nor the use of regularization techniques, thus significantly reducing the decision tree training time. Moreover, our algorithm produces much smaller and shallower trees than traditional algorithms, facilitating the interpretability of the resulting models. For reproducibility, we provide an open source version of the algorithm.