Thoughtful piece. Much more at the link. Here the intro:
Ensemble Learning Algorithm Complexity and Occam’s Razor by Jason Brownlee on December 21, 2020 in Ensemble Learning
Occam’s razor suggests that in machine learning, we should prefer simpler models with fewer coefficients over complex models like ensembles.
Taken at face value, the razor is a heuristic that suggests more complex hypotheses make more assumptions that, in turn, will make them too narrow and not generalize well. In machine learning, it suggests complex models like ensembles will overfit the training dataset and perform poorly on new data.
In practice, ensembles are almost universally the type of model chosen on projects where predictive skill is the most important consideration. Further, empirical results show a continued reduction in generalization error as the complexity of an ensemble learning model is incrementally increased. These findings are at odds with the Occam’s razor principle taken at face value.
In this tutorial, you will discover how to reconcile Occam’s Razor with ensemble machine learning.
After completing this tutorial, you will know:
heuristic that suggests choosing simpler machine learning models as they are expected to generalize better. The heuristic can be divided into two razors, one of which is true and remains a useful tool and the other that is false and should be abandoned.
Ensemble learning algorithms like boosting provide a specific case of how the second razor fails and added complexity can result in lower generalization error.
Let’s get started. ... "
No comments:
Post a Comment