Abhineet Agarwal - Understanding and overcoming the statistical limitations of decision trees
Description:
Explore a comprehensive lecture on the statistical limitations of decision trees and innovative approaches to overcome them. Delve into the performance gap between decision trees and more complex machine learning methods like random forests and deep learning. Examine sharp squared error generalization lower bounds for decision trees fitted to sparse additive generative models, and discover how these bounds connect to rate-distortion theory. Learn about the proposed Fast Interpretable Greedy-Tree Sums (FIGS) algorithm, which extends CART to grow multiple trees simultaneously. Investigate FIGS' ability to disentangle additive model components, reduce redundant splits, and improve prediction performance. Review experimental results across various datasets, showcasing FIGS' superiority over other rule-based methods in scenarios with limited splits. Gain insights into the application of FIGS in high-stakes domains, particularly its effectiveness in developing clinical decision instruments that outperform traditional tree-based methods by over 20%.
Read more
Understanding and Overcoming the Statistical Limitations of Decision Trees