9Klg7ce8D7@OpenReview

Total: 1

#1 Compressing tree ensembles through Level-wise Optimization and Pruning [PDF1] [Copy] [Kimi] [REL]

Authors: Laurens Devos, Timo Martens, Deniz Oruc, Wannes Meert, Hendrik Blockeel, Jesse Davis

Tree ensembles (e.g., gradient boosting decision trees) are often used in practice because they offer excellent predictive performance while still being easy and efficient to learn. In some contexts, it is important to additionally optimize their size: this is specifically the case when models need to have verifiable properties (verification of fairness, robustness, etc. is often exponential in the ensemble's size), or when models run on battery-powered devices (smaller ensembles consume less energy, increasing battery autonomy). For this reason, compression of tree ensembles is worth studying. This paper presents LOP, a method for compressing a given tree ensemble by pruning or entirely removing trees in it, while updating leaf predictions in such a way that predictive accuracy is mostly unaffected. Empirically, LOP achieves compression factors that are often 10 to 100 times better than that of competing methods.

Subject: ICML.2025 - Poster