Jacobian Aligned Random Forests

ICLR 2026 Conference Submission22283 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Random forests; Decision trees; Axis-aligned splits; Oblique decision boundaries; Feature interactions; Supervised preconditioning; Gradient-based feature transforms
Abstract: Axis-aligned decision trees are fast and stable but struggle on datasets with rotated or interaction-dependent decision boundaries, where informative splits require linear combinations of features rather than single-feature thresholds. Oblique forests address this with per-node hyperplane splits, but at added computational cost. We propose a simple alternative: JARF, Jacobian-Aligned Random Forests. Concretely, we fit a random forest to estimate class probabilities, compute finite-difference gradients with respect to each feature, form an expected Jacobian outer product (EJOP), and use it as a single global linear preconditioner for all inputs. This preserves the simplicity of axis-aligned trees while applying a single global rotation to capture oblique boundaries and feature interactions that would otherwise require many axis-aligned splits to approximate. On tabular benchmarks, our preconditioned forest matches or surpasses oblique baselines while training faster. Our results suggest that supervised preconditioning can deliver the accuracy of oblique forests while keeping the simplicity of axis-aligned trees.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 22283
Loading