Gradient Boosting Neural Networks: GrowNetDownload PDF

Anonymous

Sep 29, 2021 (edited Oct 03, 2021)ICLR 2022 Conference Blind SubmissionReaders: Everyone
  • Keywords: Deep Neural Networks, Gradient Boosting classifiers, NN architecture optimization
  • Abstract: A novel gradient boosting framework is proposed where shallow neural networks are employed as ``weak learners''. General loss functions are considered under this unified framework with specific examples presented for classification, regression and learning to rank. A fully corrective step is incorporated to remedy the pitfall of the greedy function approximation of the classic gradient boosting decision tree. The proposed model rendered outperforming results against state-of-the-art boosting methods in all three tasks on multiple datasets. An ablation study is performed to shed light on the effect of each model component and model hyperparameters.
  • One-sentence Summary: Neural Network based gradient boosting algorithm to perform multiple tasks.
  • Supplementary Material: zip
0 Replies

Loading