Tree-as-a-Prompt: Boosting Black-Box Large Language Models on Few-Shot Classification of Tabular Data

17 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: large language models; tabular data
Abstract: Large Language Models (LLMs) have achieved remarkable success across various natural language processing tasks, yet their application to tabular data presents unique challenges in terms of performance and interpretability. The intrinsic structure and characteristics of tabular data necessitate innovative strategies to unlock the full potential of LLMs in this domain. Recognizing the proficiency of decision trees in handling tabular data, we introduce Tree-as-a-Prompt in this paper. In addition to the original query, we propose to convert a decision tree into prompts and feed them into the LLM, aiming to enhance the performance of LLMs on tabular data. The decision tree is treated as a part of the composite model alongside the LLM and is optimized based on the LLM's predictions. Our results demonstrate that appending the decision tree as a prompt boosts the performance of LLMs on tabular data significantly. Additionally, the decision tree serves as an instrumental tool in elucidating the predictions of LLMs, thereby enhancing the model interpretability for different applications.
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 770
Loading