Koya: A Recommender System for Large Language Model SelectionDownload PDF

Published: 03 Mar 2023, Last Modified: 15 Apr 2023AfricaNLP 2023Readers: Everyone
Keywords: Koya, large language model, recommender system
TL;DR: Koya is a recommender system for choosing the most compatible pretrained LLM to use for a downstream task and language of interest.
Abstract: Pretrained large language models (LLMs) are widely used for various downstream tasks in different languages. However, selecting the best LLM (from a large set of potential LLMs) for a given downstream task and language is a challenging and computationally expensive task, making the efficient use of LLMs difficult for low-compute communities. To address this challenge, we present Koya, a recommender system built to assist researchers and practitioners in choosing the right LLM for their task and language, without ever having to finetune the LLMs. Koya is built with the Koya Pseudo-Perplexity (KPPPL), our adaptation of the pseudo perplexity, and ranks LLMs in order of compatibility with the language of interest, making it easier and cheaper to choose the most compatible LLM. By evaluating Koya using five pretrained LLMs and three African languages (Yoruba, Kinyarwanda, and Amharic), we show an average recommender accuracy of 95\%, demonstrating its effectiveness. Koya aims to offer an easy to use (through a simple web interface accessible at https://huggingface.co/spaces/koya-recommender/system), cost-effective, fast and efficient tool to assist researchers and practitioners with low or limited compute access.
0 Replies

Loading