Toggle navigation
OpenReview
.net
Login
×
Go to
DBLP
homepage
DaG LLM ver 1.0: Pioneering Instruction-Tuned Language Modeling for Korean NLP
Dongjun Jang
,
Sangah Lee
,
Sungjoo Byun
,
Jinwoong Kim
,
Jean Seo
,
Minseok Kim
,
Soyeon Kim
,
Chaeyoung Oh
,
Jaeyoon Kim
,
Hyemi Jo
,
Hyopil Shin
Published: 01 Jan 2023, Last Modified: 19 Feb 2025
CoRR 2023
Everyone
Revisions
BibTeX
CC BY-SA 4.0
Abstract:
This paper presents the DaG LLM (David and Goliath Large Language Model), a language model specialized for Korean and fine-tuned through Instruction Tuning across 41 tasks within 13 distinct categories.
Loading