From Dialogue to Mastery: Investigating Question-Asking and Interactive Learning with Large Language Models
Abstract: This paper investigates the potential of large language models (LLMs) to shift from passive data absorption to active, interactive learning through simulated student-teacher dialogues. We introduce a dataset of 1,322 contexts spanning domains like song lyrics, news articles, movie plots, academic papers, and images, and analyze conversational interactions to assess the ability of LLMs to gain knowledge about these contexts. Our findings show that interactive learning significantly boosts performance, with interactive student models surpassing static learning approaches in just four dialogue turns on average. However, student models still trail behind teacher models equipped with full context knowledge. To further assess learning dynamics, we introduce the Cumulative Information Coverage (CIC) metric, revealing that more insightful questions drive better outcomes, although rigid questioning patterns remain a limitation. These findings suggest that advancing interactive learning methods and extending machine learning theories could better capture the dynamics of conversational learning, paving the way for effective machine intelligence and educational technologies.
Paper Type: Long
Research Area: Dialogue and Interactive Systems
Research Area Keywords: interactive learning, large language models, conversational machine learning
Contribution Types: Model analysis & interpretability, Data resources
Languages Studied: English
Submission Number: 342
Loading