LLM-Enabled Semantic Caching for Affordable Web Access

Published: 22 Sept 2025, Last Modified: 03 Jan 2026WiML @ NeurIPS 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Large Language Models; Semantic Caching; Web Affordability; Vision-Language Models; Internet Accessibility
Submission Number: 156
Loading