Toggle navigation
OpenReview
.net
Login
×
Go to
DBLP
homepage
APTQ: Attention-aware Post-Training Mixed-Precision Quantization for Large Language Models
Ziyi Guan
,
Hantao Huang
,
Yupeng Su
,
Hong Huang
,
Ngai Wong
,
Hao Yu
Published: 01 Jan 2024, Last Modified: 16 May 2025
DAC 2024
Everyone
Revisions
BibTeX
CC BY-SA 4.0
Loading