OpenReview.net
  • Login

Hongyi Wang

MS student, EE, Tsinghua University, Tsinghua University

  • Joined May 2024

Names

Hongyi Wang (Preferred)
  • Suggest Name

Emails

****@mails.tsinghua.edu.cn (Confirmed)
  • Suggest Email

Personal Links

ORCID
  • Suggest URL

Career & Education History

MS student
EE, Tsinghua University, Tsinghua University (mails.tsinghua.edu.cn)
2022 – 2025
 
  • Suggest Position

Advisors, Relations & Conflicts

No relations added

  • Suggest Relation

Expertise

Software Hardware Co-design
2022 – 2025
 
  • Suggest Expertise

Publications

  • Mixture of Attention Spans: Optimizing LLM Inference Efficiency with Heterogeneous Sliding-Window Lengths

    Tianyu Fu, Haofeng Huang, Xuefei Ning, Genghan Zhang, Boju Chen, Tianqi Wu, Hongyi Wang, Zixiao Huang, Shiyao Li, Shengen Yan, Guohao Dai, Huazhong Yang, Yu Wang
    • COLM 2025
    • Readers: Everyone
  • MoA: Mixture of Sparse Attention for Automatic Large Language Model Compression

    Tianyu Fu, Haofeng Huang, Xuefei Ning, Genghan Zhang, Boju Chen, Tianqi Wu, Hongyi Wang, Zixiao Huang, Shiyao Li, Shengen Yan, Guohao Dai, Huazhong Yang, Yu Wang
    • ICLR 2025 FM-Wild Workshop
    • Readers: Everyone
  • MoA: Mixture of Sparse Attention for Automatic Large Language Model Compression

    Tianyu Fu, Haofeng Huang, Xuefei Ning, Genghan Zhang, Boju Chen, Tianqi Wu, Hongyi Wang, Zixiao Huang, Shiyao Li, Shengen Yan, Guohao Dai, Huazhong Yang, Yu Wang
    • Submitted to ICLR 2025
    • Readers: Everyone

Co-Authors

  • Boju Chen
  • Genghan Zhang
  • Guohao Dai
  • Haofeng Huang
  • Huazhong Yang
  • Shengen Yan
  • Shiyao Li
  • Tianqi Wu
  • Tianyu Fu
  • Xuefei Ning
  • Yu Wang
  • Zixiao Huang
  • About OpenReview
  • Hosting a Venue
  • All Venues
  • Contact
  • Sponsors
  • Donate
  • Frequently Asked Questions
  • Terms of Use
  • Privacy Policy
  • About OpenReview
  • Hosting a Venue
  • All Venues
  • Sponsors
  • Frequently Asked Questions
  • Contact
  • Donate
  • Terms of Use
  • Privacy Policy

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview