Toggle navigation
OpenReview
.net
Login
×
Go to
DBLP
homepage
OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models
Fuzhao Xue
,
Zian Zheng
,
Yao Fu
,
Jinjie Ni
,
Zangwei Zheng
,
Wangchunshu Zhou
,
Yang You
Published: 01 Jan 2024, Last Modified: 05 Oct 2024
ICML 2024
Everyone
Revisions
BibTeX
CC BY-SA 4.0
Loading