Skip to content
星际流动

Mixture of Heterogeneous Grouped Experts for Language Modeling

发布
采集
学术前沿 6.0 分 — Heterogeneous grouped MoE allowing different expert groupings per layer, addresses MoE rigidity with good efficiency gains
原文: cs.AI updates on arXiv.org

评分 6 · 来源:cs.AI updates on arXiv.org · 发布于 2026-04-29

评分依据:Heterogeneous grouped MoE allowing different expert groupings per layer, addresses MoE rigidity with good efficiency gains