Skip to content
星际流动

CoMeT: Collaborative Memory Transformer for Efficient Long Context Modeling

发布
采集
学术前沿 7.3 分 — Constant-memory, linear-time transformer for arbitrary long sequences
原文: cs.LG updates on arXiv.org

评分 7.3 · 来源:cs.LG updates on arXiv.org · 发布于 2026-04-20

评分依据:Constant-memory, linear-time transformer for arbitrary long sequences