Skip to content
星际流动

SpikingBrain2.0: Brain-Inspired Foundation Models for Efficient Long-Context and Cross-Platform Inference

发布
采集
学术前沿 7.0 分 — 5B model with Dual-Space Sparse Attention for long-context efficiency. Notable architecture innovation.
原文: arxiv.org

评分 7 · 来源: · 发布于 2026-04-27

评分依据:5B model with Dual-Space Sparse Attention for long-context efficiency. Notable architecture innovation.