Skip to content
星际流动

Think-at-Hard: Selective Latent Iterations to Improve Reasoning Language Models

发布
采集
学术前沿 7.0 分 — Identifies latent overthinking in looped transformers, proposes selective iterations
原文: arxiv.org

评分 7 · 来源: · 发布于 2026-04-28

评分依据:Identifies latent overthinking in looped transformers, proposes selective iterations