评分 5.5 · 来源:cs.AI updates on arXiv.org · 发布于 2026-04-14
评分依据:中等偏上:有一定信息增量和参考价值
TARAC: Mitigating Hallucination in LVLMs via Temporal Attention Real-time Accumulative Connection
arXiv:2504.04099v2 Announce Type: replace-cross Abstract: Large Vision-Language Models have demonstrated remarkable capabilities, yet they suffer from hallucinations that limit practical deployment. While various mitigation strategies exist, they often incur high computational overhead or require extensive retraining. In this paper, we address the issue of visual attention decay during generation, a key factor contributing to hallucinations. We propose Temporal Attention Real-time Accumulative…