Skip to content
星际流动

Don't Retrieve, Navigate: Distilling Enterprise Knowledge into Navigable Agent Skills

发布
采集
学术前沿 6.0 分 — Novel approach converting enterprise docs into navigable agent skills vs traditional RAG, practical for enterprise AI
原文: cs.CL updates on arXiv.org

评分 6 · 来源:cs.CL updates on arXiv.org · 发布于 2026-04-17

评分依据:Novel approach converting enterprise docs into navigable agent skills vs traditional RAG, practical for enterprise AI

arXiv:2604.14572v1 Announce Type: cross Abstract: Retrieval-Augmented Generation (RAG) grounds LLM responses in external evidence but treats the model as a passive consumer of search results: it never sees how the corpus is organized or what it has not yet retrieved, limiting its ability to backtrack or combine scattered evidence. We present Corpus2Skill, which distills a document corpus into a hierarchical skill directory offline and lets an LLM agent navigate it at serve time.