
📌 置顶
llms.txt:大语言模型理解网站内容的标准入口
llms.txt is an open proposal by Jeremy Howard that provides a standardized, machine-readable entry point for websites to help large language models (LLMs) better understand website content during the inference phase. It differs from robots.txt by guiding LLMs to valuable information rather than restricting access, and from sitemap.xml by offering curated summaries and key links optimized for LLM context windows. The proposal includes a strict Markdown format specification, a Python toolchain for implementation, and has been adopted by projects like FastHTML, Supabase, and Vue.js. (llms.txt是由Jeremy Howard提出的开放性提案,为网站提供标准化的机器可读入口,帮助大语言模型在推理阶段更有效地理解网站内容。与robots.txt不同,它引导LLM关注有价值信息而非限制访问;与sitemap.xml不同,它提供精炼摘要和关键链接,优化LLM上下文处理。提案包含严格的Markdown格式规范、Python工具链支持,已被FastHTML、Supabase和Vue.js等项目采用。)
LLMS2026/2/4
阅读全文 →






