serp.fast
← Glossary

llms.txt

The llms.txt file is a proposed standard for sites to publish a Markdown-formatted summary of their most important content for consumption by AI systems. Modeled after robots.txt, it lives at the root of a domain (`/llms.txt`) and provides AI crawlers with a curated, structured summary of the site — useful pages, key documentation, and machine-friendly excerpts. The proposal originated with Jeremy Howard in 2024 and gained traction through 2025 as AI systems sought cleaner content sources than full-page HTML scrapes. The format is intentionally simple: a top-level title and description, followed by sections of links with optional descriptions. Unlike robots.txt, llms.txt is positive — it lists what the site wants AI to use, not what it forbids. A complementary `llms-full.txt` variant inlines the full text of priority pages, eliminating the need for AI systems to fetch and parse individual pages. Adoption is uneven but growing; major platforms like Stripe, Anthropic, and many SaaS documentation sites publish llms.txt files. For AI builders, the practical question is whether to publish one. The benefit is cleaner inclusion in AI training and retrieval pipelines — your authoritative content is more likely to be cited and less likely to be misrepresented. The cost is small: the file is a Markdown index of pages you already publish. For any site that wants to be visible inside AI products, publishing llms.txt and llms-full.txt is the new minimum.