If you’ve heard about llms.txt in the last few months, you might be wondering whether it’s worth your time. The honest answer in 2026: yes, but it takes 30 minutes, not 30 hours. Here’s what it is, why it matters, and how to actually do it.
The 30-second explanation
llms.txt is a plain-text file you put at the root of your website (at yoursite.com/llms.txt) that tells AI assistants — ChatGPT, Perplexity, Gemini, Claude — what your site is about and which pages matter most.
It’s the same idea as robots.txt, but for AI agents instead of search-engine crawlers. The format is simple: a markdown-style document with a site title, a short description, and a structured list of links grouped by section.
Why it exists
When an AI assistant cites your site in a response, it has to figure out which of your pages is most relevant. The default behaviour is to crawl your sitemap, parse the HTML, and infer the structure. This works, but it’s slow, error-prone, and biased toward whatever the AI happened to find first.
llms.txt tells the AI exactly what your site contains, in the order you think matters. It’s the difference between giving someone a map of your business and asking them to walk through the building looking for the room they need.
Does it actually work?
This is the right question. The honest answer is: maybe, partially, and increasingly. The major LLM providers haven’t all officially committed to honoring llms.txt the way Google honors robots.txt. As of early 2026, adoption is mixed: some AI search engines explicitly use it, others might be using it implicitly, and some ignore it entirely.
What’s clear: providing a clean llms.txt can’t hurt your AI search visibility, costs almost nothing, and aligns with the broader pattern of AI agents preferring sites that are clean, well-structured, and easy to parse. It’s a low-risk hedge.
What goes in the file
The spec from llmstxt.org is straightforward:
- An H1 with your site name.
- A blockquote with a one-sentence description of what your site is.
- One or two paragraphs giving more context.
- H2 sections grouping your important pages, each with a list of links and one-line descriptions.
That’s it. No XML, no schemas, no validation tools. Just markdown.
How to generate one
You have three options:
Manually. Open a text editor, write the file. For a site with under 30 pages, this takes 30 minutes. The advantage is you decide which pages to include and how to describe them. The disadvantage is it’s manual to update as your site grows.
Automatically. Use a tool that crawls your site and generates the file. We built a free LLMs.txt generator for exactly this — it pulls your sitemap, extracts page titles and descriptions, groups them by URL section, and outputs a clean file you can copy or download. Takes about 15 seconds per site.
WordPress plugin. A few plugins now generate llms.txt from your existing content. They work, but they tend to over-index — including every page rather than the strategic ones.
Where to put it
Upload llms.txt to your website’s root directory, so it’s reachable at https://yoursite.com/llms.txt. For WordPress, that’s the public_html folder via your hosting file manager or FTP. The file is served as plain text by default — no extra configuration needed.
Optionally, also generate llms-full.txt at the same location. This is a longer version of the file that includes the full text of each page, useful for smaller models that want everything in a single document.
Should small businesses bother?
Yes — but treat it as 30 minutes of work, not a strategic project. Generate it, upload it, and forget about it for six months. If AI-search citations become a meaningful traffic source for you (and our bet is they will, gradually), having had this in place for a year longer than competitors is worth more than the time spent.
For broader context on AI search optimization, our 2026 SEO playbook covers how the citation patterns differ from traditional search ranking, and our SEO service includes llms.txt implementation alongside everything else.