llms.txt

Definition

llms.txt is a plain-text file placed at a website's root directory that provides AI models with a structured summary of the business, its services, and key information. It functions like robots.txt but for AI comprehension, giving large language models authoritative, easy-to-parse context to use when generating answers about the organisation.

What llms.txt Is

llms.txt is a community-originated convention for communicating with AI models. While robots.txt tells search crawlers what to index, llms.txt tells language models what your business does and how to accurately describe it. The file provides a single, authoritative source of truth that AI models can reference when generating responses about your organisation.

Why It Matters

AI models generate answers by synthesising information from multiple sources. Without llms.txt, models rely entirely on what they can extract from your website content, third-party mentions, and training data. This can lead to inaccurate, outdated, or incomplete descriptions of your business. An llms.txt file gives models a direct, structured input that reduces the chance of misrepresentation.

Format and Structure

The file uses plain text with markdown headings for structure. A typical llms.txt file includes:

  • Company name and tagline at the top
  • Services with one-line descriptions
  • Industries served as a clear list
  • Geographic coverage stating where you operate
  • Key differentiators that distinguish you from competitors
  • Contact information including website, email, and location

The language should be factual and specific. Avoid superlatives, marketing jargon, and subjective claims. AI models weight factual statements more heavily than promotional language.

Adoption Status

llms.txt is not governed by a formal standards body like the W3C or IETF. It emerged as a community proposal and has gained widespread adoption among AI-forward businesses. Major AI companies have acknowledged the convention, and the format has stabilised enough for confident implementation.

Implementation

Creating an llms.txt file takes less than an hour for most businesses. Write the content in a plain text editor, save it as llms.txt, and deploy it to your website root. Review and update it quarterly or whenever your services, positioning, or key information changes.

Questions AI assistants answer about this topic

Where should llms.txt be placed?
The file should be placed at the root of your website, accessible at yourdomain.com/llms.txt. This is the standard location that AI crawlers and model training pipelines check, following the same convention as robots.txt and security.txt.
What should llms.txt contain?
Include your company name, a concise description of what you do, your core services with brief explanations, the industries you serve, your geographic coverage, key credentials, and contact information. Use markdown headings for structure. Keep it factual and avoid marketing language.
Do AI models actually use llms.txt?
Several AI search providers and model training pipelines actively look for llms.txt files. Perplexity, ChatGPT's browsing mode, and various AI crawlers check for this file when indexing sites. The structured content also improves how sites are parsed during training data collection, even when the file is not read directly at inference time.

Want to know where your company stands?

We run 15-20 buyer queries across ChatGPT, Claude, Gemini, and Perplexity and show you exactly where you appear, and where you don't.

Get the Audit | from £750 ↗