Generate a spec-compliant llms.txt file in seconds. Tell ChatGPT, Claude, Perplexity, and every AI crawler what your site contains — in clean, structured Markdown.
Fill in the fields below. Output maps directly to the open standard. Validation runs on generate.
> Your description here — aim for under 160 characters.Fill in the form above with your site name, description, and key pages. We produce a properly formatted file in seconds.
Place the file at yoursite.com/llms.txt. See our platform guides for Vercel, Next.js, Nginx, and more.
Crawlers like GPTBot and ClaudeBot discover your file and build a more accurate understanding of your site.
llms.txt is a Markdown file placed at your domain root. It was proposed in 2024 as an open community standard — documented at llmstxt.org — to help AI language models understand site content without parsing HTML.
# Required: H1 with site name # Your Site Name > Required: one-line blockquote > Brief description of your site. ## Optional H2 section headers ## Essential Content - [Page Title](https://site.com/page): Description - [About](https://site.com/about): Who we are ## Optional - [Blog](https://site.com/blog): Articles
| Element | Rule | Status |
|---|---|---|
# H1 | First line, site name | Required |
> blockquote | Follows H1, one sentence | Required |
## H2 | Section groupings | Optional |
[text](url) | Markdown links | Optional |
| File path | /llms.txt at root | Required |
| Content-Type | text/plain | Required |
llms.txt vs robots.txt: robots.txt restricts what crawlers can access. llms.txt invites AI agents to understand your content. The two are complementary — you should have both. You can reference your llms.txt from robots.txt with a comment: # LLM-Content: /llms.txt
Copy-paste snippets for every major platform.
Place llms.txt in your /public folder. Vercel serves public files from the root automatically.
# Terminal cp llms.txt ./public/llms.txt git add . && git commit -m "feat: add llms.txt for AI discovery" git push # Verify: https://yoursite.com/llms.txt
Use a Route Handler in App Router to serve dynamically — useful if generating from a CMS.
// app/llms.txt/route.ts export async function GET() { const content = `# Your Site > One sentence about your site. ## Essential Content - [Home](https://yoursite.com): Main page - [Docs](https://yoursite.com/docs): Documentation `; return new Response(content, { headers: { 'Content-Type': 'text/plain' } }); }
# nginx.conf server { listen 443 ssl; server_name example.com; location = /llms.txt { root /var/www/html; add_header Content-Type text/plain; add_header Cache-Control "public, max-age=86400"; } }
# netlify.toml [[headers]] for = "/llms.txt" [headers.values] Content-Type = "text/plain; charset=UTF-8" Cache-Control = "public, max-age=86400"
Add to functions.php. Stores content in a WordPress option you can edit from the dashboard.
// functions.php add_action('init', function() { add_rewrite_rule('^llms\\.txt$', 'index.php?llms_txt=1', 'top'); }); add_filter('query_vars', fn($v) => [...$v, 'llms_txt']); add_action('template_redirect', function() { if (get_query_var('llms_txt')) { header('Content-Type: text/plain; charset=UTF-8'); echo get_option('llms_txt_content', '# My Site'); exit; } });
Shopify blocks arbitrary root files. Use a Cloudflare Worker to intercept and return your content.
// Cloudflare Worker const LLMS = `# My Store > Brief description of your store. ## Essential Content - [Home](https://mystore.com): Main page `; export default { async fetch(req) { if (new URL(req.url).pathname === '/llms.txt') return new Response(LLMS, { headers: { 'Content-Type': 'text/plain' }}); return fetch(req); } };
# Place llms.txt in the repo root # GitHub Pages serves root files automatically cp llms.txt ./llms.txt git add llms.txt git commit -m "add: llms.txt for AI agent discovery" git push origin main # Verify: https://username.github.io/llms.txt
Paste a URL to check whether a site has a valid llms.txt and see its compliance score.
Early adopters of the standard. The format is gaining traction across AI companies, developer tools, and SaaS products.
Everything you need to know about the llms.txt standard and using this tool.
Cache-Control: public, max-age=86400 so crawlers re-fetch daily. Treat it like your About page — a living document, not a one-time setup.