A practical guide to making websites readable by LLMs and AI agents, based on real-world testing on evilmartians.com. Six techniques are ranked by impact vs. effort: creating an llms.txt file, serving .md routes for every page, adding <link rel="alternate"> tags and HTTP Link headers, using a visually hidden div hint for

19m read timeFrom evilmartians.com
Post cover image
Table of contents
Priority levels at a glance1. What is llms.txt and why does every site need one?2. Why are .md routes the thing that actually matters?3. How do LLMs discover your Markdown version? <link> tags and HTTP Link headers4. What about when someone just pastes your URL into ChatGPT?5. When should you use /llms-full.txt ?6. What is content negotiation for LLMs?What AI SEO techniques don’t work?OK, how do you know any of this is working?What’s the rollout order?FAQ

Sort: