AI visibility controls — which AI crawlers can index your site, what the robots.txt strategy is, and how your content surfaces in ChatGPT / Claude / Perplexity answers.
Where it is
Settings → GEO (/admin/settings?tab=geo).
GEO (Generative Engine Optimization) is about making your content discoverable and citable by AI platforms like ChatGPT, Claude, and Perplexity — similar to how SEO is for Google. This tab controls which crawlers can access your content and how the discovery files are generated.
Robots.txt strategy
The core control. Pick one of four strategies:
| Strategy | What it does |
|---|---|
| Maximum | All bots allowed, including training bots. Best for maximum visibility. Default |
| Balanced | Search bots allowed (ChatGPT-User, Claude-SearchBot, PerplexityBot). Training bots blocked (GPTBot, ClaudeBot, Google-Extended). Your content powers AI answers but isn't used for model training |
| Restrictive | All AI bots blocked. Only traditional search engines allowed. Use if legally required |
| Custom | Define your own rules line by line. The tab exposes the generated robots.txt for review |
The generated robots.txt also disallows /admin/, /api/, and preview paths on every strategy.
Generated discovery files
Every build writes these files at the site root (they're what crawlers and AI platforms look for):
robots.txt— crawler policy based on your strategysitemap.xml— all indexable pages, with<xhtml:link rel="alternate" hreflang>for multi-locale sitesllms.txt— AI-friendly index of your site structure + MCP endpoint pointerllms-full.txt— full markdown dump of every published document, for retrieval-augmented AI usefeed.xml— RSS feed of your postsai-plugin.json— MCP plugin manifest so AI platforms can discover your live content API- Per-page
.mdfiles — every HTML page has a markdown sibling at the same URL +.md(e.g./blog/post.html→/blog/post.md)
All of this is automatic once the deploy runs. The GEO tab lets you configure what goes in each, not the build mechanics.
Country / region restrictions
Optional. Most sites leave this blank. Set here if your content is geographically restricted (legal, regional licensing). The value feeds into:
<meta name="geo.region">in every page- The
llms.txtpreamble - Structured data (
areaServed) in JSON-LD for businesses
AI citation tuning
GEO scoring (shown in the dashboard) measures how well your content is structured for AI citation. This tab surfaces the rules and lets you override some:
- Answer-first — lead paragraphs answer the H1 question directly
- Question headers — H2s match how people actually ask
- Statistics — include numbers, percentages, data points
- Citations — link to authoritative external sources
- Freshness — content updated within 90 days
- JSON-LD — structured data for articles, FAQs, HowTo
- Named author — E-E-A-T trust signals
- Depth — 800+ words for comprehensive coverage
The tab shows which rules your site currently passes + the Optimize All button to run all of them against every published doc.
The mcp-plugin endpoint
When GEO is enabled, the CMS exposes a public MCP endpoint at /api/mcp/public. AI platforms that support MCP (Claude Desktop, Cursor, custom agents) can plug it in and get live, structured access to your content — not just the scraped HTML.
This is the canonical way to make your site AI-native. See MCP settings for the auth'd version with write access.
Related
- SEO — search-engine optimization (sister feature)
- Visibility dashboard — combined SEO + GEO score
- MCP server — full MCP protocol docs
- llms.txt spec — the standard this implements