SEO & AI Discovery.
Automatic SEO, Schema.org markup, llms.txt, MCP server, and structured data for AI agents.
SEO & AI Discovery
VoxelSite generates websites that are optimized for both search engines and AI agents.
Search engine optimization
Every page VoxelSite generates includes:
- Title tags — descriptive, keyword-relevant titles
- Meta descriptions — compelling summaries for search results
- Semantic HTML — proper heading hierarchy (one
<h1>per page), landmarks, and semantic elements - Open Graph tags — for social media sharing previews
- Alt text — on every image
- ARIA labels — where needed for accessibility
- WCAG AA contrast — color choices that meet accessibility standards
Generated on publish
When you publish your site, VoxelSite automatically generates:
| File | Purpose |
|---|---|
sitemap.xml |
XML sitemap listing all pages for search engine crawlers |
robots.txt |
Crawler rules that welcome both search engines and AI agents |
AI discovery
VoxelSite generates sites that are machine-readable by default — not just for search engines, but for AI agents.
llms.txt
Every published site includes an llms.txt file — a plain-text site map designed for AI language models. It tells AI agents what your business does, what pages exist, and what services you offer.
MCP server
Every site includes an MCP (Model Context Protocol) endpoint at mcp.php. AI agents can query this endpoint to:
- Read your business information
- Browse your menu, services, or product listings
- Get contact details and opening hours
- Submit form data (e.g., make a booking)
This means AI assistants like ChatGPT, Claude, and others can interact with your site programmatically.
Schema.org structured data
VoxelSite generates JSON-LD structured data for:
- Organization/LocalBusiness — your business name, address, phone, hours
- WebPage — standard page metadata
- FAQ — frequently asked questions
- Breadcrumbs — navigation hierarchy
This data helps search engines display rich results and helps AI agents understand your business.
Structured data layer
Beyond Schema.org, VoxelSite maintains a data layer in assets/data/:
| File | Content |
|---|---|
site.json |
Core site info — name, description, features |
memory.json |
Business facts accumulated from conversations |
design-intelligence.json |
Design decisions and rationale |
{feature}.json |
Feature-specific data (menu, services, team, etc.) |
This structured data is the single source of truth for your business information — both the website and AI agents read from it.
Optimizing for AI
Use the Optimize for AI action to audit and enhance your site's AI readability. VoxelSite reviews:
- FAQ section completeness
- Structured data coverage
- llms.txt quality
- Local business details
- Meta description quality
Editing SEO & AI files
The Code Editor includes a dedicated SEO & AI section in the sidebar (between SITE FILES and SYSTEM PROMPTS). This gives you direct access to edit:
- robots.txt — control which crawlers can access your site and which paths are blocked
- llms.txt — customize the AI-readable description of your business, pages, and features
Both files live in your site's root directory and are served directly to search engines and AI agents.
Important: These files are regenerated every time you publish. If you manually edit
robots.txtorllms.txtand then publish your site, VoxelSite will overwrite your edits with freshly generated versions. To preserve manual changes, make your edits after your final publish.
Related
- Download & Export — export your site with robots.txt and llms.txt included
- Code Editor — the built-in editor where SEO & AI files are editable
- Publishing — how publishing regenerates SEO files