A Skills Pack is a bundle of AI-specific files — llms.txt and agent.json — generated directly from your Skillaeo audit data and customized to your brand, product, and content. Deploying your Skills Pack is the single fastest way to improve your site's technical readiness for AI engines. This guide walks you through generating, customizing, and deploying each file, with platform-specific instructions for Next.js, WordPress, Shopify, and static sites.
What's in a Skills Pack
Your Skills Pack contains two files, each serving a distinct purpose in your AI visibility infrastructure:
| File | Format | Purpose | Who Reads It |
|---|---|---|---|
llms.txt | Markdown | Human-readable summary of your brand, products, and key content for AI assistants | ChatGPT, Claude, Perplexity, Gemini, and other LLMs |
agent.json | JSON | Machine-readable metadata describing your services, capabilities, and integration points | Autonomous AI agents, agentic search systems, programmatic AI tools |
Together, these files act as your website's "resume" for AI systems. llms.txt tells conversational AI assistants who you are and what matters most. agent.json tells autonomous AI agents what your service does and how to interact with it. They complement each other — llms.txt provides narrative context while agent.json provides structured, parseable data.
Why Both Files Matter
You might wonder whether one file is enough. The short answer: AI systems are heterogeneous.
- Conversational AI assistants (ChatGPT, Claude, Perplexity) prefer
llms.txtbecause they work with natural language and Markdown is native to their processing pipelines. - Agentic AI systems (autonomous browsing agents, product comparison tools, procurement bots) prefer
agent.jsonbecause they need structured fields they can query programmatically. - Hybrid systems (Google AI Overviews, enterprise AI platforms) check both.
Having both files covers the full spectrum of AI systems that might discover your site.
What Makes a Skills Pack Different from Generic Templates
A Skills Pack generated by Skillaeo is not a generic template. It's built from your actual audit data:
- Your brand name, description, and positioning are extracted from your site content and meta tags
- Your key pages are identified by the audit and linked in
llms.txtwith descriptive annotations - Your product capabilities are structured in
agent.jsonbased on what the audit detects on your site - Your content categories are organized by the sections and topics the audit finds across your pages
This means the files are ready to deploy immediately — you review them for accuracy, make any adjustments, and publish.
Step 1: Run an AEO Audit on Skillaeo
If you haven't already, start by running an audit at skillaeo.com/audit. The audit is the data source for your Skills Pack — it scans your site's content, structure, and metadata to build a comprehensive profile that the generator uses to create your files.
For the best results, audit your homepage or primary landing page. This gives the generator the widest possible view of your brand's positioning, product descriptions, and content organization.
If this is your first audit, see the Skillaeo quick-start guide for a walkthrough.
Step 2: Generate Your Skills Pack from the Report
After your audit completes, look for the Generate Skills Pack option in your report. This triggers the generator, which uses your audit data to produce both files.
The generation process takes a few seconds. When complete, you'll be presented with:
- A preview of your
llms.txtcontent - A preview of your
agent.jsoncontent - Download options for both files
- An inline editor for customizing each file before download
You don't need to download immediately — you can review and customize first.
Step 3: Review and Customize the Generated Files
The generator produces accurate files, but you know your brand better than any automated system. Review both files with these checks:
Reviewing Your llms.txt
- Brand description: Is the blockquote summary (the first paragraph after your brand name) an accurate, current description of what you do? This is the single most important piece of text in the file — AI systems treat it as your definitive identity statement.
- Linked pages: Are the most important pages included? The generator selects key pages from your audit data, but you may want to add or remove entries.
- Descriptions: Does each linked page have an accurate one-line description? Generic descriptions like "Our about page" are less useful than "Company history, founding team, and mission statement."
- Categories: Are your pages organized under meaningful H2 sections? Common categories include "About," "Products," "Documentation," "Blog," and "Resources."
For the full specification and best practices, see the llms.txt complete guide.
Reviewing Your agent.json
- Service description: Does the
descriptionfield accurately capture what your product/service does? - Capabilities: Is the
capabilitiesarray complete? Add any core capabilities the generator may have missed. - Features: Are the listed features accurate and current? Remove any deprecated features and add new ones.
- Contact information: Are email, support URL, and documentation links correct?
- Pricing model: If included, does it reflect your current pricing?
For the full specification and field-by-field guide, see the agent.json guide.
Step 4: Deploy llms.txt to Your Domain Root
Your llms.txt file must be accessible at https://yourdomain.com/llms.txt. This is where AI systems look for it — just like robots.txt lives at the domain root.
Upload the file to your web server's public root directory. The exact method depends on your hosting platform (see platform-specific guides below), but the end result is the same: a GET request to https://yourdomain.com/llms.txt should return your Markdown content with a text/plain or text/markdown content type.
Verification: After uploading, open https://yourdomain.com/llms.txt in your browser. You should see the raw Markdown content displayed. If you see a 404 error, the file isn't in the right location. If you see an HTML page (like your homepage), your web server is redirecting the request instead of serving the file directly.
Step 5: Deploy agent.json to Your Domain Root
Your agent.json file must be accessible at https://yourdomain.com/agent.json. The deployment process is identical to llms.txt — upload to your public root directory.
The file should be served with a application/json content type. Most web servers handle this automatically for .json files.
Verification: Open https://yourdomain.com/agent.json in your browser. You should see formatted JSON. You can also validate the JSON syntax at jsonlint.com to ensure there are no parsing errors that would prevent AI systems from reading it.
Step 6: Update Your robots.txt to Reference llms.txt
While AI systems will discover llms.txt at your domain root by convention, you can explicitly reference it in your robots.txt to ensure maximum discoverability. Add the following line to your robots.txt:
# AI Instructions
Llms-txt: /llms.txtAdditionally, confirm that your robots.txt allows AI crawlers to access your site. Check for these user agents and ensure they are not blocked:
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: Google-Extended
Allow: /If your robots.txt has a blanket Disallow: / for unknown user agents, AI crawlers may be blocked by default. The robots.txt and AI crawlers guide covers this in detail.
Step 7: Verify Deployment
After deploying both files and updating robots.txt, run through this verification checklist:
| Check | How to Verify | Expected Result |
|---|---|---|
llms.txt accessible | Visit https://yourdomain.com/llms.txt | Raw Markdown content displayed |
agent.json accessible | Visit https://yourdomain.com/agent.json | Valid JSON displayed |
agent.json valid JSON | Paste contents into jsonlint.com | "Valid JSON" confirmation |
robots.txt references llms.txt | Visit https://yourdomain.com/robots.txt | Contains Llms-txt: /llms.txt line |
| AI crawlers not blocked | Check robots.txt for GPTBot, ClaudeBot directives | No Disallow rules blocking AI agents |
| Re-audit shows improvement | Run a new Skillaeo audit | Technical readiness score increases |
The most definitive verification is re-running your Skillaeo audit. Your technical readiness score should increase significantly — deployments of both files typically improve the overall AI Visibility Score by 10–20 points.
Platform-Specific Deployment Guides
Next.js
In a Next.js project, place both files in your public/ directory:
public/
├── llms.txt
├── agent.json
└── robots.txtNext.js serves everything in public/ at the domain root automatically. After placing the files:
- Add
llms.txtandagent.jsontopublic/ - Update
public/robots.txtto include theLlms-txt: /llms.txtdirective - Deploy your Next.js app as usual (Vercel, Netlify, or your hosting provider)
- Verify at
https://yourdomain.com/llms.txtandhttps://yourdomain.com/agent.json
If you're generating robots.txt dynamically via a route handler, add the Llms-txt directive to your generation logic.
WordPress
WordPress requires file placement in your site's root directory on the server:
- Access your site via FTP, SFTP, or your hosting provider's file manager
- Navigate to your WordPress root directory (where
wp-config.phplives) - Upload
llms.txtandagent.jsonto this directory - For
robots.txt: If WordPress generates it dynamically, use a plugin like Yoast SEO or Rank Math to add theLlms-txt: /llms.txtdirective. If you have a staticrobots.txt, edit it directly.
Managed WordPress hosts (WP Engine, Kinsta, Flywheel): Use the file manager in your hosting dashboard or deploy via Git if supported.
Shopify
Shopify doesn't allow direct file uploads to the domain root for arbitrary file types. Use these workarounds:
For llms.txt:
- Create a new page in Shopify Admin → Pages
- Use the URL handle
llms.txt(if Shopify allows it) or create a redirect - Alternatively, use a Shopify app that supports custom file hosting at the root, or configure a CDN/proxy to serve the file
For agent.json:
- Upload the file as a static asset through Shopify's Content → Files section
- Create a redirect from
/agent.jsonto the hosted file URL - Alternatively, use a Cloudflare Worker or similar edge function to serve the file at the root path
For robots.txt:
Shopify generates robots.txt automatically. Use the robots.txt.liquid template in your theme to add custom directives:
Llms-txt: /llms.txtStatic Sites (HTML, Hugo, Jekyll, Eleventy)
For static site generators, place both files in your source directory that maps to the build output root:
| Generator | Place Files In | Build Output |
|---|---|---|
| Plain HTML | Root directory alongside index.html | Served directly |
| Hugo | static/ directory | Copied to build root |
| Jekyll | Root directory of your Jekyll project | Copied to _site/ root |
| Eleventy | Your configured input directory | Copied to output root |
| Gatsby | static/ directory | Copied to public/ root |
After building and deploying, verify both files are accessible at their expected URLs.
Frequently Asked Questions
How often should I regenerate my Skills Pack?
Regenerate your Skills Pack when your brand, product, or content changes significantly — for example, after a rebrand, product launch, new feature release, or major content restructuring. For most sites, reviewing and updating the files quarterly is sufficient. If your product evolves rapidly, monthly updates keep your AI representation current.
Can I edit the generated files manually after download?
Absolutely. The generated files are plain text (Markdown and JSON) that you can edit in any text editor. Skillaeo's generator provides a strong starting point, but you should customize descriptions, add missing pages or features, and ensure accuracy before deploying. The llms.txt complete guide and agent.json guide detail best practices for manual editing.
What happens if I deploy an llms.txt with errors?
A malformed llms.txt won't break your site, but it may confuse AI systems. Common issues include broken links (pointing to pages that don't exist), outdated descriptions, and missing required sections. AI systems that encounter a poorly formatted file may fall back to scraping your HTML directly, which defeats the purpose. Always validate links and descriptions before deploying.
Do I need both files, or is one enough?
Both files serve different audiences within the AI ecosystem. llms.txt targets conversational AI assistants (ChatGPT, Claude, Perplexity), while agent.json targets autonomous AI agents. Deploying both maximizes your coverage. If you must prioritize, start with llms.txt — it has broader adoption and is recognized by more AI systems today.
Will deploying these files affect my traditional SEO?
No. llms.txt and agent.json are separate files at your domain root that traditional search engine crawlers (Googlebot, Bingbot) ignore for ranking purposes. They don't interfere with your existing HTML, meta tags, Schema markup, or sitemap. In fact, the content you add to these files can complement your SEO by reinforcing your brand's topical authority signals.
Conclusion
Deploying your Skills Pack is the most impactful technical change you can make for AI visibility. In a single deployment session, you go from having zero structured communication with AI systems to providing both narrative context (llms.txt) and machine-readable metadata (agent.json) that covers the full spectrum of AI assistants and agents.
The process is straightforward: run your Skillaeo audit, generate your Skills Pack, review and customize the files, deploy to your domain root, update robots.txt, and verify. Then re-audit to confirm the score improvement. For most sites, this single action accounts for the largest jump in their AEO score and moves the technical readiness category from the low range to solid coverage.
Ready to generate your Skills Pack? Run your free AEO audit at skillaeo.com/audit and generate your customized llms.txt and agent.json in minutes.
