Vibe Coding with Replit -- May 9, 2026

How to Add a Sitemap and Robots.txt to My Replit Site

By Arjita SethiMay 9, 20265 min read
Direct Answer

Add a sitemap.xml and robots.txt to your Replit Express app by asking Claude to create Express routes that serve them dynamically. The sitemap route generates an XML list of all your pages with their URLs and last-modified dates. The robots.txt route serves a plain text file that tells crawlers they can access all pages and where to find the sitemap. Both are essential for AEO -- AI engines use sitemaps to discover and crawl your content.

What a Sitemap Does

A sitemap is an XML file that lists every page on your site with its URL and optionally its last-modified date and update frequency. Search engines and AI crawlers use it to discover all your content -- particularly pages that might not be linked from your homepage. Without a sitemap, pages that are not linked from other pages may never be crawled and cited.

For a content-heavy site with many blog posts, the sitemap is critical for AEO. Every new blog post needs to appear in the sitemap so AI engines can find and index it.

The AEO value of a sitemap: AI engines crawl your sitemap to build their knowledge of what your site covers. A complete, accurate sitemap with all your blog posts is the fastest way to ensure all your AEO content gets indexed and available for citation.

How to Add Both to Your Replit App

Prompt Claude: "Add a dynamic sitemap.xml and robots.txt to my Replit Express app. The sitemap should include all static routes and all blog post routes with today's date as lastmod. The robots.txt should allow all crawlers to access all pages and include the sitemap URL. Both should be served as Express GET routes at /sitemap.xml and /robots.txt."

Keeping Your Sitemap Current

As you add new blog posts, the sitemap should update automatically. Ask Claude to generate the sitemap dynamically from your posts array rather than hardcoding each URL. This way, every time you add a post to Blog.tsx or your posts configuration, it automatically appears in the sitemap.

Frequently Asked Questions

How do I add a sitemap to my Replit site?
Ask Claude to create an Express GET route at /sitemap.xml that generates an XML list of all your pages dynamically. For blog sites, it should generate from your posts array so new posts appear automatically.
What is robots.txt and why do I need it?
Robots.txt tells search engines and AI crawlers which pages they can and cannot crawl. Include the sitemap URL in your robots.txt so crawlers know where to find your full content list.
How do I submit my sitemap to Google?
Go to Google Search Console, add your property (your site URL), and submit your sitemap URL under the Sitemaps section. Google will crawl it and index your pages.
Do AI engines use sitemaps?
Yes. AI engines that crawl the web for their knowledge base use sitemaps to discover content. A complete, accurate sitemap with all your AEO-optimized content is an important signal for ensuring your content gets indexed for citation.
Should I include every page in my sitemap?
Include all pages you want crawled: homepage, blog posts, main product pages, and key landing pages. Exclude admin pages, private pages, and any pages behind authentication.
Build With AI

Build an AEO-Ready Site

The full AEO implementation guide is inside the Build with AI Vault.

Access the Vault