Blog
Wild & Free Tools

How to Include Your Sitemap in Robots.txt (and Why You Should)

Last updated: April 2026 5 min read

Table of Contents

  1. The correct syntax
  2. Is it required?
  3. Sitemap index vs individual sitemaps
  4. Where to put the sitemap line
  5. Check and validate
  6. Frequently Asked Questions

Your sitemap tells search engines which URLs exist on your site. Your robots.txt tells them what they can and can't crawl. Putting the sitemap URL inside robots.txt connects both files — Googlebot finds your sitemap the moment it crawls your robots.txt, which it always does first.

The Correct Way to Add a Sitemap to Robots.txt

Add the Sitemap directive at the bottom of your robots.txt file. It goes outside of any User-agent blocks — it's a global directive, not agent-specific:

User-agent: *
Disallow: /admin/
Disallow: /account/

Sitemap: https://yoursite.com/sitemap.xml

The Sitemap line requires a full absolute URL — not a relative path. Sitemap: /sitemap.xml won't work; it must be Sitemap: https://yoursite.com/sitemap.xml. You can include multiple Sitemap lines if you have more than one sitemap file:

Sitemap: https://yoursite.com/sitemap.xml
Sitemap: https://yoursite.com/blog-sitemap.xml
Sitemap: https://yoursite.com/products-sitemap.xml

Is the Sitemap Directive Required in Robots.txt?

No — but it's a strong best practice. Google can discover your sitemap three ways: through robots.txt, through Search Console submission, or by stumbling on the URL during crawling. Using all three gives you the best coverage.

The robots.txt method is particularly valuable for new sites. Googlebot crawls robots.txt as one of its first actions when it discovers a domain. If your sitemap is listed there, it learns about your URL structure immediately — before it's had time to crawl much of your content.

For established sites with active Search Console accounts, the difference is smaller. But there's no downside to including it, and it takes one line.

Sell Custom Apparel — We Handle Printing & Free Shipping

Sitemap Index vs. Individual Sitemaps — Which URL to Use

If your site has a sitemap index file (a sitemap that lists other sitemaps), point the robots.txt Sitemap directive at the index URL, not individual sitemaps. Google follows the index and discovers everything underneath it:

Sitemap: https://yoursite.com/sitemap_index.xml

If you have separate sitemaps for different content types (blog, products, pages) without a sitemap index, list each one on its own Sitemap line. There's no hard limit on how many you can include.

One thing to verify: make sure the sitemap URL in your robots.txt is actually accessible. A 404 on the sitemap URL is worse than not listing it at all — it generates crawl errors and wastes Googlebot's time.

Where in the Robots.txt File Should the Sitemap Go?

Convention is to put Sitemap directives at the very bottom of the file, after all User-agent blocks. Technically the spec doesn't require a specific position — the Sitemap directive is parsed independently — but bottom placement is the most widely adopted standard and least likely to cause confusion.

Don't put it inside a User-agent block. This won't break anything in most parsers, but it's incorrect usage and some strict parsers may misread it:

User-agent: *
Disallow: /admin/
Sitemap: https://yoursite.com/sitemap.xml   <- Wrong position

Sitemap: https://yoursite.com/sitemap.xml   <- Correct: after all blocks

How to Verify Your Sitemap Is Being Found

After adding the Sitemap line, verify it in two places. First, load your robots.txt URL directly in a browser and confirm the line appears. Second, open Google Search Console — go to Sitemaps and check whether your sitemap is listed and returning a green status (no errors).

If you're seeing "couldn't fetch" or "has errors" on your sitemap in Search Console, fix those issues first. A broken sitemap URL in robots.txt creates errors every time Googlebot reads your robots.txt file.

You can also test your full robots.txt syntax — including the Sitemap lines — using any robots.txt validator before deploying changes.

Try It Free — No Signup Required

Runs 100% in your browser. No data is collected, stored, or sent anywhere.

Open Free Robots.txt Generator

Frequently Asked Questions

Can I include a sitemap from a subdomain in my main robots.txt?

Yes. The Sitemap directive accepts any valid absolute URL. If your sitemap lives at blog.yoursite.com/sitemap.xml, list it as that full URL.

Does adding a sitemap to robots.txt automatically get it indexed?

It helps Googlebot discover it, but discovery and indexing are separate. Your sitemap tells Google which URLs exist; indexing depends on content quality, crawl budget, and other ranking factors.

What if my sitemap URL changes?

Update the robots.txt line immediately. Also update your Search Console sitemap submission. Old sitemap URLs that return 404 generate crawl errors.

Should I include image or video sitemaps in robots.txt too?

Yes, if you have them. List each one on a separate Sitemap line. Image and video sitemaps help Google discover rich media content.

Is the Sitemap directive supported by all search engines?

Google, Bing, and Yandex all support it. It's part of the robots.txt extension spec established by Google. Some smaller crawlers may ignore it, but submitting via each engine's search console covers you.

Launch Your Own Clothing Brand — No Inventory, No Risk