Blog
Custom Print on Demand Apparel — Free Storefront for Your Business
Wild & Free Tools

Best Robots.txt for WordPress — Setup Guide (2026)

Last updated: April 20267 min readSEO Tools

WordPress generates a basic robots.txt automatically, but it is too minimal for most sites. Here is the recommended configuration, how to edit it with any setup, and the specific rules WooCommerce stores need.

The Recommended WordPress Robots.txt

This configuration works for the majority of WordPress sites — blogs, business sites, portfolios, and small e-commerce:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Disallow: /wp-includes/
Disallow: /cgi-bin/
Disallow: /trackback/
Disallow: /xmlrpc.php
Disallow: /?s=
Disallow: /search/
Disallow: /*?replytocom=
Disallow: /tag/*/page/
Disallow: /category/*/page/

Sitemap: https://yoursite.com/sitemap_index.xml

What Each Rule Does

RulePurposeWhy It Matters
Disallow: /wp-admin/Blocks the WordPress admin dashboardCrawlers should not index login screens and admin pages
Allow: /wp-admin/admin-ajax.phpAllows AJAX requests used by themes and pluginsMany front-end features break if this is blocked — forms, search, dynamic content
Disallow: /wp-includes/Blocks WordPress core filesCore PHP files are not useful for search engines to index
Disallow: /cgi-bin/Blocks server scripts directoryStandard security practice — no user-facing content here
Disallow: /trackback/Blocks trackback URLsTrackbacks are outdated and a spam vector — no reason to crawl them
Disallow: /xmlrpc.phpBlocks XML-RPC endpointUsed for pingbacks and remote access — also a common attack target
Disallow: /?s=Blocks search result pagesInternal search results are thin content — let Google index your real pages instead
Disallow: /*?replytocom=Blocks comment reply URLsPrevents duplicate content from threaded comment links
Disallow: /tag/*/page/Blocks paginated tag archivesSaves crawl budget — page 2, 3, 4 of tag archives add little value
Disallow: /category/*/page/Blocks paginated category archivesSame as tags — paginated archives waste crawl budget
Sitemap: ...Points crawlers to your sitemapEssential — tells crawlers exactly where to find all your content

Three Ways to Edit Robots.txt in WordPress

Option 1: Yoast SEO

  1. Go to Yoast SEO → Tools → File editor
  2. If no robots.txt exists, click "Create robots.txt file"
  3. Paste your rules and click Save
  4. Visit yoursite.com/robots.txt to verify

Note: if the File editor tab is missing, your hosting provider may have disabled file editing. Use FTP instead.

Option 2: Rank Math

  1. Go to Rank Math → General Settings → Edit robots.txt
  2. Toggle the edit option on
  3. Paste your rules and save
  4. Rank Math serves the file dynamically from the database

Option 3: FTP / File Manager

  1. Connect to your server via FTP (FileZilla, Cyberduck) or your host's File Manager
  2. Navigate to the root WordPress directory (where wp-config.php lives)
  3. Create or edit robots.txt — a plain text file, no special encoding
  4. Upload and verify at yoursite.com/robots.txt

WooCommerce Additions

If you run WooCommerce, add these rules to block pages that contain session-specific or private data:

# WooCommerce specific
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Disallow: /wishlist/
Disallow: /*?add-to-cart=*
Disallow: /*?orderby=*
Disallow: /*?filter_*

These rules block cart pages, checkout flows, account dashboards, and product filter URLs from being crawled. Product pages and category pages remain fully accessible to search engines.

Common WordPress Robots.txt Mistakes

MistakeWhy It Is HarmfulFix
Blocking /wp-content/Prevents Google from loading CSS, JS, and images — your pages render as brokenRemove the Disallow: /wp-content/ rule entirely
No Sitemap directiveCrawlers rely on the Sitemap line to discover content efficientlyAdd Sitemap: https://yoursite.com/sitemap_index.xml
Blocking /feed/Prevents RSS syndication and content discoveryRemove unless you have a specific reason to block feeds
Blocking entire /wp-content/uploads/Your images disappear from Google Image SearchNever block uploads — your media lives here
Using Disallow to hide pages from searchDisallow prevents crawling but not indexing — pages can still appear in search resultsUse noindex meta tag (via Yoast or Rank Math) to remove pages from search results
Not testing after changesA single typo can block your entire siteAlways visit yoursite.com/robots.txt after editing and check Google Search Console

SEO Tools for WordPress Sites

Generate the right robots.txt for your WordPress site — paste it in and go.

Open Robots.txt Generator
Launch Your Own Clothing Brand — No Inventory, No Risk