Guides
Step-by-step how-tos for creating, editing, and testing robots.txt files across platforms.
Step-by-step instructions for creating, editing, and testing robots.txt files. Whether you're working with WordPress, Shopify, or a custom site, these guides walk you through the process from start to finish.
For a comprehensive overview, see our The Complete Robots.txt Guide.
How to Add Your Sitemap to robots.txt
Add your sitemap URL to robots.txt so search engines can find it. Syntax, multiple sitemaps, and common mistakes to avoid.
Read moreHow to Block AI Crawlers with robots.txt
Block AI training crawlers like GPTBot, ClaudeBot, and PerplexityBot using robots.txt. Complete list of AI user agents and copy-paste rules.
Read moreHow to Check Any Website's robots.txt File
How to find and check any website's robots.txt file. View directives, verify accessibility, and check for common configuration mistakes.
Read moreHow to Create a robots.txt File
Step-by-step guide to creating a robots.txt file. Learn the syntax, write directives for search engines, and deploy your file correctly.
Read moreHow to Edit robots.txt in Shopify
How to customize your Shopify store's robots.txt using the robots.txt.liquid template. Default rules, common customizations, and gotchas.
Read moreHow to Edit robots.txt in WordPress
Edit your WordPress robots.txt file using Yoast SEO, Rank Math, or direct file editing. Step-by-step instructions for each method.
Read moreHow to Fix "Blocked by robots.txt" Errors
Fix 'blocked by robots.txt' errors in Google Search Console. Diagnose Disallow rules, update your robots.txt, and get your pages indexed.
Read moreHow to Read and Understand robots.txt
Learn to read any robots.txt file. Understand User-agent, Disallow, Allow, Sitemap, and Crawl-delay directives with real-world examples.
Read moreHow to Test and Validate Your robots.txt
Four ways to test your robots.txt file: online validators, Google Search Console, command-line tools, and dedicated testing services.
Read more