Robots.txt playbook
Master technical SEO with the Robots.txt Generator
The Robots.txt Generator gives you full control over how search engines and AI crawlers interact with your content. Use it to launch clean builds, protect staging environments, and keep critical assets discoverable.
What does robots.txt control?
A robots.txt file sits at the root of your domain and tells crawlers which parts of your site they can request. Search engines read it before fetching any URL, making it the first line of defense for staging folders, duplicate archives, and experimental content.
Each rule is grouped by User-agent. Within a group you can declare Disallow paths to block crawling, Allow directives to override blocks, and optional Crawl-delay values to slow bots down. Finally, Sitemap directives point crawlers to your XML sitemap files for rapid discovery.