← Back to tools
robots.txt Generator
Generate a robots.txt file to control how search engines crawl your website. Add rules for specific bots, set allowed/disallowed paths, crawl delays, and sitemaps.
Presets
/Seconds between requests. Not supported by all bots.
robots.txt Preview
1 rule# robots.txt # Generated by DevBolt robots.txt Generator # https://devbolt.dev/tools/robots-generator User-agent: * Allow: /
About robots.txt
- robots.txt is a text file placed at the root of your website that tells search engine crawlers which pages they can or cannot access.
User-agentspecifies which bot the rules apply to.*means all bots.Disallowblocks a path from crawling.Allowoverrides a disallow for a more specific path.Crawl-delaysets seconds between requests (supported by Bing, Yandex; ignored by Google).Sitemapdirectives help crawlers discover your sitemap. Use full URLs.- robots.txt is advisory — well-behaved bots follow it, but it does not enforce access control. Use authentication for truly private content.
- Everything runs in your browser — no data is sent over the network.