How do I generate a robots.txt file online?
Select which bots to allow or block — Googlebot, Bingbot, AI crawlers (GPTBot, ClaudeBot, CCBot), and more. Add custom allow/disallow paths, specify sitemaps, and set crawl delays. Choose from presets like allow all, block all, or block AI bots. Copy or download the result. Everything runs in your browser.
Allow: all pages Disallow: /admin, /api Sitemap: https://example.com/sitemap.xml
User-agent: * Allow: / Disallow: /admin/ Disallow: /api/ Sitemap: https://example.com/sitemap.xml
robots.txt Generator
Generate a robots.txt file to control how search engines crawl your website. Add rules for specific bots, set allowed/disallowed paths, crawl delays, and sitemaps.
Presets
/Seconds between requests. Not supported by all bots.
robots.txt Preview
1 rule# robots.txt # Generated by DevBolt robots.txt Generator # https://devbolt.dev/tools/robots-generator User-agent: * Allow: /
About robots.txt
- robots.txt is a text file placed at the root of your website that tells search engine crawlers which pages they can or cannot access.
User-agentspecifies which bot the rules apply to.*means all bots.Disallowblocks a path from crawling.Allowoverrides a disallow for a more specific path.Crawl-delaysets seconds between requests (supported by Bing, Yandex; ignored by Google).Sitemapdirectives help crawlers discover your sitemap. Use full URLs.- robots.txt is advisory — well-behaved bots follow it, but it does not enforce access control. Use authentication for truly private content.
- Everything runs in your browser — no data is sent over the network.
Tips & Best Practices
Block AI crawlers separately from search engine bots
GPTBot, CCBot, Google-Extended, and anthropic-ai are separate from Googlebot. You can allow search indexing while blocking AI training crawlers. Add specific Disallow rules for each AI bot you want to exclude.
robots.txt is advisory, not enforceable
robots.txt is a gentleman's agreement — well-behaved crawlers respect it, but malicious scrapers ignore it completely. Don't rely on robots.txt for security. Use authentication, rate limiting, and IP blocking for actual access control.
Always include your sitemap URL in robots.txt
Add `Sitemap: https://yourdomain.com/sitemap.xml` at the bottom of robots.txt. This helps search engines discover all your pages faster, even if they're not well-linked internally. It's the single most impactful line in the file.
Don't expose sensitive paths by listing them in Disallow rules
Adding `Disallow: /admin` or `Disallow: /internal-api` to robots.txt tells every attacker exactly where your admin panel and internal APIs live. robots.txt is public. Secure sensitive paths with authentication, not crawl directives.
Frequently Asked Questions
How do I create a robots.txt file for my website?
Should I block AI crawlers in robots.txt?
Does robots.txt block pages from appearing in Google search results?
Related Generate Tools
HTTP Request Builder
Build HTTP requests visually and generate code in cURL, JavaScript, Python, Go, Rust, and PHP — lightweight Postman/ReqBin alternative
HTML Table Generator
Build HTML tables visually with an interactive editor — add rows, columns, header rows, captions, and styling. Export as plain HTML, inline CSS, or Tailwind classes
CSS Clip-path Generator
Create CSS clip-path shapes visually — circle, ellipse, inset, or polygon with draggable points. 13 shape presets, interactive preview, and production-ready CSS output
CSS Filter Generator
Build CSS filter effects visually — blur, brightness, contrast, grayscale, hue-rotate, invert, opacity, saturate, sepia, and drop-shadow with 12 presets and live preview