Skip to tool
FeuTex · free tools runs in-browser no bloat built by LiMiT

Online Robots.txt Generator

Create a valid robots.txt file in seconds. Add user-agent groups, allow/disallow paths, crawl-delay, optional host, and sitemap URLs—then copy or download the result.

Category: SEO · URL: /tools/online-robots-txt-generator.html
Quick builder (optional)
Use these to insert common lines; you can still edit the main rules input below.
Rules input
Supported directives: User-agent, Disallow, Allow, Crawl-delay, Host, Sitemap. Lines starting with # are ignored.
robots.txt output
Output is normalized and compact. Sitemap lines are deduplicated and placed at the bottom.
Privacy: runs locally in your browser. No uploads, no tracking scripts.

How to use

Use the input box to describe your rules, then generate your final robots.txt.

  1. Start each group with one or more User-agent: lines.
  2. Add rules like Disallow:, Allow:, and Crawl-delay: under that group.
  3. Add Sitemap: (and optional Host:) anywhere—this tool places them at the bottom and deduplicates sitemap lines.
  4. Click Generate, then copy or download the output.
Keywords this page targets (natural cluster): online robots txt generator, robots.txt generator, create robots.txt online, generate robots.txt for website, robots.txt allow disallow generator, robots.txt sitemap generator, robots.txt for googlebot, block all bots robots.txt, allow all bots robots.txt, robots.txt crawl delay, robots.txt multiple user agent groups, robots.txt wildcard rules, robots.txt for wordpress, robots.txt for shopify, robots.txt syntax checker, robots.txt template generator, robots.txt host directive, robots.txt example for seo, disallow admin robots.txt, robots.txt file download
Secondary intents covered: Generate a basic allow-all robots.txt, Block crawling sitewide for staging or dev, Create separate rules for Googlebot and other bots, Add one or more sitemap URLs correctly, Validate robots.txt directive formatting and group structure, Export robots.txt as a downloadable file, Quickly build rules for admin/private folders, Fix common mistakes like rules without a User-agent

FAQ

What does robots.txt do?

It tells crawlers which paths they may or may not fetch on your site; it doesn’t enforce security.

How do I allow everything?

Use User-agent: * and an empty Disallow: line.

How do I block the whole site from crawling?

Use User-agent: * and Disallow: /.

Can I add multiple sitemap URLs?

Yes—add multiple Sitemap: lines; this tool deduplicates identical URLs.

Do I need a separate group for Googlebot?

Only if you want different rules; otherwise User-agent: * applies to most crawlers.

Where should Sitemap lines go in robots.txt?

Anywhere is valid, but most sites place them at the bottom for readability.

Is the Host directive required?

No—Host: is not part of the official standard and is ignored by many bots.