How to use
Tip: This tool is offline. It does not fetch your live robots.txt—paste it in.
- Paste your
robots.txtinto the box. - Choose a Google crawler (default: Googlebot).
- Enter a full URL (recommended) or just a path like
/products?ref=ad. - Click Check Google Access to see ALLOWED/BLOCKED and the winning rule.
FAQ
Does this tool fetch my live robots.txt from Google?
No. It runs offline and only analyzes the robots.txt text you paste in.
What user-agent should I use for Google?
Start with Googlebot. If you're troubleshooting images or ads, test Googlebot-Image or AdsBot-Google too.
How does it decide between Allow and Disallow?
It applies longest-match wins. If there’s a tie, Allow wins.
Do wildcards (*) and $ end anchors work?
Yes. * matches any string and $ anchors the end of the URL path/query being tested.
If there is no matching User-agent group, what happens?
The URL is treated as allowed because there are no applicable rules for that crawler.
Does robots.txt remove pages from Google (noindex)?
No. robots.txt controls crawling. For removal/indexing control, use meta robots, HTTP headers, or removals in Search Console.
Should I include query parameters in the test URL?
If your rules target parameters (or you want to be safe), include them. This checker matches against the path plus the query string.