How to use
Use this robots.txt checker to test a single URL/path against your rules.
- Paste your
robots.txtcontent into the box. - Enter the user-agent you want to simulate (example:
Googlebotor*). - Enter a full URL (recommended) or a path like
/products?ref=ad. - Click Check to see ALLOWED/BLOCKED and the winning rule.
FAQ
Is this robots checker really online?
The page is online, but the checking runs offline in your browser — it doesn’t fetch or call any URLs.
Can I paste a full URL instead of a path?
Yes. The tool extracts pathname + query (for example /page?x=1) and matches rules against that.
How does it choose which User-agent group applies?
It picks the matching group with the most specific user-agent token (longest match). If tied, the first one in the file wins.
If both Allow and Disallow match, which one wins?
The most specific pattern wins (longest path pattern). If specificity ties, Allow wins.
Does it support wildcards (*) and the $ end anchor?
Yes: * matches any sequence and a trailing $ anchors the end of the URL/path.
What does an empty Disallow mean?
Disallow: with an empty value means nothing is blocked for that group (effectively allow all).
Does robots.txt block indexing?
Robots rules control crawling, not guaranteed indexing. A URL can still appear in search if discovered elsewhere.