Skip to tool
FeuTex · free tools runs in-browser no bloat built by LiMiT

Robots Checker Google

Check whether Googlebot can crawl a specific URL based on your robots.txt. Paste your robots.txt content, choose a Google crawler, enter a URL (or path), and get an ALLOWED/BLOCKED verdict with the exact rule that matched.

Category: SEO · URL: /tools/robots-checker-google.html
Supports * wildcards and $ end anchors. Uses longest-match; ties go to Allow.
Result
No network calls • Runs in your browser

    
Privacy: runs locally in your browser. No uploads, no tracking scripts.

How to use

Tip: This tool is offline. It does not fetch your live robots.txt—paste it in.

  1. Paste your robots.txt into the box.
  2. Choose a Google crawler (default: Googlebot).
  3. Enter a full URL (recommended) or just a path like /products?ref=ad.
  4. Click Check Google Access to see ALLOWED/BLOCKED and the winning rule.
Keywords this page targets (natural cluster): robots checker google, google robots checker, googlebot robots.txt tester, test robots.txt for google, is googlebot blocked, robots.txt allow disallow checker, robots.txt wildcard tester, robots.txt $ end of line match, googlebot access checker, robots.txt url checker, verify google crawl permissions, check robots.txt for a page, robots.txt rule precedence allow vs disallow, googlebot-image robots.txt checker, adsbot-google robots.txt checker, mediapartners-google robots.txt tester, robots.txt matching tool, robots.txt group selection user-agent, why google not indexing robots blocked, robots.txt validator offline
Secondary intents covered: Confirm whether a specific page is blocked for Googlebot, Find which Allow/Disallow rule is actually winning, Test wildcard (*) and end-anchor ($) patterns against a URL, Check which user-agent group Googlebot will follow, Troubleshoot Google indexing issues caused by robots.txt, Verify that important sections (e.g., /blog/) are crawlable, Audit robots.txt for common mistakes (empty groups, unknown directives), Quickly copy a result summary to share with a dev/SEO teammate

FAQ

Does this tool fetch my live robots.txt from Google?

No. It runs offline and only analyzes the robots.txt text you paste in.

What user-agent should I use for Google?

Start with Googlebot. If you're troubleshooting images or ads, test Googlebot-Image or AdsBot-Google too.

How does it decide between Allow and Disallow?

It applies longest-match wins. If there’s a tie, Allow wins.

Do wildcards (*) and $ end anchors work?

Yes. * matches any string and $ anchors the end of the URL path/query being tested.

If there is no matching User-agent group, what happens?

The URL is treated as allowed because there are no applicable rules for that crawler.

Does robots.txt remove pages from Google (noindex)?

No. robots.txt controls crawling. For removal/indexing control, use meta robots, HTTP headers, or removals in Search Console.

Should I include query parameters in the test URL?

If your rules target parameters (or you want to be safe), include them. This checker matches against the path plus the query string.