Check if Googlebot can crawl a path
Paste robots.txt, enter Googlebot as the user agent, and test a path such as /tools/json-formatter.
Paste robots.txt rules, choose a user agent, and test whether a URL path is allowed or blocked.
Allowed
The most specific matching rule is Allow.
Allow: /private/preview/
Allow: /private/preview/
Disallow: /private/
Continue with nearby tools in the same workflow category.
Robots.txt tells crawlers which paths they may request. Testing a path against the rules can reveal broad blocks, special bot rules, and exceptions before search crawlers find them.
This tester uses the most specific matching rule as a practical model. Search engines may have implementation details beyond this lightweight checker.
Paste robots.txt, enter Googlebot as the user agent, and test a path such as /tools/json-formatter.
Compare Allow and Disallow rules to see which matching directive controls a URL path.
Robots.txt controls crawling, not indexing directly. A blocked URL can still appear in search if it is discovered elsewhere.
Search engines generally use the most specific matching rule. This tester follows that practical rule of thumb.
Yes. Change the user agent field and run the same path against another bot name.
Use these nearby utilities when the current task is part of a larger debugging, writing, security, or site maintenance workflow.