Tests specific URL +
Tests specific URL + user-agent combinations against robots.txt rules.
500+ fast, free tools. Most run in your browser only; Image & PDF tools upload files to the backend when you run them.
Test whether a specific URL is allowed or blocked for a given user-agent according to a robots.txt file.
The Robots.txt Tester lets you paste any robots.txt content and test whether specific URLs are allowed or disallowed for a given user-agent (e.g. Googlebot, Bingbot, or *). The tool parses the robots.txt directives — including Allow, Disallow, wildcard patterns, and Crawl-delay — and evaluates whether the target URL matches any rule for the specified agent, following the Google robots.txt specification for precedence and pattern matching. This is an essential tool for SEO audits, diagnosing crawl blocks, verifying sitemap inclusion, and ensuring that important pages are not accidentally blocked from search engine spiders.
Tests specific URL + user-agent combinations against robots.txt rules.
Shows which specific rule matched and why the URL is allowed or blocked.
Handles Allow/Disallow precedence and wildcard (* and $) patterns correctly.
Instant result — no server query needed, works with any robots.txt content.
Tests specific URL + user-agent combinations against robots.txt rules instantly.
Handles Allow/Disallow precedence and wildcard (* and $) patterns correctly per RFC 9309.
Input: User-agent: * Disallow: /admin/ Disallow: /private/ Allow: / Sitemap: https://example.com/sitemap.xml Test URL: /blog/post-1 User-agent: Googlebot
Output: Result: ALLOWED ✓ Matched rule: Allow: / (line 4) Reason: No specific Disallow matches /blog/post-1; explicit Allow grants access.
Input: User-agent: GPTBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: * Allow: / Test URL: /article/example User-agent: GPTBot
Output: Result: BLOCKED ✗ Matched rule: Disallow: / (line 2) Reason: GPTBot has its own group with Disallow: / which blocks all paths.
Input: User-agent: * Disallow: /*.pdf$ Allow: / Test URL: /files/report.pdf User-agent: *
Output: Result: BLOCKED ✗ Matched rule: Disallow: /*.pdf$ (line 2) Reason: URL ends in .pdf and matches the wildcard pattern.