Free Robots.txt Tester
Paste your robots.txt, choose a user-agent, enter a path or full URL, and instantly see whether the crawler is allowed or blocked.
How the result is decided
- The tester finds the most relevant User-agent group.
- It compares Allow and Disallow rules against the tested path.
- The longest matching rule wins.
- If match lengths are equal, Allow takes precedence.
Best practices
Use robots.txt to guide crawlers, not to protect sensitive data. Private URLs should still be secured at the server or application layer. Also remember that robots.txt controls crawling, not indexing in every case.
FAQ
What does a robots.txt tester do?
A robots.txt tester checks whether a given crawler is allowed or blocked from crawling a URL path based on your robots.txt rules.
Does this tool upload my robots.txt?
No. This tool works in your browser and does not require signup or server-side file storage.
Can I test a full URL instead of only a path?
Yes. You can paste a full URL and the tool will normalize it to the path used for robots rule matching.
What happens if no rule matches?
If no Allow or Disallow rule matches the tested path in the selected crawler group, the result is treated as allowed by default.