Robots.txt Testing and Analysis Tool

Easily analyze, edit, and test robots.txt files to ensure search engines access the right pages and optimize crawlability

Configuration and Analysis
Enter the website URL to download the robots.txt file (e.g. https://example.com)
URL to test access according to robots.txt

Features

Automatically download and analyze
Automatically download robots.txt from website and analyze in detail
Test Access Rights
Simulate crawl bots to check if URLs are blocked
Syntax Validation
Detect syntax errors and warnings in robots.txt
Detailed Stats
Show comprehensive stats about directives

Quick Guide

1 Enter website URL or robots.txt content
2 Select User-Agent and test URL (optional)
3 Click 'Analyze' to test
4 View results and optimization suggestions

Supported Bots support

Googlebot
Bingbot
Yahoo! Slurp
DuckDuckBot
Facebook Bot
Twitter Bot

Use Case

Block Folder
Prevent bots from accessing sensitive folders
Disallow: /admin/
Allow Access
Explicitly allow bots to access specific folders
Allow: /public/
Declare Sitemap
Guide bots to find the sitemap of the website
Sitemap: /sitemap.xml