Validate your robots.txt file for syntax errors, test if URLs are blocked for specific user-agents, and identify common issues that could affect your SEO.
Test if a specific URL path would be blocked for a user-agent based on the robots.txt rules above.
A malformed robots.txt file can have serious consequences for your website's visibility in search engines. Even small syntax errors can cause crawlers to misinterpret your rules or ignore them entirely.
Every robots.txt must have at least one User-agent directive. Rules without a preceding User-agent are invalid.
Directives must follow the format "Directive: value" with a colon and space. Misspellings are treated as unknown directives.
Using "Disallow: /" for all user-agents blocks search engines from your entire site. Make sure this is intentional.
Sitemap URLs must be absolute URLs (starting with http:// or https://). Relative paths won't work.
An empty Disallow: directive (with nothing after the colon) means "disallow nothing" - effectively allowing access to all URLs. This is often used intentionally to grant full access to specific user-agents while blocking others.
Start today and generate your first article within 15 minutes.