Indexing Win #5: Submitted URL Blocked by robots.txt (How to Fix It)
Robots blocks are often accidental. Here’s how to diagnose and safely unblock the right URLs.
“Submitted URL blocked by robots.txt” means you told Google about a URL (usually via sitemap), but robots.txt prevents crawling it.
Fast diagnosis
- Check robots.txt rules (Disallow patterns) and which user-agent they target.
- Confirm the URL is intended to be indexable (some paths should stay blocked).
- Make sure your sitemap isn’t listing blocked URLs.
Fix checklist
- Remove blocked URLs from your sitemap if they shouldn’t be crawled.
- Edit robots.txt to allow crawling if the pages should be indexed.
- Validate in Search Console using robots testing (or equivalent).
- Re-submit sitemap after cleanup.
Don’t forget internal links
Even after unblocking, Google prioritizes based on importance signals. Strong internal links help the right pages get discovered and crawled.
Ship internal links at scale
Linkbot helps you deploy relevant internal links so newly-unblocked sections get crawled and indexed faster.