What Crawlability Means In Plain English
What it means
Crawlability is simply whether Google's bots can visit and read your website. If Google can't crawl your pages, they won't be indexed — and if they're not indexed, they won't appear in search results.
How Google finds your pages
Google sends automated crawlers to visit websites and follow links from page to page. They read the content and send it back to Google to be processed and added to the search index.
What blocks crawlability
- txt errors: A misconfigured file can accidentally block Google from your entire site.
- Noindex tags: A meta noindex tag tells Google not to index a page — sometimes these end up on pages that should be indexed.
- Broken internal links: Links leading to 404 errors can't be followed.
- Slow server response: If your server is too slow, crawlers may give up.
How we check it
Our technical audit includes a full crawl identifying any blocked, orphaned, or error pages. We then prioritise fixes by commercial importance.