Using Fetch As Googlebot to isolate site crawling errors

Google has a diagnostic tool called Fetch As Googlebot. It’s located in the Diagnostic panel of Webmaster Tools. What it does is allow you to see how Googlebot sees your particular page.

Like most sites, my sites live or die based on Google search results. Several days ago, I noticed a dramatic drop in page views. This is usually a result of Google dropping the site from its search results. A Google search of key terms confirmed the site was nowhere to be found. A quick look in Google’s Webmaster Tools shows an increasing number of crawl errors with an error code of 403 which means Googlebot was denied access to the pages being crawled. However, the pages were displaying normally when visited. To help isolate these errors, Google has a diagnostic tool called Fetch As Googlebot. It’s located in the Diagnostic panel of Webmaster Tools. What it does is allow you to see how Googlebot sees your particular page. Plug in the url of the page in question and it will report back any errors.

In my case, I was able to isolate the problem to an overzealous spam tool used to reject bad bots. Deactivating that component immediately solved the problem. Once the correction has been made, you can use Fetch As Googlebot to resubmit the site for crawling. Simply submit the home page of the site and click fetch to have Googlebot re-crawl your site.