Google announces Webmaster Tools update

Google has introduced a new feature to Webmaster Tools that shows webmasters exactly which resources on their websites are blocked from crawling.

Why has this update come about?

When Googlebot crawls a webpage, it wants to understand the resources contained in it from a user's perspective (examples of these resources include linked images, JavaScript and CSS).

If Googlebot cannot read a resource on a page, this is usually because a website's robots.txt file has disallowed it to be crawled. This means that Google cannot index the page properly, and therefore it can have a negative effect on how the page is ranked and rendered (presented to users).

Google has recently warned webmasters that sites which are not optimised for mobile users will not rank well in mobile search results. When Googlebot cannot read resources such as Javascript and CSS because crawling has been disallowed, it cannot tell whether or not that website is optimised for mobile, and therefore cannot rank it accurately.

The new Blocked Resources Report

Google now produces a Blocked Resources Report for webmasters, which states exactly which resources are blocked. It will also provide detailed guidance on how the webmaster can go about diagnosing these problems and finally unblocking the resources.

The report shows only the resources that could be under the webmaster's control, as those are the only ones which can be addressed.

On its Webmaster Central Blog, Google advised: "Because it can be time-consuming (usually not for technical reasons!) to update all robots.txt files, we recommend starting with the resources that make the most important visual difference when blocked."

For more news and blogs, follow us on Twitter. To learn more about SEO, attend one of our free training seminars.