Google has now introduced a new feature in its Webmaster tools that will allow you to tell Google about any URL’s and pages that you have added to your site or any refreshed or changed pages that require them to come and have a fresh look.
Fetch as Googlebot allows you to basically tell Google that you want their Googlebot (their spidering program) to come along, get the new or updated URL and consider it for indexing, especially useful if you do not want to wait the usual time it takes for the engine to find the links via a more natural method.
We see it being quite useful for when webmasters make the error of publishing incorrect information or maybe something they really didn’t want the world to see, because by requesting the Googlebot to fetch the URL in question can hopefully help you to change the cache of this page quicker than if you waited for a natural re-spider.
To use this feature, all you need to do is log into your Webmasters tools and head over to the Diagnostics section. Click onto the Fetch As Googlebot and then enter the URL that you want them to come and fetch. If everything goes to plan and the URL is successfully fetched you’ll see a new “Submit to index” link appear next to the fetched URL and you can now click this and off you go.
Although it is important to realise that using this doesn’t guarantee you quicker indexing or indeed any indexing, it is however great that we can now quickly identify key pages that might not be in the index or pages that are out of date and actually act on them.
For Further detail about Seo Bristol and Web Design Bristol please visit our website.
No comments:
Post a Comment