Submit Individual URL's To Google For Indexing via Fetch As Googlebot

Google has announced a new way, Fetch As Googlebot, to submit your new and updated URL’s for indexing.

So, how does it work? Once you successfully fetch a URL as Googlebot, you will be given an option to submit the URL for indexing. By doing so, in about a days time, Googlebot then crawls the URL. Once the URL is crawled, it is then considered for being included in Google’s index. However, as of now, you are allowed to submit only 50 individual URLs per week and only 10 per month, if you are submitting URLs that have all linked pages. Additionally, when you are submitting images and videos you are recommended to continue using the sitemap instead.

Also note, while you would assume by following the procedure your URL will get indexed, here’s a warning for you from Google, “we don’t guarantee that every URL submitted in this way will be indexed; we’ll still use our regular processes—the same ones we use on URLs discovered in any other way—to evaluate whether a URL belongs in our index.” Whereby, the regular process refers to, discovering pages based on the links, Using RSS feeds, XML sitemaps, on request form etc. Basically, what this means is, Fetch as Googlebot is but an added way for the webmasters to request Google for crawling specific web pages.

How do you submit a URL?

To submit a URL firstly, use Diagnostics, which will take you to Fetch As Googlebot to fetch the URL you wish to Submit. When your URL is fetched successfully you’ll see the new “Submit to index” link flash next to the fetched URL.

On clicking the “ Submit to index” link, you’ll be then taken to a dialog box which allows you to opt between submitting only the one ‘URL’ or ‘URL and all its linked pages’

In tandem with this new update, Google has also updated it’s “Add your URL to Google” form, which is now renamed as the “Crawl URL form”. Whilst it has the same URL submitting quota as the Fetch as Googlebot, its unique feature is, you will not need to verify ownership of the site, allowing you to submit any URLs that you wish to be crawled and indexed.

So when do you use it? Well, Google urges you to, “don’t feel obligated to use this tool for every change or update on your site. But if you’ve got a URL whose crawling or indexing you want to speed up, consider submitting it using the Crawl URL form or the updated Fetch as Googlebot feature in Webmaster Tools.

Check out Page Traffic Buzz for more articles by Navneet Kaushal

By Navneet Kaushal

Nav is the founder and CEO of Page Traffic, a premier search engine company known for its assured SEO service, web design and development, copywriting and full time SEO professionals. Navneet has wide experience in natural search engine optimization, internet marketing and PPC campaigns. He is a prolific writer and his articles can be found in the "Best Articles" section of many websites and article banks. As a search engine analyst , he has over 9 years of experience and his knowledge is in application here.