How To Use Fetch As Google In GSC To Submit An Updated Page To Google’s Index [Tutorial]

Glenn Gabe

google, seo, tools

Fetch as Google and Submit To Index in GSC

{Updated on April 18, 2016 to cover the latest changes in Google Search Console (GSC).}

Any marketer focused on SEO will tell you that it’s sometimes frustrating to wait for Googlebot to recrawl an updated page.  The reason is simple.  Until Googlebot recrawls the page, the old content will show up in the search results.  For example, imagine someone added content that shouldn’t be on a page, and that new content was already indexed by Google.  Since it’s an important page, you don’t want to take the entire page down.  In a situation like this, you would typically update the page, resubmit your xml sitemap, and hope Googlebot stops by soon.  As you can guess, that doesn’t make anyone involved with the update very happy.

For some companies, Googlebot is visiting their website multiple times per day.  But for others, it could take much longer to get recrawled.  So, how can you make sure that a recently updated page gets into Google’s index as quickly as possible?  Well, Google has you covered.  There’s a tool called Fetch as Google that can be accessed within Google Search Console (GSC) that you can use for this purpose.  Let’s explore Fetch as Google in greater detail below.

Fetch as Google and Submit to Index
If you aren’t using Google Search Console (GSC), you should be.  It’s an incredible resource offered by Google that enables webmasters to receive data directly from Google about their verified websites.  Google Search Console also includes a number of valuable tools for diagnosing website issues.  One of the tools is called Fetch as Google.

The primary purpose of Fetch as Google is to submit a url, and test how Google crawls and renders the page. This can help you diagnose issues with the url at hand.  For example, is Googlebot not seeing the right content, is the wrong header response code being returned, etc? You can also use fetch and render to see how Googlebot is actually rendering the content (like the way a typical browser would). This is extremely important, especially for Google to understand how your content is handled on mobile devices.

But, those aren’t the only uses for Fetch as Google.  Google also has functionality for submitting that url to its index, right from the tool itself.  You can submit up to 500 urls per month via Fetch as Google, which should be sufficient for most websites.  This can be a great solution for times when you updated a webpage and want that page refreshed in Google’s index as quickly as possible.  In addition, Google provides an option for submitting the url and its direct links to the index.  This enables you to have the page at hand submitted to the index, but also other pages that are linked to from that url.  You can do this up to 10 times per month, so make sure you need it if you use it!

Let’s go through the process of using Fetch as Google to submit a recently updated page to Google’s index.  I’ll walk you step by step through the process below.

How to Use Fetch as Google to Submit a Recently Updated Page to Google’s Index

1. Access Google Search Console and Find “Fetch as Google”
You need a verified website in Google Search Console in order to use Fetch as Google.  Sign into Google Search Console, select the website you want to work on, expand the left side navigation link for “Crawl”.  Then click the link for “Fetch as Google”.

Fetch as Google in the Crawl Section of GSC

2. Enter the URL to Fetch
You will see a text field that begins with your domain name.  This is where you want to add the url of the page you want submitted to Google’s index.  Enter the url and leave the default option for Google type as “Desktop”, which will use Google’s standard web crawler (versus one of its mobile crawlers).  Then click “Fetch”.

Submitting a URL via Fetch as Google in GSC

3.  Submit to Index
Once you click Fetch, Google will fetch the page and provide the results below.  At this point, you can view the status of the fetch and click through that status to learn more.  But, you’ll notice another option next to the status field that says, “Submit to index”.  Clicking that link brings up a dialog box asking if you want just the url submitted or the url and its direct links.  Select the option you want and then click “Go”. Note, you will also have to click the captcha confirming you are human. Google added that in late 2015 based on automated abuse it was seeing from some webmasters.

A Successful Fetch:
Successful Fetch in GSC

The Submit to Index Dialog Box:
Submit To Index Dialog Box in GSC

4. Submit to Index Complete:
Once you click “Go”, Google will present a message that your url has been submitted to the index.

Successful Submit to Index via GSC

That’s it!  You just successfully added an updated page to Google’s index.
Note, this doesn’t mean the page will automatically be updated in the index.  It can take a little time for this to happen, but I’ve seen this happen pretty quickly (sometimes in just a few hours).  The update might not happen as quickly for every website, but again, it should be quicker than waiting for Googlebot to recrawl your site. I would bank on a day or two before you see the new page in Google’s cache (and the updated content reflected in the search results).

Expedite Updates Using Fetch as Google
Let’s face it, nobody likes waiting. And that’s especially the case when you have updated content that you want indexed by Google!  If you have a page that’s been recently updated, then I recommend using Fetch as Google to make sure the page gets updated as quickly as possible.  It’s easy to use, fast, and can also be used to submit all linked urls from the page at hand.  Go ahead, try it out today.

GG

 

32 thoughts on “How To Use Fetch As Google In GSC To Submit An Updated Page To Google’s Index [Tutorial]”

  1. Great post, as this is a method I have used to update content/sites in Google’s index a number of times. That being said, I learned the hard way that there is one instance where this cannot help you.  A couple months back I was migrating my website to a new domain and initially I had a number of pages disallowed via robots.txt file. When I finally got everything situated on the new domain I removed the disallow and tried to fetch as Googlebot.  Unfortunately it appears that Google uses the last version of your robots.txt file it crawled for the “Fetch as Googlebot” feature, so if the pages were disallowed during the last crawl it will not be able to fetch the page.  In my case I had to wait until the next crawl to run it, but by then Google was already seeing the new page.  Not a big deal for minor changes, but it sucks when you are trying to get migrated content indexed.

    • Thanks for your comment Mike, and that’s a great point.  I’m going to look into the robots.txt issue.  I’m wondering if you can use Fetch as Googlebot on robots.txt.  I’ll post an update if I find out anything on that front.  Thanks again.
      GG

  2. Hi, thanks for your step-by-step instructions. I have actually used Fetch earlier and found that it did not really update. However, I tried it again today. Also, found in my Webmaster Tools it shows just as ‘Fetch’ and under ‘Health’ and not ‘Diagnostics.’ Has the interface changed since you wrote this post?

    • Hi Anu. I’m glad you found my post helpful. You’re right, the interface was updated after I wrote this post. You can now find “Fetch as Google” under “Health” versus under “Diagnostics”.

      GG

  3. Nice post,i have a doubt,in the final step their is a message as “URL Submitted to index” for me more than 10days that message is in existence,how come i know that my site is indexed or not
    Waiting for your reply ASAP…
    Thank you

      • yes it is not been indexed ,it is an old website only,is their any other way to get indexed by google soon…otherthan “fetch as google” in webmaster tools…i have also submitted sitemap

  4. Good instructions! But just how long will Google fetch my site after it shows Success? Because after 24 hours, I googled my site and it still shows my old description and page title.

  5. Googlebots blocked my Website posts URL… And when I Open my site from Google Cache, it will show my oldest post, not updated even after a week..

    Anyone Help me :(

    • Zeeshan, remove the allow directive. It’s not necessary. Also, make sure you are using the right format for your xml sitemap. I haven’t checked your specific file out, but make sure you are pointing to a valid xml sitemap file. How does your sitemap reporting look in Google Webmaster Tools?

      Regarding the URL’s blocked, you are only blocking one directory, so those are probably the files being blocked. If your posts are not in that directory, then they shouldn’t be blocked.

      You can test this out in google webmaster tools. When you test posts, what results are you getting?

  6. What does the Fetch remaining number means, its also showing around 500 to me in my webmaster tool………..

    • You can submit up to 500 urls per week, or 10 per week for URL + all pages linked from that URL. That’s the total you are seeing at the top (your remaining fetches). I hope that helps.

  7. Hi Glenn, I know this post is a couple years old now but I have a question that I haven’t been able to find an answer to elsewhere about using the Fetch tool. Do you recommend using it just for pages that have been changed, or would you also recommend it to speed up Google finding pages that have a 301 redirect in place? The “status” when a page with a 301 is submitted shows “Redirected” but it does also have the “submit to index” option.

    • Hi Valerie. I wouldn’t worry about doing that for 301 redirects. Just let Google find the 301 during the normal crawling process. Is there a specific reason you would want Google to find the 301 faster than usual?

      • Well, just trying to recover from Panda as quickly as possible! The changes I have to do are very slow going – but seem to be working (thankfully!) I submit updated pages using the Fetch tool, but the “bad” pages that I am adding a 301 redirect to are only very slowly spidered (for some it’s been over 6 months since Googlebot last stopped by). We’re talking a substantial number of pages that have the 301 that I assume won’t help the site recover more until Google’s re-evaluated them (and probably spidered multiple times).

        • Valerie, definitely a topic that should be covered elsewhere, but I wouldn’t 301 low quality pages. I would noindex or 404 them. Why are you 301 redirecting the low quality pages? Again, definitely for another post! :)

  8. Webmaster Tools is showing me mobile errors in the Moble Usability Report. When I check the live versions, it says the page is 100% user friendly.

    I dont think robots.txt is blocking anything, because when I “Fetch as Google” using smartphone – it loads complete.

    What gives? Why is the Usability report not accurate to the live tools?

    • I’ve seen this a few times. If the mobile friendly tool is saying you’re ok, then it’s probably ok. But, it’s hard for me to tell without seeing the actual urls. Are you using a responsive design, dynamic delivery, or separate mobile urls? I’ve seen the problem more on separate mobile urls.

  9. is there any way to remove an item from the index than has been submitted to google’s index via this process

    • Hi Ken. Why would you want to remove a url that you just tried to have indexed?

      If you updated the url (content-wise), you can use this process to ensure Google has the latest version indexed. If you want a specific url removed for some reason, then you can 410 the url. Google will drop the url from its index after crawling it again. And if you really need it removed quickly, then use the url removal tool in GSC. That’s under “Google Index -> Remove URLs”. I hope that helps.

  10. Suppose you want google to recrawl your whole site, because it seems to be taking days or weeks for new pages to be picked up and also for old removed pages to be removed from google’s search results. Can the site map be submitted to be indexed via this process Crawl > Fetch as Google > Add to Index > URL and all linked pages? or will this mess everything up ?

    • Great question. You could use an html sitemap that links to each page and then submit that via fetch as google (and crawl all linked urls). I would also make sure your xml sitemaps are updated and resubmit them in GSC.

      Regarding crawling all links in an xml sitemap, I’m not sure if that would work well in GSC. It theoretically should work, but I haven’t used it that way. You can definitely try it and see how it works. I hope that helps.

Comments are closed.