The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for April 2012

Penguin 1.0 Initial Findings – Unnatural Inbound Links Heavily Targeted, Other Webspam Tactics Await Penalty?

April 27, 2012 By Glenn Gabe 34 Comments

Penguin Update 1.0

The past few days have been fascinating for SEO’s.  Google’s latest algorithm update, now officially named Penguin, has been rolled out.  The update was originally called the Over Optimization Penalty, then the Webspam Algorithm Update, and now Penguin.  As you can imagine, there have been screams from webmasters far and wide about the update (from both webmasters who should have gotten hit by Penguin, and some who believe they were wrongly penalized.)  False positives are absolutely going to occur with Penguin, and Google knows this.  More about this later in the post.

I’ve already started helping some companies that have been hit by Penguin analyze their websites and prepare for a Post-Penguin world.  I’ve also been monitoring the various webmaster forums to see examples of websites getting hit, to see what they were doing wrong.  Based on my research and analysis so far, I wanted to write a post explaining what I’m seeing and document the common thread across sites that are being penalized.  Note, we are still very early in the game, and Google will undoubtedly be rolling out updates to Penguin over time.  Therefore, this is what I’m seeing now.  Since it’s a fluid situation, I will try and write follow-up posts about future Penguin releases.

Penguin and Exact Match Domains:
Before I get deeper into this post, I wanted to mention my first post about Penguin, which I published a few days ago.  I wrote about the potential impact of the Over Optimization Penalty on Exact Match Domains.  If you have been hit by Penguin, and you are using exact match domains, definitely check out that post.  There are several risks you might want to review.

Inbound Links – The Common Thread During Research and Analysis
Almost every penalized site that I’ve reviewed had issues with inbound links.  Specifically, their link profiles were littered were unnatural, spammy links.  And not all of those links were paid text links like some people would expect.  I saw a range of issues that could get a site pecked by Penguin.  Sorry, that’s my first Penguin joke.  :)  Below, I’m going to cover several inbound link issues that I’ve seen during my analysis.  That said, I first wanted to mention other spammy tactics and Penguin 1.0.

What About Over Optimization?
I’ve been searching for a site that got hit for spammy title tags, keyword stuffing, doorway pages, etc., but I’m not seeing that as a driving force right now with Penguin.  And believe me, I’ve come across a lot of sites violating Google’s Quality Guidelines over the years…  It doesn’t seem like those factors are getting caught right now, with the key phrase being “right now”.  My hope is that Google will roll out updates to Penguin that also catch those violations.  So, if you are a company that’s keyword stuffing, employing doorway pages, overly optimizing your title tags, etc., now is the time to change…  It wouldn’t shock me to see rolling updates to Penguin that include penalties targeting those violations as well.

Below, I’ll list some of the inbound link issues I’ve seen on websites hit by Penguin.  Again, we are early on, and things can change.  But for now, this is what I’m seeing:

1. Paid Text Links Using Exact Anchor Text
As you can imagine, this one is a clear violation of Google’s guidelines.  During my analysis, it was easy to pick up exact match, paid text links on sites that were rampant with sponsored links.  Many of the sites I analyzed had these types of links.  Similar to what I said in my first post on Penguin, if you want to check your own inbound links, read my post about finding paid text links using Open Site Explorer.

Checking inbound links via Open Site Explorer:
Using Open Site Explorer to check inbound links

2. Comment Spam
When analyzing websites hit by Penguin, I also saw a lot of comment spam.  This came in two forms.  The first form was using signatures in comments using exact match anchor text.  For example, instead of using your name (like you’re supposed to), people commenting were using the exact match anchor text for keywords they wanted to rank for.  For Google, this is pretty darn easy to pick up.

3. Guest Posts on Questionable Sites
I’ve also seen many guest posts on questionable sites that included exact match anchor text.  Note, I obviously don’t think all guest posts are bad.  Actually, I think they can be ultra-powerful on the right websites and blogs.  But, the guest posts I’m referring to were on sites set up simply to generate income from those guest posts (based on the links they would drive).  And the posts themselves weren’t strong… They were typically thin with a focus on the anchor text, and not the story.

4. Article Marketing Sites
Similar to the last bullet, I saw a lot of syndicated articles using exact match anchor text leading back to sites that got hammered by Penguin.  So yes, Penguins seem “cold” to article marketing tactics. Sorry, that’s my second Penguin joke. :)  Again, these articles were relatively thin, used several instances of exact match anchor text leading back to the site, etc.

Inbound Link Profiles Heavily Weighted by These Tactics Got Hit
One of the most important findings included the weighting of inbound links for each site.  For the sites I analyzed, a majority of the inbound links included the tactics listed above.  Actually, for some sites, I couldn’t find any natural links… most were unnatural.  As you can imagine, this is not a strong signal to Google that you’re a typical webmaster looking to gain traffic by earning it.  You look like you’re gaming the system to gain rankings.  And that’s when Penguin steps in, and hammers you.  A natural link profile will contain many types of links, including URL’s, brand names, image links, etc.  It won’t contain 99% exact match anchor text from article sites, comment spam, etc.

Dangerous Sites and Not Just Spammy
There’s another point I wanted to make before ending this post.  While analyzing inbound links across penalized sites, I found several linking websites that were flat out dangerous, and not just spammy.  I found sites flagged for malware, sites using numerous popups as I hit a page, etc.  So, when we know that Google doesn’t like sending users to dangerous websites, and they it doesn’t like spam, these dangerous sites could have been the kiss of death for downstream destinations.  If you have unnatural links on dangerous sites, then the Penguin outcome probably wasn’t pretty for you.

An example of a website flagged for malware:
Firefox flagging a website for malware

Next Steps for Penguin
Remember, this is just the beginning stages of Penguin and the situation can change quickly.  I’m going to keep analyzing websites that have been penalized, monitoring webmaster forums, and watching Google’s response closely.

Here are some closing points about Penguin that are important to understand:

  • Google said you cannot file a reinclusion request if you’ve been hit by Penguin.  Since the update is algorithmic, and not manual, reinclusion requests will not help you.
  • Google set up a Penguin form to fill out if you believe that you’ve been wrongly hit by the algo update.  If you think you’re a false positive, then fill out the form today.
  • Google also included a link to a form where you can report spam that you think should have been caught by Penguin, but hasn’t.  You can click the button labeled “Report Webspam” on that page.  This is obviously a little subjective, but I’m glad Google is looking to catch more webspam in future releases of Penguin.
  • It’s worth noting that both Panda and Penguin have been rolled out within a few days of one another.  Yesterday, Matt Cutts explained that the latest version of the Panda Update rolled out on 4/19, and then Penguin rolled out on 4/24.  You should check your reporting to make sure you know the date you were hit.  You don’t want to go down the wrong path when making changes to your website…  i.e. Mistaking Panda penalties with Penguin penalties. <- and by the way, how ridiculous does that sound? :)
  • If you’ve been hit by Penguin, take a hard look at your site, your inbound link profile, etc., and plan to make changes.  I know this is going to be a painful time for you and your website, but don’t just sit there. Analyze and take action.

Until the Next Version of Penguin rolls out…
That’s what I have for now.  I hope this post helped shed some light on the latest Google algorithm update.  If you have any questions, or need assistance, don’t hesitate to contact me.  Good luck.

GG

 

Filed Under: algorithm-updates, google, seo

Google’s Over-Optimization Penalty and Exact Match Domains

April 25, 2012 By Glenn Gabe 17 Comments

The Webspam Algorithm Update and Exact Match Domains

Yesterday, we found out that the latest Google algorithm update was underway (dubbed the Over Optimization Penalty).  Google posted about the update on the Webmaster Central Blog.  By the way, Google is officially calling the update the Webspam Algorithm Update and not the Over Optimization Update (although I like the sound of the latter).  There has been a lot of speculation about what the update would look like, which spammy tactics would get hit, etc.  I’ve been keeping a close eye on the various webmaster forums, and there are already many reports of sites getting nuked.  As Google explained in its blog post yesterday, it was targeting webspam tactics that were being used to game the system.  That could mean spammy inbound links, keyword stuffing, doorway pages, etc.

The goal of the latest algorithm update is to level the playing field rankings-wise.  Google realizes that there are many sites that have great content, but simply can’t compete against other sites that have been overly optimized.  And when I say “overly optimized”, I’m referring to using spammy tactics to game the system.  As I’ve always said, those tactics might work in the short-term, but the long term impact could be devastating.  And those sites are seeing the negative effect now.

What Does Over Optimization Mean?
There’s a lot of speculation with what over optimization actually is, and what can get you penalized with this latest algo update.  For example, keyword stuffing, too many inbound links with rich anchor text, overly optimized title tags, footers filled with optimized content, etc.  Basically, any spammy tactic that companies have used to game the system…

To be clear, these aren’t tactics that a typical webmaster would use.  Based on the screams from webmasters that have been hit during recent testing, and now as the algo gets rolled out, spammy inbound links seem to be causing a lot of problems.  That said, I’m sure we are going to see many examples of different tactics getting penalized too.  By the way, if you are interested in checking your own inbound links, then check out my recent post about finding spammy links using Open Site Explorer.  It will only take you a few minutes…

The Exact Match Domain Threshold
One tactic that I think hasn’t received as much attention during this update is the use of exact match domains.  Mike Wilton mentioned this in his post about Over Optimization, but most people have been focusing on inbound links, keyword stuffing, on-page optimization, etc.  But anyone in SEO will tell you that exact match domains have been a tactic that has been abused over the years.  It involves someone registering a domain name that exactly matches the keyword they want to rank for.  Unfortunately, the engines heavily weight keyword-rich domains in the SERPs.  As you can imagine, that flaw has led to an abuse of the system.

For example, imagine you sold widgets in Princeton, NJ.  You might register www.princetonwidgets.com or www.widgetsprinceton.com.  You get the picture, and my guess is that you have seen many exact match domains rank well as you search Google.  On the one hand, if you legitimately have an exact match domain, and you use that domain as your core website, then I get it (and that’s fine to me).  Also, if you happen to have a brand name or company name that matches a highly searched keyword, I get that too.

But the abuse has come from business owners (and heavily local business owners) who simply want to dominate a certain category by using exact match domains.  And that’s where I think it crosses the line.  In addition, some companies use their core domain for their website, but register a bunch of exact match domains that simply link to their core domain.  As you can see, that’s not a “normal” way to set up websites for a company (or how to build inbound links for the core domain).

Will Exact Match Domains Get Hammered by the Webspam Algorithm Update?
It’s hard to say to what degree, but I know some will get penalized (actually, I see some are getting penalized right now).  I obviously don’t think all exact match domains will get nuked, since that’s way too extreme, and would include the legitimate use of exact match domains (as I covered earlier).  So, there might be a threshold that Google uses while determining which exact match domains to penalize.

An Example of Webmasters Reporting a Drop in Rankings as Algorithm Update Rolls Out: 

Webmasters report drop in rankings based on Google algorithm update

Below, I’ll list some thoughts about what that threshold could look like.  And I hope this goes without saying, but we’ll all find out over time how extreme this algorithm update was.  We are only on day 1. :)

1. Number of Domains Per Company
As I mentioned earlier, there are some companies that have registered an exact match domain for their business website.  If that’s the sole use of the domain, and you are adding high quality, valuable content, then there’s no reason for that domain to get hammered.  But, if a company registered 5 different exact match domains, in addition to having its company website, then you can start to see how this would violate Google’s guidelines.  The company is simply trying to game Google and rank across multiple sites for target keywords.  This type of set up is at great risk right now (in my opinion).

And by the way, if you think Google doesn’t know that you own all the domains, think again.  It has multiple ways to understand this, including your own Google Analytics account.  :)

2. Cross Linking of Domains Using Rich Anchor Text
Are all the exact match domains linking to either each other, or to another domain you own?  Again, that could easily be perceived as spamming by Google.  Buying a bunch of exact match domains only to cross link them using rich anchor text could definitely get you in trouble.  I’ve come across this a thousand times while analyzing inbound links for companies.  You clearly see several company-owned domains all linking to one another with the exact anchor text that they want to rank for.

3. Doorway Pages
Similar to what I explained above, some companies employ exact match domains (and pages within those domains) solely to help another site rank.  In addition, some companies use EMD’s to funnel traffic to a core, company website.  Again, typical webmasters aren’t going to do this… They will build high quality domains with the goal of impressing prospective customers, educating them, and landing new business.  They will build links naturally and not try to buy their way to the top of the rankings.  Google has a clear view of doorway pages, as stated in its Webmaster Guidelines.

4. Thin Content and Panda
With the latest algorithm update, there is a chance that a website could fall victim to a double whammy penalty.  For example, getting hit by both Panda and Over Optimization.  All you have to do is combine what I’ve listed above and you can see how a website might have both thin content and be in use solely to help another site rank well.  If that’s the case, then it could get hit by the Panda Update (which targets low quality content), and by OOP (which targets webspam).  Good luck recovering from that perfect storm…

Summary – We’re only in the beginning stages of OOP
As crazy as this sounds, this is an extremely exciting time for anyone involved in SEO.  You get to watch a major algorithm update get pushed out, analyze the sites that get penalized, view collateral damage, try and better understand what Google’s objective was with the update, etc.  In this post, I’ve tried to outline how the latest update could impact exact match domains.  Unfortunately, nobody will know the exact impact for weeks (or longer).  I plan to write more about the Webspam Algorithm Update in future posts, so keep an eye on my RSS feed.

And, if you have been hit by the latest update, feel free to reach out to me.  Although many spammy websites will get penalized, there is always collateral damage.

GG

 

Filed Under: algorithm-updates, google, seo

How to Manage Negative Keywords in Microsoft adCenter Desktop Using the New Negatives Tab [Tutorial]

April 14, 2012 By Glenn Gabe 1 Comment

The New Negatives Tab in adCenter Desktop

Microsoft adCenter has been rolling out new features and updates at a rapid pace recently.  Several of the changes aim to make your life a lot easier, and bring AdWords-like functionality to adCenter.  That’s great news, because when you’re managing a lot of campaigns, ad groups, and keywords, you don’t want to have to deal with compatibility issues when attempting to sync accounts across both AdWords and adCenter.

One of the latest adCenter changes impacts how negative keywords are handled within adCenter Desktop, which is a desktop-based software package that enables you to refine your accounts locally and then sync with adCenter when needed. It’s similar to AdWords Editor, which is a similar tool from Google.  Both are essential tools for Search Engine Marketers.  In this post, I’m going to explain what the change is, and walk you step by step through how to manage negatives in the new adCenter desktop tool.

A Change to How Negatives are Managed in adCenter Desktop
Negative keywords are extremely important in Paid Search.  They enable you to stop your ads from showing when certain words are present in a user’s query.  In adCenter, negatives have gotten a serious facelift recently.  First, you can now add exact match keywords, which wasn’t possible until recently.   AdWords included this functionality for a long time, so it was great to see adCenter implement exact match negatives.

In addition, now campaign-level and ad group negatives work together versus having ad group negatives override campaign level keywords.  Most marketers didn’t even know this was happening in the past, and it absolutely was affecting the performance of their campaigns.  Now that negatives work differently (and better), adCenter Desktop has also been updated to better match AdWords Editor.  This will bring a familiar way to manage negatives across AdWords Editor and adCenter Desktop.  The only problem is that the location of managing negatives has moved!

Therefore, I’m going to walk you through the new Negatives Tab in adCenter Desktop.  Once you find it, know how it works, and see how to manage negatives keywords in the new version of adCenter Desktop, I think you’ll dig it.  Let’s jump in.

The New Negatives Tab in adCenter Desktop
1. Make Sure You Are Using the Latest Version of adCenter Desktop
First, if you haven’t downloaded adCenter Desktop yet, you can find it here.  If you already have it installed, then you will be prompted to upgrade to the latest version (which has the new Negatives Tab).  If you already upgraded to the latest version, then you’re good to go.

2. Locate the New Negatives Tab
After launching adCenter Desktop, you can locate the new Negatives Tab by accessing a campaign and/or ad group in your account.  Let’s check out an ad group for this example.  Once you click an ad group, make sure the Keywords Tab is active by clicking the tab.  You will then see the Negatives Tab in the upper left hand corner of the Campaign Manager.

The Keywords Tan in adCenter Desktop

3. Manage Negative Keywords
By default, the Positives Tab will be displayed.  Positives are the keywords you want to run, while Negatives are words that you’ll add to stop your ads from showing when a user’s query contains those words.  Go ahead and click the Negatives Tab.

Find the Negatives Tab in adCenter Desktop

4. Add Campaign-Level or Ad Group-Level Keywords
If you don’t have any negatives included in the ad group at hand, the screen will be blank.  At this point, you can either add a single negative keyword or you can add negatives in bulk.  If you want to add a single negative keyword, click the dropdown arrow next to the “Create Negative Keyword” button.  adCenter Desktop will display two options, one for creating a campaign level keyword and one for creating an ad group-level keyword.

Add a Negative Keyword in adCenter Desktop

5. Enter a Negative Keyword and Select Match Type
Choose one of the options and then enter the keyword in adCenter Desktop.  At the bottom of the screen, you can enter or refine the keyword and then choose a match type.  You can select either “phrase” or “exact” match for your negative keywords.

Select a Match Type for a Negative Keyword in adCenter Desktop

6. Add in Bulk
If you want to add several negatives at one time, you can click the dropdown arrow next to the “Multiple Changes” button.  You will see options for adding or deleting multiple campaign-level or ad group-level negatives.

Adding Multiple Negative Keywords in adCenter Desktop

7. Enter Negatives in Bulk
Choose one of the options and then make sure the right campaign and ad group are selected.  If you selected this option from an ad group, then that ad group will be selected by default.  Enter your negative keywords by adding one per line.  To add exact match negatives at this point, enclose each keyword in brackets.  For example, [sample keyword] would add an exact match negative.  If you enter a keyword without brackets, it will be added as a phrase match negative. Then click the “Import” button to add your negative keywords.  After importing your keywords, you can change the match type, if needed.  Again, you can select either “phrase” or “exact” match for your negative keywords.

How to Add Multiple Negative Keywords in adCenter Desktop

You’re done!  Now you can sync your account and the new negatives will be uploaded to your campaigns and ad groups.

Summary – Use the New Negatives Tab in adCenter Desktop
As I explained earlier, negatives are critically important in Paid Search.  I highly recommend you develop a plan for using negatives at both the campaign and ad group level. This will enable you to hone your targeting by weeding out untargeted queries.  I love the latest adCenter changes, including the new Negatives Tab in adCenter Desktop.  I recommend downloading the latest version of adCenter Desktop, and then get familiar with how to manage negatives in the new version.

Good luck, and remember, negatives are not negative. :)

GG

Related Post:
If you are managing campaigns in Microsoft adCenter, you might be interested in my post about using Param1 and Param2:
How To Use Param1 and Param2 in AdCenter to Customize Your Paid Search Ads

 

Filed Under: adcenter, bing-ads, sem, tools Tagged With: adcenter

How To Target by Zip Code in Google AdWords

April 6, 2012 By Glenn Gabe Leave a Comment

Location Targeting by Zip Code in Google AdWords

Google AdWords is a powerful platform for reaching targeted users on the web.  Google has done an incredible job with building a robust system for launching targeted ads on both the Search Network and the Display Network.  This enables you to reach prospective customers while they are searching on Google.com, or when they are visiting websites across the web.

Functionality-wise, Google keeps adding new features and options to its platform, which enables advertisers to precisely target the people they want to reach.  One of these features is location targeting.  Using location targeting, marketers can create specific campaigns targeting people in a certain location.  For example, you can target at the country level, the state level, the city level, you can use radius targeting, etc.  It’s a must-have piece of functionality for any local business.  And with 20% of all queries on Google related to location, you can imagine how important location targeting can be for reaching a targeted audience.

Zip Code Targeting in AdWords
Well, now there’s another level of location targeting you can add to your campaigns, which was released yesterday.  Marketers can now target their campaigns at the zip code level, which provides another level of precise location targeting.  You can now target up to 30,000 zip codes via the AdWords system.  Note, each campaign can target up to 1,000 zip codes, but there are 30,000 codes in the system.  Below, I’m going to cover how this works within the AdWords interface (UI).

Similar to other location targeting methods, you will access zip code targeting via the Settings Tab within your campaign.  Then you can begin to enter zip codes, and then add or exclude them for the campaign at hand.

Here’s how to add zip code targeting to your campaigns:

1. Access the Settings Tab for a Campaign
Campaign Settings Tab in AdWords

2. Add Location Targeting
Under the Locations and Languages section, click “Edit” under “Locations”.  This brings up a text field where you can start to enter locations to target.
Location Targeting Settings in AdWords

3. Enter Zip Codes
In the text field, you can begin to enter zip codes.  As you begin typing zip codes, the list will auto-populate with zip codes that match what you are entering.  You will then see options for “Add”, “Exclude” or “Nearby” next to specific zip codes.  If you click “Nearby”, then a map will appear showing the location that matches the zip code selected.  Excluding a zip code does just that.  Users in that zip code will not see your ads.
Zip Code Targeting in Google AdWords

4. Add Zip Codes and Save Your Work
As mentioned earlier, you can add up to 1,000 zip codes per campaign.  Once you have the zip codes selected and entered, make sure you click “Save” to add the zip codes to your campaign’s location targeting.
Save Location Targeting Settings in AdWords

You’re done!  Your ads will now target users in the zip codes you selected.

Summary – Targeting Zip Codes in AdWords
If you are a local business, or you want to target users in a specific location, then I think you’ll find zip code targeting extremely valuable.  There are times that you want to get extremely granular with the locations you target, and zip codes can help you achieve this.  You can add and exclude specific zip codes to create an advanced level of targeting for your campaigns.  If you are using location targeting, I highly recommend you take a look at the new zip code targeting option.  It’s available now in your AdWords campaign.

And by the way, I’m already using it. :)

GG

 

Filed Under: adwords, local, sem

How To Use Fetch As Google In GSC To Submit An Updated Page To Google’s Index [Tutorial]

April 5, 2012 By Glenn Gabe 32 Comments

Fetch as Google and Submit To Index in GSC

{Updated on April 18, 2016 to cover the latest changes in Google Search Console (GSC).}

Any marketer focused on SEO will tell you that it’s sometimes frustrating to wait for Googlebot to recrawl an updated page.  The reason is simple.  Until Googlebot recrawls the page, the old content will show up in the search results.  For example, imagine someone added content that shouldn’t be on a page, and that new content was already indexed by Google.  Since it’s an important page, you don’t want to take the entire page down.  In a situation like this, you would typically update the page, resubmit your xml sitemap, and hope Googlebot stops by soon.  As you can guess, that doesn’t make anyone involved with the update very happy.

For some companies, Googlebot is visiting their website multiple times per day.  But for others, it could take much longer to get recrawled.  So, how can you make sure that a recently updated page gets into Google’s index as quickly as possible?  Well, Google has you covered.  There’s a tool called Fetch as Google that can be accessed within Google Search Console (GSC) that you can use for this purpose.  Let’s explore Fetch as Google in greater detail below.

Fetch as Google and Submit to Index
If you aren’t using Google Search Console (GSC), you should be.  It’s an incredible resource offered by Google that enables webmasters to receive data directly from Google about their verified websites.  Google Search Console also includes a number of valuable tools for diagnosing website issues.  One of the tools is called Fetch as Google.

The primary purpose of Fetch as Google is to submit a url, and test how Google crawls and renders the page. This can help you diagnose issues with the url at hand.  For example, is Googlebot not seeing the right content, is the wrong header response code being returned, etc? You can also use fetch and render to see how Googlebot is actually rendering the content (like the way a typical browser would). This is extremely important, especially for Google to understand how your content is handled on mobile devices.

But, those aren’t the only uses for Fetch as Google.  Google also has functionality for submitting that url to its index, right from the tool itself.  You can submit up to 500 urls per month via Fetch as Google, which should be sufficient for most websites.  This can be a great solution for times when you updated a webpage and want that page refreshed in Google’s index as quickly as possible.  In addition, Google provides an option for submitting the url and its direct links to the index.  This enables you to have the page at hand submitted to the index, but also other pages that are linked to from that url.  You can do this up to 10 times per month, so make sure you need it if you use it!

Let’s go through the process of using Fetch as Google to submit a recently updated page to Google’s index.  I’ll walk you step by step through the process below.

How to Use Fetch as Google to Submit a Recently Updated Page to Google’s Index

1. Access Google Search Console and Find “Fetch as Google”
You need a verified website in Google Search Console in order to use Fetch as Google.  Sign into Google Search Console, select the website you want to work on, expand the left side navigation link for “Crawl”.  Then click the link for “Fetch as Google”.

Fetch as Google in the Crawl Section of GSC

2. Enter the URL to Fetch
You will see a text field that begins with your domain name.  This is where you want to add the url of the page you want submitted to Google’s index.  Enter the url and leave the default option for Google type as “Desktop”, which will use Google’s standard web crawler (versus one of its mobile crawlers).  Then click “Fetch”.

Submitting a URL via Fetch as Google in GSC

3.  Submit to Index
Once you click Fetch, Google will fetch the page and provide the results below.  At this point, you can view the status of the fetch and click through that status to learn more.  But, you’ll notice another option next to the status field that says, “Submit to index”.  Clicking that link brings up a dialog box asking if you want just the url submitted or the url and its direct links.  Select the option you want and then click “Go”. Note, you will also have to click the captcha confirming you are human. Google added that in late 2015 based on automated abuse it was seeing from some webmasters.

A Successful Fetch:
Successful Fetch in GSC

The Submit to Index Dialog Box:
Submit To Index Dialog Box in GSC

4. Submit to Index Complete:
Once you click “Go”, Google will present a message that your url has been submitted to the index.

Successful Submit to Index via GSC

That’s it!  You just successfully added an updated page to Google’s index.
Note, this doesn’t mean the page will automatically be updated in the index.  It can take a little time for this to happen, but I’ve seen this happen pretty quickly (sometimes in just a few hours).  The update might not happen as quickly for every website, but again, it should be quicker than waiting for Googlebot to recrawl your site. I would bank on a day or two before you see the new page in Google’s cache (and the updated content reflected in the search results).

Expedite Updates Using Fetch as Google
Let’s face it, nobody likes waiting. And that’s especially the case when you have updated content that you want indexed by Google!  If you have a page that’s been recently updated, then I recommend using Fetch as Google to make sure the page gets updated as quickly as possible.  It’s easy to use, fast, and can also be used to submit all linked urls from the page at hand.  Go ahead, try it out today.

GG

 

Share
Tweet
Share
Email

Filed Under: google, seo, tools

Connect with Glenn Gabe today!

Latest Blog Posts

  • How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)
  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent