The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
  • Contact GSQi

7 More Penguin Findings: An Update From the Over Optimization Front Lines

May 11, 2012 By Glenn Gabe 13 Comments

More Penguin 1.0 Findings

After Penguin first hit on April 24th, I started performing a lot of analysis on websites that were affected.  I ended up quickly publishing my findings in two blog posts here on the Internet Marketing Driver.  The first explained how exact match domains could be susceptible to penalty based on how those domains were being used.  A few days later, I wrote a second post that included initial findings based on analyzing a number of websites hit by Penguin.  In that post, I explained how the initial rollout of Penguin seemed extremely inbound link-heavy.  I simply wasn’t seeing other webspam tactics getting penalized.  Every website that I analyzed that had gotten nuked ended up having serious inbound link issues.

Based on writing those two initial posts, I’ve had numerous businesses reach out to me that have been hit by Penguin.  They range from businesses running one single website to owners of hundreds of websites.  It’s been absolutely fascinating to hear what’s happened to various websites (and networks), and then be able to analyze those sites.  In total, I’ve analyzed approximately 60-70 websites since Penguin hit.  As a result, I have a lot of data.  My goal with this post today is to share some of my findings, explain what I’m seeing, and shed some light on the situation.

Real People, Real Problems
The first thing I wanted to mention before getting into my Penguin findings has little to do with SEO.  Since April 24th, I’ve had the opportunity to speak with a lot of business owners that have gotten hit by Penguin.  I’ve been amazed at how open everyone has been with me.  Some people emailed me, while others simply called me directly.  In almost all cases, you can feel their despair through emails or in their voices.

I just want to emphasize that no matter what you think about webmasters using grey hat or black hat tactics, it’s important to know that there are still people on the other side of those websites.  Real people, that now have real problems.  Yes, many broke the rules.  I get it.  But it’s still hard to hear some of the stories…   Some won’t be able to pay their medical bills now, while others are going to find it hard to pay their mortgages.  I think that gets lost when speaking about Penguin, and it shouldn’t.  That’s why I titled this article, “An Update from the Front Lines”.  It’s like digital combat.  And I’ve spoken directly with the wounded.

With that out of the way, let’s dig into my findings:

1. High Threshold of Exact Match Anchor Text Got Hammered
As I mentioned in my last post, Penguin 1.0 was extremely inbound link-heavy.  I haven’t seen other webspam tactics get hit like spammy inbound links.  Based on my analysis, websites with a high percentage of exact match anchor text were hammered.  For example, a site with 80%+ of its inbound links using exact match anchor text got smoked.  Performing an inbound link analysis on many of the sites I reviewed revealed unnatural links.  That included links on low quality sites, article marketing sites, etc. I have yet to come across a website with a truly diversified link profile get hammered by Penguin.

High Percentage of Exact Match Anchor Text

2. Other Spam Tactics Not Hit Yet
As part of my analysis, I came across several sites that were keyword stuffing, had over-optimized title tags, using doorway pages, etc., but didn’t get hit.  As I said in my previous post, I believe future releases of Penguin could hit those tactics.  I don’t believe sites using those methods are safe.  It is probably just a matter of time.  Beware.

3. Very Low False Positive Rate
I have seen a very low false positive rate.  Actually, I haven’t come across one site that was a clear false positive.  Also, Danny Sullivan interviewed Matt Cutts about Penguin this week, and Matt explained that Google is happy with the results of Penguin.  He said the false positive rate is very low.  I have to agree with his assessment.  I’ve analyzed many websites and almost every one of them had serious inbound links issues.

By the way, it also sounds like Penguin will be rolled out periodically (like Panda).  If that’s the case, then you won’t notice any changes until the next version of Penguin rolls out (no matter what you change in the meantime).

4. Panda + Penguin = Confusion for Marketers
This one is really confusing webmasters.  Panda rolled out on 4/19, and then Google rolled out a Panda refresh on 4/27.  In between, Google rolled out Penguin.  As you can imagine, this is extremely confusing for marketers.  Many don’t know how to even determine what they were hit by.  And the last thing you want to do is to take action thinking you were hit by Penguin, when you were actually hit by Panda (or vice versa).  I’ve had several companies contact me saying they were hit by Penguin, when in reality, they were hit by Panda.  My advice is to make sure you know which update hit you, and then form a plan of attack.  If you are confused about this, contact a professional SEO.

Panda or Penguin Update

5. Private Networks
I’ve had several owners of private networks contact me about Penguin.  To clarify, I’m calling a network of websites owned and operated by one entity as a private network.  The sites all leverage each other for links, and as you can guess, exact match anchor text links are heavily used.  Many of you reading this post would probably assume that all of the sites that were part of private networks would have gotten nuked equally.  But that’s not the case.  I analyzed dozens websites that were part of private networks and what I found is going to surprise you.  Only some of the network sites were hit, while other remained untouched.  Then there were some that dropped a few spots in the SERPs, but only marginally.

This was fascinating for me to analyze.  I would drill into a site that got hammered, and find all of the unnatural links.  Then I would analyze a site within the same network, with the same types of links, and it was untouched.  Then another site part of the network, with the same links, only lost marginal rankings.  Why?  Did Google really miss those sites?  That’s hard to believe.  They all linked to one another using exact match anchor text.

I did start seeing a trend with certain categories of websites.  For example, categories A and B were getting hit, while category C was untouched (across websites).  This led to me to believe that Google might be targeting certain categories with Penguin 1.0.  I can’t say for sure if that’s the case, but I saw this several times during my analysis of multiple private networks.

Private Linke Network

6. Google Isn’t Done With Public Link Networks
If private networks consist of websites owned and operated by one company, then public networks are large networks of websites where many different companies participate.  It’s more of a typical link network where sites unrelated to one another all link to each other.  As you can imagine, Google hated this tactic, and hammered many with Penguin 1.0.  But I noticed something really strange.  One of the largest public link networks I came across was untouched.  Literally, not one site that I checked seemed to have gotten hit by Penguin.  Again, how in the world did Google miss a network this large?  It just didn’t make sense.

Since these sites were all unrelated, the category issue I mentioned earlier couldn’t be what saved them.  So why would Google let such a large network of websites survive?  Good question, and it seems like it’s just a matter of time before that network gets hit.  I think this just proves that Penguin wasn’t perfect.  So, although I didn’t see any true false positives during my analysis, I did see some false negatives.  I expect future releases of Penguin to address those issues.

7. Penalty vs. Devalued Links
I mentioned earlier that I saw some sites get nuked while others were only marginally hurt.  I mentioned this earlier when referring to private link networks.  But, I also saw some standalone websites with unnatural link profiles drop in rankings, but not get annihilated.  So, why would some sites get hammered while other simply drop slightly in rankings?

This probably means that Google simply devalued the links pointing to the sites that were marginally hit, versus applying a serious penalty to the site.  If a site fell off the Google map, it was penalized.  If it dropped a few spots in search, or jumped to page 2, it’s possible that some of the website’s inbound links were devalued.  It’s worth noting, and you should analyze your rankings to determine if you were hit hard by Penguin or if you are experiencing devalued links.

Back to the Front Lines. More Updates Soon
That’s what I have for now.  It’s definitely ugly out there.  My recommendation is to analyze your current situation the best you can.  Determine what you were hit by, Panda or Penguin, and then form a plan of attack.  During my analysis, I found there are some sites with a clear problem, while others involve a deeper analysis.  Good luck.

GG

 

 

Filed Under: algorithm-updates, google, seo

Penguin 1.0 Initial Findings – Unnatural Inbound Links Heavily Targeted, Other Webspam Tactics Await Penalty?

April 27, 2012 By Glenn Gabe 34 Comments

Penguin Update 1.0

The past few days have been fascinating for SEO’s.  Google’s latest algorithm update, now officially named Penguin, has been rolled out.  The update was originally called the Over Optimization Penalty, then the Webspam Algorithm Update, and now Penguin.  As you can imagine, there have been screams from webmasters far and wide about the update (from both webmasters who should have gotten hit by Penguin, and some who believe they were wrongly penalized.)  False positives are absolutely going to occur with Penguin, and Google knows this.  More about this later in the post.

I’ve already started helping some companies that have been hit by Penguin analyze their websites and prepare for a Post-Penguin world.  I’ve also been monitoring the various webmaster forums to see examples of websites getting hit, to see what they were doing wrong.  Based on my research and analysis so far, I wanted to write a post explaining what I’m seeing and document the common thread across sites that are being penalized.  Note, we are still very early in the game, and Google will undoubtedly be rolling out updates to Penguin over time.  Therefore, this is what I’m seeing now.  Since it’s a fluid situation, I will try and write follow-up posts about future Penguin releases.

Penguin and Exact Match Domains:
Before I get deeper into this post, I wanted to mention my first post about Penguin, which I published a few days ago.  I wrote about the potential impact of the Over Optimization Penalty on Exact Match Domains.  If you have been hit by Penguin, and you are using exact match domains, definitely check out that post.  There are several risks you might want to review.

Inbound Links – The Common Thread During Research and Analysis
Almost every penalized site that I’ve reviewed had issues with inbound links.  Specifically, their link profiles were littered were unnatural, spammy links.  And not all of those links were paid text links like some people would expect.  I saw a range of issues that could get a site pecked by Penguin.  Sorry, that’s my first Penguin joke.  :)  Below, I’m going to cover several inbound link issues that I’ve seen during my analysis.  That said, I first wanted to mention other spammy tactics and Penguin 1.0.

What About Over Optimization?
I’ve been searching for a site that got hit for spammy title tags, keyword stuffing, doorway pages, etc., but I’m not seeing that as a driving force right now with Penguin.  And believe me, I’ve come across a lot of sites violating Google’s Quality Guidelines over the years…  It doesn’t seem like those factors are getting caught right now, with the key phrase being “right now”.  My hope is that Google will roll out updates to Penguin that also catch those violations.  So, if you are a company that’s keyword stuffing, employing doorway pages, overly optimizing your title tags, etc., now is the time to change…  It wouldn’t shock me to see rolling updates to Penguin that include penalties targeting those violations as well.

Below, I’ll list some of the inbound link issues I’ve seen on websites hit by Penguin.  Again, we are early on, and things can change.  But for now, this is what I’m seeing:

1. Paid Text Links Using Exact Anchor Text
As you can imagine, this one is a clear violation of Google’s guidelines.  During my analysis, it was easy to pick up exact match, paid text links on sites that were rampant with sponsored links.  Many of the sites I analyzed had these types of links.  Similar to what I said in my first post on Penguin, if you want to check your own inbound links, read my post about finding paid text links using Open Site Explorer.

Checking inbound links via Open Site Explorer:
Using Open Site Explorer to check inbound links

2. Comment Spam
When analyzing websites hit by Penguin, I also saw a lot of comment spam.  This came in two forms.  The first form was using signatures in comments using exact match anchor text.  For example, instead of using your name (like you’re supposed to), people commenting were using the exact match anchor text for keywords they wanted to rank for.  For Google, this is pretty darn easy to pick up.

3. Guest Posts on Questionable Sites
I’ve also seen many guest posts on questionable sites that included exact match anchor text.  Note, I obviously don’t think all guest posts are bad.  Actually, I think they can be ultra-powerful on the right websites and blogs.  But, the guest posts I’m referring to were on sites set up simply to generate income from those guest posts (based on the links they would drive).  And the posts themselves weren’t strong… They were typically thin with a focus on the anchor text, and not the story.

4. Article Marketing Sites
Similar to the last bullet, I saw a lot of syndicated articles using exact match anchor text leading back to sites that got hammered by Penguin.  So yes, Penguins seem “cold” to article marketing tactics. Sorry, that’s my second Penguin joke. :)  Again, these articles were relatively thin, used several instances of exact match anchor text leading back to the site, etc.

Inbound Link Profiles Heavily Weighted by These Tactics Got Hit
One of the most important findings included the weighting of inbound links for each site.  For the sites I analyzed, a majority of the inbound links included the tactics listed above.  Actually, for some sites, I couldn’t find any natural links… most were unnatural.  As you can imagine, this is not a strong signal to Google that you’re a typical webmaster looking to gain traffic by earning it.  You look like you’re gaming the system to gain rankings.  And that’s when Penguin steps in, and hammers you.  A natural link profile will contain many types of links, including URL’s, brand names, image links, etc.  It won’t contain 99% exact match anchor text from article sites, comment spam, etc.

Dangerous Sites and Not Just Spammy
There’s another point I wanted to make before ending this post.  While analyzing inbound links across penalized sites, I found several linking websites that were flat out dangerous, and not just spammy.  I found sites flagged for malware, sites using numerous popups as I hit a page, etc.  So, when we know that Google doesn’t like sending users to dangerous websites, and they it doesn’t like spam, these dangerous sites could have been the kiss of death for downstream destinations.  If you have unnatural links on dangerous sites, then the Penguin outcome probably wasn’t pretty for you.

An example of a website flagged for malware:
Firefox flagging a website for malware

Next Steps for Penguin
Remember, this is just the beginning stages of Penguin and the situation can change quickly.  I’m going to keep analyzing websites that have been penalized, monitoring webmaster forums, and watching Google’s response closely.

Here are some closing points about Penguin that are important to understand:

  • Google said you cannot file a reinclusion request if you’ve been hit by Penguin.  Since the update is algorithmic, and not manual, reinclusion requests will not help you.
  • Google set up a Penguin form to fill out if you believe that you’ve been wrongly hit by the algo update.  If you think you’re a false positive, then fill out the form today.
  • Google also included a link to a form where you can report spam that you think should have been caught by Penguin, but hasn’t.  You can click the button labeled “Report Webspam” on that page.  This is obviously a little subjective, but I’m glad Google is looking to catch more webspam in future releases of Penguin.
  • It’s worth noting that both Panda and Penguin have been rolled out within a few days of one another.  Yesterday, Matt Cutts explained that the latest version of the Panda Update rolled out on 4/19, and then Penguin rolled out on 4/24.  You should check your reporting to make sure you know the date you were hit.  You don’t want to go down the wrong path when making changes to your website…  i.e. Mistaking Panda penalties with Penguin penalties. <- and by the way, how ridiculous does that sound? :)
  • If you’ve been hit by Penguin, take a hard look at your site, your inbound link profile, etc., and plan to make changes.  I know this is going to be a painful time for you and your website, but don’t just sit there. Analyze and take action.

Until the Next Version of Penguin rolls out…
That’s what I have for now.  I hope this post helped shed some light on the latest Google algorithm update.  If you have any questions, or need assistance, don’t hesitate to contact me.  Good luck.

GG

 

Filed Under: algorithm-updates, google, seo

Google’s Over-Optimization Penalty and Exact Match Domains

April 25, 2012 By Glenn Gabe 17 Comments

The Webspam Algorithm Update and Exact Match Domains

Yesterday, we found out that the latest Google algorithm update was underway (dubbed the Over Optimization Penalty).  Google posted about the update on the Webmaster Central Blog.  By the way, Google is officially calling the update the Webspam Algorithm Update and not the Over Optimization Update (although I like the sound of the latter).  There has been a lot of speculation about what the update would look like, which spammy tactics would get hit, etc.  I’ve been keeping a close eye on the various webmaster forums, and there are already many reports of sites getting nuked.  As Google explained in its blog post yesterday, it was targeting webspam tactics that were being used to game the system.  That could mean spammy inbound links, keyword stuffing, doorway pages, etc.

The goal of the latest algorithm update is to level the playing field rankings-wise.  Google realizes that there are many sites that have great content, but simply can’t compete against other sites that have been overly optimized.  And when I say “overly optimized”, I’m referring to using spammy tactics to game the system.  As I’ve always said, those tactics might work in the short-term, but the long term impact could be devastating.  And those sites are seeing the negative effect now.

What Does Over Optimization Mean?
There’s a lot of speculation with what over optimization actually is, and what can get you penalized with this latest algo update.  For example, keyword stuffing, too many inbound links with rich anchor text, overly optimized title tags, footers filled with optimized content, etc.  Basically, any spammy tactic that companies have used to game the system…

To be clear, these aren’t tactics that a typical webmaster would use.  Based on the screams from webmasters that have been hit during recent testing, and now as the algo gets rolled out, spammy inbound links seem to be causing a lot of problems.  That said, I’m sure we are going to see many examples of different tactics getting penalized too.  By the way, if you are interested in checking your own inbound links, then check out my recent post about finding spammy links using Open Site Explorer.  It will only take you a few minutes…

The Exact Match Domain Threshold
One tactic that I think hasn’t received as much attention during this update is the use of exact match domains.  Mike Wilton mentioned this in his post about Over Optimization, but most people have been focusing on inbound links, keyword stuffing, on-page optimization, etc.  But anyone in SEO will tell you that exact match domains have been a tactic that has been abused over the years.  It involves someone registering a domain name that exactly matches the keyword they want to rank for.  Unfortunately, the engines heavily weight keyword-rich domains in the SERPs.  As you can imagine, that flaw has led to an abuse of the system.

For example, imagine you sold widgets in Princeton, NJ.  You might register www.princetonwidgets.com or www.widgetsprinceton.com.  You get the picture, and my guess is that you have seen many exact match domains rank well as you search Google.  On the one hand, if you legitimately have an exact match domain, and you use that domain as your core website, then I get it (and that’s fine to me).  Also, if you happen to have a brand name or company name that matches a highly searched keyword, I get that too.

But the abuse has come from business owners (and heavily local business owners) who simply want to dominate a certain category by using exact match domains.  And that’s where I think it crosses the line.  In addition, some companies use their core domain for their website, but register a bunch of exact match domains that simply link to their core domain.  As you can see, that’s not a “normal” way to set up websites for a company (or how to build inbound links for the core domain).

Will Exact Match Domains Get Hammered by the Webspam Algorithm Update?
It’s hard to say to what degree, but I know some will get penalized (actually, I see some are getting penalized right now).  I obviously don’t think all exact match domains will get nuked, since that’s way too extreme, and would include the legitimate use of exact match domains (as I covered earlier).  So, there might be a threshold that Google uses while determining which exact match domains to penalize.

An Example of Webmasters Reporting a Drop in Rankings as Algorithm Update Rolls Out: 

Webmasters report drop in rankings based on Google algorithm update

Below, I’ll list some thoughts about what that threshold could look like.  And I hope this goes without saying, but we’ll all find out over time how extreme this algorithm update was.  We are only on day 1. :)

1. Number of Domains Per Company
As I mentioned earlier, there are some companies that have registered an exact match domain for their business website.  If that’s the sole use of the domain, and you are adding high quality, valuable content, then there’s no reason for that domain to get hammered.  But, if a company registered 5 different exact match domains, in addition to having its company website, then you can start to see how this would violate Google’s guidelines.  The company is simply trying to game Google and rank across multiple sites for target keywords.  This type of set up is at great risk right now (in my opinion).

And by the way, if you think Google doesn’t know that you own all the domains, think again.  It has multiple ways to understand this, including your own Google Analytics account.  :)

2. Cross Linking of Domains Using Rich Anchor Text
Are all the exact match domains linking to either each other, or to another domain you own?  Again, that could easily be perceived as spamming by Google.  Buying a bunch of exact match domains only to cross link them using rich anchor text could definitely get you in trouble.  I’ve come across this a thousand times while analyzing inbound links for companies.  You clearly see several company-owned domains all linking to one another with the exact anchor text that they want to rank for.

3. Doorway Pages
Similar to what I explained above, some companies employ exact match domains (and pages within those domains) solely to help another site rank.  In addition, some companies use EMD’s to funnel traffic to a core, company website.  Again, typical webmasters aren’t going to do this… They will build high quality domains with the goal of impressing prospective customers, educating them, and landing new business.  They will build links naturally and not try to buy their way to the top of the rankings.  Google has a clear view of doorway pages, as stated in its Webmaster Guidelines.

4. Thin Content and Panda
With the latest algorithm update, there is a chance that a website could fall victim to a double whammy penalty.  For example, getting hit by both Panda and Over Optimization.  All you have to do is combine what I’ve listed above and you can see how a website might have both thin content and be in use solely to help another site rank well.  If that’s the case, then it could get hit by the Panda Update (which targets low quality content), and by OOP (which targets webspam).  Good luck recovering from that perfect storm…

Summary – We’re only in the beginning stages of OOP
As crazy as this sounds, this is an extremely exciting time for anyone involved in SEO.  You get to watch a major algorithm update get pushed out, analyze the sites that get penalized, view collateral damage, try and better understand what Google’s objective was with the update, etc.  In this post, I’ve tried to outline how the latest update could impact exact match domains.  Unfortunately, nobody will know the exact impact for weeks (or longer).  I plan to write more about the Webspam Algorithm Update in future posts, so keep an eye on my RSS feed.

And, if you have been hit by the latest update, feel free to reach out to me.  Although many spammy websites will get penalized, there is always collateral damage.

GG

 

Filed Under: algorithm-updates, google, seo

How To Use Fetch As Google In GSC To Submit An Updated Page To Google’s Index [Tutorial]

April 5, 2012 By Glenn Gabe 32 Comments

Fetch as Google and Submit To Index in GSC

{Updated on April 18, 2016 to cover the latest changes in Google Search Console (GSC).}

Any marketer focused on SEO will tell you that it’s sometimes frustrating to wait for Googlebot to recrawl an updated page.  The reason is simple.  Until Googlebot recrawls the page, the old content will show up in the search results.  For example, imagine someone added content that shouldn’t be on a page, and that new content was already indexed by Google.  Since it’s an important page, you don’t want to take the entire page down.  In a situation like this, you would typically update the page, resubmit your xml sitemap, and hope Googlebot stops by soon.  As you can guess, that doesn’t make anyone involved with the update very happy.

For some companies, Googlebot is visiting their website multiple times per day.  But for others, it could take much longer to get recrawled.  So, how can you make sure that a recently updated page gets into Google’s index as quickly as possible?  Well, Google has you covered.  There’s a tool called Fetch as Google that can be accessed within Google Search Console (GSC) that you can use for this purpose.  Let’s explore Fetch as Google in greater detail below.

Fetch as Google and Submit to Index
If you aren’t using Google Search Console (GSC), you should be.  It’s an incredible resource offered by Google that enables webmasters to receive data directly from Google about their verified websites.  Google Search Console also includes a number of valuable tools for diagnosing website issues.  One of the tools is called Fetch as Google.

The primary purpose of Fetch as Google is to submit a url, and test how Google crawls and renders the page. This can help you diagnose issues with the url at hand.  For example, is Googlebot not seeing the right content, is the wrong header response code being returned, etc? You can also use fetch and render to see how Googlebot is actually rendering the content (like the way a typical browser would). This is extremely important, especially for Google to understand how your content is handled on mobile devices.

But, those aren’t the only uses for Fetch as Google.  Google also has functionality for submitting that url to its index, right from the tool itself.  You can submit up to 500 urls per month via Fetch as Google, which should be sufficient for most websites.  This can be a great solution for times when you updated a webpage and want that page refreshed in Google’s index as quickly as possible.  In addition, Google provides an option for submitting the url and its direct links to the index.  This enables you to have the page at hand submitted to the index, but also other pages that are linked to from that url.  You can do this up to 10 times per month, so make sure you need it if you use it!

Let’s go through the process of using Fetch as Google to submit a recently updated page to Google’s index.  I’ll walk you step by step through the process below.

How to Use Fetch as Google to Submit a Recently Updated Page to Google’s Index

1. Access Google Search Console and Find “Fetch as Google”
You need a verified website in Google Search Console in order to use Fetch as Google.  Sign into Google Search Console, select the website you want to work on, expand the left side navigation link for “Crawl”.  Then click the link for “Fetch as Google”.

Fetch as Google in the Crawl Section of GSC

2. Enter the URL to Fetch
You will see a text field that begins with your domain name.  This is where you want to add the url of the page you want submitted to Google’s index.  Enter the url and leave the default option for Google type as “Desktop”, which will use Google’s standard web crawler (versus one of its mobile crawlers).  Then click “Fetch”.

Submitting a URL via Fetch as Google in GSC

3.  Submit to Index
Once you click Fetch, Google will fetch the page and provide the results below.  At this point, you can view the status of the fetch and click through that status to learn more.  But, you’ll notice another option next to the status field that says, “Submit to index”.  Clicking that link brings up a dialog box asking if you want just the url submitted or the url and its direct links.  Select the option you want and then click “Go”. Note, you will also have to click the captcha confirming you are human. Google added that in late 2015 based on automated abuse it was seeing from some webmasters.

A Successful Fetch:
Successful Fetch in GSC

The Submit to Index Dialog Box:
Submit To Index Dialog Box in GSC

4. Submit to Index Complete:
Once you click “Go”, Google will present a message that your url has been submitted to the index.

Successful Submit to Index via GSC

That’s it!  You just successfully added an updated page to Google’s index.
Note, this doesn’t mean the page will automatically be updated in the index.  It can take a little time for this to happen, but I’ve seen this happen pretty quickly (sometimes in just a few hours).  The update might not happen as quickly for every website, but again, it should be quicker than waiting for Googlebot to recrawl your site. I would bank on a day or two before you see the new page in Google’s cache (and the updated content reflected in the search results).

Expedite Updates Using Fetch as Google
Let’s face it, nobody likes waiting. And that’s especially the case when you have updated content that you want indexed by Google!  If you have a page that’s been recently updated, then I recommend using Fetch as Google to make sure the page gets updated as quickly as possible.  It’s easy to use, fast, and can also be used to submit all linked urls from the page at hand.  Go ahead, try it out today.

GG

 

Share
Tweet
Share
Email
13 Shares

Filed Under: google, seo, tools

A Guide to Using Social Extensions in Google AdWords | What They Are, How To Set Them Up, and How To Analyze Performance

March 30, 2012 By Glenn Gabe Leave a Comment

Social Extensions in Google AdWords

Google+ is in full force now, it continues to grow, and its impact can be felt in both organic and paid search. This is readily apparent as +1 buttons have spread across the web, similar to what happened with Facebook Like buttons. One of the ways that Google is enabling businesses to benefit from +1’s is via Social Extensions in AdWords. If you’re not familiar with Social Extensions, don’t worry. This post will cover an introduction to Google’s latest ad extension, explain why you should care about it, and I’ll also explain how to add them to your AdWords campaigns. And by the way, yes, you should care about Social Extensions. Read on.

Ad Extensions in Google AdWords
In Paid Search, any time you can attach additional information to your SEM ads, the better. Attaching relevant and valuable information to your ads can be the difference between a click or simply registering an impression. And that can impact Quality Score, CPC’s, and ROI.

Google has done an incredible job rolling out various ad extensions that provide valuable information for people searching for products or services. For example, paid search marketers can implement ad sitelinks, product extensions, call extensions, location extensions, and social extensions. I won’t cover each of these extensions in detail in this post, but it’s important to understand that they “attach” information to existing ads. That additional information might be phone numbers, addresses, reviews, sitelinks, +1 annotations, etc.

Google Introduces Social Extensions
After Google+ rolled out, Google launched a new ad extension called Social Extensions. Social Extensions enable you to connect your Google+ Page to your AdWords campaigns. This enables you to share +1’s from your G+ page with your ads, and vice versa. Sharing +1’s gives marketers a greater chance of having +1 annotations show up in their ads (and it obviously impacts the count that shows up, as well).

+1 Annotations stand out, as they display the number of +1’s, including social connections that have casted a vote for the business on Google+. Here’s a screenshot of Social Extensions in action (for Dell). Note, Social Extensions can show up in both Search and on the Display Network.

An example of Social Extensions in Action in Google Search:
Social Extensions in Action - Dell

How Social Extensions Can Help You
As we have all seen with Facebook Ads, social annotations bring relevance to advertisements. If your social connections show up within the ad itself (essentially giving their approval of a business), then it can make a bigger impact. That’s as long as you take that person’s recommendation seriously. Since annotations can show up in both Search Ads and Display Network Ads, this can have a far reaching impact for businesses. The Display Network consists of any website running Google Ads, including Google properties like Gmail, YouTube, Google Maps, etc. It reaches approximately 80% of web users.

The fact of the matter is that social annotations work. They take up more screen real estate, they include visuals of your connections, and the number of +1’s the business has received. This can all be very powerful for advertisers, as it can increase the chances of a click-through. Google has stated that +1 annotations have yielded higher click-through during their testing.

Here’s an interesting quote from Vic Gundotra from Google about +1 annotations (from a New York Times Article about Google+):
“We are seeing 5 to 10 percent click-through-rate uplift on any ad that has a social annotation on our own Web sites,” Mr. Gundotra said. ”We have been in this business for a long time, and there are very few things that give you a 5 to 10 percent increase on ad engagement.”

How To Analyze Social Extensions in AdWords
Let’s say you implemented Social Extensions and were wondering how they were impacting the performance of your ads. You’ll be happy to know that Google addressed this situation by providing a +1 segment in AdWords. You can access this segment by logging into AdWords, accessing a specific campaign or ad group, and then clicking the “Segment” button, and then selecting the “+1 Annotations” segment.

+1 Annotations Segment in Google AdWords

You will then be presented with data showing how the +1 annotation impacts your performance. There are two types of +1 annotations that you can analyze. The first is labeled “personal” and will show you how many people saw the annotation when users from their circles were included. For example, if I was in a user’s circles, it could say, “Glenn and 22 other people +1’d this.” The second type of +1 annotation is “basic”, and it will simply show you how many people saw the annotation without personal recommendations. For example, a basic annotation would say, “250 people +1’d this.”

Using the +1 Annotations segment, you can compare the performance of ads that didn’t show an annotation to ads that did. In addition, you can compare both personal and basic annotations to see how they perform. And if you have conversion tracking set up, you can see the impact on conversion.

How To Add Social Extensions in AdWords
If you’ve gotten this far in my post, I’m sure you are wondering how to add Social Extensions in AdWords! I’ve got you covered. The first thing you need to do is to connect your Google+ Page to your website, and vice versa. Then you need to add the Social Extension in Adwords. Follow the steps below to add a Social Extension to your campaigns.

1. Connect Your Website and Your Google+ Page
You can connect your Page and Site a few different ways. In order to verify ownership, Google looks for a rel=”publisher” link on your homepage to a Google+ Page. You can also add the Google+ Badge to your homepage. You can learn more about each method here.

Connecting Your AdWords Account with a Google+ Page

2. Link Your G+ profile Back to Your Homepage
Next, Google wants to see a link from your Google+ Page’s profile back to your homepage. You can easily add this to your Page by editing your profile and then adding links in the right sidebar. After clicking “edit profile” button, you can click the links section in the right sidebar to edit them. Then you can click “add custom link” and enter the label and URL for your homepage.

Linking a Google+ Page Back to a Website

3. Add Social Extension in AdWords
In order to complete the process of adding Social Extensions, you need to add the extension in your AdWords campaign. Access the campaign in question and click the “Ad Extensions” tab. Use the dropdown and select “Social Extensions”. Click “New Extension” and you’ll be presented with a text field where you need to enter the URL of your Google+ Page. You can get the URL by accessing your page in Google+ and copying the URL. Make sure you are copying the URL of your Page and not your personal profile. This is a common mistake. Click “Save” to complete the process. Google will approve the extension if everything is in place.

Adding Social Extensions in AdWords

That’s it! Once your Social Extension has been approved, your ads will be eligible to have +1 annotations show up.

Summary – Don’t Miss Out On Social Extensions
When running AdWords campaigns, it’s important to take advantage of all the powerful tools that Google provides for increasing performance. Ad Extensions in general can help advertisers increase click-through, build credibility, and land more business. And Social Extensions in particular can bring a social relevance to your ads that’s hard to match. I recommend taking the necessary steps to implement Social Extensions, and then track how they work for your business. You just might find that +1’s, and the relevance they bring to your ads, boost sales and ROI. And that’s the name of the game in SEM.

GG

Filed Under: adwords, google, google-plus, sem, social-advertising, social-media

  • « Previous Page
  • 1
  • …
  • 29
  • 30
  • 31
  • 32
  • Next Page »

Connect with Glenn Gabe today!

Latest Blog Posts

  • Google Search Console (GSC) reporting for Soft 404s is now more accurate. But where did those Soft 404s go?
  • Google’s December 2020 Broad Core Algorithm Update Part 2: Three Case Studies That Underscore The Complexity and Nuance of Broad Core Updates
  • Google’s December 2020 Broad Core Algorithm Update: Analysis, Observations, Tremors and Reversals, and More Key Points for Site Owners [Part 1 of 2]
  • Exit The Black Hole Of Web Story Tracking – How To Track User Progress In Web Stories Via Event Tracking In Google Analytics
  • Image Packs in Google Web Search – A reason you might be seeing high impressions and rankings in GSC but insanely low click-through rate (CTR)
  • Google’s “Found on the Web” Mobile SERP Feature – A Knowledge Graph and Carousel Frankenstein That’s Hard To Ignore
  • Image Migrations and Lost Signals – How long before images lose signals after a flawed url migration?
  • Web Stories Powered by AMP – 12 Tips and Recommendations For Creating Your First Story
  • Visualizing The SEO Engagement Trap – How To Use Behavior Flow In Google Analytics To View User Frustration [Case Study]
  • The May 2020 Google Core Update – 4 Case Studies That Emphasize The Complexity Of Broad Core Algorithm Updates

Web Stories

  • Google’s Disqus Indexing Bug
  • Google’s New Page Experience Signal

Archives

  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2021 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy

We are using cookies to give you the best experience on our website.

You can find out more about which cookies we are using or switch them off in settings.

The Internet Marketing Driver
Powered by  GDPR Cookie Compliance
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

3rd Party Cookies

This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping this cookie enabled helps us to improve our website.

This site also uses pixels from Facebook, Twitter, and LinkedIn so we publish content that reaches you on those social networks.

Please enable Strictly Necessary Cookies first so that we can save your preferences!