The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for May 2012

The Pandeguin Penalty: What to do if your website has been hit by both Penguin and Panda

May 23, 2012 By Glenn Gabe 7 Comments

Pandeguin - Getting Hit by Both Penguin and Panda algorithm updates

If you’ve been following my blog recently, then you know how much Penguin analysis I’ve been performing.  Since the algorithm update first hit on 4/24, I’ve been working hard at analyzing websites impacted by Penguin.  You can read my previous posts to learn more, including my most recent post detailing my findings based on analyzing over 60 websites hit by Penguin.  Well, now I’ve analyzed over 80 websites hit by the update, and I decided to write a new post covering a tough subject.  Unfortunately, I’ve had several companies reach out to me that have run into the perfect storm of algorithm updates, so I thought it would be helpful to cover it today. More on what the perfect storm is soon.

Penguin or Panda?
Many of the companies contacting me about the update automatically believe they were hit by Penguin.  That’s not shocking, considering how much Penguin coverage there has been since 4/24.  But what many companies aren’t aware of is that Google rolled out a Panda update on 4/19, and then a Panda refresh on 4/27.  Panda is a another algorithm update initially launched in early 2011.  I’ll cover Panda in greater detail soon.  In addition, it seems there was another unofficial update around 5/11, which seems like it affected sites that were previously hit by Panda.  Google denied a Penguin or Panda update when asked about it, but I can tell you that there was some type of update.

The dates that Penguin and Panda rolled out in April 2012

So, with the algorithm sandwich in full effect, I’m seeing a lot of confusion with webmasters not understanding which update actually hit their websites.  That led to my latest Search Engine Journal post about how to determine which algorithm update hit your website.  Based on the popularity of that post, you can tell how big of a problem it was…  And knowing which update hit your site is extremely important, since Penguin and Panda target different problems.  I’ll explain more about each update a little later in this post.

Introducing Pandeguin – Fear the Beast
During my analysis, I’ve unfortunately come across several situations where websites were not only hit by Panda or Penguin, but instead, they were hit by both Panda and Penguin.  Yes, this is the worst possible scenario for a website, based on the recent algo updates.  These sites presumably had low quality, thin content, in addition to having horrible link profiles.  Needless to say, it’s critically important to know that you’ve been hit by both in order to rectify the situation.

Pandeguin Trending: 

The Pandeguin Penalty in Action

After explaining the Pandeguin situation to webmasters hit by both Panda and Penguin, a long period of silence typically followed.   Then comes the question, “OK, now what should I do?”  That’s a great question, and the answer depends on the site in question.  Penguin was more acute, where Panda is deeper and broader.  That said, there are some top-level recommendations I would advocate following for webmasters hit by Pandeguin.  Before I cover those bullets, let’s take a step back and quickly review Panda and Penguin 1.0.

What is Panda? A Primer
Panda was first released in February of 2011 and targets low quality content.  It’s a rolling update, which means it is rolled out periodically (typically once per month).  This means that if you were hit by Panda and made changes to rectify the situation, you wouldn’t know if those changes worked until the next update gets rolled out.

Many sites have been affected by Panda and there wasn’t a hard rule with why specific sites were getting hit.  That led to a lot of confusion.  For example, there were sites hit with duplicate content, thin content, affiliate content, scraped content, etc.  Webmasters were forced to take an extensive look at their sites and content and make hard decisions.  For example, gutting content, moving it to another domain, subdomain, etc.  Should they 301 the URL’s, 404 them?   The confusion led to Google releasing the famous 23 questions that webmasters should ask themselves about the quality of their content.  Although helpful, Google still didn’t clearly explain what was causing a site to get caught in the Panda filter.

Google's 23 Panda Questions

What is Penguin? A Primer
In March of 2012, Google began hinting that a major update targeting “over optimization” would be rolling out soon.  Nobody knew when it would roll out, what the update would target, etc.  We just knew that Google was going to target webspam.  When the update first rolled out, many called it the “Over Optimization Penalty”, which then turned into the “Webspam Algorithm Update”, and then was officially called “Penguin” by Google.

After performing heavy analysis once Penguin rolled out, it became extremely apparent that the update was very inbound link-heavy.  Although there are many forms of webspam, unnatural inbound links were absolutely hammered.  After analyzing 80+ websites, I can tell you that inbound links are the core problem being targeted by Penguin 1.0.  Now, I fully expect future versions of Penguin to target additional types of webspam, so inbound links are just the start (in my opinion).  Like Panda, Penguin will be rolled out periodically.  You won’t know if the changes you implement actually work until Penguins come knocking again.

As you can see, the two algorithm updates are very different.  As a webmaster, you don’t want to fix low quality content when you’re hit by Penguin and you don’t want to fix inbound links if you’ve been hit by Panda.  But what about if you were hit by both updates?  As I said earlier, I’ve had several companies reach out to me that were hit by both.  Needless to say, they have a lot of work to do.  But where do they start?  Let’s take a look at some top-level recommendations for sites hit by Pandeguin.

Top-level recommendations for companies hit by both Panda and Penguin:

1. Start with Penguin, it’s more focused at this point:
As I explained earlier, Penguin 1.0 was more focused on inbound links.  If you were hit by Penguin 1.0, chances are you had a poor link profile filled with unnatural links.  I would begin here, and start to analyze and then prune links.  Perform an inbound link analysis and organize your links by quality.  Then target the ones you want to nuke, and then execute.  Panda is a deeper algorithm update at this point and requires much more analysis and work.  Start with Penguin and move quickly.

2. Move to Panda, it’s a deeper update:
Since there are a number of problems that could have caused Panda to hit your site, you really should have a professional SEO analyze your website.  I’ve written previously about SEO audits here on my blog, and I’m a firm believer that audits are the most powerful deliverable in all of SEO.  You need to determine the risks on your website from a Panda standpoint.  Is there duplicate content, is it just thin content, does your site look too affiliate for Google, are you scraping content, etc?  Once you fully understand your current state, you can start to form a plan of attack.  Panda changes could be more complex, depending on what you need to refine.  It’s not as simple as pruning links (if you can).  You might need to develop an entirely new strategy for your website or business.

3. Execute, Wait, and Adjust
As I mentioned earlier, both updates will be rolled out periodically.  This means you need to wait until they are rolled out to know if the changes have succeeded with lifting the penalty.  This also means you need to move quickly.  If each update is rolled out monthly, then you need to analyze the problem, map out changes, and execute those changes before the next update.  If not, you can miss your window of opportunity.  If you miss the window, you might blow an entire month.  If your business relies on Google traffic to survive, that can be extremely costly.  Once the updates roll through, you can determine what worked and what didn’t.  Then you need to adjust quickly.  I wish Panda and Penguin were live all the time, but they aren’t at this point.

4. Get Search Analytics In Order
As you can imagine, in order to accurately analyze the situation, you need Search Analytics in order.  That includes your analytics package like Google Analytics, Omniture, WebTrends, etc.  In addition, you should have Google Webmaster Tools and Bing Webmaster Tools set up.  If you are using Google Analytics, you can create advanced segments for various categories of organic keywords.  That will enable you to quickly analyze core sets of data.  In addition, you should use annotations to document changes in GA.  You also might want to set up custom reports, based on your own organic search situation.

In Google Webmaster Tools, you should be tracking a number of items, including the Search Queries report (which will show you the number of impressions, clicks, average position, etc. for queries that returned your site in Google search results.)  You can also see the percentage of change for core metrics in this report.  You should also be exporting your data from Google Webmaster Tools, since the data only goes back 90 days.

Summary – Overcoming Pandeguin
Based on speaking with many webmasters since Penguin hit, I know how frustrating it can be when you’ve been impacted by an algorithm update.  But some business owners have a bigger problem to deal with, namely Pandeguin.  If you’ve been hit by both algorithm updates, then follow the recommendations I provided in this post to begin building back search traffic from Google.  It might be a long road back, but you need to start somewhere.  Good luck.

GG

 

Filed Under: algorithm-updates, google, seo

7 More Penguin Findings: An Update From the Over Optimization Front Lines

May 11, 2012 By Glenn Gabe 13 Comments

More Penguin 1.0 Findings

After Penguin first hit on April 24th, I started performing a lot of analysis on websites that were affected.  I ended up quickly publishing my findings in two blog posts here on the Internet Marketing Driver.  The first explained how exact match domains could be susceptible to penalty based on how those domains were being used.  A few days later, I wrote a second post that included initial findings based on analyzing a number of websites hit by Penguin.  In that post, I explained how the initial rollout of Penguin seemed extremely inbound link-heavy.  I simply wasn’t seeing other webspam tactics getting penalized.  Every website that I analyzed that had gotten nuked ended up having serious inbound link issues.

Based on writing those two initial posts, I’ve had numerous businesses reach out to me that have been hit by Penguin.  They range from businesses running one single website to owners of hundreds of websites.  It’s been absolutely fascinating to hear what’s happened to various websites (and networks), and then be able to analyze those sites.  In total, I’ve analyzed approximately 60-70 websites since Penguin hit.  As a result, I have a lot of data.  My goal with this post today is to share some of my findings, explain what I’m seeing, and shed some light on the situation.

Real People, Real Problems
The first thing I wanted to mention before getting into my Penguin findings has little to do with SEO.  Since April 24th, I’ve had the opportunity to speak with a lot of business owners that have gotten hit by Penguin.  I’ve been amazed at how open everyone has been with me.  Some people emailed me, while others simply called me directly.  In almost all cases, you can feel their despair through emails or in their voices.

I just want to emphasize that no matter what you think about webmasters using grey hat or black hat tactics, it’s important to know that there are still people on the other side of those websites.  Real people, that now have real problems.  Yes, many broke the rules.  I get it.  But it’s still hard to hear some of the stories…   Some won’t be able to pay their medical bills now, while others are going to find it hard to pay their mortgages.  I think that gets lost when speaking about Penguin, and it shouldn’t.  That’s why I titled this article, “An Update from the Front Lines”.  It’s like digital combat.  And I’ve spoken directly with the wounded.

With that out of the way, let’s dig into my findings:

1. High Threshold of Exact Match Anchor Text Got Hammered
As I mentioned in my last post, Penguin 1.0 was extremely inbound link-heavy.  I haven’t seen other webspam tactics get hit like spammy inbound links.  Based on my analysis, websites with a high percentage of exact match anchor text were hammered.  For example, a site with 80%+ of its inbound links using exact match anchor text got smoked.  Performing an inbound link analysis on many of the sites I reviewed revealed unnatural links.  That included links on low quality sites, article marketing sites, etc. I have yet to come across a website with a truly diversified link profile get hammered by Penguin.

High Percentage of Exact Match Anchor Text

2. Other Spam Tactics Not Hit Yet
As part of my analysis, I came across several sites that were keyword stuffing, had over-optimized title tags, using doorway pages, etc., but didn’t get hit.  As I said in my previous post, I believe future releases of Penguin could hit those tactics.  I don’t believe sites using those methods are safe.  It is probably just a matter of time.  Beware.

3. Very Low False Positive Rate
I have seen a very low false positive rate.  Actually, I haven’t come across one site that was a clear false positive.  Also, Danny Sullivan interviewed Matt Cutts about Penguin this week, and Matt explained that Google is happy with the results of Penguin.  He said the false positive rate is very low.  I have to agree with his assessment.  I’ve analyzed many websites and almost every one of them had serious inbound links issues.

By the way, it also sounds like Penguin will be rolled out periodically (like Panda).  If that’s the case, then you won’t notice any changes until the next version of Penguin rolls out (no matter what you change in the meantime).

4. Panda + Penguin = Confusion for Marketers
This one is really confusing webmasters.  Panda rolled out on 4/19, and then Google rolled out a Panda refresh on 4/27.  In between, Google rolled out Penguin.  As you can imagine, this is extremely confusing for marketers.  Many don’t know how to even determine what they were hit by.  And the last thing you want to do is to take action thinking you were hit by Penguin, when you were actually hit by Panda (or vice versa).  I’ve had several companies contact me saying they were hit by Penguin, when in reality, they were hit by Panda.  My advice is to make sure you know which update hit you, and then form a plan of attack.  If you are confused about this, contact a professional SEO.

Panda or Penguin Update

5. Private Networks
I’ve had several owners of private networks contact me about Penguin.  To clarify, I’m calling a network of websites owned and operated by one entity as a private network.  The sites all leverage each other for links, and as you can guess, exact match anchor text links are heavily used.  Many of you reading this post would probably assume that all of the sites that were part of private networks would have gotten nuked equally.  But that’s not the case.  I analyzed dozens websites that were part of private networks and what I found is going to surprise you.  Only some of the network sites were hit, while other remained untouched.  Then there were some that dropped a few spots in the SERPs, but only marginally.

This was fascinating for me to analyze.  I would drill into a site that got hammered, and find all of the unnatural links.  Then I would analyze a site within the same network, with the same types of links, and it was untouched.  Then another site part of the network, with the same links, only lost marginal rankings.  Why?  Did Google really miss those sites?  That’s hard to believe.  They all linked to one another using exact match anchor text.

I did start seeing a trend with certain categories of websites.  For example, categories A and B were getting hit, while category C was untouched (across websites).  This led to me to believe that Google might be targeting certain categories with Penguin 1.0.  I can’t say for sure if that’s the case, but I saw this several times during my analysis of multiple private networks.

Private Linke Network

6. Google Isn’t Done With Public Link Networks
If private networks consist of websites owned and operated by one company, then public networks are large networks of websites where many different companies participate.  It’s more of a typical link network where sites unrelated to one another all link to each other.  As you can imagine, Google hated this tactic, and hammered many with Penguin 1.0.  But I noticed something really strange.  One of the largest public link networks I came across was untouched.  Literally, not one site that I checked seemed to have gotten hit by Penguin.  Again, how in the world did Google miss a network this large?  It just didn’t make sense.

Since these sites were all unrelated, the category issue I mentioned earlier couldn’t be what saved them.  So why would Google let such a large network of websites survive?  Good question, and it seems like it’s just a matter of time before that network gets hit.  I think this just proves that Penguin wasn’t perfect.  So, although I didn’t see any true false positives during my analysis, I did see some false negatives.  I expect future releases of Penguin to address those issues.

7. Penalty vs. Devalued Links
I mentioned earlier that I saw some sites get nuked while others were only marginally hurt.  I mentioned this earlier when referring to private link networks.  But, I also saw some standalone websites with unnatural link profiles drop in rankings, but not get annihilated.  So, why would some sites get hammered while other simply drop slightly in rankings?

This probably means that Google simply devalued the links pointing to the sites that were marginally hit, versus applying a serious penalty to the site.  If a site fell off the Google map, it was penalized.  If it dropped a few spots in search, or jumped to page 2, it’s possible that some of the website’s inbound links were devalued.  It’s worth noting, and you should analyze your rankings to determine if you were hit hard by Penguin or if you are experiencing devalued links.

Back to the Front Lines. More Updates Soon
That’s what I have for now.  It’s definitely ugly out there.  My recommendation is to analyze your current situation the best you can.  Determine what you were hit by, Panda or Penguin, and then form a plan of attack.  During my analysis, I found there are some sites with a clear problem, while others involve a deeper analysis.  Good luck.

GG

 

 

Filed Under: algorithm-updates, google, seo

Connect with Glenn Gabe today!

Latest Blog Posts

  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap
  • Google Multisearch – Exploring how “Searching outside the box” is being tracked in Google Search Console (GSC) and Google Analytics (GA)

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent