The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for November 2015

Google’s Unconfirmed Algorithm Updates in 2015 and Their Connection to Panda and Phantom (Including the 11/19/15 Update)

November 23, 2015 By Glenn Gabe 11 Comments

2015 Unconfirmed Google Algorithm Updates

2015 has been an incredibly interesting year from a Google algorithm update standpoint. Although there weren’t many confirmed updates like previous years, it was still a relatively volatile year algo-wise. We had the mobile-friendly algorithm released in April of 2015, Phantom 2 confirmed in early May, and then Panda 4.2 in July.

Although those are three confirmed updates by Google, there were absolutely other significant updates that caused significant movement in the SERPs this year. And in my opinion, several of those updates seemed directly connected to “content quality” like Panda and/or Phantom are.

Note, Google can roll out hundreds of updates per year, but many of those are minor and do not cause major changes in rankings and/or traffic. Actually, John Mueller explained that they rolled out over one thousand changes in 2014. It’s important to understand that the updates I’m covering in my post today would be categorized as significant based on major gains or losses of Google organic traffic overnight.

The Importance of Identifying Unconfirmed Updates For Webmasters
For business owners and webmasters, it’s incredibly important to understand the algorithm updates that have impacted a specific domain. I mentioned that in my previous Search Engine Land column about performing a Search History Analysis. If you don’t know what hit you, then you cannot have a solid strategy for attacking those problems and designing the right remediation plan.

But if you do have a good understanding of which algorithm updates have caused increases or decreases in traffic, then you can analyze the damage through the lens of the update that impacted the site. And that’s a much stronger way to go for companies looking to recover their Google organic search traffic.

I’ll first go through the various unconfirmed updates that have rolled out in 2015 and then explain more about their apparent ties to “content quality”. Again, they definitely seemed connected to Panda and The Quality Update (AKA Phantom 2) based on analyzing many sites that were impacted throughout the year. And if you’re wondering about the November 19 update, I cover that at the end. Based on what I’m seeing, it looks very similar to the others I mention in this post. Let’s begin.

February 5th 2015
It wasn’t long before we saw a significant update in 2015. Specifically, there was a lot of movement on 2/5/15. Based on the large Panda dataset I have access to, as well new companies reaching out to me about a drop in traffic, it was clear there was some connection with “content quality”. Also, many of the sites that were impacted had previously been impacted by Panda. There were many claims throughout the industry of drops or gains in traffic starting on 2/4 and 2/5.

Feb 5 2015 Google Algorithm Update

Feb 5 Update and Phantom 2 in May 2015

Feb 5, Phantom 2, and Sep 2015 Updates

Again, it’s important to note that a number of the companies that experienced a large drop on February 5, 2015 had been previously impacted by Panda updates. I saw it time and time again. And although some claimed it was entirely focused on ecommerce sites, that’s definitely not the case. I know many sites outside of ecommerce that were hit too. It’s also important to note that Barry Schwartz reached out to Google to see if it was Panda or Penguin-related and was told that wasn’t the case.

So definitely take a hard look at February 5, 2015 if you have experienced a drop in Google organic traffic this year. It very well could have been the 2/5 update, which again, seemed to be heavily focused on content quality.

Side Note: RankBrain Introduced
It’s worth noting that RankBrain supposedly started rolling out in early 2015. It doesn’t sound like it had fully rolled out by February, so I’m not sure if it had any impact on the 2/5 update. That said, any algorithm that’s helping Google handle the long tail of search (which is massive), is important to document. Personally, I think the 2/5/15 update had more to do with content quality factors being baked into Google’s core ranking algorithm. More about this soon.

Phantom 2 – Early May 2015
I’m including this confirmed major algorithm update in the list since it was originally unconfirmed. Google never intended to release any information about Phantom, but ended up doing so based on the buzz it generated. If you are unfamiliar with Phantom 2, I picked up what looked to be a major algorithm update in late April and early May of 2015 that impacted many sites across the web. And I’m referring to major swings in rankings and traffic. I named the update “Phantom” based on its mysterious nature.

Based on how large the update was, I ended up being interviewed by CNBC soon after writing my post, and the CNBC article went live the next day. Google finally revealed that there was an update, and it was a change to its core ranking algorithm with how it assessed “quality”. And if you pay attention to SEO at all, then you know that’s a really big deal. By the way, the update has been called “The Quality Update”, but I’m sticking with Phantom. :)

Phantom 2 Hit

Phantom 2 Drop May 2015

Massive Phantom 2 Hit in May of 2015

You can read more about Phantom in my post containing my findings, but it was a huge update that seemed to target many factors that Panda also targeted. And that got me thinking that Phantom could actually be the beginning of Panda being incorporated into Google’s core ranking algorithm. It’s hard to say if that was the case, or if that could have started during the 2/5 update, but both looked to heavily target content quality issues.

July 2015 – A Note About Panda 4.2
The summer brought a confirmed update in Panda 4.2 on July 18, 2015. And it wasn’t pretty, since it was an extended rollout based on “technical problems” Google was having with the algorithm update. Since 7/18/15, many companies have been eagerly checking their reporting to see if P4.2 has impacted their Google organic traffic. Unfortunately, many previous Panda victims (especially those impacted by Panda 4.1 in September 2014 and the 10/24/14 update) are still waiting for recovery. And that’s after making significant changes to their site, content, advertising setup, etc.

Panda 4.2 Recovery in July 2015

Panda 4.2 Hit in July 2015

There was a heavy focus on Panda 4.2 in the industry as we led into the fall of 2015. And that’s when the algo Richter scale in my office started to jump again. Strap yourselves in, the fall of 2015 was a bumpy ride.

September 2 and 16 Updates
In early September, I saw significant movement across sites that had been impacted by Panda and Phantom in the past (notice the trend here?) The first date was 9/2/15 and many sites either surged or dropped on that day. Again, these were sites that previously had content quality problems and had dealt with Panda and/or Phantom situations in the past.

And it wasn’t long until the next update hit the scene. Exactly two weeks later on 9/16, I saw another update roll through. More sites that had dealt with content quality problems saw movement (either up or down). I wrote a post detailing my findings after analyzing a number of sites impacted by the 9/2 and 9/16 updates. Based on my analysis, it was clear that “content quality” was the focus.

So, was it Panda 4.2 continuing its rollout or more core ranking algorithm adjustments like Phantom? Or was it a mix of Panda 4.2 and core ranking algo adjustments? Only Google knows.

September 16 Google Update

Drop During September 16 Google Update

September 16 Google Update Connection

September 16 Surge After Phantom 2 Drop

And We’re Not Done Yet! – November 19, 2015
As I was finalizing this post, it seems that yet another unconfirmed update has been released. There were many claims of movement starting on 11/19/15 from webmasters globally. I am also seeing movement based on the data I have access to, in addition to new companies reaching out to me about the update. So it does look like there was an update pushed last Thursday.

November 19, 2015 Google Update with Connection

November 19 Google Update Increase

November 19, 2015 Google Update Drop

Also, Google’s John Mueller was asked on Twitter if there was an update and he replied with the following response (see below). I’ve heard that type of response quite a bit over the years when there was a core ranking algo update. :)

John Mueller Tweet About November 19 2015 Google Update

When analyzing the sites seeing movement, many of those sites had been previously impacted by Panda and/or Phantom. Once again, the update looks “content quality” related to me and does not look to be connected to links. That said, we are still early on. If this was another unconfirmed update focused on “content quality”, then it joins the others I mentioned in this post that began rolling out in February, continued with Phantom, then in September, and now a late November update.

{Update: I have now heavily analyzed the November 19, 2015 update and it definitely had a strong connection to Phantom 2 from May 2015. Many sites that were impacted during Phantom 2 in May were also impacted on November 19 during Phantom 3. And a number of companies working to rectify problems saw recovery and partial recovery. You can learn more about Google’s Phantom 3 Update in my post covering a range of findings.}

———————
The rapid release of these updates got me thinking about their connection to Panda and what Google has planned for our cute, black and white friend. I have a theory about this and I’ll cover that next.

Are The Rapid-fire “Quality Updates” The Deconstruction of Panda?
We know that Google intends to bake Panda completely into its core ranking algorithm at some point (and Penguin as well). I mentioned earlier that all of these unconfirmed updates could be Google deconstructing Panda and taking those pieces and baking them into the core ranking algorithm. That could have started with the 2/5 update, continued with Phantom 2 in early May, and then even more with the bi-weekly updates in the fall of 2015, and continued now with the 11/19 update. The summer simply could have been a break from the process as Panda 4.2 rolled out.

Note, that is speculation, but data definitely backs this theory. There were a number of algorithm updates focused on “content quality” and we know Google wants to bake Panda into its real-time algorithm. Hard to say if that’s the case, but it’s important to note.

Dealing With Unconfirmed Quality Updates
As you can guess, this has been extremely frustrating for many business owners dealing with traffic loss from Panda, Phantom, or unconfirmed algorithm updates. Based on helping a number of companies with this situation, I’ve provided a bulleted list below with some recommendations. It is not a comprehensive list of what to do, but can definitely help you get moving in the right direction.

Recommendations for dealing with Google’s various “content quality” updates:

  • Perform a search history analysis, which can help you document the various swings in organic search traffic over time. Then line those dips and surges up with both confirmed and unconfirmed algorithm updates.
  • Understand the content and queries that took a hit. Run a Panda report (which can be used for other algo updates as well) to find the core pages that dropped after a specific date.
  • For quality updates (like Panda, Phantom, and the other updates listed in this post), focus on hunting down low content quality. And as I’ve said before many times, “low quality content” can mean several things. For example, thin content, scraped content, duplicate content, technical problems causing quality issues, advertising problems causing engagement issues, and more.
  • Rectify user engagement issues quickly. Once you find problems causing user engagement issues, fix them as soon as you can. Engagement barriers drive user frustration, which can send those users running back to the SERPs. And low dwell time sends horrible signals back to the mothership. Beware.
  • Avoid user deception at all costs (especially from an advertising standpoint.) Don’t blend ads into your content, don’t surround key elements of the page with ads that can be mistakenly clicked, and don’t provide ad links that look like navigation, but drive users to third party sites. Hell hath no fury like a user scorned. By the way, ad deception is in Google’s Quality Rating Guidelines. You should download the pdf to learn more.
  • Understand how Google sees your site. Use fetch and render to truly understand how Googlebot views your content. I can’t tell you how many times I’ve audited a site and found serious render issues. Fetch and Render is your friend. Use it.
  • Noindex what needs to be removed and focus Google’s attention on your highest quality content. Make hard decisions. Nuke what needs to be nuked. Be objective and aggressive when needed.
  • Rewrite and revamp content that needs a boost. Not all low quality content needs to be noindexed or nuked. If you feel it’s a strong topic, but the content isn’t the best it can be, then work on enhancing its quality. For example, brainstorm ways to enhance the content data-wise, visually, and based on what users are looking for. You don’t have to nuke it if you can boost it.


Summary – Know What Hit You, Respond Accordingly
As you can see, 2015 has been a volatile year from an algorithm update standpoint — yet only a few updates were actually confirmed by Google. In this post, I provided additional important updates that could have impacted your Google organic traffic starting back in February and being released throughout the year. I recommend reviewing your 2015 Google organic trending and identifying any swings in traffic around those important dates. Then form a strong plan of attack for fixing any problems from a content quality standpoint.

You have to know what hit you in order to take the appropriate actions. And that’s sometimes hard to do when Google doesn’t confirm specific updates. I hope this post was helpful, at least from that standpoint. Good luck.

GG

 

Filed Under: algorithm-updates, google, seo

How To Check The X-Robots-Tag For Noindex Directives (Google Tools, Chrome Extensions, and Third-party Crawlers)

November 8, 2015 By Glenn Gabe Leave a Comment

X-Robots-Tag in the Header Response

Updated: April 2022
The post now contains the most current tools I use for checking the x-robots tag for noindex directives. The list includes tools directly from Google, Chrome extensions, and third-party crawling tools.

——

I have previously written about the power (and danger) of the meta robots tag. It’s one line of code that can keep lower quality pages from being indexed, while also telling the engines to not follow any links on the page (i.e. don’t pass any link signals through to the destination page).

That’s helpful when needed, but the meta robots tag can also destroy your SEO if used improperly. For example, if you mistakenly add the meta robots tag to pages using noindex. If that happens, and if it’s widespread, your pages can start dropping from Google’s index. And when that happens, you can lose rankings for those pages and subsequent traffic. In a worst-case scenario, your organic search traffic can plummet in an almost Panda-like fashion. In other words, it can drop off a cliff.

And before you laugh-off that scenario, I can tell you that I’ve seen that happen to companies a number of times during my career. It could be human error, CMS problems, reverting back to an older version of the site, etc. That’s why it’s extremely important to check for the presence of the meta robots tag to ensure the right directives are being used.

But here’s the rub. That’s not the only way to issue noindex, nofollow directives. In addition to the meta robots tag, you can also use the x-robots-tag in the header response. By using this approach, you don’t need a meta tag added to each url, and instead, you can supply directives via the server response.

Here are two examples of the x-robots-tag in action:
Examples of the X-Robots-Tag

Again, those directives are not contained in the html code. They are in the header response, which is invisible to the naked eye. You need to specifically check the header response to see if the x-robots-tag is being used, and which directives are being used.

As you can guess, this can easily slip through the cracks unless you are specifically looking for it. Imagine checking a site for the meta robots tag, thinking all is ok when you can’t see it, but the x-robots-tag is being used with “noindex, nofollow” on every url. Not good, to say the least.

How To Check The X-Robots-Tag in the Header Response

Based on what I explained above, I decided to write this post to explain several different ways to check for the x-robots-tag. By adding this to your checklist, you can ensure that important directives are correct and that you are noindexing and nofollowing the right pages on your site (and not important ones that drive a lot of traffic from Google and/or Bing). The list below contains tools directly from Google, Chrome extensions, and third-party crawling tools for checking urls in bulk. Let’s jump in.

1. Tools Directly From Google

Google’s URL Inspection Tool
There’s nothing better than going straight to the source. With Google’s URL Inspection Tool, you can check specific urls to see if they are indexable. And as you can guess, the tool will specify if noindex is being delivered via the x-robots-tag (via the header response).

Using Google's URL Inspection Tool To Check The x-robots-tag

URL Inspection API
You can also use Google’s URL Inspection API to test urls in bulk. Once you run urls through the API, you can see if they are being noindexed via the x-robots-tag. You can check my tutorial for creating a multi-site indexing monitoring system to learn more about how to use the API.

Using the URL Inspection API to check the x-robots-tag

Mobile-friendly Test
Google’s mobile-friendly test can also show you if a page is being noindexed via the x-robots-tag. And since the tool can check any public url, you don’t need to have the site verified in GSC.

Using the mobile-friendly test to check the x-robots-tag

2. Chrome Extensions

Web Developer Plugin
The web developer plugin is one of my favorite plugins for checking a number of important items, and it’s available for both Firefox and Chrome. By simply clicking the plugin in your browser, then “Information”, and then selecting “Response Headers”, you can view the http header values for the url at hand. And if the x-robots-tag is being used, you will see the values listed.

Checking the X-Robots-Tag Using Web Developer Plugin

Detailed SEO chrome extension
The Detailed SEO extension provides a boatload of SEO information based on the page you are analyzing. And yes, it includes the x-robots-tag. One quick click and you can view if the page is being noindexed via the x-robots-tag. I highly recommend this plugin overall, and for checking the x-robots-tag. I think you’ll dig it.

Using the Detailed SEO chrome extension to check the x-robots-tag

Robots Exclusion Checker
This is another one of my favorite chrome extensions. The Robots Exclusion Checker will check the status of the robots.txt file, meta robots tag, x-robots-tag, and canonical url tag. I use this plugin often and it works extremely well for checking the x-robots-tag.

Using the Robots Exclusion Checker for checking the x-robots-tag

3. Crawling Tools
Now that I’ve covered Google’s tools and some Chrome extensions for checking the x-robots-tag, let’s check out some robust third-party crawling tools. For example, if you want to crawl many urls in bulk (like 10K, 100K, or 1M+ pages) to check for the presence of the x-robots-tag, then the following tools can be extremely helpful.

DeepCrawl
If you want to a robust, enterprise-level crawling engine, then DeepCrawl is for you. Note, I’ve been such a big proponent of DeepCrawl that I was on the customer advisory board for years. So yes, I’m a fan. :)

After crawling a site, you can easily check the “Noindex Pages” report to view all pages that are noindexed via the meta robots tag, the x-robots-tag header response, or by using noindex in robots.txt. You can export the list and then filter in Excel to isolate pages noindexed via the x-robots-tag.

Checking the X-Robots Tag Using DeepCrawl

Screaming Frog
I’ve also been a big fan of Screaming Frog for a long time. It’s an essential tool in my SEO arsenal and I often use Screaming Frog in combination with DeepCrawl. For example, I might crawl a large-scale site using DeepCrawl and then isolate certain areas for surgical crawls using Screaming Frog.

Once you crawl a site using Screaming Frog, you can simply click the Directives tab and then look for the x-robots column. If any pages are using the x-robots-tag, then you will see which directives are being used per url.

Checking the x-robots-tag via Screaming Frog

Sitebulb
Sitebulb is another excellent crawling tool and it also provides information based on the x-robots-tag. You can find that information in the Indexability section and then the Noindex reporting. I use Sitebulb often while analyzing websites.

Checking the x-robots-tag via Sitebulb

JetOctopus
I’ve also started using JetOctopus for crawling websites, which is another excellent enterprise-level crawling tool. And as you can guess, it reports on the x-robots-tag as well. Not as many people in the industry know about JetOctopus, but it’s a solid crawling tool that I’m using more and more.

Checking the x-robots-tag via JetOctopus

Summary – There’s more than one way to noindex a page…
OK, now there’s no excuse for missing the x-robots-tag during an SEO audit. :) If you notice certain pages are not being indexed, yet the meta robots tag isn’t present in the html code, then you should definitely check for the presence of the x-robots-tag. You just might find important pages being noindexed via the header response. And again, it could be a hidden problem that’s causing serious SEO issues.

Moving forward, I recommend checking out the various tools, Chrome extensions, and crawlers I listed in this post. All can help you surface important directives that can be impacting your SEO efforts.

GG

Filed Under: google, seo, tools

Connect with Glenn Gabe today!

Latest Blog Posts

  • How To Find Lower-Quality Content Being Excluded From Indexing Using Bing’s XML Sitemap Coverage Report (and Its “Content Quality” Flag)
  • How To Bulk Export GSC Performance Data For A Specific List Of URLs Using The Google Search Console API, Analytics Edge, and Excel
  • Analyzing the removal of FAQ and HowTo snippets from the Google search results [Data]
  • Why Noindexing Syndicated Content Is The Way – Tracking 3K syndicated news articles to determine the impact on indexing, ranking, and traffic across Google surfaces [Case Study]
  • Jarvis Rising – How Google could generate a machine learning model “on the fly” to predict answers when Search can’t, and how it could index those models to predict answers for future queries [Patent]
  • Analysis of Google’s Perspectives Filter and Carousel – A New Mobile SERP Feature Aiming To Surface Personal Experiences
  • People Also Search For, Or Do They Always? How Google Might Use A Trained Generative Model To Generate Query Variants For Search Features Like PASF, PAA and more [Patent]
  • Disavowing The Disavow Tool [Case Study] – How a site owner finally removed a disavow file with 15K+ domains, stopped continually disavowing links, and then surged back from the dead
  • Google’s April 2023 Reviews Update – Exploring its evolution from PRU to RU, a powerful tremor on 4/19, and how its “Review Radar” found larger publishers
  • Google’s Video Thumbnail Apocalypse Is Causing A Huge Drop In Video Snippets In The Search Results (But Traffic Could Remain Stable)

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent