The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Favi-gone: 5 Reasons Why Your Favicon Disappeared From The Google Search Results [Case Studies]

February 22, 2022 By Glenn Gabe Leave a Comment

favicons in Google Search

They say “a favicon is worth a thousand words”. OK… they really don’t say that, but favicons can definitely be important from a Search perspective. In 2019, Google started displaying favicons in the mobile search results as part of a mobile redesign, and it ends up that those little graphics in the SERPs can sure help on several levels. For example, a favicon can help reinforce your brand, it can attract eyeballs in a crowded SERP, and it can also help with click-through rate. So you definitely want to make sure your favicon game is strong.

Favicons in the Google search results.

Google published guidelines for defining a favicon in order to make sure they can be properly displayed in the SERPs. If you don’t adhere to those guidelines, Google can choose to ignore your favicon and provide a generic one for you. And there’s nothing more “meh” than the generic globe favicon Google provides. Let’s just say you won’t stand out in the SERPs with their generic favicon showing…

Generic favicon in Google search results.
Generic globe favicon in the Google search results.

In addition, you can end up with a blank favicon, which is super-awkward. The space for the favicon is reserved, but nothing shows up. It’s a just a blank white space where a favicon should appear. So sad… and I’ll explain more about that later in the post.

Blank favicon in the Google search results.

Here is another example of a blank favicon (and not just the generic globe favicon):

Missing favicon in the Google search results.

Favicon Assistance: When site owners reach out about favicon problems.
Every now and then I have site owners reaching out in frustration when their favicons go missing from the search results. When that happens, it can be a very confusing situation for those site owners… Well, I recently just helped a few more site owners over the past few weeks troubleshoot favicon problems in the search results. And based on what I found, I figured I would write a post explaining some of the top reasons I’ve seen that cause favicon problems in Google Search.

The problems are relatively easy to fix and changes can be picked up by Google pretty quickly for most sites. For example, one of the latest fixes I helped with was picked up in just a few hours and the SERPs were updated in less than a day (with the new favicon).

Favicons Disappearing and Questions About Quality:
When favicons go missing, some site owner immediately jump to thinking that Google somehow doesn’t trust their site  anymore or that there are quality problems causing Google to stop displaying their favicons (like how rich snippets can be impacted by broad core updates). That’s not the case. Favicons going missing in the SERPs have nothing to do with site quality. Instead, it has everything to do with technical problems with the favicons, or violating Google’s guidelines for providing favicons.

So if your favicon goes missing, it’s not that Google has suddenly reevaluated your site quality-wise. It’s probably due to technical issues or other guideline violations (which I’ll cover below). 

Where did your favicon go? Troubleshooting common favicon problems in Google Search.
Below, I’ll cover several common problems I have seen while helping site owners troubleshoot favicons that disappear from the search results (or favicons that just aren’t displayed properly by Google).

1. Wrong dimensions, no favicon for you…
This is the  most common issue I have seen. Google has explained in detail that favicons must be a multiple of 48×48 pixels. So, make sure your favicon is at least 48×48 or a multiple of 48×48. For example, 96×96, 144×144, etc. Don’t have a favicon that’s smaller than 48×48.

Favicon dimension guidelines from Google.

For example, a site used the following image as its favicon (blurred to avoid calling out the site). It was 50×50 and not a multiple of 48×48 pixels. Google just used the generic globe favicon. Again, meh in the SERPs.

Favicon with the wrong dimensions.

Also, the aspect ratio is important. If it’s not a square, it’s not going to work well. I’ve seen favicons that looked out of whack from an aspect ratio standpoint, or they just didn’t show up in the SERPs. For example, a site used the favicon below, which didn’t have a square ratio. Google forced it to fit the required aspect ratio (and it looked totally warped in the SERPs). Beware.

Favicon with the wrong aspect ratio.
Example of favicon with the wrong aspect ratio in the Google search results.

A note about favicon format: You have plenty of options:
Your favicon doesn’t have to be in the .ico format. It can be in any supported format, such as jpg, gif, png, and svg. I’ll cover more about svgs later in the post.

Favicon image formats.

2. Robots.txt blocking the favicon:
Google’s documentation states that you should allow crawling of your favicon and your homepage in order for the favicon to be used in Search. If your homepage is blocked by robots.txt, you clearly have bigger issues to worry about than just the favicon. :) But the favicon location could cause problems and be confusing from a robots.txt perspective. For example, some directives in robots.txt can be “greedy” and block more than you think.

I recommend using the robots.txt Tester in Google Search Console to make sure your favicon and homepage can be crawled. It’s a quick test and can save you some frustration. For example, here is a site with a missing favicon and it’s blocking access to the favicon. It’s a bigger brand by the way, so yes, larger companies can make this mistake too.

Favicons and robots.txt problems.

And here’s an interesting side note. Google has a specific crawler for favicons called Google Favicon. You can check the Googlebot documentation for the user-agent string. Google will use this crawler to check your favicon when you request indexing of your homepage via Google Search Console (GSC). And the crawler will ignore robots.txt directives when someone requests a recrawl of the homepage based on a favicon change.

For example, this is directly from the favicon documentation:
“The Google Favicon crawler ignores robots.txt rules when the crawl was requested by a user.”

And here is the crawler user-agent:

Google Favicon user-agent (crawler).

But again, that’s just for Google Favicon to check the new favicon. You still should enable crawling of your homepage and your favicon if you want it to be used in the search results.

3. Duplicate favicon references and one didn’t meet Google’s favicon guidelines:
This is similar to the first issue I covered, but it includes duplicate favicon references in the homepage code (and one didn’t meet the guidelines). I’ve seen situations where one, or more, of the favicon references are to files that don’t meet the requirements and Google just displayed the generic globe favicon instead in the SERPs. So just make sure to double-check all of the references to your favicon from your homepage and make sure they are ok.

For example, this site’s favicon wasn’t showing up correctly. It ended up the homepage had multiple rel=“icon” references and one didn’t meet Google’s guidelines. Fixing that by just having one rel=”icon” reference pointing at the proper file enabled the site to regain its favicon in the SERPs:

Multiple favicon references causing problems.

4. Uh, empty favicon code…
Yes this seems obvious, but I’ve unfortunately seen it in action. If you literally leave out the file in the favicon code, then you will obviously have favicon problems in Search. :) So if you are experiencing favicon problems, then definitely double-check your code. And I also recommend using the various testing tools from Google to check both the static html and the rendered html to make sure your code is correct.

Empty favicon code.

5. Your platform or CMS is botched favicon-wise.
For sites that use a specific platform or CMS to run their site, they may be in a situation where they can’t easily set or customize their favicon. And in situations where you don’t have much control, you are relying solely on the platform or CMS to get it right. And as you can guess, that doesn’t always work out well.

And yes, that means all sites using that platform could have favicon problems. I surfaced this problem recently for a smaller e-commerce platform. Google just isn’t replacing the favicon with the generic globe, it’s literally leaving the favicon blank! This is even worse than receiving the generic favicon in my opinion…

CMS platform causing favicon problems.

And when performing a query that brings up many sites using the platform, you can see the widespread problem. Yep, that’s all of the sites on the platform with missing favicons (not even the generic favicon). And look at the second listing in the SERP… the aspect ratio is messed up for the favicon. So we have a mix of blank favicons and one warped one. Not good.

All sites running the same platform having favicon problems in the Google search results.

Bonus 1: Don’t push the limits with your favicon.
In its documentation, Google has explained that it won’t show any favicon that it deems inappropriate (like pornography or hate symbols). If that’s the case, Google will simply provide the default, generic favicon. Just keep this in mind when crafting a favicon… I’m sure this won’t impact most sites, but it can clearly cause issues with your favicon displaying properly in the SERPs.

Here is what Google explains in their favicon documentation:

Google guidelines for favicons and inappropriate images.

Bonus 2: Create an adaptive favicon that works well in dark mode.
People love dark mode and that includes Google Search. But I find many don’t test how their favicon displays in dark mode.

Once you check out your favicon in dark mode, and if you think it looks less-than-optimal, then you can always create an adaptive favicon that looks great in both light and dark mode. For example, creating an SVG that uses media queries to ensure your favicon adapts to the current environment (light mode versus dark mode).

Adam Argyle wrote a post explaining how to create an adaptive favicon on web.dev where he walks you through the process of creating an SVG that can change based on light versus dark mode. I haven’t tested it out yet, but it’s an interesting technique that seems to work well in the demo. I might try doing that in the near future.

Adaptive favicons

Summary: Put your best favicon, I mean foot, forward in Search with one that actually shows up.
I hope this post helped you understand some of the most common favicon problems I’ve seen while helping site owners that reached out to me for help. With favicons being displayed prominently in the mobile search results, you don’t want a less-than-optimal favicon staring users in the face. And you also don’t want the “meh” generic favicon that Google can provide, or worse, a blank favicon. A few minutes of digging into the situation can usually surface the core favicon problem. And once fixed, you can finally have a favicon that works for you instead of against you. Good luck.

GG

Filed Under: google, seo

Google’s Broad Core Updates And The Difference Between Relevancy Adjustments, Intent Shifts, And Overall Site Quality Problems

January 31, 2022 By Glenn Gabe Leave a Comment

Google's broad core updates and the difference between relevancy adjustments, intent shifts, and overall site quality problems.

Based on helping companies with major algorithm updates, I often have site owners reaching out about big drops in traffic (whether that’s from broad core updates, other major algorithm updates like the Product Reviews Update, technical problems, or other disturbances in the SEO force). When those drops happen, site owners are often extremely confused about why their site dropped in rankings and traffic. It’s not long into those conversations that I bring up the differences between relevancy adjustments, intent shifts, and overall site quality problems. It’s extremely important for site owners to understand the differences between those three scenarios, and to understand which one (or which combination) is impacting their site.

I think many people jump to quality problems as the main culprit when they drop during a broad core update, and that can definitely be the case, but relevancy adjustments and intent shifts can also cause big drops in rankings and clicks from Google. The main difference is that site owners don’t have much control over relevancy adjustments and intent shifts. And for relevancy adjustments, it might not be a big deal at all for the business, since the quality of traffic from those lost queries can often be very low. In other words, those users weren’t really finding what they needed, they weren’t converting, and it makes sense for Google to show other sites with more relevant content in the search results.

When speaking with clients about the differences between relevancy adjustments, intent shifts, and overall site quality issues, I sometimes joke that I should just write a blog post covering the topic (since I’m explaining it so often and it would be great to point them to an article about it). Well, here it is! :)

In this post, I’m going to explain more about each of the three scenarios (relevancy adjustments, intent shifts, and overall site quality problems). My hope is that it can help site owners understand more about the drops they are seeing based on Google’s broad core updates. And as I said earlier, you could also see a combination of reasons why a site drops. That’s why understanding which queries and landing pages dropped is incredibly important. That’s a good segue to Delta Reports.

But first, here’s a quick table of contents in case you want to jump around the post:

  • The Importance of Running Delta Reports.
  • Relevancy Adjustments and Broad Core Updates.
  • Intent Shifts: A Slightly Different Way For A Site To Lose Search Visibility.
  • Overall Site Quality Problems.
  • The Good News: Site Owners Can Improve Site Quality.
  • Key Points and Recommendations for Site Owners.

First, The Importance of Running Delta Reports for Broad Core Update Drops:
When a broad core update rolls out and you see a drop, it’s always important to analyze which queries and landing pages dropped. That can tell you a lot about why that drop happened. That’s why I recommend running what I call a Delta Report, which shows you the drop across queries and landing pages when comparing the timeframe after the broad core update to a timeframe before. You can read my blog post to learn how you can automate running a delta report (along with filtering that data) via the Search Console API and Analytics Edge.

Once you dig into the data, you might come across several different scenarios. I’ll cover each of those below, starting with relevancy adjustments.

Here are sample screenshots from running a Delta Report in Excel using Analytics Edge:

Google Delta Reports
Google Delta Report Results

Relevancy Adjustments and Broad Core Updates:
So, what is a relevancy adjustment?If you check the queries that dropped based on a broad core update and notice queries your site really shouldn’t have ranked for, then that could be due to a relevancy adjustment pushed by Google. In other words, some content Google once believed was relevant for the query, is not anymore. This can happen in several ways, and I’ll cover two different types of relevancy adjustments below.

The “You’re On Borrowed Time” Scenario:
First, Google might push an adjustment and a site might not rank anymore for content that it had no right ranking for in the first place. Maybe it’s just content that can’t meet user expectations anymore based on query, or maybe it’s not even on-topic at all. The content might be fine quality-wise, but it just doesn’t help users based on query.

And yes, this can happen…. Google can get it wrong, and it can figure that out over time. Then a relevancy adjustment gets pushed, and boom, the pages drop heavily in the search results for those queries.

Here is an example of a query from a site hammered by the June 2021 broad core update. This is for a specific query that dropped based on a relevancy adjustment. There were many like this for the site…

Relevancy adjustment example for site on borrowed time.

And here is another query (from a different site) that dropped heavily based on a relevancy adjustment with the November 2021 broad core update:

Site dropping in rank for a query it had no right ranking for.


The “Stale and Dated” Scenario:
Another example of a relevancy adjustment is when content was once relevant to a topic but isn’t anymore. I see this often when news publishers have an article about an entity (like a celebrity, politician, company name, etc.) and the content is a few years old and just not relevant anymore. When that relevancy adjustment gets pushed by Google, those articles can drop heavily in the SERPs. It’s not that the content is bad or low-quality, it’s just not relevant anymore. I have covered this many times in my posts about broad core updates.

Here is an example of a query that fits this scenario. It’s a high-quality article that just isn’t relevant to a head term for a celebrity. The “Stale and Dated” scenario can happen often across publishers that write about celebrities, politicians, actors, companies, etc.

For example, here are two examples of stale and dated content dropping during the November 2021 core update and the June 2021 core update.

Stale content dropping in rank due to a relevancy adjustment.
Another older piece of content dropping in rank based on a relevancy adjustment.


Intent Shifts: A Slightly Different Way For A Site To Lose Search Visibility
What is an intent shift? An intent shift is when Google decides to show different types of content for the queries you once ranked for (based on what it believes users want to see). In other words, Google is trying to determine the intent of the user, which can radically change the search results for those queries. Yes, your content might still be high quality, but Google believes another type of content should rank. And when that happens, your rankings and traffic could plummet (causing huge problems for your business). We actually just saw this happen with the December Product Reviews Update, and I covered intent shifting in a section of that article. More about this soon.

For example, maybe you once ranked for the query “widget name” and you had content covering all of the best deals about the widget, but now Google returns content covering how to use the widget instead based on what it believes users want to see. It’s not that your content was bad, it just doesn’t cover what Google wants to return for users now. And from a content strategy standpoint, maybe you don’t have content covering how to use the widget, so now your site doesn’t rank at all. By the way, Google is making those changes based on the vast amounts of data is sees on a regular basis for those queries. Unfortunately, you can’t control an intent shift.

Like relevancy adjustments, intent shifts can cause big drops for certain sites. Circling back to the December Product Reviews Update, I saw intent shifting happening during the rollout (pretty heavily as the update rolled out over three weeks). For some product queries, Google was either returning review sites or e-commerce retailers (where you could buy the product). So, review sites ranking in the top ten suddenly dropped and retailer websites took their positions.

It’s not that the reviews content was bad, it was just Google believing users want to see places to buy the product for those queries versus the various deals. Again, that shift is completely out of the control of site owners. That said, you can make sure you have content that can rank for other types of search intent, but you can’t force Google to reverse the shift.

Here is an example of a query that was impacted heavily when intent shifts occurred during the December Product Reviews Update (and caused serious volatility for the site jumping into, and out of, the top 10):

Intent shifting during the December Product Reviews Update (2021)

Here is an example of a site impacted by the December 2020 broad core update and there was an intent shift for some queries (where Google started surfacing different types of content in the top ten). The site dropped to the bottom of page one, or even page two, for the query.

Intent shift causing a site to drop in rankings during the December 2020 broad core update.

And here is an example of a query where a big intent shift rocked a site (outside of a broad core update). This caused big problems for the site traffic-wise, but eventually Google reversed some of this. They are in much better shape now, but that initial intent shift was substantial for the site.

Intent shift outside of a broad core update.


Overall Site Quality Problems:
Beyond relevancy adjustments and intent shifting, overall site quality problems can be an important reason why sites drop heavily during broad core updates. I have covered this many times in my posts about broad core updates, so I won’t go too deeply in this post. You can just read those other posts to learn more about the various quality problems that can cause serious issues for a site when broad core updates roll out. That said, I will cover several key points below.

But first, here are two examples of drops for sites impacted by broad core updates (and both sites had serious overall quality problems). The first was heavily impacted by the May 2020 broad core update and the second was heavily impacted by the June 2021 broad core update. As you can see below, when major quality problems creep into a site, it can cause a big drop in rankings when broad core updates roll out. Site owners obvious want to avoid this at all costs…

Site dropping due to major site quality problems during a broad core update.
Site dropping heavily during the June 2021 broad core update due to quality problems.

And as I explained earlier, there could be a mix of reasons why a site dropped. For example, here is a site that dropped heavily with the December 2020 broad core update that had site quality problems, but also saw relevancy adjustments that caused drops. That combination was not good for this site (although the relevancy adjustments did make sense in my opinion).

Site dropping during a broad core update due to a mix of relevancy adjustments and site quality problems.

The Good News: Site Owners Can Improve Site Quality.
Here is an example of what can happen when a site significantly improves quality over time (after being negatively impacted by a broad core update). Rankings and traffic can surge when a subsequent broad core update rolls out. The following screenshot is from a site that worked hard to improve quality overall (content quality, user experience, “quality indexing”, toning down its aggressive advertising situation, and more). It surged during the June 2021 broad core update:

Site surging during a broad core update based on improving overall site quality over time.

“Quality” is more than just content:
If you think you were impacted based on site quality problems, it’s important to understand that “quality” can mean many things. It’s not just content quality. Google’s John Mueller has explained this before and it’s what I have seen while helping many companies deal with broad core update drops. Google is evaluating the site overall, and over an extended period of time. And quality problems can span several key areas, including content quality, user experience, aggressive, disruptive, or deceptive advertising, aggressive affiliate setups, technical problems that cause quality issues, and more.

Google’s John Mueller explained this during a Search Central hangout. Here is my tweet about that along with a link to the video:

Impacted by a broad core update? Via @johnmu: With core updates, Google doesn't focus on individual issues, but rather the relevance of the site overall (including content quality, UX issues, ads on the page, how things are presented, sources, & more): https://t.co/WFbHH6mD7s pic.twitter.com/lT3RSexdn6

— Glenn Gabe (@glenngabe) October 10, 2021

And here is the video clip (at 21:04 in the video):

So, if a site is impacted heavily by a broad core update, and it’s from serious site-level quality problems, then Google will want to see significant improvement in quality over the long-term. This is exactly what Google has explained many times before and it’s what I have seen while helping companies with broad core updates.

Here is John explaining more about this. First, here is one of my tweets covering the topic with a link to the video:

Significant Changes + Long-Term: Via @johnmu: Implement big changes on your site to improve quality? It can take significant time for Google's quality algorithms to re-evaluate the site (recalculate signals). Could be months, like 6+ months or even longer: https://t.co/S22INIMR7S pic.twitter.com/vTMbZDtOww

— Glenn Gabe (@glenngabe) March 20, 2018

And here is the video below (at 10:52 in the video):

In addition, it’s important to know that you typically cannot recover from a significant core update drop until another broad core update rolls out (if you were impacted due to site quality problems). You might be doing all the right things, improving quality significantly, etc., but you aren’t surging back like you thought you would. That’s because there are site-level quality algorithms that refresh during broad core updates (and that basically hasn’t happened yet). My recommendation is to surface all potential problems impacting a site quality-wise, and address as many as you can. I call this the “kitchen sink” approach to remediation, and it has worked well over time for my clients. 

Once again, here is John explaining more about recovery from broad core updates. First, here is my tweet about this:

Hit by a broad core update? -> Via @johnmu If the impact is so strong from the update, then it could be hard to reach your pre-update levels until another update. You can't just tweak things & might need to reconsider overall what you're doing w/ the site: https://t.co/6jgs4W9x6u pic.twitter.com/5T4s6x5Dcu

— Glenn Gabe (@glenngabe) November 5, 2020

And here is the video (at 36:45 in the video):

Again, I don’t want to go too deep here about quality, since I’ve written many blog posts that cover quality problems and broad core updates. The core point, pun intended, is that quality problems can cause major issues during broad core updates, and that sites need to address those problems over time. And this is much different than when a relevancy adjustment or intent shift cause drops during broad core updates. You can’t really control those… but you can control site quality.

Analyzing drops from broad core updates: Key points and recommendations for site owners.
I’ll end this post with some key points for site owners that saw a drop during broad core updates. Following the recommendations below can help get you moving on the right direction (and help you understand the core reason you were negatively impacted). 

  • If you’ve been negatively impacted by a broad core update, then run a delta report to understand the queries and landing pages seeing the biggest drops. I would start here to gain visibility into the queries and landing pages that saw the biggest drops.
  • When reviewing those queries, determine if those are relevancy adjustments, intent shifts in the SERPs, or if there might be site-level quality problems causing the drop. And remember, you could be seeing a combination of reasons for the drop. For example, some relevancy adjustments, but some big quality problems too.
  • If there were relevancy adjustments, then that’s not necessarily a bad thing (as I covered earlier). There are also times you can create content that is more relevant to those queries (or boost the content that wasn’t meeting user expectations). So, you might be able to regain some of the lost rankings and clicks with content that’s more relevant to the query.  
  • If there was an intent shift, then determine if you have content that can cover the types of content that Google is now surfacing for those queries. And if you want to be proactive, make sure your content strategy covers various user intents from the start… Then you can potentially have that content rank if an intent shift impacts queries you currently rank for.
  • If you determine that the queries dropping are important queries for your site, and you have content that’s relevant to those queries, then it could be site-level quality problems causing issues. If that’s the case, then work hard to improve site quality over the long-term. Understand that you probably will not see recovery until another broad core update rolls out (and it could take several broad core updates to see improvements). You can read my posts about broad core updates for more information about quality problems.
  • From a quality standpoint, it’s important to understand that Google is evaluating the site overall  (and it’s beyond just content quality). Objectively analyze your site through the lens of broad core updates, surface all potential quality problems, and form a plan of attack for addressing as many as you can. Again, I call this the “kitchen sink” approach to remediation.

Summary: Determining the reason for a broad core update drop can put you on the right path.
I hope this post was helpful for understanding the differences between relevancy adjustments, intent shifts, and overall site quality problems (as they relate to broad core updates). If you have seen a drop during a broad core update, I recommend following the steps I included in this post and determine why that drop happened. For example, did Google push a relevancy adjustment, was there “intent shifting” going on in the SERPs, or does your site have serious overall quality problems? Once you gain clarity, you can determine a solid plan of attack. And remember, start by running a delta report… It can get you on the right path. Good luck.

GG

Back to top>>

Filed Under: algorithm-updates, google, seo

Google’s December 2021 Product Reviews Update – Analysis and Findings Based On An Extended And Volatile Holiday Rollout

December 21, 2021 By Glenn Gabe Leave a Comment

Google's December 2021 Product Reviews Updates (PRU)

Update: May 2022
I just published a post covering a number of findings from the March 2022 Product Reviews Update (the latest PRU to roll out). My analysis covers linking to multiple sellers, the use of video content, loopholes in the PRU, intent shifts, dueling machine learning algorithms, and more.

—–

The past month has really been something for site owners and SEOs… Google pushed a broad core update on November 17, 2021, right before Black Friday (which shocked many people, including me). And following the rollout of the November broad core update, Google pushed yet another major algorithm update that can impact sites during the holiday shopping season. On 12/1/21, Google rolled out the December 2021 Product Reviews Update, the second official PRU following the initial update in April 2021.

I’ll cover more about the rollout below, but Google explained it can take up to three weeks to complete. And they were right… we just heard from Google that the update has completed (exactly three weeks in). During the extended rollout, I have been heavily analyzing many sites that have been impacted and this post covers my findings. I’ll continue to update this post as I surface more findings or if we learn more about the update from Google itself.

This post contains a lot of information about the December Product Reviews Update, so I have provided a table of contents for those looking to jump around the post:

  • Extended Rollout
  • Periodic Refresh Still Necessary
  • Google’s New Guidelines
  • Surges, Drops, and Overall Volatility
  • Recovery From The April PRU
  • Double Hits
  • Dueling Machine Learning Systems
  • Will The Product Reviews Update Expand To Other Languages?
  • Rich Snippets Impact
  • Large Publishers With Reviews Content
  • Google’s New Guidance Not Being Applied Yet
  • Links Aren’t Everything
  • Intent Shifting
  • Reviews Content That’s Working Well
  • The Wirecutter Standard

Note, I also published a companion Web Story covering key findings for site owners and SEOs. Both this post and the web story work well together if you are trying to learn more about the December product reviews update.

A Note About The April Product Reviews Update (PRU):
If you are interested in learning more about the Product Reviews Update overall, definitely check out my post about the April 2021 Product Reviews Update. I cover a number of important findings based on analyzing many sites that were impacted. I also published a companion Web Story covering top tips for site owners and affiliate marketers based on my analysis of the April PRU.

In this post, I’ll cover what I’m seeing based on the December PRU, including surges, drops, recoveries, how core updates and the Product Reviews update work together (or not), rich snippets impact, what I’m calling “The Wirecutter Standard”, and more.

First, as usual with my posts about major algorithm updates, a quick disclaimer:
I do not work for Google. I do not have access to Google’s algorithms, including the machine learning system fueling the Product Reviews Update. I did not dress up like Will Ferrell in Elf, make believe I know Santa Claus, and sing endless Christmas carols in front of Google’s Danny Sullivan’s house looking to gain access to his laptop. I might have tried, but I never executed on my plan. ;)  I am simply providing what I’m seeing across many sites impacted, including review sites I’m helping across verticals, that were impacted by the April and December Product Reviews Updates.

Rollout and Refresh:
As I mentioned earlier, Google explained it could take up to three weeks for the December Product Reviews Update to fully roll out. That’s a relatively long rollout, especially compared to broad core updates which typically roll out in less than two weeks. The update began rolling out on December 1 and impacted English-language pages globally:

Our December 2021 product reviews update is now rolling out for English-language pages. It will take about three weeks to complete. We have also extended our advice for product review creators: https://t.co/N4rjJWoaqE

— Google Search Central (@googlesearchc) December 1, 2021

And again, it took exactly three weeks to fully roll out. We heard from Google’s Alan Kent on Tuesday, December 21 that the update completed rolling out:

The Google product review update is fully rolled out. Thank you!

— Alan Kent (@akent99) December 21, 2021

It’s important to note that I have seen several adjustments being made by Google during the rollout. I’ve tweeted several examples of the volatility on Twitter (and you can see a few of those adjustments below). Although a number of sites impacted have leveled off, notice how search visibility reverses course for some sites, surges more, drops more, etc. It has been wild to monitor:

Ranking volatility during the December Product Reviews Update
More ranking volatility during the December Product Reviews Update
Ranking volatility during the December Product Reviews Update

Also, here is my tweet thread about the December Product Reviews Update in case you want to view more of my tweets about the rollout and what I’ve been seeing:

Another weekend & more volatility with the Dec Product Reviews Update (which can take 3 weeks to fully roll out). Many sites impacted are moving in the same direction or leveled off, but there are def. some seeing a lot of volatility during the rollout. Google has been busy :) pic.twitter.com/9YXDdtuSww

— Glenn Gabe (@glenngabe) December 13, 2021

The PRU Still Requires A Periodic Refresh:
Also, and this is important, Google still has to push a periodic refresh of the Product Reviews Update. So this is like medieval Panda where Google pushed out Panda updates every few months (before incorporating Panda into its core ranking algorithm).

So, if you’ve been heavily impacted by the December Product Reviews Update, you’ll need another refresh to see significant improvement. It’s also important to understand that Google’s John Mueller explained that he could see the Product Reviews algorithm get baked into Google’s core ranking algorithm at some point.

On that note, I’m sure Google wants to do that based on what I’m seeing with sites impacted (and how those sites are also being impacted by other major algorithm updates which sometimes go against the PRU). It doesn’t make a lot of sense and I’ll cover more about that soon in the section about dueling machine learning systems. Here is John Mueller explaining that he could see the PRU getting baked into Google’s core ranking algorithm:

And as a quick reminder, Google is evaluating sites more broadly with the Product Reviews Update (similar to how broad core updates work). For example, site or section-level evaluation. That’s why some sites can see a lot of movement with the PRU. You can read my post about the April Product Reviews Update for more information about that.

Two New Guidelines For Sites Publishing Reviews:
As a quick reminder, the Product Reviews Update is looking to reward high quality, insightful reviews versus lower-quality and thinner reviews. In Google’s original post about the April PRU, they provide a number of questions site owners can ask themselves about their own reviews content.

For example, do your reviews express expert knowledge about products, show what the product is like physically, provide quantitative measurements, and more. You can read my post about the April Product Reviews Update for more information about that.

Google's questions for sites publishing reviews content.

With the December Product Reviews Update, Google published another blog post with two additional guidelines for sites publishing reviews content. It’s important to understand these guidelines were NOT taken into account with the December PRU, but might be in future Product Reviews updates.

Here are the two new guidelines for site owners from Google’s new blog post:

  • Provide evidence such as visuals, audio, or other links of your own experience with the product, to support your expertise and reinforce the authenticity of your review.
  • Include links to multiple sellers to give the reader the option to purchase from their merchant of choice.
Two new guidelines from Google for sites publishing reviews content.

So, if you publish reviews, this is a big heads-up about potential new factors that will be taken into account during subsequent Product Reviews Updates. And the “multiple seller” bullet has been controversial based on a certain large e-commerce retailer’s terms of service (TOS), cough, Amazon. It should be interesting to see if Google changes their view of this over time, or not. And how Amazon reacts to the change… Time will tell.

More information from Ian Howells about this below:

https://twitter.com/ianhowells/status/1466391965868728332

Massive Visibility Changes For Some Review Sites (and larger publishers that provide reviews):
When the April Product Reviews Update rolled out, I said it was core update-like for many sites. There were huge swings in search visibility for many sites publishing reviews (whether those were affiliate sites, larger publishers with reviews content, etc.) Also, I surfaced sites that saw a lot of volatility that didn’t even focus on reviews content (you can read the section about “collateral improvement” in my post about the April PRU for more information about that).

Well, with the December Product Reviews Update, there has also been a ton of movement. Some sites are surging through the roof, while others got hammered.  It was also super-interesting to see sites recover from the April Product Reviews Update, while others got hit for a second time. I’ll cover more about that soon.

First, here are some examples of big surges and drops during the December Product Reviews Update. I’ll start with some surges:

Surge during the December Product Reviews Update
Surge during the December Product Reviews Update
Surge during the December Product Reviews Update
Surge during the December Product Reviews Update

Now here are some big drops based on the December PRU:

Drop during the December Product Reviews Update
Drop during the December Product Reviews Update
Drop during the December Product Reviews Update

Recovery from the April Product Reviews Update:
As I mentioned above, some sites recovered after getting hammered by the first Product Reviews Update in April. And for some of those sites, the recovery made sense. They worked hard to improve their content, user experience, and more, over time.

For example, here is a site that got pummeled by the April Product Reviews Update that surged during the December PRU after implementing a number of key changes. Those changes spanned several categories, including content quality and detail, user experience, expertise, ad situation, and more. Note, this company contacted me for help after the April PRU, but they implemented a number of changes even before I started helping them. So this surge is based on what they implemented on their own (after reading Google’s posts, my post about the April PRU, reviewing sites that surged, objectively analyzing their own content, etc.)

Recovery from the April Product Reviews Update

It’s worth noting that although they surged, they aren’t back to where they were prior to the April Product Reviews Update. This was a great first step in recovery, but they can do more in my opinion. It should be interesting to see how the site does based on future PRUs after implementing more changes.

Partial recovery from the April Product Reviews Update

Another site that recovered from a big April PRU drop also has improved overall. When checking the content now that’s ranking (and surging), it contains strong reviews content in my opinion. It’s balanced, thorough, provides great visuals, a nice breakdown of key features, and more. After reviewing the site, I could definitely see why it surged back. I’ll cover more about reviews content later in this post.

Another recovery from the April Product Reviews Update
Partial recovery from the April Product Reviews Update

A Quick Note About Double Hits During The April and December Product Reviews Updates:
While analyzing the December PRU, there were some sites that got hammered in December that also got hammered in April. Those sites are in grave condition right now… Although sites can see big changes during broad core updates, the two Product Reviews Updates were separated by seven months. That’s a long time to wait to see if you recover… I’ll cover more about dueling machine learning systems soon.

When checking the sites that saw double hits, it often made a lot of sense. The content was thin, wasn’t in-depth, just provided a quick paragraph about a product (sometimes right from the manufacturer), and then often just linked off to Amazon. And some of these sites mixed thin content with a terrible user experience with aggressive, disruptive, and/or deceptive ads. So the situation was not good…

Here are some examples of sites with double hits (April and December PRUs):

Double Product Reviews Update hit.
Double Product Reviews Update hit.

Interesting Example: Site dropped after surging in April. A mistake or a change in the algorithm?
When reviewing sites that reversed course during the December PRU, I noticed an interesting example. The site surged with the April PRU, more with the June 2021 core update, even more with the November 2021 core update, only to get hit very hard by the December PRU. So it’s been an interesting ride for the site in Google Land. I’ll explain more about Google’s core updates in the next section.

But the interesting part from a Product Reviews Update standpoint was that the content that dropped doesn’t provide specific reviews of products. Instead, it provides general information about the product category, what to look for, etc. and then just provides a quick list of products you can buy. That doesn’t really help users understand more about the products (since again, they aren’t being reviewed). Here is trending for the site over time. This is clearly not a good holiday season for this site…

But again, they surged with the April PRU, then more with the June 2021 core update, and then even more with the November 2021 broad core update, only to drop heavily with the December 2021 Product Reviews Update.

Google Product Reviews Update Change

Dueling Machine Learning Systems – Core Updates vs. Product Reviews Update
While analyzing the impact from the December Product Reviews Update, it was interesting to also see the impact from the November broad core update (and the recent broad core updates in June and July of 2021). The November update rolled out on 11/17/2021 and impacted a number of sites in the reviews space. But when the December PRU rolled out, some of those sites reversed course… For example, some sites that dropped with the November broad core update surged with the December PRU (and vice versa). And beyond just the November broad core update, there were many examples of sites surging in June or July and dropping heavily with the December Product Reviews Update. It’s clear the updates were at odds with another…

Here are some examples:

Dueling machine learning systems (core updates and product reviews updates).
Different changes with the Product Reviews Update versus broad core updates.
Surge during a broad core update and drop during the December Product Reviews Update.

And like I said earlier, there were some sites impacted by the November 2021 core update that reversed course! Talk about dueling machine learning algos… I’m sure those sites were on a roller coaster ride over the past few weeks.

Dropping with the November core update and surging with the December Product Reviews Update.
Dropping with the November core update and recovering with the December Product Reviews Update.

I think this emphasizes a very important point about major algorithm updates like broad core updates and the Product Reviews Update. When Google is using different machine learning systems, those systems can be at odds. If a site gets hammered by a broad core update, yet surges with another major algorithm update like the Product Reviews Update, which one is right?

As Bing has explained in the past about its core ranking algorithm, it sends thousands of signals to the machine learning system and the system determines the weighting (and ultimately where sites will rank). So even Bing’s search engineers don’t even know how powerful certain signals are. That’s why it’s impossible to try and figure out the one or two things causing problems with a site… And its also why I have used a “kitchen sink” approach to remediation with my clients. Don’t try to focus on one or two problems… surface as many as you can and fix them all (or as many as you can).

Here is Bing’s Fabrice Canel explaining more about Bing’s machine learning approach to rankings:

How much does a certain factor matter for SEO? Via Bing's @facan We simply don't know. Bing is heavily using machine learning. We don't set the weighting. It's about sending thousands of features to the ML system & the system figures it out: (at 35:02) https://t.co/EiTktEFqx7 pic.twitter.com/HTzu9wkA5m

— Glenn Gabe (@glenngabe) November 9, 2020

And here is Google’s Alan Kent mentioning Google’s machine learning system used for the Product Reviews Update:

(Alternative response… let me get back to you after a future rollout once the machine learning model has worked out the answer… or not)

— Alan Kent (@akent99) December 1, 2021

In addition, John Mueller was asked in a Search Central Hangout if the Product Reviews Update is using machine learning to train the algorithm, if human reviewers were used, etc. John explained that for updates like the PRU, it’s not something where human reviewers would be in the loop. And he explained that Google uses a lot of machine learning overall, so yes, it’s probably being used to some extent (reinforcing what Alan explained above). Again, this is important to understand.

You can watch the video below to hear John’s comments:

This is also why I believe John Mueller’s comments about baking the Product Reviews Update into Google’s core ranking algorithm makes complete sense. I’m not sure you can have multiple major algorithm updates that cause the same site to surge one minute, then drop the next. Those algorithms need to work together in my opinion. If not, they will surely drive site owners insane.

When Will The Product Reviews Update Expand To Other Languages (Beyond English)?
The current Product Reviews algorithm impacts English language content globally and does not impact other languages at this point. Google’s John Mueller has been asked when the PRU will expand to other languages and he addressed this in a recent Search Central Hangout.

John explained that they have nothing to pre-announce about expansion, but Google’s goal is to roll out algorithms like this to other languages. He also explained that for some updates, the team at Google works fast to roll things out to other languages. And for others, it’s a slower process. John also explained that there are sometimes policy and legal reasons that can slow things down.

You can watch the video below at 46:17 to hear John’s response:

Sites Gaining or Losing Rich Snippets – Yep, it’s clear.
It’s also important to understand that rich snippets can be impacted by broad core updates and other major algorithm updates (like the PRU). You can read my blog post about that topic to learn more. This can happen when Google reevaluates site quality overall. For example, it’s not uncommon for sites to gain or lose review snippets, FAQ snippets, etc. when a broad core update rolls out (or other major algorithm updates like the Product Reviews Update).

With the December 2021 Product Reviews Update, I saw more of that happening. It’s important for site owners to understand this since rich snippets can absolutely impact click through rate from the SERPs. The additional real estate and visual treatment of rich snippets can attract eyeballs as people scan the SERPs. You don’t want to lose rich snippets if you have them.

Here is an example of a site that surged during the December 2021 Product Reviews Update and it received FAQ snippets back. This is after losing them during the April Product Reviews Update.

Rich snippets back with the December Product Reviews Update.

But, as you can see below, it was short-lived. The site has continued to do well rankings-wise, but Google stopped showing FAQ snippets for the site. Sometimes Google is just not showing FAQs at all in the SERPs and sometimes other sites do have them. It’s been volatile from a SERP features standpoint, that’s for sure.

Rich snippets temporarily back with the  December Product Reviews Update.

Below, you can see FAQ snippet tracking for the domain (visibility-wise). It spikes with the rollout, then drops as Semrush begins picking up the loss of snippets:

Semrush showing a tempoary surge in FAQ snippets with the December Product Reviews Update.

But as I explained on Twitter during the rollout, I saw other strange things happening with rich snippets (and especially FAQ snippets). For example, I saw some sites losing FAQ snippets completely, I saw some *SERPs* losing them completely, and then I saw FAQ snippets replaced by jumplinks (when a page included a table of contents with named anchors for each product reviewed). So, just a heads-up to check your rich snippets if you’ve been impacted by the December Product Reviews Update.

Here is an example of jumplinks showing the top products being reviewed in the post (which I saw replace FAQ snippets during the rollout). It seems Google is testing the best SERP treatment.

Jumplinks for product reviews pages in the search results.

Large publishers with reviews content – The PRU will find you.
The Product Reviews Update is supposed to be focused on reviews content, so what happens when large publishers contain reviews content, in addition to a ton of other types of content? Well, the PRU can find that content and those sites can see volatility there. I’ve seen this many times while analyzing both the April and December Product Reviews Updates.

So, it’s easy to think that the PRU didn’t impact your site, but it might have impacted reviews content significantly (whether that’s in specific sections or mixed throughout the site). I just wanted to bring this up in case larger publishers had questions about their reviews content blended into their overall content on the site (since those sites can have hundreds of thousands of pages indexed, or more).

For example, here is specific reviews content within large publishers being impacted. These are sites with a ton of different content, but they do also have reviews.

Reviews content on large publishers impacted by the Product Reviews Update.
Reviews content on large publishers impacted by the Product Reviews Update.
Reviews content on large publishers negatively impacted by the Product Reviews Update.
Reviews content on large publishers negatively impacted by the Product Reviews Update.

Google’s new guidance for review sites NOT applied yet. And that’s pretty clear.
In Google’s latest blog post about the December Product Reviews Update, it provided more guidance for site owners (beyond the guidance in their original blog post). It’s important to understand that the new guidance is NOT being applied yet from a ranking perspective. I bring that up since many site owners were concerned about the second piece of guidance, which covered linking to multiple stores so people have more choice when deciding to purchase the products being reviewed.

As I mentioned earlier, affiliate marketers could end up violating Amazon’s terms of service (TOS) by linking to other retailers from their product reviews, so that bullet caught the attention of many marketers. So, this is a big heads-up to any site providing reviews that is just linking to one retailer (like Amazon). I’m not sure how this will work down the line, and if Google will change its stance on the topic. Again, it’s not being used now as a factor for the Product Reviews Update, but sure looks like it might with the next update.

For example, you’ll often see review sites linking to just Amazon:

Linking to just Amazon from affiliate sites.

And to confirm this isn’t really being used yet, I noticed a number of review sites that surged that still only link to one retailer (often Amazon). So again, this is more of a heads-up for affiliate marketers:

Site surging that just links to Amazon via affiliate links.

The PRU Proves Links Aren’t Everything (Just like the April PRU proved):
I covered this in my post about the April Product Reviews Update, and site owners that are overly focused on links should definitely be aware of this. There are many review sites that surged that had much weaker link profiles than competitors.

For example, some sites surging through the roof have under 10K links total, with some having less than 5K links total. And when comparing links from authoritative sites (which is important from an authority standpoint), the comparisons weren’t even close. Sure, they had some powerful links, but not even close to some competitors that dropped.

So yes, links matter. But no, they aren’t the end-all (and especially based on the Product Reviews updates). For example, here are two examples of sites surging during the December Product Reviews Update. Their link profiles are much weaker than many competitors.

Yes, that’s a site with 3.3K links surging during the December Product Reviews Update (and 1.7K after what Majestic calls “noise reduction”):

Site with low link count surging during the December Product Reviews Update.

And here is a site with just 11.8K links surging too (5.1K after “noise reduction”). And it’s competing with sites that have hundreds of thousands (or even millions) of links:

Site with low link count surging during the December Product Reviews Update.

Intent Shift(ing) – Review Sites Appear, and Disappear, For Head Terms
One important finding that others have seen as well was Google changing the SERP for head terms that don’t contain “best”, “top”, or “reviews” in the query. For example, searching for just humidifiers versus “best humidifiers”. During the rollout, I’ve seen a ton of volatility with this. Sometimes the SERPs contain review sites and sometimes they don’t… When they don’t, retailers rank since Google believes the intent is to buy versus find reviews. But again, this has been changing a lot during the rollout. And some sites are jumping all over the place as Google changes the SERPs.

So if you’ve been impacted by the December Product Reviews Update, definitely dig into the queries that dropped or surged. And see if you are part of the reviews dance going on the search results.

Here is a good example. For a specific head term, this product reviews site has suddenly showed up on page one, dropped from page one, show up, dropped, etc. during the rollout. When this happens across head terms, it can sure cause a lot of volatility:

Head term impact for product review sites.

Also, Lily Ray has covered similar things on Twitter (with regard to the companies/products being reviewed jumping up the SERPs versus the sites reviewing them). Here is one of her tweets covering that:

https://twitter.com/lilyraynyc/status/1467948310094823425

Product Reviews Winners: A quick review of what’s working well with reviews content:
My original post about the April Product Reviews Update covers a lot of information about high quality reviews content. So you should definitely check out that post for more information detailing the various aspects of insightful and valuable reviews. That said, I did want to revisit review content to explain more about what I’m seeing that works well, what’s helpful for users, and what Google seems to be rewarding.

Note, not every site surging provides incredible reviews content. I mentioned that in my post about the April PRU, but I would always strive to provide the highest quality content you can (and follow the guidelines that Google has provided). It’s your safest bet for steering clear of a drop during a subsequent Product Reviews Update.

Here are some additional insights based on analyzing reviews that are being rewarded:

The Power of Organization:
Comparison charts are super-helpful for reviews content. I noticed more and more of this on sites benefiting from the December Product Reviews Update. When comparing multiple products, organizing all the details can help users get a lay of the land.

Charts in product reviews.
Charts in product reviews.
Charts in product reviews.
Charts in product reviews.

Strong Visuals (Photos, Videos, and Gifs):
When looking for help with choosing the right product, killer visuals can clearly help users. If you provide reviews content, I would think way beyond a stock manufacturer photo and provide original photos of the product you are reviewing (if possible). In addition, providing video of the product so users can get a strong feel for the item can really help readers. And providing quick gifs of how things work can also be beneficial.

Here is a series of visuals for a specific product review:

Photos in product review articles.

Here is Wirecutter showing off several bike racks:

Photos in product review articles.

Actual experience using the product. Prove you really tested the product and understand the pros and cons:
When reviewing a lot of product reviews content (see what I did there?), it was easy to see when the person reviewing that product truly used it, tested it, etc. Those reviews went way beyond superficial reviews, and I really dug that. By the way, one of Google’s latest guidelines explains that reviews should provide this level of detail. I agree, and it’s clear that Google is trying to surface those types of reviews in the search results. You don’t have to look further than Wirecutter to see examples of this type of review. More about “The Wirecutter Standard” soon.

Animation and video in product review articles.
Animation and video in product review articles.

And as I covered in my post about the April Product Reviews Update, make sure experts or enthusiasts are writing the articles (author expertise). After analyzing many reviews, you could clearly tell when the author had a lot of experience in the topic they were covering. And no, just having an expert write the content doesn’t mean it will be super high quality. But, making sure someone has expertise in the subject matter they are writing about can clearly help drive insightful, helpful, and high quality content. In other words, don’t just have anyone writing your reviews content.

Explain your review process and why people should trust you:
When people are searching for reviews, providing your review process clearly for readers is a smart idea (and can help break down barriers). Explain how you review products, the process you take, the metrics you are using to rate products, how you chose the top products per category, and more.

If you want a good example of how this looks, you should check out Wirecuttter. They provide that information in each review. Safewise also provides a section covering how they review products, but it’s at the end of the review. In my opinion, I would put that higher up the page to help users understand that process before reading the reviews. That said, they at least provide that information for readers.

Here is a section from a safewise review for wireless cameras:

Explaining how reviews were conducted in product review articles.

And here is a section from a Wirecutter review about why you should trust them:

Explaining why readers should trust a product reviews site.

The Wirecutter Standard: Try to be the Wirecutter for your niche. You can’t go wrong.
I mentioned Wirecutter earlier, just provided a bunch of examples from the site, and also covered it heavily in my post about the April Product Reviews Update, and all for good reason. Wirecutter provides some of the most thorough, high quality, insightful, and helpful reviews on the web. And it’s no coincidence they have done very well over time rankings-wise.

When helping companies focused on reviews, I find myself often saying, “be the Wirecutter for your niche”. I know that’s not an easy thing to do, since Wirecutter has a killer team of writers, people testing products, editors, etc., but it’s what many sites should strive for in my opinion. I recommend spending some time on their site, seeing how they break down reviews, the level of detail provided, and try to emulate that for your own niche.

The Wirecutter standard for providing product reviews.
The Wirecutter standard for providing product reviews.

Summary: Understand The April PRU, Learn About The December Update, and Get Ready For The Next
Like the April Product Reviews Update, the December PRU was significant for many sites focused on reviews. It was core update-like for many, yielding surges, drops, and recoveries across verticals. If you have been impacted by the December PRU, I recommend reading Google’s blog posts about the updates, my post about the April Product Reviews Update, review this post about the December update again, and then objectively analyze your own site.

I would surface gaps in your reviews content and fill those gaps, improve the user experience, watch the aggressive ad situation, and work to become the Wirecutter of your niche. Don’t improve a little… make BIG changes. Remember, there are dueling machine learning systems out there looking to surface the highest quality content possible. Control what you can control. That’s the best path forward in my opinion.

GG

Back to table of contents

Filed Under: algorithm-updates, ecommerce, google, seo

The Link Authority Gap – How To Compare The Most Authoritative Links Between Websites Using Majestic Solo Links, Semrush Backlink Gap, and ahrefs Link Intersect

November 11, 2021 By Glenn Gabe Leave a Comment

Link Authority Gap

While helping companies that have been heavily impacted by Google’s broad core updates, the topic of “authority” comes up often. And that’s especially the case for companies that focus on “Your Money or Your Life” (YMYL) topics. For example, sites that focus on health, medical, financial, etc. all fall under YMYL.

If you’ve read my posts about broad core updates, then you know that Google is taking many factors into account when evaluating sites (and over a long period of time). It’s one of the reasons I recommend taking the “kitchen sink” approach to remediation where you surface all potential issues impacting a site quality-wise, and work to fix them all (or at least as many as you can). For example, improving content quality, user experience, technical SEO problems causing quality problems, the site’s advertising setup, affiliate setup, and more.

And for sites that cover YMYL topics, it’s important to understand that they are held to a higher standard. Google wants to surface the most authoritative sites for sensitive queries (for topics that can impact the health, happiness, financial status, etc. of a user). And that’s one area where authority comes into play.

“Authority” can be a nebulous topic, so I’ll just focus on one aspect for this post – which are links. Danny Sullivan (pre-Google) actually wrote a good post about the subject where he interviewed Paul Haahr, distinguished engineer at Google. In that post, Danny explained how a mixture of factors are used to measure “authority”, with PageRank being the “original authority metric” (links from across the web). So links are not the only factor contributing to authority, but they are important. Instead, there’s a bucket of signals Google is using. By the way, Paul is one of the smartest guys at Google from a ranking perspective. I’ve always said that when Paul talks, SEOs should listen. :)

PageRank, the original authority metric.
Google calculating authority.

In addition, Google has explained that PageRank, or links from across the web, is one of the most well-known factors when evaluating authority. That’s from a whitepaper Google published about fighting disinformation across the web.

Google, PageRank, and E-A-T

Google also explained in the whitepaper that when it detects a YMYL query, then it gives more weight in its ranking system to factors like authoritativeness, expertise, and trust. So those links and mentions are even more important for YMYL queries/topics.

Google, PageRank, E-A-T, and YMYL

Going even further, Google’s Gary Illyes explained at Pubcon in 2017 that E-A-T is largely based on links and mentions from authoritative sites. Here is a tweet from Marie Haynes after Gary explained this at Pubcon:

I asked Gary about E-A-T. He said it's largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that's good.

He recommended reading the sections in the QRG on E-A-T as it outlines things well.@methode #Pubcon

— Marie Haynes (@Marie_Haynes) February 21, 2018

But there’s more…. In 2020 Bill Slawski covered a super-interesting patent that discusses expertise and how Google might be categorizing sites to determine which set of sites should even be considered for ranking (for certain types of queries). By the way, that sure sounded a lot like what happened with the Medic Update in August of 2018 (when many health and medical sites saw extreme volatility). As part of that patent, E-A-T was mentioned with a focus on “authority” (which we know includes links and mentions from authoritative sites).

As you can see, links are important and can provide a strong signal to Google about the authoritativeness of a page and website. And as Gary Illyes once explained, that includes mentions on well-known sites (and not just links).

A Link Authority Gap Analysis: Comparing The Most Powerful Links Between Sites
Based on what I explained above, it can be important to understand where powerful links and mentions are coming from (both to your site and to your competitors). And that’s especially the case if you focus on YMYL topics. Therefore, I find it’s always interesting to compare link profiles, understand the differences in links from authoritative sites, and identify gaps in your own link profile in order to have a solid understanding of your current state.

The tough part for site owners is understanding that there’s no quick fix to a situation where your competitors have a clearly stronger link profile from an authority standpoint. You can’t just go out the next day and gain links from CNN, the CDC, The New York Times, and other authoritative sites. Note, as I explained earlier, rankings (and “authority) are not just about links, so this isn’t a binary rank-or-not-rank situation. But again, the right links can be a strong signal to Google about the authoritativeness of a piece of content, or a site overall.

For example, it’s not easy to earn links from the following domains…

Majestic Solo Links

Also, from a link earning perspective, you typically need to build a strong content strategy, social strategy, and possibly digital PR strategy to help earn links naturally over time. And that takes time… it’s not something you can do quickly (if done naturally). And if you find a large gap from a link authority standpoint, it’s definitely not a reason to spam every news publisher in the world to try and gain links. That typically won’t work well and could backfire big-time.

Peeling back a link profile to reveal “link authority gaps”:
In this post, I’m going to explain how to quickly surface the most authoritative links pointing to a competitor when those same sites don’t link to your own site. In other words, finding the authority gap between sites. The post isn’t about doing a deep dive into a link profile, which can also be beneficial. Instead, it’s focused on quickly identifying important gaps from an authority standpoint.

Again, this is a nuanced topic and just because one site has links from X domain doesn’t mean that’s why it is ranking above you. But, the link authority gap can give you a pretty good view of why a site may be seen as more authoritative than another. In addition, you can learn a lot about the type of content earning those links and possibly identify content gaps on your own site.

Below, I’ll cover how to run a quick link comparison using three tools:

  • Majestic Solo Links
  • Semrush Backlink Gap
  • ahrefs Link Intersect tool

Solo Links in Majestic:
I love Solo Links and it has definitely flown under the radar for a really long time. It compares the top 500 referring domains for two different websites (ordered by Majestic’s Trust Flow metric). I have found it to be a strong way to surface the most powerful domains linking to one site, and not another. And no, Google doesn’t use Trust Flow or any third-party metrics from the various tools. It’s just Majestic’s internal metric for estimating quality based on being closely linked to “trusted seed sites”.

Just enter the two domains you want to compare and click the magnifying glass. You’ll see a Venn digarm representing the overlap, and the gaps, between both sites. Below the Venn diagram, you’ll the referring domains listed, the number of links, and you have the ability to quickly view the links. For example, if you click the link count, you will see the actual links along with supporting information (like anchor text, if it’s nofollow, when it was first seen, etc.) It’s not perfect (no tool is), but it can sure get you moving in the right direction when trying to determine the link authority gap.

For example, here’s a gap analysis for a powerful site in the health niche compared to a smaller, less authoritative site. You can clearly see the differences in the Venn diagram and then in the referring domains list.

Solo links for health sites venn diagram
Solo links for health sites, referring domains

A quick note to Majestic:
Please add the ability to export all of the links in bulk and not just the domains. It would be great to simply click “export” and get all of that data in one shot. It’s a bit cumbersome now to get that data out of Solo Links since you can only export links for each domain and not all of the domains listed at one time…

Exporting in Majestic Solo Links

Backlink Gap in Semrush:
Similar to Solo Links, Semrush’s Backlink Gap tool enables you to enter up to five domains to start comparing link profiles. It doesn’t just focus on the top links, but the data is presented based on Semrush’s Authority Rank metric (AR). And just to reiterate, Google does NOT use AR or any other third-party metric when determining rankings… It’s just an Semrush-specific metric.

After entering at least two domains, you have options for viewing the links pointing to the competition, but not your site, or vice versa. Again, it’s not perfect, but can help you start to understand powerful links leading to your competitors that you don’t have.

Backlink Gap in Semrush

Note, the default view is to list the referring domains (without showing the links). To view the actual links leading to your competitor’s site, you can click the arrow to the right of the referring domain in the chart. It’s not the most intuitive way to find the links, but they are there. :)

Viewing links in Semrush Backlink Gap

Link Intersect in ahrefs:
Another tool that has flown somewhat under the radar is ahrefs Link Intersect Tool. It enables you to enter up to four competitor domains and view gaps in your own link profile (ordered by Domain Rank, a metric created by ahrefs). And once again, Google does not use any third-party metrics like Domain Rank, Authority Rank, Domain Authority, etc.)

Like the other tools I mentioned, the link intersect tool can shed some light on authoritative links pointing to the competition that you don’t have. For example, what are those sites linking to content-wise, do you have content like that on your own site, what can you build that’s even better than that, etc.?

To use the tool, simply enter a competitor’s domain (or several) and then your domain in the “but doesn’t link to” field. The tool will find links leading to the competition, but not to your site. The list of referring domains will be ordered by ahrefs Domain Rank. In order to view the actual links per referring domain, just click the number in the column for each competitor. You can see information about that link, including the anchor text, the url that’s linking to the site, whether the link is nofollow, if the link was found in the content, etc.

ahrefs Link Intersect Tool
Referring Domains in ahrefs Link Intersect Tool
Exporting links in ahrefs Link Intersect Tool

You can also export the list, but just like Majestic, you can’t export all links in one shot. You can just export the referring domains in one shot… So you’ll have to export links per referring domain, which can be tedious. So just like my note to Majestic, I have the same for Ahrefs! Please add the ability to export all links from the Link Intersect tool in one shot! That would be a great addition.

Exporting domains in ahrefs Link Intersect Tool

Tips and recommendations when evaluating the link authority gap:

  • “Authority” is Not binary: Links (PageRank) is just one factor. Sure, it’s an important factor, but just one. Google has explained it uses a bucket of signals to determine authority. While you are analyzing the link authority gap, just understand that the links alone might not be the reason a competitor is outranking you. It’s great to see the gap, and can be important for forming a long-term strategy to improve authority-wise, but it’s not a rank-or-not-rank situation.
  • Understand the Search competition versus just the competition: Make sure you are comparing to your actual competitors in Search (who you are competing within the search results) and not just which sites you think are the top competition (outside of Search). Your goal here is to understand the link authority gap between your site and sites outranking your for target queries.
  • Mine the gaps, create a plan: Identify gaps in your link profile from authoritative sites, understand the content that gained those links, determine if you have content covering those topics, and figure out ways to produce killer content that can earn links like that (or even better). Don’t just try to match what your competitors are doing. Try to outdo them.
  • Have patience… building authority is a long-term process: Understand that earning powerful links from authoritative sites is a long-term process. You cannot change the situation overnight… That might be frustrating, but it’s true. Understand what the competition is doing to earn those links, figure out gaps on your site, and build a killer content strategy to earn amazing links over the long-term. Once you start earning those links, you’ll appreciate that it takes time, since it can build a natural layer of protection for you (until others start going through the same process, put the work in, etc.)

Summary – Understanding the link authority gap can be enlightening, but you must execute.
By using the process I documented in this post, you can quickly start to understand the most authoritative sites linking to your competition that don’t link to your site. And that “link authority gap” can be important. But don’t stop once you uncover the gap. You need to execute based on your analysis in order to make an impact. So… what are you waiting for?? The data awaits.   

GG

Filed Under: google, seo, tools

How to identify ranking gaps in Google’s People Also Ask (PAA) SERP feature using Semrush

October 19, 2021 By Glenn Gabe Leave a Comment

When performing a competitive analysis, it’s smart to run a keyword gap analysis to determine the queries that competitors rank for that your site doesn’t rank for. It can often yield low-hanging fruit that your content team can execute on. As part of that process, it’s also smart to analyze Google’s People Also Ask (PAA) SERP feature for queries your site already ranks for, or doesn’t rank for (to determine what those additional queries are and which sites are ranking for them). I find this step is skipped often for some reason and it can also yield powerful findings that you can start to execute on quickly.

In this post, I’m going to walk you through the process of identifying ranking gaps in People Also Ask (PAA) using Semrush, which provides outstanding functionality for mining PAA data.

What we’re going to accomplish:
For this tutorial, I’m going to filter queries leading to a site by top ten rankings and then layer a secondary filter for surfacing queries where a People Also Ask module also ranks (but the site doesn’t rank in the default PAA listings). In other words, you rank in the top ten, but you don’t have content ranking in PAA for those queries for some reason. I’ve found that can yield very interesting findings that sites can execute on pretty quickly.

For example, in the screenshot below, a site ranks in the top ten for 3,359 queries when it also does not rank in the default People Also Ask (PAA) module:

Viewing Google ranking gaps in people also ask via semrush.

Step-by-step instructions for identifying PAA gaps via Semrush:

1. First, fire up Semrush and enter the domain name you want to analyze.

Enter domain name in semrush.

2. Access the Organic Research reporting.
Click Organic Research in the left-side navigation, which will bring us to a powerful set of features for analyzing the search performance of the domain, subdomain, directory, or url you enter.

Viewing Organic Research reporting in Semrush.

3. View all rankings for the domain via the Positions tab.
Click the Positions tab, which will yield all queries that a site ranks for in the top 100 listings (based on Semrush data).

Viewing the positions tab in Organic Research in Semrush.

4. Filter by top ten results:
Next, we are going to filter the results by queries where the site ranks in the top ten (so these are queries where the site ranks very well already, but might not have content that ranks in People Also Ask). I’ll cover the second part of this step next, but start by filtering by queries ranking in the top ten by clicking the Positions tab and then selecting Top 10.

Filter by top ten rankings in Semrush.

5. Layer a secondary filter for PAA:
To complete this step, we want queries ranking in the top ten, but where the site doesn’t rank in People Also Ask (PAA), which can provide a great opportunity to fill gaps content-wise. To view these queries, click “SERP features”, then “Domain doesn’t rank in” and then select “People Also Ask”.

Add a filter for when a site doesn't rank in people also ask via Semrush.

6. Analyze the data:
Now that you have two filters selected, you will be presented with all of the queries where the site ranks in the top ten but doesn’t rank in the default PAA module. When you scan the list, definitely spot-check the live search results to see which questions are listed in People Also Ask, which sites are ranking there, the content ranking for those queries, etc. Again, you can identify content gaps, format gaps (like video), and more. It’s a quick way to help your content team identify opportunities (and some gaps you find might lead you to face-palm). :)

Final report showing ranking gaps in People Also Ask via Semrush.

7. Export the data:
You can always export the results if you want to use that data in other programs like Excel. You can export the data in Excel or CSV format.

Export ranking data in Semrush.

And there you have it. A quick and easy way to identify ranking gaps in PAA via Semrush. It only takes a few minutes to run, and you’ll have a boatload of PAA data to go through (where your site ranks, but not in PAA). By the way, if you’re looking for other posts I’ve written about Semrush tools, then check out The Link Authority Gap which shows you how to compare the most authoritative links between websites.

PAA Gaps – Final tips and recommendations:
Before I end the tutorial, here are some final notes and recommendations based on helping clients go through this process over time. Semrush is like a Swiss Army Knife for SEO research, so make sure you are getting the most of out the tool.

  • Live graphs – Remember that the graphs in Semrush are live graphs, so they change based on the filters you select. Therefore, you can see trending over time for ranking (or not ranking) in PAA when you already rank well in the organic search results. It’s a cool way to visually see your progress.
  • Advanced filtering – Use advanced filtering in Semrush to fine-tune your analysis. For example, combine filters like search volume, keywords, directory, subdomain, urls, etc. You can filter the data multiple ways in Semrush (and combine those filters for advanced reporting). Play around there… you might find some great combinations that surface important data.
  • PAA by country – Run this analysis by country! Just change the country you are analyzing in the reporting, and voila, you have a fresh set of queries where your site doesn’t rank in PAA.
  • By device – Be sure to check both mobile and desktop data. Similar to country, just select desktop versus mobile in the filters to see each resulting dataset. You might find some differences there.
  • Spot check the results – Make sure you are spot-checking the actual SERPs. PAA can obviously change (and Semrush isn’t always perfect), so make sure you really aren’t ranking in PAA for those queries. Then form a plan of attack once you identify the gaps.
  • PAA formats – Keep an eye on the format of content ranking in People Also Ask. As I mentioned earlier, video could be ranking there as well. Understand the types of content Google is ranking based on query and choose the right formats for the job.
  • View historical rankings – You can easily change the dates via Semrush! For example, you can look back in history and run this analysis for previous months. Have you improved, declined, or remained stable? How has Google changed with regard to PAA for those queries?

Summary: Identifying PAA Gaps Via Semrush can be powerful.
It’s hard to overlook People Also Ask when an analyzing the SERPs and the feature often contains important questions that people are searching for based on the original query. By using the process I detailed in this tutorial, you can surface and export queries where your site already ranks in the top ten search results, but doesn’t rank in PAA. In my opinion, it’s a great way to identify low-hanging fruit that your content team can dig into quickly. You never know, you might find some quick wins… or many of them. Have fun.

GG

Filed Under: google, seo, tools

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • …
  • 34
  • Next Page »

Connect with Glenn Gabe today!

Latest Blog Posts

  • The Google May 2022 Broad Core Update – 5 micro-case studies that once again underscore the complexity of broad core algorithm updates
  • Amazing Search Experiments and New SERP Features In Google Land (2022 Edition)
  • Analysis of Google’s March 2022 Product Reviews Update (PRU) – Findings and observations from the affiliate front lines
  • How NewsGuard’s nutritional labels can help publishers avoid manual actions for medical content violations (Google News and Discover)
  • What Discover’s “More Recommendations”, Journeys in Chrome, and MUM mean for the future of Google Search
  • How to extend a multi-site indexing monitoring system to compare Google-selected and user-selected canonical urls (via the URL Inspection API and Analytics Edge)
  • Favi-gone: 5 Reasons Why Your Favicon Disappeared From The Google Search Results [Case Studies]
  • Google’s Broad Core Updates And The Difference Between Relevancy Adjustments, Intent Shifts, And Overall Site Quality Problems
  • Google’s December 2021 Product Reviews Update – Analysis and Findings Based On An Extended And Volatile Holiday Rollout
  • The Link Authority Gap – How To Compare The Most Authoritative Links Between Websites Using Majestic Solo Links, Semrush Backlink Gap, and ahrefs Link Intersect

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2022 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy