The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

What happens to crawling and Google search rankings when 67% of a site’s indexed urls are pagination? [SEO Case Study]

October 7, 2021 By Glenn Gabe Leave a Comment

There has been a lot of confusion and debate over the years about how to best handle pagination from an SEO perspective. It doesn’t help that Google changed on this front, which has led to even more confusion about the best path forward. For example, Google announced in March of 2019 that it stopped supporting rel next/prev for consolidating indexing properties from across a paginated set (and… it realized that was the case for years – just nobody knew).

That shocked many in the SEO community and had site owners and SEOs wondering which was the best way to handle pagination moving forward (if changes were needed at all!) For example, should pagination be indexable, should site owners use “noindex, follow” instead, or should they just canonicalize pagination to page one in the series? Since pagination is present on so many different types of sites, and can lead to a large number of additional pages on a site, site owners and SEOs wanted to address the situation the best they could.

But is that important? Will choosing the wrong path cause serious problems SEO-wise? And what if a majority of your pages indexed are pagination? These are all good questions and I hope this case study will provide at least a few answers based on a client I’m helping with a lot of pagination (comprising 67% of indexed pages). Yes, 67%.

So, did this cause big problems SEO-wise, is Google spending too much of its time crawling pagination that it’s missing fresher, more important content, and are rankings being impacted (GASP)?? Let’s begin.

Google’s stance on handling pagination over the years:
Since I’ve helped many large-scale sites over time with a lot of pagination, I’ve been able to see (and experience) the evolution of how Google handles that pagination. That experience led to me writing a blog post covering how to set up pagination for SEO, which contains all of the latest updates and announcements from Google. And there are several updates I’ve had to cover…

Back in 2012, Google’s Maile Ohye published an outstanding video covering SEO best practices when providing pagination. In that video, Maile explained the various ways you can set up pagination based on content type, including articles split into multiple pages and category pagination (like for e-commerce retailers). In that video she also explained how to use rel next/prev to consolidate indexing properties from across a paginated set. The blueprint that Maile mapped out in that video became the foundation for setting up pagination SEO-wise, and I often referenced that video in my audits, posts, and presentations.

Here is a screenshot from Maile Ohye’s video from 2012 explaining more about rel next/prev for pagination. Note, rel next/prev is not supported anymore for consolidating indexing properties across pagination. More about that next:

Former Googler Maile Ohye's video covering pagination SEO best practices.

Google Nukes rel next/prev (and nobody noticed):
As I mentioned earlier, in 2019 Google dropped a bomb on SEOs and explained that it doesn’t support using rel next/prev anymore to consolidate indexing properties from across a paginated set. And to add insult to injury, they also explained this had been the case for years! It’s worth noting that it’s still good to use rel next/prev for accessibility purposes, but it will have no effect SEO-wise.

As you can guess, SEOs went ballistic (although I think it’s fair to say we’re happy that they told us!) Google realized this out after checking some of its systems and noticed rel next/prev wasn’t being used for indexing and ranking purposes. Whoops.

Google's John Mueller explaining rel next/prev isn't supported anymore.

As someone that has helped many companies set up rel next/prev when using pagination, that last point got me thinking… If rel next/prev hadn’t been used for years by Google, and the companies I’ve been helping didn’t even notice that Google stopped supporting it (rankings were unaffected), then maybe Google was pretty darn good at handling pagination.

In other words, maybe we were getting all bent out of shape for no reason. Again, the clients I helped that had a lot of pagination didn’t see any major movement or drops based on Google removing support for rel next/prev. And that also matches what Google’s John Mueller has been explaining for a while. Heck, he even explained that in the tweet thread from 2019. For example, he said “most seem to be doing pagination in reasonable ways that work…” I’ll cover more about John’s recommendations next.

Google's John Mueller explaining that most sites are doing pagination in reasonable ways.

Google’s John Mueller Peppered With Questions About Pagination:
Ever since Google stopped using rel next/prev to consolidate indexing properties, Google’s John Mueller has been asked many questions about the best way to set up pagination. And John has provided some great advice in his Search Central Hangout videos.

John explained that Google has a lot of experience dealing with pagination (as long as it can identify pagination easily). And since it has a lot of experience dealing with pagination, it really shouldn’t have a big impact on a site SEO-wise. It can just “work”.

Also, John often explains the differences between splitting article content into multiple pages and having category pages that contain pagination (listing pages that lead to other urls). Those are two very different scenarios and can be handled differently from a pagination standpoint, if needed.

He has also explained that for category pagination (like an e-commerce category page), you can have all of the pages indexable (which is preferable), you can use “noindex, follow”, or you can even canonicalize to page one in the pagination. It really depends on how well your content is crosslinked, how important the pagination is for discovery, for passing of signals, etc. You can read my post about pagination to see the latest updates from Google about this.

Here is one of John’s latest videos about setting up pagination covering most of what I just explained (at 13:16 in the video):

It’s also worth noting that Google just published some outstanding best practices for e-commerce SEO (where they cover how to handle pagination as well). In that document, they explain to provide self-referencing canonical tags for each page in the pagination (and avoid canonicalizing to page one in the series). That means all pagination should be indexable.

ecommerce best practices for pagination from Google

Warning: Nuance Ahead! -Two important points for SEOs and site owners about handling pagination:
As with most things in SEO, there is nuance involved with handling pagination. First, the size of your site is important since you don’t want too much pagination to hamper crawl budget. As John explained, if you don’t have hundreds of thousands of pages (or more), then you shouldn’t really have to worry about pagination impacting crawl budget. But if you do have that many pages, then it’s important to make sure Google is focusing crawling on your most important pages. And that might lead to handling pagination differently across a site. More on crawl budget soon in the case study below.

The other thing I wanted to point out is that I’m a firm believer you shouldn’t have excessive pagination per sequence. For example, I would avoid providing thousands of pages of pagination per paginated set (like one category with thousands of pages of pagination). I think it’s fine to have many paginated sets (if needed), but I would only provide a reasonable number of pages per set (which would be based on your site, content, etc.) That number can vary per site, but I wouldn’t overload your pagination with thousands of pages per sequence. I would do what’s best for users and search engines.

For example, here’s a site with close to 3,700 pages of pagination in one category…

Excessive pagination per set.

The Case Study: Background information, pagination setup, and indexing levels.
The site I’m covering in this case study has a crawl footprint of about 200K pages (between indexed and excluded urls). I can’t go into too much detail about what the client focuses on, but you can think of them as a site that provides a wealth of information about categories, organizations, products, and reviews of those products. There is also a blog on the site containing in-depth content focused on their niche.

From a pagination standpoint, there is quite a bit of it across the site. That’s primarily on the category, organization, and review pages. Since there are many products and reviews that aren’t crosslinked well on the site (based on how the business operates), the pagination is important for discovery (for Googlebot and for users). For that reason, we definitely wanted to make sure the pagination was indexable and that Google could follow the links on the paginated urls to the product pages, the review pages, etc.

I’ve been helping this company for a long time and helped them craft their pagination strategy as far back as 2012. They have been using the approach that Maile Ohye described in the video I covered earlier. That’s where each page in the pagination contains self-referencing canonical tags, contained rel next/prev (now deprecated), and provided a strong text navigation to the paginated series at the bottom of each page. Remember, this was the recommended setup for a long time. The site has had this set up for years… and you can learn more about that setup in my post about setting up pagination.

Paginated urls using self-referencing canonical tags.

Indexing Levels: Holy pagination Batman!
Based on how much pagination is on the site, it is currently a large percentage of indexed urls. To be specific, the pagination makes up 67% of total indexed pages. Yes, more than two-thirds of indexed urls are pagination! The horror!! :)

Number of paginated urls indexed.
Total indexing in GSC's Coverage reporting.

These are not necessarily pages that we want to rank in the SERPs (other than the first page in the set), but we definitely want Google to discover all of the destination pages from the pagination. And remember, rel next/prev was supposed to take care of that for us by consolidating indexing properties from across the paginated set and (usually) surfacing the first page from the set in the SERPs. That’s not supported anymore as I covered earlier. More on rankings soon.

Crawl Budget and Crawl Stats
With that many paginated pages on the site being indexed, what do crawl stats look like? Is Googlebot churning through many paginated pages and missing important, fresher pages?

First, crawl budget is really only something that larger-scale sites need to worry about. For example, Google states that sites with 1M+ unique urls or medium-sized sites with 10K unique urls with “rapidly changing content” need to worry about crawl budget.  This site has 18.6K urls indexed and a total crawl footprint of under 200K urls. So, it’s not a small site, but not huge either.  

In addition, it’s important to understand that all pages are not crawled with the same frequency (based on importance). Google’s John Mueller has explained this point many times over the years. So, although there might be a lot of pagination on the site, that doesn’t mean Google will continually crawl all of that pagination instead of your fresher (or more important) content.

Here is John explaining this (at 12:42 in the video):

The site’s new content does get crawled relatively quickly. In addition, checking the crawl stats reporting, you can see that fresher content being crawled recently and pagination does not overwhelm the reporting. So despite the site having 67% of its indexed pages being pagination, it’s fine from a crawl budget and crawl stats perspective (in my opinion). That makes sense based on what Google has explained over the years about crawl budget.

Performance over time. How does trending look?
OK, this is where the rubber hits the road. Is having that much pagination indexed impacting rankings or organic search performance at all?

In a word, nope.

The site’s performance has been extremely stable over years (and through a number of broad core updates). They have seen strong growth over the long-term as well (especially from 2012 when I first started helping them).

Here is the past 16 months of trending from GSC:

Stable trending from Google organic search despite many paginated urls indexed.

Here is search visibility trending over the past two years:

Stable visibility trending despite having many paginated urls indexed.

And here is search visibility trending since 2012:

Strong visibility growth over time despite many paginated urls indexed.

How about pagination ranking in the search results (beyond page one)? Are paginated pages being surfaced in the SERPs and driving traffic?
No, not really. It accounts for a very small percentage of traffic from Google Search. For example, over the past three months, there have been 1.62M clicks from Google web search. Pagination accounted for just five thousand of those clicks.

Pagination accounts for just .3% of total clicks from Google Search over the past 3 months:

Clicks from Google Search to paginated urls.

I think the most important thing to remember for this site is that the various “product” pages aren’t being linked to sufficiently from other areas of the site (based on the company’s business model and how the site needs to be set up). So, the pagination is important for making sure Googlebot can get to many of those destination urls. And that’s why it’s important to make sure the pagination is indexable, that it uses self-referencing canonicals, etc.

Final tips and recommendations for sites with a lot of pagination:
So there you have it. A site with 67% of its indexed urls being pagination and it’s chugging along just fine in the SERPs. I’m not saying this approach is what every site should use, but just like Google’s John Mueller has explained many times, Google has a lot of experience handling pagination. It often just works… To end this post, I’ve provided some final tips and recommendations for sites dealing with a lot of pagination.

  • Pagination setup: Setup-wise, you have several methods at your disposal for handling pagination. The path you choose depends on the type of content you are dealing with and your internal linking structure. See my recommendations above about content split across multiple pages versus listing pages (like categories), and how well those destination pages are linked across the site.
  • Indexable vs. Non-indexable: It’s important to understand that if you noindex pagination, then links on those pages can be dropped by Google over time. So if you want to make sure Google is finding those destination urls via the pagination (and passing signals), I would have the pagination indexable. As this case study showed, it can work very well.
  • Canonicalizing Pagination: If you are canonicalizing all pagination to the first page in the series, it’s important to understand that rel canonical is just a hint for Google. It can still choose to index certain pages if it believes that’s the right thing to do. I have covered this in several blog posts over the years. Don’t assume canonicalized pages are actually being canonicalized. This is one reason I’m not a huge fan of canonicalizing pagination to the root page in the set.
  • Crawl Stats and Log Files: Don’t just check indexing levels. I would check the crawl stats reporting in GSC as well (and log files if you can get them). As I covered earlier in this post, not all pages are crawled with the same frequency. Google can crawl certain urls on your site more frequently based on crawl demand. That means pagination might not be crawled as much as your homepage, important category pages, product pages, etc. Don’t assume that high indexing levels for pagination means there’s a problem. It could be totally fine.
  • Tracking Performance: Make sure you track all of this over time and determine if pagination is causing issues SEO-wise (which I doubt for most cases). Again, Google can handle pagination very well and has a lot of experience doing that (as long as it can identify the pagination easily). Using pagination when it’s necessary is fine (and can be good). I would worry more about the number of paginated pages per set than the total number of paginated pages that are indexed on the site. Like I said earlier, make sure your pagination makes sense for both users and for search engines.
  • Google’s Recommendations: Finally, listen to Google’s John Mueller and read Google’s documentation. Google has a lot of experience handling pagination across sites, and it can often just “work”. I recommend implementing a pagination solution based on your own situation, analyze that setup over time, and make sure it’s working for you.

Summary – Yes, your site can be fine SEO-wise with a lot of pagination indexed.
If you are dealing with a lot of pagination across your site, I hope this case study was helpful. I know there’s been a lot of confusion over the years about pagination and SEO, and especially since Google nuked rel next/prev in 2019. As I explained in this post, Google has a long history of handling pagination and it typically will not cause many problems across a site rankings-wise (as long as it’s set up properly). The approach you take really comes down to your own site, the type of content, and the internal linking structure. And if you end up having a lot of pagination indexed, then so be it. As this case study proved, it can work out just fine for you.

GG

Filed Under: google, seo

The short and long-term ranking impact of removing long and fluff e-commerce category descriptions [Case Study]

August 30, 2021 By Glenn Gabe Leave a Comment

ecommerce category description case study

Imagine the following scenario… Excited about buying a new house, you decide to browse for a new kitchen table. So you fire up Google and start searching. You end up clicking through a search listing to view modern kitchen tables and land on the category page of an e-commerce site. But before you start to view all of the great tables they have, you are presented with multiple lengthy paragraphs of content.

As you scroll and scan the description (if you even choose to do that), you notice that the copy is long and doesn’t really help you at all. It’s basically fluff content. For example, maybe it contains the history of modern kitchen tables or maybe it tells you all of the reasons you must have a modern kitchen table (while mentioning different types of modern kitchen tables four or five times). Frustrated, you quickly scroll to get to the actual products.

All of us have come across this situation (and across different types of e-commerce sites). But do we have to? Is it necessary for e-commerce sites to provide long and excessive descriptions that most people won’t even read? And can you rank well without those longer fluff descriptions? That’s a question that many SEOs and site owners must consider while crafting e-commerce category pages and the topic I’m covering in this case study.

If possible, I recommend reading this post from beginning to end to understand the history of the situation, the multiphase approach the site owners employed, the short and long-term impact, and more. But if you are short on time (pun intended), then here’s a table of contents so you can jump around the case study.

Case Study Table of Contents:

  • The e-Commerce Description Conundrum.
  • Case Study Details.
  • When long category descriptions roamed the site.
  • The Desktopian Era: Before mobile-first indexing.
  • Phase One: Refining the descriptions.
  • Phase Two: Hiding shorter descriptions.
  • Boom: The March 2019 Broad Core Update.
  • Phase Three: Biting the bullet with concise descriptions.
  • Final tips and recommendations for e-commerce retailers.

The e-Commerce Description Conundrum:
While helping e-commerce companies deal with major algorithm updates, and analyzing some sites with long and excessive descriptions, it’s clear that some site owners believe those long descriptions help those pages rank well (or are a key part of helping those pages rank well). Therefore, my feedback about the long and questionable descriptions is something that they might make note of, talk about with their teams, and possibly even brainstorm solutions for improving, but never move on. The fear of losing rankings because of removing long and fluff descriptions is enough to keep those descriptions in place (and sometimes for a very long time).

I get it, change in SEO (especially when the competition is consistently using a certain approach) is tough to deal with and implement. But, frustrating users and potentially sending mixed signals to Google about the core intent of a page could also hinder your efforts. To make things even more risky, you have short-term ranking changes to worry about, but then you have broad core updates to worry about. And those only roll out three to four times per year.

To make matters more complex, it’s also important to understand that short-term testing will not always tell you what a broad core update will bring. I’ve covered that several times in my posts about broad core updates, and Google has also explained that testing just a subset of pages is not enough for their algorithms to determine that site quality has improved enough (if you are working on improving quality overall). So, just tinkering with a few category pages will not reveal what a broad core update could roll in…

Here is Google’s John Mueller explaining this: (at 11:41 in the video)

The Case Study:
Below, I’m going to cover a case study of an e-commerce site I helped over time that struggled during several major algorithm updates years ago and finally decided to address their lengthy and excessive descriptions on category pages. And in their place, they added much shorter descriptions that better fit their products, users, and niche overall.

This change was a long time in the making, since I initially brought this up to them several years ago (2015). Being cautious, they took a multi-phase approach (over years) moving from long and fluff category descriptions (that weren’t really valuable for users) to shorter, tighter descriptions that fit their product line and users much better. So, I have data from the short-term changes they implemented, and data over the long-term.

My hope is that this case study can get your wheels spinning about the best possible e-commerce category pages you can provide to prospective customers. And to be clear, that might include longer descriptions, or it might not. It really depends on what your research and testing yields. That’s a good segue to an important disclaimer.

Disclaimer: I’m not saying every site should nuke or replace their long descriptions and I’ll cover the various nuances in this post. But, I do think every site owner should review their category page content, understand what real users want to see, provide the best experience possible, while obviously trying to rank well in the search results. This case is a good example of a site owner taking a multi-phase approach to refining category descriptions over several years and those changes paying off. Read on.

Jumping Back in History: When long category descriptions roamed the site.
Again, the company reached out to me for help after seeing its fair share of volatility during major algorithm updates. They initially reached out to me in 2015 and I heavily analyzed the site (with the goal of using a “kitchen sink” approach to remediation). That’s where my goal is to surface every potential issue that could be impacting the site from a quality standpoint and then site owners working hard to implement as many of those recommendations as possible.

One glaring finding was about their e-commerce category pages. The pages had lengthy (and excessive) descriptions at the top of the page before the actual products were presented. The descriptions were multiple paragraphs long and didn’t really help users.

Also, the descriptions weren’t displayed in full by default. The first paragraph was shown with a “read more” link that would trigger the full content block. I found that strange since the content was supposed to help users (but most of it was being hidden on-load). Also, this was before mobile-first indexing, so hidden content was not given full weight from a ranking perspective. I’ll cover more about that next.

Here is a mockup of what the category descriptions looked like in 2015:
One paragraph on-load with a “read-more” link revealing several paragraphs:

Long e-commerce category descriptions hidden behind "read more" link.

Long description displayed once the “read more” button is clicked:

Expanding long e-commerce category descriptions via a "read more" link.

The Desktopian Era: Before mobile-first indexing.
Before I go any further, it’s important to remember that this was before mobile-first indexing was a thing. So, any content hidden on-load was not given full weight from a ranking perspective. I explained to my client that the hidden content in each description was (probably) not given full weight, along with other key points about what Google had been explaining about e-commerce category pages that contained lengthy, fluff descriptions.

For example, John Mueller has explained many times that e-commerce sites don’t need to add lengthy descriptions in order to rank well. Instead, he said it is important to add some textual content so Google can have context about what the page is about, but that writing a book about the category can actually confuse Google’s algorithms.

In other words, if you provide a lot of informational content about a category, along with product listings, Google’s algorithms can start to wonder if the page is informational or transactional. i.e. Are you selling products or are you providing the history of the category? John has also mentioned the danger of keyword stuffing when you add really long descriptions to e-commerce category pages.

Here are two clips from Google’s John Mueller about e-commerce category pages:
Watch the entire clip. There are some great nuggets of information from John: (at 29:25 in the video)

In this second video, John explains how site owners should focus on providing informative content and place it where users will actually see it. He also covers alt text, captions, and headings: (at 7:18 in the video)

So, the implementation of lengthy (and mostly hidden) category descriptions in 2015 could have been a UX barrier (frustrating users) without much ranking benefit (since the descriptions were mostly hidden). And if Google did take that content into account, it could have been confusing Google about the core point of the page. These were all important things to consider before implementing changes.

Phase One: e-commerce category descriptions refined.
After surfacing many of the e-commerce category descriptions that were long and not extremely valuable, the site owners decided to take action and implement some changes. But, they wanted to test the waters by using a multi-phase approach to changing the descriptions.

They first cut down the descriptions and made them all readable on-load (without needing to click a “read more” button). The new descriptions were about 30% shorter and supported the category better than the longer, fluff descriptions that were replaced.

Shorter e-commerce category descriptions visible on-load.

Their hope was that this change, along with many others they were implementing across the site, would help with Google’s evaluation of quality over the long-term. So even a short-term drop would be ok if the site performed better over the long-term. Now it was time to see how Google would react.

The short-term impact: All clear on the ranking front.
Nothing really changed immediately following the first description change, which was good. The pages still ranked where they were, so it seemed removing all of that extra content didn’t impact the short-term rankings for the pages (or site). Phase one was complete.

Here are some ranking graphs from that timeframe for head terms leading to category urls. Note the site moved to https in 2016 so you will see the http urls switch to https in the graphs below (green to blue). Rankings for the category pages held steady through the description changes and the move to https:

Ranking impact for a page with shorter e-commerce category descriptions.
Ranking impact for another key page with shorter e-commerce category descriptions.
Ranking impact for a third category page with shorter e-commerce descriptions.

Phase Two: Hide the shorter descriptions.
The next phase was implemented when the site was redesigned the following year. The site owners decided to hide the shorter descriptions behind a “read more” link. And to clarify, the entire description was hidden behind the link (on both desktop and mobile). As you can probably guess, I wasn’t a big fan of this approach. If you have descriptions that are helpful for users, then I think you should provide them on the page by default (especially on desktop). If not, then refine that content so it can be displayed on-load. To clarify, it’s fine on mobile to collapse some of the content behind a UI element, if needed, but the entire description shouldn’t be hidden in my opinion.

Also, the site had not been switched to mobile-first indexing yet, so the content should not have been given full weight from a ranking perspective. My recommendation was to implement changes there and provide more of that description on-load (and to possibly cut down those descriptions even more). The site owners totally understood and said they would figure something out to improve the situation.

Shorter descriptions fully hidden on-load behind a “read more” button:

Fully hiding a shorter e-commerce category description behind a "read more" link,

Shorter descriptions displayed after the “read more” button was clicked:

Revealing the shorter e-commerce category description after clicking a "read more" link.

Note, rankings did not change much when the descriptions were hidden. So, it seems that change didn’t really impact much rankings-wise.

Here are rankings for some head terms leading to those category pages when phase two was implemented:

Ranking impact of hiding an e-commerce category description behind  a "read more" link.
Ranking impact for another category page after hiding a description behind  a "read more" link.
Ranking impact for a third e-commerce category page after hiding a description behind  a "read more" link.

So far I’ve covered the short-term impact of the description changes, but Google pushes broad core updates several times per year (and they can clearly have a huge impact on sites across the web). Well, a broad core update rolled out in March of 2019 and it was a big one for the site. I’ll cover that next.

March 2019 Broad Core update: Boom, the site surged, and so did many of those pages.
When the March 2019 broad core update rolled out, the site surged overall (and so did many of those pages). It was amazing to see. Now, broad core updates can often impact a site overall (and increase rankings across many pages). That’s due to Google’s site-level quality signals. And as I’ve covered in my posts about broad core updates, Google is evaluating many factors over time and is reevaluating site quality and relevance with broad core updates. It was just interesting to see the site surge, and those category pages jump in rankings.

And, removing 30% of the description content from the original implementation clearly didn’t hurt those pages long-term. Actually, the description content was better for users with more relevant information. Also, by the time the March 2019 core update rolled out, the site had been moved to mobile-first indexing (so the content hidden on-load was given full weight). So the descriptions being hidden behind a “read more” button were given full weight after the move to mobile-first indexing.

Here are some of the increases for category pages during the March 2019 broad core update (head terms that lead to those category pages increasing in rankings based on the update):

Ranking changes for e-commerce category pages based on the March 2019 broad core update.

Phase Three: Biting the bullet with concise descriptions.
In the final phase of category description changes, the site owners decided to go even further and cut down the descriptions to just a few helpful sentences and provide all of that content on-load (without forcing the user to reveal the description). And in my opinion, it was a much better fit for users, the niche, the categories, etc. You can see below they provided just a short paragraph on each page without hiding the description at all.

Providing concise e-commerce category descriptions that are visible be default.

This change was implemented in late 2019 and the site has done extremely well since then. And those category pages are also doing extremely well. It was also interesting to see the site’s search visibility surge with the July 2021 core update. So, clearly those new, shorter category descriptions aren’t hurting the site or rankings for those pages.

Many of the category pages rank on page one, with a number of them in the top five listings, and several key category pages are ranking number one in the search results. Again, it’s great to see an improved category page with shorter, but more valuable descriptions, working well. They didn’t need to write a book about the category… they just needed enough content to meet or exceed user expectations (and to give Google enough signals about what the pages were about).

Here is search visibility trending when filtering by category pages on the site. There was a big surge during the December 2020 core update and then more during the July 2021 core update:

The search visibility impact for e-commerce category pages during the December 2020 and July 2021 broad core updates.

Final tips and recommendations for e-commerce retailers:
Now that I’ve covered the case study, I wanted to end with some final tips and recommendations for e-commerce retailers. Note, there’s not a “one size fits all” approach when it comes to e-commerce category descriptions. I would use a logical process for implementing changes across your category pages if you currently feel stuck with long and fluff descriptions:

  • User Studies: First, run a user study. Ask real users what they want and how they feel about your current descriptions. I’ve covered user studies through the lens of broad core updates many times in my blog posts. Hearing from objective users can be enlightening. Note, you might need to run multiple studies based on your initial results. Again, you never know what you’re going to see and hear while testing your site.
  • Short-term testing: Run some short-term testing of category description changes (just keep in mind this will only show the immediate impact rankings-wise and not what impact during broad core updates will look like). As I mentioned earlier, you can’t run a test on a subset of pages to fully understand the impact during broad core updates. But you can gauge the short-term impact of refining category descriptions.
  • Taking (an educated) Leap of faith: Do what’s best for your users and your site long-term based on the user study, your understanding of your niche, the short-term testing, etc. And if that means cutting down those long fluff descriptions, and replacing them with shorter, but more valuable content, then so be it. As this case study showed, those pages (and the site overall) can do well without adding excessive descriptions like that… Google has explained this over and over, but I know it’s hard to pull the trigger on something like that. Just remember that Google’s John Mueller did explain that having some relevant text on the page about the category is smart (so Google can have some context about the page). Don’t just add images and call it a day…
  • The danger of blindly following the competition: Don’t blindly follow what the competition is doing. I can’t tell you how many site owners have reached out to me over the years after getting hit hard by a broad core update and explaining that they were following one of their top competitors (and it ends up they both got hit during the same core update). As Google’s John Mueller has said many times, some sites might be ranking well despite the bad or risky things they are doing.

Summary – Working towards better e-commerce category descriptions.
I hope you found this case study interesting. I know there’s a long history of debate over e-commerce category descriptions, despite what Google has explained many times. Again, I’m not saying all sites should nuke their long category descriptions. It’s more nuanced than that. But I would start to understand what your users truly want to find on those pages, run user testing, implement short-term changes, and then gauge how that’s working for the site. And if all signs point to cutting down those long descriptions, and replacing them with shorter, but more valuable descriptions, then that might be the best move. And that’s exactly what worked for the site I covered in this post (over both the short and long-term).

GG

Filed Under: ecommerce, google, seo

Google Broad Core Updates and Image Search: Can core updates impact Image Search rankings in addition to Web Search and Discover?

August 3, 2021 By Glenn Gabe Leave a Comment

Google Broad Core Algorithm Updates and Image Search

Update: Google’s Danny Sullivan confirmed that Image Search can be impacted by broad core updates. That matches with what I’m seeing in the data, so it’s great to have that confirmation from Google.

——————–

I was asked an interesting question on Twitter about broad core updates from Kenichi Suzuki. He asked if Google’s broad core updates could impact Image Search (in addition to Web Search and Discover). Although I’ve helped many sites with Image Search problems (especially after url migrations), I haven’t focused too much on Image Search from a broad core update standpoint. Instead, I’m usually helping companies that have seen significant drops in Web Search traffic. Also, I don’t remember Google’s John Mueller, Gary Illyes, or Danny Sullivan ever covering this topic before, so I decided to dig in.

Tweet about Google core updates and image search.

Since I have a lot of broad core update data, I fired up GSC and quickly checked some sites that surged or dropped during the last several broad core updates (dating back to May 2020). After quickly checking several sites, I didn’t see much movement in Image Search when those sites were impacted by broad core updates (in Web Search). That’s when I tweeted this:

Reply to tweet about Google core updates and image search.

I replied too soon. :) As I checked more sites, including ones impacted by the recent June and July broad core updates, I absolutely started to see several of them showing impact in Image Search when they were heavily impacted by broad core updates. So down the rabbit hole I went…

But before I provide examples of what I am seeing with Image Search during broad core updates, I think it’s important to cover how Google ranks images in Image Search, as well as how images can rank in Web Search. I know there’s a lot of confusion about that.

Ranking in Image Search: It’s about the image and landing page combination.
When Google ranks images in Image Search, it’s not just about the images alone. That tends to surprise some site owners that see drops or surges in Image Search. Google ranks images in Image Search based on the landing page and image combination. Google has explained this many times and I covered that in my post about image search last year.

Here is Google’s John Mueller about Image Search rankings and the importance of the image and landing page combination (at 14:41 in the video):

Google wants to make sure users can find out more about the content behind the image ranking, so it’s using the landing page/image combination for rankings. In my opinion, that could be why Image Search can be impacted by broad core updates. For example, if a page drops in Web Search, that scoring could potentially impact the calculation for Image Search rankings (based on the landing page content). Again, that’s my opinion and not something confirmed by Google. It just makes sense to me…

Images Ranking in Web Search
In addition to Image Search rankings, images can rank in Web Search as well (and register impressions and clicks that show up in the performance reporting for Web Search in GSC). I covered that in a post about why you might see high rankings and low impressions (since images ranking in image packs, knowledge panels, and featured snippets can often rank highly, but receive very little clicks).

For example, when an image ranks in a SERP feature in the Web Search results, it yields an impression (and click if someone clicks through). If a site is impacted by a broad core update, then those listings in Web Search containing those images could drop (or the images could drop out of certain SERP features). If that happens, those images would not receive the impressions/clicks they once did, so that can yield a drop for Web Search in Google Search Console. And remember, the default GSC performance reporting is for Web Search. To view Image Search data, you need to switch the “Search type” to Image Search.

For example, the images in the knowledge panel below will register an impression in WEB SEARCH and not Image Search. If those images drop out during a broad core update, then that would impact impressions and clicks in Web Search:

Images ranking in Web Search in a Knowledge Panel.

Therefore, it’s important to understand that drops or surges in Web Search could include images that rank in SERP features like image packs, knowledge panels, and featured snippets. Again, you can read my previous post about that topic for more information. But that’s Web Search and not Image Search. Below, I’ll cover the impact to Image Search from broad core updates.

Image Search Impact During Broad Core Updates: Data and Examples
Based on Kenichi’s question, and the mixed bag of trending I saw during my initial quick analysis, I decided to take a much closer look. And it was sort of a rabbit hole… I ended up checking many sites impacted by broad core updates to see the impact on Image Search.

Based on my deeper analysis, I absolutely saw impact in Image Search for a number of sites that were heavily impacted by broad core updates. When isolating Image Search, you could see the same trend that Web Search showed (whether that was surging or dropping during a specific broad core update). Not all sites lined up, though, which can make sense… I’ll cover more about that soon.

Here are some examples of sites impacted by a broad core update in both Web Search and Image Search:

Example A: Web Search drop

Drop in Web Search during a Google broad core update.

Example A: Image Search drop

Drop in Image Search during a Google broad core update.

Example B: Web Search surge

Surge in Web Search during a Google broad core update.

Example B: Image Search surge

Surge in Image Search during a Google broad core update.

Example C: Web Search drop

Significant drop in Web Search during a Google broad core update.

Example C: Image Search drop

Significant drop in Image Search during a Google broad core update.

Example D: Web Search surge

Large surge in Web Search rankings and traffic during a Google broad core update.

Example D: Image Search surge

Large surge in Image Search traffic from a Google broad core update.

And like I explained earlier, not all sites saw the same level of impact in Image Search after seeing heavy volatility in Web Search after a broad core update. I believe this is due to the landing page and image combination I covered earlier. When sites are impacted by broad core updates, certain areas of a site could absolutely be impacted more than others. For example, maybe a blog is impacted differently from an e-commerce section, a forum, or company pages.

Here are examples of sites that were NOT impacted heavily in Image Search by a broad core update after seeing significant impact in Web Search:

Example A: Web Search surge

Example of a site surging in Web Search during a Google broad core update, but not surging in Image Search.

Example A: Image Search mostly unaffected

Image Search stable when Web Search surges during a  Google broad core update.

Example B: Web Search drop

Example of a site dropping in Web Search during a Google core update, but not dropping in Image Search.

Example B: Image Search mostly unaffected

Image Search largely unaffected when Web Search drops during a Google broad core update.

Another way to visualize the impact to Image Search rankings during broad core updates is at the query level. Here are examples of average position dropping or surging for specific queries in Image Search when a broad core update rolled out:

A surge in rankings in Image Search when a broad core update rolls out.
Drop in Image Search rankings when a Google broad core update rolls out.

So, the data shows that broad core updates can impact Image Search. I’ll try and ping Google’s John Mueller and Danny Sullivan to see if they can provide more information about broad core updates and Image Search. I’ll update this post with any information they provide.

Update: Google’s Danny Sullivan confirmed that Image Search can be impacted by broad core updates.
I pinged Danny Sullivan on Twitter with a link to this post and asked if he could provide more information (and maybe confirmation) that Google’s broad core updates could impact Image Search. Danny explained that broad core updates can involve Image Search results because they involve Google’s core ranking systems that involve all types of content. You can see his tweet below. This definitely matches what I am seeing in the data, so it’s great to have confirmation from Google about this.

Google's Danny Sullivan confirms that Image Search rankings can be impacted during broad core algorithm updates.

Summary: Check Image Search trending if your site has been impacted by a broad core update.
I hope this post was helpful for learning more about how Image Search could be impacted by broad core updates. Although we haven’t received confirmation from Google about this, it sure seems that broad core updates can impact image search rankings. I have a lot of data showing that happening across sites heavily impacted by broad core updates.

And again, it does make sense based on Google using an image and landing page combination when determining rankings for Image Search. If a page drops in Web Search, that scoring could potentially impact the calculation for Image Search rankings. But, like I explained earlier, not all sites impacted by broad core updates will see a drop in Image Search rankings. Different areas of a site can be impacted in different ways. In closing, if you’ve been impacted by a broad core update, then definitely make sure to analyze your Image Search rankings as well. You might be surprised with what you find.

GG

Filed Under: algorithm-updates, google, seo

Google’s July 2021 Broad Core Update – Rapid-fire Insights From The Summer of Two Core Algorithm Updates

July 13, 2021 By Glenn Gabe Leave a Comment

{Updated: July 23, 2021 with information from Google about broad core updates being unrelated to Core Web Vitals and the Page Experience Signal.}

{Updated: August 6, 2021 with information from Google’s John Mueller that the June and July broad core updates were separate and unique.}

{Update July 2022: I just published my post covering the May 2022 broad core update containing five micro-case studies of drops and surges.}

—————-

Welcome to Google Land.

Please keep hands, arms, and legs inside the vehicle at all times. :) Yes, the spectacular summer of Google algorithm updates continues. This has been one of the most volatile few months I can remember from an algorithm update standpoint. First, Google rolled out the Product Reviews Update in April, which was followed by the June broad core update (when Google also explained a second core update would be rolled out in July). Then the much-anticipated Page Experience Update rolled out in mid-June. After that, Google rolled out two spam updates on 6/23 and 6/28, followed by the July 2021 broad core update on 7/1.

Google has been busy, that’s for sure. For the most part, it’s felt like this:

In this post, I’m going to cover findings based on the July 2021 broad core update, and its sibling core update that rolled out in June. There are many sites that saw impact during both broad core algorithm updates, so it makes sense to document what I’m seeing across both. And that includes some reversals (which Google explained could happen with the July update). Remember, Google explained that some improvements weren’t ready for the June core update and that’s why there would be a second core update in July. That’s unusual… and Google didn’t explain more about that (for obvious reasons).

Rapid-fire insights from a double core update:
In this post, I’ll provide a list of rapid-fire insights based on what I’m seeing across sites impacted. First, here’s a quick table of contents in case you want to jump around the post.

Table of contents:

  • Rollout of July and June core updates.
  • Time to see impact.
  • Reversals from June.
  • The June and July core updates were separate and unique.
  • Tremors.
  • Health and medical sites.
  • Finance sites.
  • Sites with the same, or near-duplicate, content.
  • Category adjustment based on Product Reviews Update flaw.
  • Reversals from previous broad core updates.
  • Product Review sites.
  • Spam updates.
  • Rich snippets impact.
  • A note about Core Web Vitals.
  • The “Kitchen Sink” approach to remediation.
  • Site-level quality signals.
  • Final tips and recommendations.

Rollout of the June and July broad core updates:
The June update started rolling out on 6/2 and finished rolling out on 6/12. That was pretty quick for a broad core update, which can sometimes take up to 14 days to fully roll out. The July broad core update started rolling out on 7/1 and completed rolling out on Monday, 7/12. So, the July update took 12 days to fully roll out, closer to what we normally see with broad core updates.

And I have to mention the dates of other Google algorithm updates rolling out since early June. Site owners should know these dates to make sure they are seeing movement from the broad core updates and not the others. Google started to roll out the much-anticipated Page Experience Update on 6/15, but that will roll out slowly and it won’t fully complete until the end of August. Also, Google explained the Page Experience Signal will be a lightweight factor (more of a tiebreaker). So, if you are seeing significant movement in June and July, it’s probably not from the Page Experience Update.

Here is a slide from my SMX Advanced presentation about the power of the signal (along with a tweet of mine based on a video from Google’s Martin Splitt):

Google's @g33konaut about the Page Experience Signal in a video w/@lorenbaker:

*Don't panic.
*It's a tiebreaker.
*For some it will be quite substantial (?!?!), for others, not so much.
*Don't oversell the update to stakeholders… that can backfire.https://t.co/2LN3Dj7mEG pic.twitter.com/9pUFvEGFa9

— Glenn Gabe (@glenngabe) May 24, 2021

There were also two spam updates that rolled out on 6/23 and then 6/28, and I have seen several sites impacted by those updates. I’ll cover more about that in my post, especially sites that were impacted and then also saw movement during the double core updates in June and July.

Time to see impact from the broad core updates: Slower in June, quicker impact in July.
When a broad core update rolls out, we typically see movement pretty quickly (within 24-36 hours after the rollout begins). The June broad core update was different… we didn’t see movement until a few days into the rollout for some reason. The first movement I saw was 6/5 into 6/6. I tweeted about that once I saw a lot of volatility across sites.

Heads-up. The first major signs of visibility changes are being seen today. Some very big changes for the sites impacted. The update started rolling out on 6/2, but I just started seeing these visibility changes 6/5 into 6/6. Looks more core update-like now… pic.twitter.com/k7D8lfkAqi

— Glenn Gabe (@glenngabe) June 6, 2021

Here is GSC trending for a client of mine that started seeing a surge on 6/6:

And on the flip side, we saw volatility very quickly with the July broad core update (within 24 hours of the rollout). That’s similar to what we have seen with other broad core updates. I also tweeted when I started seeing the early movement.

And here we go. Saw a lot of movement starting yesterday based on the July broad core update. Some sites are seeing big changes very quickly vs the June update. First, some examples of surges during the update so far. And one saw a nice bump in June and more in July. pic.twitter.com/IoJhnXiqy1

— Glenn Gabe (@glenngabe) July 3, 2021

Also, and I’ve covered this in other posts about broad core updates, but it’s important to analyze the queries and landing pages dropping based on a broad core update. There could be several reasons for the drop, including relevancy adjustments, intent shifts, overall site quality problems (or a mix of reasons). You can read my post about this topic for more information.

June reversals in July. We were warned about this…
When the June 2021 broad core update rolled out, Google explained that some sites could see reversals with the July 2021 broad core update. Needless to say, I was interested in seeing how that would go. I also tweeted that nobody should declare victory with the June core update until July fully rolls out. And after the July broad core update started rolling out, I saw some sites reverse course. There weren’t a ton of reversals based on the sites in my dataset, but there were quite a few.

Here are some examples of visibility changes for sites that saw reversals (either surging, then dropping, or dropping, then surging):

Google: The June and July Core Updates Were Separate and Unique.
Based on some sites seeing impact during one update, and not the other, Google’s John Mueller was asked a question about that during a Search Central Hangout video. John explained that the June and July core updates were separate and unique updates. And just because they are both “core updates” doesn’t mean they affect the same core parts of the ranking system. That’s why some sites can see impact during one of the core updates, but not the other. Here is John explaining this (at 30:25 in the video):

Tremors: Not a surprise, just like with other major algorithm updates.
After major algorithm updates roll out, it’s not unusual for Google to make smaller tweaks to the algorithm based on what they are seeing in the SERPs. You can think of them as minor adjustments to fine-tune the results. Google’s John Mueller explained this publicly back in medieval Panda times. Well, we definitely saw tremors during the July broad core update (especially near the end of the rollout). Here are two examples of sites seeing changes starting around 7/9.

Is there a doctor in the house? A reminder that Google’s algorithms can be more critical of health/medical content:
With many broad core updates, there can be a lot of volatility across sites categorized as Your Money or Your Life (YMYL). And within that category, health and medical can often see a lot of movement. Google is on record explaining that its algorithms are more critical with health and medical content (for obvious reasons).

Here is a tweet I shared based on a video from Google’s John Mueller explaining more about health and medical. And here is a direct link to the video if you want view it now (starting at 20:25 in the video):

Run a health/medical e-commerce site? Via @johnmu: Our algorithms are more critical for health/medical topics, so def. keep E-A-T in mind. Make sure the site represents a very high standard. i.e. High-quality content created by actual medical professionals https://t.co/aiMrdN9Hl7 pic.twitter.com/Nuz3K7Pi6o

— Glenn Gabe (@glenngabe) March 27, 2021

Well, I saw some very big swings across some health and medical sites with the June and July core updates. There were some sites with huge changes in search visibility… Note, it’s impossible to know what Google exactly changed since there are so many factors being evaluated. In my opinion, you will rarely be able to isolate specific changes to Google’s algorithms (for broad core updates). It’s more important to keep improving your site overall, and I’ll cover the “kitchen sink” approach to remediation later in this post.

For example, here are some examples of big visibility changes for health/medical sites:

Take the money and run. Also a lot of volatility across some financial sites.
While we’re on the topic of YMYL content, I also saw a lot of big swings in search visibility across sites focused on finance. Since those sites are considered YMYL, Google’s algorithms hold them to a higher standard. And some of the sites saw a lot of movement with the summer set of core updates.

For example:

Sites with the same, or near-duplicate, content: Reference, lyrics, directories.
I have previously covered sites that contain the same, or very similar, content before. For example, reference sites, lyrics sites, directories, etc. Google has explained that if you provide the same content as many other sites, and don’t provide a serious value-add, then it can be hard for Google’s algorithms to decide which sites should rank highly. I have seen this many times over the years and have helped a number of sites in verticals like these. It’s easy for sites that don’t provide a value-add to surge and drop during each broad core update (as Google struggles to determine which ones should rank higher…)

With the June and July core updates, I saw a lot of movement across sites that fit into these categories. Again, it’s extremely important to provide a serious value-add (especially when the content is the same, or near-duplicate). If you don’t provide a value-add, then you could surge during one update, and drop heavily during the next. And that leads to roller coaster trending (which can drive site owners insane).

Here are some examples of sites that fit into this category that saw a lot of movement:

Large-scale reference sites:

Lyrics sites seeing a lot of movement:

Interesting: Category adjustment based on a Product Reviews Update flaw?
Continuing down the path of trends I saw during the June and July core updates, there was a lot of sites within one specific vertical that did not fare well with this update. I have a done a lot of work in that space over the years and it was clear that Google implemented big changes on that front. I’m not saying all sites in the niche dropped heavily, but a number of sites did (including some of the top players). 

This could have started when Google took a hard look at the vertical after the Product Reviews Update (PRU). With that update, some sites that don’t provide reviews content surged for queries that fell outside of their focus. I can’t go into too much detail about this, but Google’s algorithms incorrectly caused those sites to surge during the PRU. So, I knew it wouldn’t be long for them to implement changes there algorithmically. And those changes could be seen during the June and July broad core updates. I covered more about the collateral improvement I saw in my post about the Product Reviews update.

Here is some trending for those sites during the update(s):

Reversals from previous broad core updates: The gray area of Google’s algorithms.
In my previous posts about broad core updates, I explained a very important point for site owners about recovery. Sites that are heavily impacted by broad core updates typically cannot see recovery until another broad core updates rolls out. There have been a few exceptions, including what I saw with the Product Reviews update, but overall, sites typically need to wait for another broad core update to see recovery (and that’s only if the site has significantly improved in quality over the long-term). You can read my previous posts about core updates to learn more about that.

Here is a slide from my SMX Advanced presentation with information about recovery from Google’s broad core updates (including information from Google’s John Mueller):

Well, during this update, there were many, many examples of sites that reversed course (big-time) from previous core updates. And for sites that surged during a previous core update only to get hammered during this broad core update, that can often happen when those sites are in the gray area of Google’s algorithms (where they might creep out of the gray area during one update and surge, but drop back into the danger zone during the next update and drop). I’ve explained many times before that the gray area of Google’s algorithms is a maddening place for site owners to live.

Here is an example of a site seeing major volatility during a number of broad core updates:

Product Review Sites: Enough is enough
What a few months it’s been for product review sites… We had the Product Reviews Update in April, which was core update-like for many review sites. And then the June and July broad core updates rolled out and some of those sites saw even more impact. And some even reversed course with the June/July broad core updates. It seems the double core updates this summer had even more impact on a number of those sites.

For example, here are some product review sites during the June/July core updates:

And then there were review sites that were unaffected by the Product Reviews Update that saw big gains or drops during the June and July core updates. Here’s a client of mine that has worked extremely hard to improve the site overall (large-scale reviews site impacted by several previous core updates that surged during the June broad core update). This was great to see based on how much work they completed on the site:

Spam Updates + Broad Core Updates. A one-two (or more) punch for some sites.
I mentioned earlier that Google released two spam updates in June (the first on 6/23 and then the second update five days later on 6/28). Most sites didn’t see any impact from those spam updates, but it was big for the ones that did.

And beyond the spam updates, some of those sites saw additional movement with the June and July broad core updates. Some reversed course, some surged, some dropped, etc. It was definitely a crazy few weeks for those sites (getting caught in the spam updates and broad core updates).

For example, here are some sites seeing movement with the spam updates and the June/July broad core updates:

Rich snippets impact: Speak of the devil
My last blog post covered how rich snippets can be impacted by broad core updates, so it was fitting to see that happen during the June/July core updates. If you are interested in learning more about that (and how that can happen), check out my blog post.

Here’s a great example of a site that received rich snippets back after losing them during a previous broad core update (review snippets in this case). This is based on Google’s site-level quality signals (which Googlers have explained before and based on what I’ve seen in the field many times). I’ll cover more about site-level quality signals soon.

And here are how-to snippets returning for a site during the June broad core update that had them removed during a previous core update:

A Note About Core Web Vitals: Correlation vs. Causation
Based on the June/July core update impact, there are some people that speculated that Core Web Vitals were factored in during the summer core updates. For example, many sites that improved during the update had strong Core Web Vitals scores, and some were speculating that those scores were part of the reason they surged. I was asked on Twitter about this as well.

First, I don’t believe Core Web Vitals scores were taken into account during the June and July broad core updates. And I was glad to see Google’s John Mueller confirm that to be the case. You can see the video below for John’s response. But, that doesn’t mean an improved user experience didn’t help those sites. For years I have been saying that Google’s broad core updates take many factors into account, including the user experience. I’ve covered how aggressive and disruptive advertising could impact sites, how user experience barriers could have an impact, how deception could be problematic, and more. So, it makes complete sense that sites improving those areas could see gains during subsequent broad core updates (while also seeing Core Web Vital scores improve).

For example, here’s a slide from my SMX presentation in 2019 where I covered how a negative user experience could contribute to a broad core update hit:

Regarding sites that improved during the July core update, here are the Core Web Vitals scores for two sites that surged during the June core update, but their scores aren’t great. They are ok, and have improved, but they aren’t strong across the board. And keep in mind that the site owners worked on improving their sites overall, including content quality, user experience, the ad situation, technical SEO issues causing quality problems, and more.

So, I think this is a correlation versus causation situation for sites with strong CWV scores that surged during the June or July core updates. The scores themselves wouldn’t be factored in with broad core updates, but the effect of improving Core Web Vitals could absolutely impact a site during broad core updates. Anyway, just my two cents on the subject. :)

—Update About Core Web Vitals: July 23, 2021—
Google’s John Mueller confirmed in a Search Central Hangout video that broad core updates are unrelated to Core Web Vitals and the Page Experience Signal. John reiterated that broad core updates are about Google evaluating the quality and relevance across a site where the Page Experience Update is about performance. He also explained that if you saw a stark drop or surge, then it’s probably more related to the core updates and not the Page Experience Signal (which includes Core Web Vitals). You can watch the video below at 28:01 in the video:


The “Kitchen Sink” approach to remediation: A strong way forward for site owners.
I have covered what I call the “kitchen sink” approach to remediation many times before in my posts and presentations about broad core updates. Google is evaluating many factors over an extended period of time with broad core updates, so it’s impossible to isolate one or two things that need improvement. I’ve often said there’s never one smoking gun with sites negatively impacted by a broad core update. Instead, there’s typically a battery of them.

That’s why it’s important to thoroughly analyze a site through the lens of broad core updates, surface all potential problems, and fix as many as you can (or all of them in a best-case scenario). Google is on record that they want to see significant improvement in quality over time. And that’s exactly what I’ve seen in the field while helping sites that have been negatively impacted by broad core updates. Surface all potential problems, fix as much as you can, improve the site greatly, etc. That’s your best bet for seeing recovery during subsequent broad core updates.

And no, it’s not easy. There’s usually a lot of work that has to be done. And some of those decisions are extremely hard to make. Beyond that, then you have the execution of those changes, which can be challenging. But again, significant improvement is what Google wants to see. And it can pay off big-time during subsequent broad core updates.

Site-level quality algorithms and site-level impact:
It’s also important to understand that Google has site-level quality algorithms that can have a big impact on rankings across a site. These algorithms can lift rankings overall for a site and not just on a url-by-url basis. I have covered this many times over the years and have documented Google explaining this over time. This is also why it’s important to significantly improve a site overall and not just a few urls. Site owners can miss the forest for the trees if they focus too narrowly (at least for broad core updates). So, think broadly about improving the site, pun intended. :)

Here is just one tweet of many from me about the subject. This links to a video from Google’s John Mueller explaining more about site-level quality signals:

"When evaluating relevance, Google tries to look at the bigger picture of the site." And "Some things Google evaluates are focused on the domain level".

John has explained many times that G evaluates certain things (like quality) at the site level. I have tweeted many times… https://t.co/FqgSoBf5Yl

— Glenn Gabe (@glenngabe) January 26, 2021

Here are some examples of sites I have helped that took a “kitchen sink” approach to remediation that surged during the June or July broad core updates. They didn’t fix one or two things. They fixed many things that had a huge impact on the site overall. Remediation like this can take months (or longer), but can have a big impact on a site during broad core updates.

It’s also important to understand that Image Search can be impacted during broad core updates as well. Here is one of the sites I helped that used a “kitchen sink” approach to remediation that also surged in Image Search (in addition to Web Search). For more information about that, you can read my post about how broad core updates can impact Image Search:

Next steps for site owners impacted by the June and July core updates:
If your site has been heavily impacted by the June or July broad core updates, I have some closing bullets that might be helpful.

  • Recovery: First, understand you probably won’t see recovery without another broad core update rolling out. Don’t implement a few changes and expect to see a surge in rankings. That’s not how it works.
  • Improvement: Understand that Google wants to see significant improvement in quality over the long-term in order to see recovery. Sure, some sites surge without doing anything during broad core updates, but those sites can stay in the gray area of Google’s algorithms and possibly drop again during subsequent broad core updates. I would work to clearly get out of the gray area.
  • Remediation: Take a “kitchen sink” approach to remediation (as covered earlier). There’s usually not one smoking gun on sites heavily impacted by broad core updates. Instead, there’s typically a battery of them. Surface all potential problems and fix them all (or as many as you can).
  • User Studies: Run a user study through the lens of Google’s broad core updates. I usually cover this after every broad core update rolls out, since it’s super-powerful. I know it sounds like a pain in the neck to run a user study… but the feedback is pure gold for improving a site, the user experience, the content, and SEO overall.
  • Drive Forward: Keep driving forward. Don’t wait to publish new content or implement improvements to the site. The quicker you can improve the site, the content, the user experience, etc., the better. Again, that’s what Google wants to see. Unfortunately, there are many sites that see continuous drops during broad core updates until they significantly improve things. I have documented some of those situations in my case studies about broad core updates. The sooner you can improve, the sooner you can recover during subsequent broad core updates.
  • User Experience: Improve the user experience on your site. Don’t bombard users with ads, annoy them with unnecessary popups, notifications, and more. I mentioned Core Web Vitals and how improving those scores can have a secondary effect during broad core updates. In my posts and presentations about broad core updates, I’ve often said, “hell hath no fury like a user scorned”. I’ve seen this over and over when analyzing sites that were negatively impacted during broad core updates and I believe this will only continue. Create happy users. Good things can happen.
  • Value-add: Make sure to provide a value-add (especially if you provide content that can easily be found elsewhere on the web). If you don’t provide a value-add, then Google’s algorithms can have a hard time understanding which site to rank highly (or at all). You can end surging and dropping during broad core updates (yo-yo trending over time). Invest in providing more value to users than your competitors. Prove to Google you should rank over those sites.

Summary: The Spectacular Summer of Google Algorithm Updates Continues…
I hope my rapid-fire list of insights based on the June and July 2021 broad core updates was helpful. One thing is for sure. It’s been a crazy few months from an algorithm update standpoint, and we’re not done yet. The Page Experience Update is still rolling out and should fully roll out by the end of August. And from a broad core update standpoint, I think the soonest we would see another is in the fall. It’s hard to say if Google will roll one out then, but site owners should start working on improvements sooner than later. Remember, you will typically need another broad core update to roll out to see recovery (as covered earlier). And that’s only if you significantly improve quality over time. Good luck.

GG

Filed Under: algorithm-updates, google, seo

Yes, Rich Snippets can be impacted by Google’s broad core updates (and other major algorithm updates). Here’s what it looks like and why you should care.

June 23, 2021 By Glenn Gabe Leave a Comment

If you’ve seen a drop (or gain) of rich snippets during a broad core update, then overall site quality could be the reason. In this post, I’ll cover how this can happen, why Google’s evaluation of site quality matters, and I’ll provide examples of rich snippets volatility across sites (including review, how-to, and FAQ snippets).

With the June 2021 broad core update, a number of sites reported the loss or gain of rich snippets. For example a site either losing or gaining review, FAQ, how-to snippets, etc. I also shared a great example of that happening with the June core update and I’ll cover more about that shortly. It’s important for site owners to understand that this can absolutely happen, and we’ve seen this over the years during Google’s broad core updates (and other major algorithm updates).

Here is the screenshot I shared where a client gained how-to snippets during the June broad core update after losing them during a previous broad core update. This site has worked hard to improve over time (using the “kitchen sink” approach to remediation I have covered in previous posts about broad core updates). I also cover more about this in my post about the July 2021 broad core update.

Sites can lose/gain rich snippets during these updates because there’s a site-level quality threshold involved. If Google’s quality algorithms deem a site high quality enough, you can gain rich snippets (or regain them if you lost them previously). And on the flip side, if Google’s quality algorithms aren’t sure about the quality of the site, then you can lose them. Google has explained this many times and I have shared a number of examples of this happening in my posts about broad core updates.

Google is evaluating site quality over time and has site-level quality algorithms that can have a big impact on rankings (and whether sites can receive rich snippets in the SERPs.) Actually, John Mueller just reiterated this in a recent Search Central Hangout video. He explained that there are some signals that Google can’t reliably collect on a per-page basis. Google has to have a better understanding of the site overall for those signals. And quality falls into that category. Again, this isn’t a new statement, but it’s incredibly important to understand.

Here’s the video from John (starting on 40:00):

By the way, this applies to Google Discover as well. Sites can see big swings in Discover visibility with broad core updates and other major algorithm updates (like the recent Product Reviews Update). That’s why it’s incredibly important for site owners to focus on improving quality overall for a site. Don’t focus on just a few urls… improve the site overall.

For example, here is a significant drop in Discover after a broad core update. The site flatlined in Discover as the update rolled out. I covered this in a previous SMX presentation about broad core updates:

And here is a site surging during in Discover visibility during the Product Reviews Update after working hard over time to improve site quality. The Product Reviews Update was a significant update impacting reviews content (and was core update-like for many sites that were impacted):

In this post, I’ll cover more about this topic, including what Google has explained over the years about the quality threshold for rich snippets, examples of what I have seen and shared over time, and then why this is important for site owners and SEOs to understand. 

A Quick Note About Favicons Disappearing:
I’ve helped a number of site owners that had their favicons go missing in the Google search results and they thought it could be due to overall quality problems. In other words, Google’s reevaluation of site quality or Google losing trust in a site. That’s not the case. If your favicon disappears from the mobile SERPs, it’s usually due to technical problems or the favicon violating Google’s guidelines. You can read my post covering favicon problems to learn more about that situation.

Google About Site Quality Thresholds and Rich Snippets Impact:
First, let’s cover what Google has explained over the years about rich snippets, quality thresholds, and major algorithm updates. It’s not uncommon for Google’s John Mueller to be asked questions about sites gaining or losing rich snippets after broad core updates (or other significant algorithm updates). It can be jarring for site owners to see those precious snippets disappear from the SERPs, or surge, during algorithm updates.

Below, I’ll provide several comments from John, and Google’s Gary Illyes, about rich snippets impact. Again, it’s very important to understand. Note, I have shared most of these clips over the years on Twitter or in my blog posts about major algorithm updates.

First, here is Google’s John Mueller explaining that if you’re seeing rich snippets drop after a broad core update (the May 2020 core update in this example), it could be site quality that’s the issue. If everything is correct with the technical setup for receiving rich snippets (structured data-wise), then it could be that Google just doesn’t think the site is high enough quality to receive them. The segment starts at 44:01 in the video:

John also explained in that video that if you suddenly lost rich snippets, and you’re unsure if it’s a quality issue, that you can run a site query as a test. If the technical setup if correct, Google will often show the rich snippets when you run a site query (even if you aren’t receiving them in the actual search results). This isn’t foolproof, but it’s a good test to run. To run a site query, you can search for site:yourdomain.com (and obviously add your own domain name there). The segment starts at 46:24 in the video:

Also, in 2019, Google’s Gary Illyes explained to Dan Shure (and others) that a site losing all review snippets can be a sign of overall quality issues. Gary explained that Google has site-level quality signals and those can impact the gain/loss of rich snippets. He also explained how Panda was a site-level quality signal and that could have caused the loss of rich snippets as well (if a site’s Panda score fell below a certain threshold).

Here is the podcast from Dan Shure where he covers what Gary explained (at 3:45 in the podcast). You can click the image to view the episode on Google Podcasts:

And here is a blog post from Barry Schwartz covering John Mueller’s tweet about why rich snippets might not be showing up when the structured data setup is correct. If the setup is technically correct, and doesn’t violate Google’s structured data policies, then it can be a “general quality issue” with the site:

https://twitter.com/JohnMu/status/895992952262017025

More Examples of What Gaining or Losing Rich Snippets Looks Like:
Now that we’ve covered what Google has explained about site-level quality evaluation and how that can impact rich snippets, I wanted to provide several more examples of this happening in the wild. Again, if Google’s quality algorithms think highly of your site, you can gain rich snippets, or maintain them. But if you’re on the flip side, and Google’s quality algorithms aren’t so sure about your site from a quality perspective, then you can unfortunately lose those rich snippets.

First, here is a site has dealt with its share of drops based on previous algorithm updates that worked hard to improve over time. The site finally received FAQ snippets when the May 2020 broad core update rolled out.

Here is a site losing rich snippets during the May 2020 broad core update (how-to snippets in this case).

And here is how a visibility tool can pick up drops or surges in rich snippets over time. By using a tool like Semrush, you can filter by different types of search features to view trending over time. This site lost review snippets with the January 2020 broad core update.

Here is another example of Semush picking up a surge in review snippets. This time it was during the June 2021 broad core update. This site received review snippets after working to improve quality overall (for months leading up to the update). The graph below is filtered by SERP Feature, and then Reviews:

Using visibility tools can help you view these drops or gains in rich snippets over time (beyond the 16 months of data that GSC provides). It’s helpful when auditing sites that you didn’t have access to when the initial drop or surge occurred.

Why Overall Site Quality Matters: Rich Snippets + Stronger Rankings Overall
Broad evaluation of quality and site-level impact can have a big effect on visibility in the SERPs and in the Discover feed. If Google re-evaluates your site and it passes a certain quality threshold, your entire site can benefit. That’s why some sites can see very strong impact during broad core updates or other major algorithm updates (like the Product Reviews Update). It’s not about a specific url seeing improvement, it’s about the site receiving higher scoring overall from a quality standpoint (which can help many urls rank higher across a site and not just a few).

And from a rich snippets standpoint, if your listings in the SERPs suddenly receive review, how-to, or FAQ snippets, then you can gain a nice advantage in the SERPs from a click through rate perspective. If you mix the enhanced SERP treatment from rich snippets with stronger rankings overall (which can happen if Google’s site quality evaluation improves for a site), that can yield stronger rankings combined with better SERP treatment. That combination can be very powerful for sites benefitting during broad core updates, or other major algorithm updates.

Summary: Rich Snippets Impact During Major Algorithm Updates
I know there can be a lot of confusion about why sites are seeing the gain or loss of rich snippets with major algorithm updates (like the June 2021 core update). For sites losing rich snippets, Google’s quality algorithms just might not be convinced about the overall quality for those sites. And that can yield a loss of rich snippets, or even a big drop in Discover visibility. That’s why it’s extremely important to focus on site quality overall (especially for sites negatively impacted by Google’s broad core updates).

And as I’ve explained in my posts about broad core updates, there’s no quick fix for this. Google will need to see a significant improvement in quality over the long-term in order for a site to regain rich snippets (or to gain more visibility in Discover). Therefore, don’t get so bogged down in improving just a few urls on your site that you forget about Google’s broad evaluation of site quality. Those site-level signals can have a big impact on visibility in the SERPs, and in Discover. Good luck.

GG

Filed Under: algorithm-updates, google, seo

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • 5
  • …
  • 34
  • Next Page »

Connect with Glenn Gabe today!

Latest Blog Posts

  • The Google May 2022 Broad Core Update – 5 micro-case studies that once again underscore the complexity of broad core algorithm updates
  • Amazing Search Experiments and New SERP Features In Google Land (2022 Edition)
  • Analysis of Google’s March 2022 Product Reviews Update (PRU) – Findings and observations from the affiliate front lines
  • How NewsGuard’s nutritional labels can help publishers avoid manual actions for medical content violations (Google News and Discover)
  • What Discover’s “More Recommendations”, Journeys in Chrome, and MUM mean for the future of Google Search
  • How to extend a multi-site indexing monitoring system to compare Google-selected and user-selected canonical urls (via the URL Inspection API and Analytics Edge)
  • Favi-gone: 5 Reasons Why Your Favicon Disappeared From The Google Search Results [Case Studies]
  • Google’s Broad Core Updates And The Difference Between Relevancy Adjustments, Intent Shifts, And Overall Site Quality Problems
  • Google’s December 2021 Product Reviews Update – Analysis and Findings Based On An Extended And Volatile Holiday Rollout
  • The Link Authority Gap – How To Compare The Most Authoritative Links Between Websites Using Majestic Solo Links, Semrush Backlink Gap, and ahrefs Link Intersect

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2022 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy