The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)

March 15, 2023 By Glenn Gabe Leave a Comment

Hourly tracking in Google Analytics 4

I was just asked on Twitter if there was an easy way to compare Google organic traffic hourly like you can in Universal Analytics. That’s a great question, and that’s a super useful report to have as major algorithm updates roll out. You can typically start to see the separation over time as the update rolls out (if your site was heavily impacted by a major update like broad core updates, Product Review Updates, etc.)

So I fired up GA4 and created a quick exploration report for analyzing hourly traffic. Here is a short tutorial for creating the report:

1. Fire up GA4 and click the “Explore” tab in the left-side menu.

Explore tab in Google Analytics 4

2. Click the “Free Form” reporting option.

Free form exploration reporting in Google Analytics 4

3. Click the plus sign next to “Segments” to add a new session segment. Then create a segment for Google Organic by adding a new condition, selecting “Session source / medium” and then adding a filter for “google / organic”.

Creating a segment for Google Organic in Google Analytics 4
Selecting session source and medium and then filtering by Google Organic when creating a new segment in GA4

4. Add that segment to your reporting by dragging it to the “Segment Comparisons” section of the report.

Adding a segment to the reporting in Google Analytics 4

5. Set “Granularity” to Hour.

Selecting Hour as the granularity for the reporting in Google Analytics 4

6. Add a new metric and select “Sessions”. And then drag “Sessions” to “Values”.

Adding sessions as a metric in Google Analytics 4

7. Change the visualization to line chart by clicking the line chart icon.

Changing the visualization of the reporting to line graph in Google Analytics 4

8. For timeframe, select “Compare” and choose a day. Then choose the day to compare against. Note, GA4 isn’t letting me choose today (which is a common way to see how the current day compares to a previous day). So, you’ll have to just compare the previous day to another day. Sorry, I didn’t create GA4.

Comparing timeframes in Google Analytics 4

9. Name your report and enjoy comparing hourly sessions.

I hope you found this helpful, especially since the March 2023 broad core update is currently rolling out. Have fun. :)

GG

Filed Under: algorithm-updates, google, google-analytics, seo, tools, web-analytics

Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

October 19, 2022 By Glenn Gabe Leave a Comment

Google September 2022 broad core product reviews update

Well, SEOs and site owners had a heck of an end to the summer of 2022. It all started with the Helpful Content Update (HCU), which rolled out on August 25, 2022. The rollout of Google’s new site-wide signal took a little more than two weeks to complete (and it only seemed to impact the most egregious sites). I covered that heavily on Twitter while analyzing the update.

But we didn’t have much time to rest after the HCU rollout completed, since Google then rolled out the September 2022 broad core update. That was surprising, especially since I spoke with Google about the HCU and Product Reviews Update prior to the HCU rollout, and didn’t hear anything about a broad core update launch that would follow!

Today we released the September 2022 core update. We'll update our ranking release history page when the rollout is complete: https://t.co/sQ5COfdNcb

— Google Search Central (@googlesearchc) September 12, 2022

So, now we had a broad core update rolling out that can make the earth shake, and after a new site-wide ranking signal rolled out (the HCU). What else could happen?

In a move that surprised many (due to timing), Google then rolled out the September 2022 Product Reviews Update (PRU). And PRUs can be core update-like for sites that contain products reviews content. Although we knew the next PRU was coming, the timing of the rollout was extremely surprising since Google overlapped the broad core update and the Product Reviews Update.

Today we released the September 2022 product reviews update for English-language product reviews. We'll update our ranking release history page when the rollout is complete: https://t.co/sQ5COfdNcb

— Google Search Central (@googlesearchc) September 20, 2022

That’s right, it’s another algo sandwich from Google, which can bring a ton of confusion for site owners and SEOs. To make matters even more confusing, both the broad core update and the PRU completed rolling out on the same exact day (September 26, 2022). Needless to say, many are confused about which update impacted their sites.

Ranking updates page for the September 2022 broad core update and Product Reviews Update

And if you’re an SEO history buff, then you might know that algo sandwiches are not new for Google, just rare (pun intended). Google has rolled out overlapping updates, or updates very close to one another, several times in the past. The one that really sticks out to me was the Panda, Penguin, Panda algorithm sandwich from April of 2012 (where all three updates were rolled out within a 10-day period). It was a triple decker sandwich packed with thin content, spammy links, and topped with Google’s famous hot sauce. As you can guess, many were confused about what hit them at the time… which led to several of my posts about Pandeguin (the combination of Panda and Penguin).

Here’s a graph from my post about Pandeguin showing a site impacted by all three updates:

Pandeguin impact

Covering The Confusion Based On Overlapping Major Algorithm Updates:
In this post, I’m going to cover the two overlapping major algorithm updates that rolled out in September 2022, the confusion that set in for some site owners and SEOs, interesting things I saw during the combined rollout, my recommendations for sites impacted during the algo sandwich, and more.

So strap yourself in, grab your favorite sandwich sauce, and maybe a pickle for good luck. It’s going to be a bumpy ride in Gabe’s Deli for this post. Let’s jump in.

First, here’s a quick table of contents for those that want to jump around the post:

  • Visualizing the confusion. Was it the broad core update or the Product Reviews Update?
  • Welcome To Google Land – The Overlapping of Major Algorithm Updates.
  • Don’t assume it’s the PRU, it could be the broad core update.
  • Look at this trending (if you dare).
  • A final September PRU tremor correcting some issues with the July PRU.
  • More Negative Impact for Sites Hit By Helpful Content Update (HCU).
  • Google on how the PRU evaluates sites. Site-level or url-level?
  • Structured Data helping Google understand if your content contains reviews. Really?
  • Google adds examples of product reviews to its documentation.
  • Final tips and recommendations for site owners.

Visualizing the confusion. Was it the broad core update or the Product Reviews Update?
First, let’s get the dates right. The broad core update started rolling out on September 12, 2022. And many sites saw impact very quickly with the broad core update. I started documenting movement about one day into the rollout, which is quick.

For example, here are several sites seeing immediate impact:

Early impact from the September 2022 broad core update
More early impact from the September 2022 broad core update
Another example of early impact from the September 2022 broad core update
Final example of early impact from the September 2022 broad core update

Then the Product Reviews Update started rolling out on September 20, 2022 in the middle of the broad core update rollout. It was eight days into the rollout of the broad core update. And this is where things get interesting, and confusing. Many sites saw a ton of movement starting right at that point. Some were product reviews sites, but some were not. Not even close actually…

Here are some examples of product review sites seeing movement when the September PRU rolled out. This would be clear PRU impact in my opinion (without confusion based on the broad core update):

Example of product review site impacted by September 2022 Product Reviews Update
Another example of product review site impacted by September 2022 Product Reviews Update
Third example of product review site impacted by September 2022 Product Reviews Update
Final example of product review site impacted by September 2022 Product Reviews Update

But not all sites impacted were as clear as those… There were many sites that don’t contain reviews that were impacted heavily starting on 9/20 (right when the PRU started rolling out). And that led to massive confusion about which update actually impacted sites seeing a lot of movement. In other words, was it the broad core update or the Product Reviews Update impacting the site? Only Google knows… or do they? I’ll cover more about that soon.

Here are some examples of sites that don’t contain reviews or affiliate content at all that were heavily impacted starting on 9/20. These were reference sites, news sites (without affiliate content), e-commerce sites, recipe sites (without affiliate content), and more.

Non-review site impacted on the same date the Product Reviews Update rolled out.
Another non-review site impacted on the same date the Product Reviews Update rolled out.
Third example of a non-review site impacted on the same date the Product Reviews Update rolled out.
Final example of a non-review site impacted on the same date the Product Reviews Update rolled out.

Welcome To Google Land – The Overlapping of Major Algorithm Updates
I have said “Welcome to Google Land” many times over the years, and for good reason. It can sometimes feel like you are on a roller coaster, run by AI, in a land of confusion, with no games or stuffed animals at the amusement park to make you feel better.

Google usually tries to keep major algorithm updates separate so site owners can better understand which update actually impacted their sites. Google’s Danny Sullivan has explained that in the past (and even right before this algo sandwich rolled out!)

Just saw this. We’ve worked very hard to keep updates separated from each other, or as little overlap as possible, to help creators understand more. So no, not coincidence we are due a core update but said let’s wait on that until the helpful content update has rolled out…

— Danny Sullivan (@dannysullivan) September 16, 2022

But, and like I said earlier, there are times that major updates do overlap. This was a great example of that… When the September PRU rolled out during the September broad core update, Google provided some advice to site owners. Google’s advice was basically that if you have product reviews, then it’s probably the PRU impacting your site. If you don’t, then it’s probably not.

And as you can guess, the word “probably” definitely caused some concern. And it wasn’t long before we saw sites that didn’t have product reviews that were impacted heavily when the PRU rolled out.

Here are tweets from Google about the overlapping rollout:

For awareness, the September 2022 core update has not fully completed but it's mostly done. We expect it will be fully complete within a week and will share on our updates page when it is done.

— Google Search Central (@googlesearchc) September 20, 2022

If you see a change and wonder if it's related to the core update or the product reviews update:
– If you produce product reviews, then it's probably related to that.
– If not, then it might be related to the core update.

— Google Search Central (@googlesearchc) September 20, 2022

Don’t assume it’s the PRU, it could be the broad core update:
It’s also worth noting that since the core update was still rolling out, site owners could not simply assume it was the PRU impacting their sites on 9/20. Based on what I saw across many sites, I do believe we saw an uptick from the broad core update right on 9/20 when the PRU rolled out. That was either coincidental, or not. But again, many sites without product reviews were heavily impacted on that date.

For example, here is visibility trending from a recipe site without any product reviews or affiliate links that was impacted on 9/20. Some users are leaving a quick review of the recipe in the comments, but that’s not really what the PRU should be targeting. So either the PRU is flawed there or it was the broad core update impacting the site. Personally, I believe it was the core update impacting the site, but I can’t say for sure. The site owner is super-confused. They were impacted by a previous broad core update by the way, which is interesting.

Recipe site without reviews or affiliate links impacted on 9/20 when the Sep 2022 product reviews update rolled out

Here is a quote from the site owner about the situation:

“When the September core update started rolling out my site wasn’t affected at all. But the day the PRU started, it took a relatively big hit (it dropped by about 20%). The site doesn’t have product reviews or affiliate content at all. Unfortunately, the overlapping updates can send you down a rabbit hole trying to fix things that aren’t actually a problem… since you don’t have a clear idea of which update caused the drop.

“For example, if it was the Product Reviews Update, then you would think I should be focused on improving reviews content (but I don’t have any reviews)… Or is it the comments that sometimes have quick reviews of recipes from users? And if it was the core update causing the drop, should I focus on improving the site overall? It’s just very confusing when all I’m trying to do is publish great recipes for my users…”

And if you think that’s a tough situation, then look at this trending (if you dare):
The site contains mostly informational content, but does include some reviews (and affiliate links). Google is clearly having major issues understanding the type of site and whether it should rank well. The site has experienced massive swings in rankings during multiple major algorithm updates (including reversals outside of those updates). Below, you can see impact from a broad core update, Product Reviews Updates, and then random swings outside of major updates.

Insane trending for a site impacted by several major algorithm updates.

Which then led to others sharing their trending on Twitter, which showed similar ups and downs.

Note, I’ll come back to this case shortly… since the next section ties in nicely. Yep, it’s like Inception for SEO. :)

A final September PRU tremor correcting some issues with the July PRU:
So, based on what I explained with the 9/20 impact, could there be a flaw with the September PRU? It’s possible… and I was pretty vocal about some problems I saw with the July PRU. At the time, I said that we could see Google correct those flaws via a tremor or with the next PRU.

Here is my tweet from July explaining we might see a correction:

And here are some exs of surges based on the July PRU. The last screenshot is a super-interesting one that I'm digging into heavily. It's a massive drop, but I'm not sure that's correct… Wouldn't shock me to see a change there as the update continues to roll out (via a tremor). pic.twitter.com/4isWpVZQS3

— Glenn Gabe (@glenngabe) July 31, 2022

Thankfully, a change was rolled out! At the very end of the rollout of the September PRU (on 9/25), a number of sites surged back from the dead that were impacted by the July PRU. I tweeted several examples when this happened.

Site surging back from the dead during a late PRU tremor
Another site surging back from the dead during a late PRU tremor
A third site surging back from the dead during a late PRU tremor

Time will tell if some issues with the September PRU get corrected and those sites bounce back like these did on 9/25.

And remember that site from earlier with insane trending? Well, here we go again. That site surged back again on 10/15. Yep, after the broad core update and PRU completed. And after the site dropped heavily during the late PRU tremor on 9/25. Again, Google is having a very hard time understanding where this site fits in, the type of content it contains, etc.

More insane trending for a site impacted by multiple major algorithm updates.

More Negative Impact for Sites Hit By Helpful Content Update (HCU)
After the Helpful Content Update rolled out, there was some confusion about if the HCU could contribute to broad core updates (as another signal). Google’s Danny Sullivan explained the updates are separate, but if you were impacted by the HCU, and if you have broad core update issues, then the combined effect might not be optimal for you… Basically, the effect can be compounded.

Here is a tweet from Danny about this:

Not that directly. Point is our ranking systems use a variety of signals overall, as we said: https://t.co/G6g7hvE7P2

Helpful content is weighted, so sites on the edge might not see issues. But if they also have core update issues, the combo might be more significant. pic.twitter.com/tQZcsZQHkp

— Danny Sullivan (@dannysullivan) September 12, 2022

And here are some sites that dropped with the Helpful Content Update and then saw more of a drop with the September broad core update:

Site impacted by the Helpful Content Update that saw more of a drop with the Sep 2022 broad core update
Another site impacted by the Helpful Content Update that saw more of a drop with the Sep 2022 broad core update
Third site impacted by the Helpful Content Update that saw more of a drop with the Sep 2022 broad core update

Google on how the PRU evaluates sites. Site-level or url-level?
Based on the varying levels of impact by the Product Reviews Update, and how it impacted some sites that contain a mix of informational content and reviews, Google received some questions about how the PRU evaluates content. For example, is it evaluating at the url-level or site-level? And, how does Google determine that a piece of content is a review in the first place (especially if the site has other types of content)?

Google first responded quickly about the latter question and recommended that sites might want to add structured data to clearly signal to Google that the content was a review. Then when questioned about that point, Google took some more time internally before responding.

Then Danny Sullivan responded and first explained more about how the PRU evaluates websites. He explained that if your site contains a lot of product reviews, then the PRU can be like a site-wide evaluation. i.e. All content can be evaluated… But, if your site contains a mix of content, and reviews don’t make up a large portion of the content, then the PRU evaluates more on the url-level.

Here are Danny’s tweets about that:

If you don't have a lot of product reviews (a really substantial not-single-digit-percentage part of your entire site is made up of them), a site-wide evaluation is not likely to happen…

— Danny Sullivan (@dannysullivan) October 7, 2022

This made complete sense to me based on analyzing many sites impacted by Product Reviews Updates. For sites where a majority of the content is comprised of product reviews, the site could be heavily impacted by the PRU (and any content on the site could be negatively impacted).

For example, here is a site that focuses on product reviews that was impacted heavily by the April 2021 PRU. Notice the massive drop when the update rolls out:

Extreme drop for a product reviews site during the April 2021 PRU

And for sites that contain a mix of content, like news sites that also publish some product reviews, then it wouldn’t be like a site-wide evaluation. In other words, not all content would be evaluated by the PRU. For example, here is a news site that contains review content in a specific section. The site overall took a hit when the PRU rolled out, but the section with reviews REALLY took a hit.

First, here is the drop overall for the site:

News site also containing reviews dropping with the Sep 2022 PRU

And here is the drop for the reviews section:

News site with reviews section dropping with the Sep 2022 PRU

This does reinforce the idea that adding all reviews to a section could help Google identify where reviews are on the site. John Mueller explained that the PRU works more broadly on a site or section-level depending on how the site is structured content-wise.

These kinds of changes tend to be more on broader parts of sites, or the sites overall.

— ⛰ johnmu is not a cat ⛰ (@JohnMu) April 9, 2021

And that dovetails nicely into the next section of my post about structured data and Google identifying reviews…

Structured Data helping Google understand if your content contains reviews. Really?
For sites that contain a mix of content, Danny Sullivan explained that Google can use structured to help it understand if a piece of content is a review. But, it’s not required and it’s just one of several signals they use to understand if content is a review.

As for structured data, it might help us better identify if something is a product review, but we do not solely depend on it.

— Danny Sullivan (@dannysullivan) October 7, 2022

And this from earlier in the conversation:

Structured data isn't used for ranking. Alan was saying it's possible it's used to help identify types of content, just as we use many signals to understand content. That's not the same as ranking, nor would we solely depend on it as not everyone uses it.

— Danny Sullivan (@dannysullivan) October 3, 2022

I’ve been vocal that this is pretty ridiculous to me. I mean, Google with all of its unbelievable natural language processing power needs site owners to feed it structured data to understand if it’s a review? That’s crazy. Also, many site owners don’t even know what structured data is and how to use it. That said, if your site contains a mix of content (including reviews), then I would add structured data to signal to Google which pieces of content are indeed reviews.

And if you think that structured data statement from Google caused more confusion, you would be right. I had several site owners reach out to me claiming they were hit by the PRU because of structured data errors, the wrong structured data used, etc. Unfortunately, even though I don’t believe that’s the case, I can’t say it’s NOT the case with 100% certainty.

Google adds examples of product reviews to its documentation.
With each Product Reviews Update, the algorithm continues to evolve. One question I get often is about the types of content that can be impacted by the PRU. For example, is it supposed to just target sites with product reviews or can sites with other types of reviews content be impacted? I’ve seen sites with user-generated content (UGC) reviews get impacted in the past and I’m not sure Google is really targeting that type of review with the PRU. Well, I’ve seen less and less of that with each Product Reviews Update. Google does seem to be focusing just on product reviews with the latest iterations of the PRU.

Also, Google recently refined one of the help documents about the Product Reviews Update. Specifically, Google added three examples of the types of reviews content that can be created by site owners (and I’m assuming Google is saying these are the types of review content that can be evaluated by the PRU – at least for now). And those examples don’t contain UGC reviews.

It’s just worth noting for anyone creating reviews content (or any site owner that has UGC reviews).

Google adds examples of product review pages in their documentation for site owners and SEOs.

Final tips and recommendations for site owners impacted during the September 2022 algorithm sandwich:
If you were impacted during the latest algorithm sandwich, then the tips below could help you get moving in the right direction. I hope the following recommendations help cut through some of the confusion:

  • If you were impacted starting on 9/12, and before 9/20, then you were impacted by the broad core update. You can read my posts about broad core updates to learn more about them, how to identify why you were impacted, and learn the best path forward from a remediation standpoint.
  • If you were impacted starting on 9/20 when the PRU rolled out, and you have a mix of content on your site beyond reviews, do not assume it was the PRU. It could have been the broad core update. I covered this earlier in the post, and I have seen many sites impacted on 9/20 that had no reviews or affiliate content at all.
  • If you were impacted starting on 9/20, and you do have a lot of product reviews content, then you should work to improve your content based on Google’s best practices. You can read my previous posts about Product Reviews Updates to learn more about them.
  • If you have a mix of content on your site, including reviews, then I recommend adding structured data to identify the reviews content (since Google explained it uses structured data as one signal for identifying reviews). I still think it’s crazy that site owners need to do this… but I would probably do it to make sure Google can understand what is reviews content and what’s not. Remember, we’re dealing with machine learning systems and mistakes can definitely be made by Google on this front. Just check the reversals in my post above to see examples of that.
  • If impacted by the September broad core update, run a delta report to understand what dropped, and why. For example (if it was the core update), was it a relevancy adjustment, intent shift, or is it overall site quality problems. Then form a remediation plan based on what you find.
  • If impacted by a major algorithm update, don’t just compare specific urls in the SERPs if that’s the case. With site-level quality algorithms at play, the overall quality evaluation could be dragging rankings down. And for some urls, the content on that page might have little to do with that url specifically dropping. Improve overall… that’s what Google wants to see.
  • For broad core update remediation, using a “kitchen sink” approach is your best path in my opinion. That’s where you surface all potential quality problems and work hard to fix as many as you can. Look to improve overall. That’s all you can do in the age of machine learning-based major algorithm updates.

Summary – Overlapping major algorithm updates can cause confusion for sites owners and SEOs. Try your best to cut through that confusion…
I hope this post helped you better understand the major algorithm updates that rolled out in September of 2022. Unfortunately, two of those major updates overlapped for a week. And when that happens, it can cause massive confusion for site owners and SEOs. By reviewing the points in my post, and understanding the dates you were impacted, you can form a plan of attack remediation-wise.

And like I explained in my post, don’t be surprised if we see some corrections and adjustments based on the latest updates. Again, “Welcome To Google Land”. Good luck.

GG

Back to top>>

Filed Under: algorithm-updates, google, seo

Google’s Helpful Content Update Introduces A New Site-wide Ranking Signal Targeting “Search engine-first Content”, and It’s Always Running

August 18, 2022 By Glenn Gabe Leave a Comment

I was able to speak with Google about the Helpful Content Update, which introduces a new site-wide ranking signal that’s targeting “unhelpful content” (which is content primarily created to rank in search engines versus help users). Keep your arms and legs in the vehicle at all times, the next few weeks in Google Land are going to be interesting.

Updated August 25, 2022
The rollout has begun! Google posted in its ranking updates documentation that the Helpful Content Update started rolling out on 8/25/22.

Updated August 24, 2022
Included information about how the Helpful Content Update will impact multilingual sites.

Updated: August 22, 2022
I included new information about noindexing content and how subdomains could be treated when the Helpful Content Update rolls out.

—————————–

Big news in the SEO world. Google is releasing a new, site-wide ranking signal that targets “unhelpful content”. I was able to speak with Google’s Danny Sullivan about the ranking signal and will provide that information, including my thoughts about the update, below. Note, Google also announced that a new Product Reviews Update will be rolling out in the coming weeks. I’ll cover more about that separately (soon).

Here’s a quick table of contents if you want to jump around to different sections:

  • The Helpful Content Update (HCU) – An introduction.
  • Site-wide ranking signal that’s always running.
  • The signal is weighted.
  • How to recover from the Helpful Content Update.
  • More bullets and guidance from Google.
  • Machine learning and MUM.
  • AI-generated content.
  • Dueling machine learning systems.
  • Types of content targeted by the Helpful Content Update.
  • Multilingual sites and English searches only.

The Helpful Content Update (HCU) – An Introduction
First, the name of the update is the “Helpful Content Update”, which I’ll refer to as the HCU for short, and it will start rolling out next week. It will initially focus on English content globally (like the Product Reviews Update). Like broad core updates and the Product Reviews Update, the Helpful Content Update can take up two weeks to fully roll out. Naming-wise, I was personally hoping for a black and white animal like Orca, Zebra, or Dalmatian, to follow in Panda and Penguin’s footsteps, but that’s clearly not the direction Google wanted to go. Also, Google told Search Engine Land that the update will only impact Search for now (and not Discover or other surfaces).

Update: The Helpful Content Update started rolling out on August 25, 2022. Google posted in its ranking updates documentation explaining the HCU started rolling out on 8/25/22 and that it can take up to two weeks to fully roll out. If it’s like broad core updates and the Product Reviews Update, we should start seeing impact within a few days (like within the first 48 hours).

The new, site-wide ranking signal targets content that’s created for search engines first and not users. Google is looking to tackle what it considers “unhelpful content” versus content that’s created to truly help users. Their post talks about “search engine-first” content creation versus “people-first”. For example, pages that simply aggregate content from other sources without providing some unique value or insights.

Google said it wants to “reward content where visitors feel they’ve had a satisfying experience, while content that doesn’t meet a visitor’s expectations won’t perform as well”. In one of its two posts about the update, Google provided an example of someone searching for movie information and finding a page ranking well that just aggregated reviews from other sites without providing unique perspectives. With the new site-wide signal at play, you should find more content with unique information.

Google explained that this new signal is joining the Product Reviews Update in a broader effort to reduce low quality content from the search results. And if this reminds you of medieval Panda after reading about the new signal, you’re definitely not alone. It’s one of the first things I thought of after hearing about it. But it’s different… read on.

A site-wide ranking signal that’s always running:
Yes, you heard that correctly. It’s a site-wide signal. Over the years, I have heavily covered that Google has site-level quality algorithms that can impact an entire site (versus the notion that every url fends for itself rankings-wise). Panda and Penguin of the past were site-level algorithms. And there are other site-level algorithms currently at play with broad core updates.

Here are just a few tweets of mine with links to videos from Google’s John Mueller about this (there were many tweets to choose from):

Google on the importance of overall site quality -> Via @johnmu: For some things, we look at site quality *overall*. So, if you have significant portions that are low quality, then that can drag down your original, higher quality content too: https://t.co/MHrZN1P56Q pic.twitter.com/BrkGTMdCIX

— Glenn Gabe (@glenngabe) January 2, 2022

Site-level: Have one url that's high quality, but the site overall isn't? Via @johnmu There are some signals we can't reliably collect on a page-per-page basis. We need to have a better understanding of the *site overall*. Quality falls into that category: https://t.co/tkTRxmFupG pic.twitter.com/YnNG3ip5jl

— Glenn Gabe (@glenngabe) June 12, 2021

During our call, Google’s Danny Sullivan explained to me that the new ranking signal is a classifier. If your site is deemed to have a lot of what Google considers “unhelpful content”, then the site will be classified that way (and that can negatively impact your rankings at a site-level).

In Google’s post about the signal it explains, “Our systems automatically identify content that seems to have little value, low-added value or is otherwise not particularly helpful to those doing searches.”

Also, and this is super important to understand, Google explains that “ANY CONTENT – not just unhelpful content – on sites determined to have relatively high amounts of search engine-first content is less likely to perform well in Search (assuming there is other content on the web that’s better to display)”. So this site-wide ranking signal can impact rankings across your entire site, including higher quality content. That’s why  Google recommends removing unhelpful content in order to get your site reclassified (which can lead to recovery).

Update: Information about subdomains:
Google’s Danny Sullivan was asked about how subdomains would be treated and he explained that Google “tends to see subdomains apart from root domains, but it can also depend on many factors.” In other words, they tend to see subdomains as separate. It’s important to know that Google has algorithms that work on the hostname-level and the Helpful Content Update could be doing the same. For example, medieval Panda worked on a hostname-level. Google’s Gary Illyes explained that in the past. We’ll know much more once the update rolls out, but this is good to know.

Here is Danny’s tweet about subdomains:

We tend to see subdomains apart from root domains but it can also depend on many factors.

— Danny Sullivan (@dannysullivan) August 18, 2022

The Signal Is Weighted: Impact Could Vary
I asked if the new site-wide ranking signal was binary (yes/no) or if there were grades of scoring. Google explained that the signal is weighted (and they explained that in their blog post as well). They explained that “sites with a lot of unhelpful content will find the signal stronger for them”. So, if you have a lot of unhelpful content on the site, like content created for search engines over humans, then you could see a stronger effect from the Helpful Content Update (HCU).

Recovery from the Helpful Content Update. How long does it take?
If your site is impacted by the new ranking signal, then you will need to work hard to improve your content over time. Google explained you cannot recover quickly. Instead, it can take months for the classification to change (as Google sees less “unhelpful content” on your site).

Google explained, “Sites identified by this update may find the signal applied to them over a period of months. Our classifier for this update runs continuously, allowing it to monitor newly-launched sites and existing ones. As it determines that the unhelpful content has not returned in the long-term, the classification will no longer apply.”

So, like I explain to clients about broad core updates, do not push short-term changes, only to revert back to old ways. Keep the right changes in place over the long-term. Don’t cherry pick changes… tackle “unhelpful content” aggressively. That’s how you can see recovery from the Helpful Content Update.

Update: Noindexing versus 404ing content:
Google’s John Mueller was asked on Twitter if noindexing “unhelpful content” is ok or if the urls need to be truly removed (404 or 410). John explained that noindexing content is fine (since the urls are removed from the index). But, and I agree, users can still find that content… and if the content is lower quality and “unhelpful”, then why even keep it? So you can noindex, but in my opinion, 404ing it is a better approach. You can also boost that content and improve it. But that could be a lot of work if there are many “search engine-first” pages on the site.

Here is John’s tweet about noindexing content:

noindex is fine. Consider if all we see are good signals for your site, that's a good sign. That said, as a user I'd feel kinda weird, you land on a good page, and the rest is bad? Why would you do that? Short-term noindex is a good way to start, but usually it's not a few pages.

— ⛰ johnmu of switzerland ⛰ (@JohnMu) August 18, 2022

Also, the site-wide ranking signal will be continuously running (similar to how the Page Experience Signal works). It will not require a periodic refresh. Instead, your site’s classification can change over time as Google detects more, or less, “unhelpful content” on your site (which again, is content created for search engines over people). It’s important to understand that sites won’t recover overnight, or even in a few weeks, but you can see recovery after several months (if you have removed enough unhelpful content).

Also, we have no idea how powerful the signal is. I asked Danny Sullivan about this during our call and he said if a site is impacted by the Helpful Content Update, then that impact should be visible (meaning the site should see an impact to rankings). Personally, I get the feeling the signal will be much more powerful than the Page Experience Signal (which is a lightweight ranking factor at best). I’ll be covering the impact heavily once the rollout is complete.

MY OPINION: Is this a foreshadowing of how more site-level signals will be handled in the future?
OK, so we know that Google could always decouple algorithms from broad core updates and run them separately. I have covered that in several posts about broad core updates. Well, this is just my opinion, but imagine if Google started decoupling algorithms from broad core updates and running them continually like this new signal.

That could be a big shift in how those algorithms impact sites (and throughout the year versus just during broad core updates). To clarify, Google did not say anything about this when I spoke with them (so it’s totally my point of view), but hearing about this new ranking signal targeting low quality content, and how it works, had me thinking about broad core updates and how they work. I guess time will tell if that starts happening.

More bullets points for site owners, More guidance from Google:
From medieval Panda to broad core updates to the Product Reviews Update, Google has provided a number of bullets to help site owners evaluate their own content (and sites overall). And in my opinion, those bullets are extremely important for site owners and SEOs to objectively review. Heck, I’ve even explained how to use those bullets to craft user studies through the lens of broad core updates.

Well, we now have a new set of bullets based on the new site-wide ranking signal. And, Google said the bullets shouldn’t surprise anyone.  

First, Google explained that if you answer “yes” to the following questions, then you’re probably on the right track with using a people-first approach to content creation:

  • Do you have an existing or intended audience for your business or site that would find the content useful if they came directly to you? 
  • Does your content clearly demonstrate first-hand expertise and a depth of knowledge (for example, expertise that comes from having actually used a product or service, or visiting a place)?
  • Does your site have a primary purpose or focus?
  • After reading your content, will someone leave feeling they’ve learned enough about a topic to help achieve their goal?
  • Will someone reading your content leave feeling like they’ve had a satisfying experience?
  • Are you keeping in mind our guidance for core updates and for product reviews?

And when addressing avoiding creating unhelpful content, Google provided additional bullets and explained that answering “yes” to some, or all, of the questions is a warning sign that you should reevaluate how you are creating content:

  • Is the content primarily to attract people from search engines, rather than made for humans? 
  • Are you producing lots of content on different topics in hopes that some of it might perform well in search results?
  • Are you using extensive automation to produce content on many topics?
  • Are you mainly summarizing what others have to say without adding much value? 
  • Are you writing about things simply because they seem trendy and not because you’d write about them otherwise for your existing audience?
  • Does your content leave readers feeling like they need to search again to get better information from other sources?
  • Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t).
  • Did you decide to enter some niche topic area without any real expertise, but instead mainly because you thought you’d get search traffic?
  • Does your content promise to answer a question that actually has no answer, such as suggesting there’s a release date for a product, movie, or TV show when one isn’t confirmed?

My take on the new bullets:
Google is clearly explaining that they want to reward content creators that focus heavily on a subject matter, have deep expertise in an area, and that can demonstrate first-hand expertise and a depth of knowledge. I have covered “staying in your lane” many times when discussing broad core updates, and I would reiterate that based on the bullets above. Don’t try to cover too many different topics if you can’t provide insightful information for each of them. In other words, stay in your lane.

I recommend focusing on what you truly have expertise in. Don’t follow search volume alone, provide top-notch, high quality content that can really help users. Don’t use automation to pump out a lot of lower quality content just to target queries with search volume. Don’t just summarize what others are saying. Provide unique insights. Don’t artificially increase word count thinking Google looks to reward that. They don’t and it can frustrate users. Don’t overpromise and underdeliver. As Google put it, don’t promise an answer to a question that has no answer… It’s worth noting that Google has a News and Discover policy that literally says the same thing. Just an interesting side note. See below.

Machine Learning and MUM: Is the new signal using MUM?
I asked Danny if machine learning, and more specifically if MUM (Multitask Unified Model), was being used for the new ranking signal. Danny explained that the classifier is entirely automated using a machine learning approach, but MUM is not being used.

So like I thought would be the case, Google is using machine learning to understand when a site has a lot of unhelpful content. We know that the Product Reviews Update also uses a machine learning model (and I believe Google’s broad core updates are using machine learning as well). So this wasn’t shocking, but it’s good to get confirmation.

AI-generated content:
I asked Google is the new ranking signal would be targeting content created via AI models like GPT-3 and others, or if the signal would just be targeting more egregious examples of content produced for search engines versus users. For example, there are definitely sites pumping out a lot of AI-driven content (and some of it isn’t great… to say the least). Some of that is ranking well right now.

Google explained, “we are looking to surface content that will be viewed as helping and adding value to the topics searched. Creators who are specifically creating content for search engines first, which include AI-based content, may be impacted.”

So if you are heavily using AI to create content, you should be aware that it can be targeted by the new ranking signal (if it’s deemed to be created for search engines first versus truly helping users). There’s clearly nuance there, so it should be interesting to see what gets impacted by the HCU.

Dueling Algorithms Can Still Be At Play:
With this latest release, broad core updates, Product Reviews Updates, and the new site-wide ranking signal will be at play now. Broad core updates and the Product Reviews Update require a periodic refresh, while the new ranking signal will be continually running.

I have covered what I call “dueling machine learning systems” in my posts about broad core updates and the Product Reviews Update. That’s when a site might surge with one update and drop with another (which can be awkward and confusing for site owners). In other words, is the site high quality, or not?

Here’s an example from my post about the March Product Reviews Update where a site is surging with one type of update and dropping with another:

I asked Danny if that could potentially happen with the new ranking signal, or if it would be closely aligned with what happens with broad core updates and the PRU. He explained that they are all separate systems, so it’s possible, but it should be rare since the algorithms are all trying to reward high quality content or reduce low quality content.

I’ll be tracking this closely as the new ranking signal rolls out. I hope it aligns well with broad core updates and the PRU. Again, it’s a confusing and awkward situation when you see dueling algorithms like that.

Types of content targeted and niche areas that will see more impact.
With the new site-wide ranking signal, Google explained that there are certain types of content that seem to be impacted more than others based on their testing (at least for now). That includes online educational materials, arts and entertainment, shopping, and tech-related content. I asked more about that, and Danny explained this is what Google saw after testing the signal versus what they targeted specifically. In other words, there just might be more low-quality “unhelpful content” that’s being impacted in those niche areas than other areas. It’s not that Google targeted those niche areas… it’s just those areas are seeing more impact.

It’s also important to reiterate that Google will be continually improving and refining that signal over time. So things can absolutely change on that front. And remember, the site-wide signal is always running. So there won’t be periodic refreshes like we see with broad core updates or Product Reviews Updates.

How will the Helpful Content Update impact multilingual sites?
There was a great question form Juan González Villa on Twitter about how the Helpful Content Update would impact sites with multilingual content. For example, we know the update only impacts English content globally, so what about sites that contain content in multiple languages? If the new classifier detects a site has a lot of “search engine-first” content, will the update impact the rankings of content in other languages?

Hi @dannysullivan @JohnMu 👋

Is the Helpful Content Update likely to have site-wide effect on multilingual sites?

Example, let's say https://t.co/WsFr0Uzp2A is impacted as the update rolls out for English content. Would this also have any effect on its Spanish pages? Thanks!

— Juan González Villa (@seostratega) August 22, 2022

We haven’t heard from Google yet about this, but I believe the answer lies right in their blog post about the update. Specifically, Google explains that English searches globally will be impacted. I bolded “searches” since it seems like Google will use the classifier for English queries (and not queries in other languages yet). That would make a lot of sense and it was one of the theories I explained on Twitter when I responded to Juan. Here is that part from the post:

So if you have content in other languages on a site that also has English content, then you have a head start for making sure those pages in other languages are truly high quality and insightful. For now, it sounds like the Helpful Content Update will only target English queries (and your site will not be impacted by the ranking signal when Google detects queries in other languages). But like the Product Reviews Update, Google wants to expand to other languages. Again, you have a head start.

Summary: A new site-wide ranking signal is rolling out targeting low quality, “search engine-first” content.
I hope this post shed some light on Google’s new Helpful Content Update, powered by a site-wide ranking signal targeting “search engine-first” content. The new signal is a classifier that’s always running, and it aims to reduce the amount of low-quality content in the SERPs. If you are impacted by the new site-wide signal, then work to remove unhelpful content from your site. Then focus on using a “people-first” approach to content creation (by providing helpful and insightful content for users). Over time (which Google explained can be months), your site’s classification can change, and you can recover. But you will not recover in days, or even weeks.

I’ll be tracking the rollout of the new ranking signal closely, so definitely follow me on Twitter for the latest updates. Like I explained earlier, it’s going to be an interesting next few weeks in Google Land. Good luck.

GG

Back to top>>

Filed Under: algorithm-updates, google, seo

Analysis of Google’s March 2022 Product Reviews Update (PRU) – Findings and observations from the affiliate front lines

May 2, 2022 By Glenn Gabe Leave a Comment

Google's March 2022 Product Reviews Update (PRU)

Almost four months since the last Product Reviews Update (PRU) rolled out, Google released the third in the PRU series on March 23, 2022. PRUs can cause a lot of volatility for sites with reviews content, and the first two were core update-like for some. With each PRU, Google is looking to continue its evolution with surfacing the highest quality and most insightful reviews content in the search results. And that means thinner, lower-quality posts should drop in rankings as more thorough content rises in the search results. More about that soon.

In this post, I’ll cover several important observations and findings based on the March 2022 Product Reviews Update. I am not going to cover the PRU overall, since I have done that heavily in my first two posts about the April 2021 and December 2021 Product Review updates. Instead, I’ll cover some interesting findings based on analyzing sites impacted by the March PRU (both surges and drops). That includes the types of content potentially helping sites win during the PRU, some lower-quality reviews content slipping through the cracks, more about dueling machine learning algorithms (broad core updates and PRU), the importance of review testing labs, the power of links (or not), and more. I’ll also revisit what I call the Wirecutter Standard with an interesting example of a site employing that strategy that missed the latest PRU cutoff.

Here’s a quick table of contents for those that want to jump around:

  • Periodic refresh still necessary.
  • Linking to multiple sellers.
  • Multimedia (especially video) helping sites, even when not original?
  • Content slipping through the cracks. A potential loophole.
  • Interesting Case: Employing the Wirecutter Approach and missing the PRU cutoff.
  • Watch for intent shifts. It could be Google, and not your content.
  • Dueling machine learning algorithms (again), and surfing the gray area.
  • Ignore user feedback at your own peril.
  • Testing Labs: Follow the leader and how review testing labs will continue to expand.
  • First-hand testing by reviewers. Is it necessary?
  • The Power of Links: inconsistent findings (again).
  • Key takeaways for site owners and affiliate marketers.

Reminder: PRUs Still Require A Periodic Refresh:
Regarding seeing changes over time, the PRU still requires a periodic refresh (as you can see via the massive swings in visibility during each rollout). So, Google still needs to “push a button” and roll out the update. So far, that’s been separated by a number of months (eight months in between the April and December PRUs and then almost four months in between the December and March PRUs). Just keep this in mind while working on remediation. You will need another PRU to roll out to see significant improvement (if you have been negatively impacted by a previous Product Reviews Update). I’ll cover more about dueling machine learning algorithms and the future of the PRU later in this post.

For example, I asked Google’s Danny Sullivan about the type of rollout when the first PRU launched in April of 2021:

At the moment, there's a periodic refresh. Unlike with core updates, we might not always post when a refresh happens given the more limited nature of content involved here. So overall, sites should consider the advice & keep working to it (true of core updates as well!).

— Danny Sullivan (@dannysullivan) April 9, 2021

Linking to multiple sellers: Not included in the algorithm yet, but showing up more and more.
With the December Product Reviews update, Google explained that sites should consider providing links to more than one retailer to purchase products. That surprised many affiliate marketers since Amazon is the dominant e-commerce retailer benefiting from affiliate links (and it’s actually against Amazon’s TOS to link to other retailers when using data via its API.)  

Google explained it was just a recommendation and not being used algorithmically in the PRU (yet), but that definitely was a shot across the bow of Amazon. Well, the March PRU rolled out and I didn’t see any mention of that factor being enforced. So, I pinged Google’s Alan Kent to learn more. Alan explained that Google was still not enforcing that aspect at the moment.

Hi Glenn. The update is an improvement of current algorithms. There is no special support for multiple sellers in this update.

— Alan Kent (@akent99) March 23, 2022

That’s good to know, but my recommendation is to link to more than one seller, if possible (to future-proof your site), but it’s not a requirement as of now. While analyzing the March PRU, I noticed many more affiliate marketers are indeed linking to multiple sellers, when possible. In the past, I saw many reviews linking to just Amazon. That has definitely changed based on the sites I’ve been analyzing and I’m sure Amazon is watching closely. That type of change could dilute their affiliate revenue a bit (as affiliate sites start linking to other retailers from their reviews content). We’ll see how this plays out…

For example, a site linking to two sellers from reviews content:

Reviews sites linking to multiple sellers

Here is another review linking to multiple sellers (four in this example):

A review site linking to multiple sellers to buy products

Video: A picture is worth a thousand words. And video can be worth ten thousand.
As part of Google’s best practices, they explained to “provide evidence such as visuals, audio, or other links of your own experience with the product, to support your expertise and reinforce the authenticity of your review.” And in my post about the April Product Reviews Update, I explained how original images, video, and gifs could help readers get a much better feel for a product.

Google's best practices for reviews sites regarding video and images.

Well, I’ve noticed an interesting trend while analyzing sites impacted by the PRU. I’m seeing much more video embedded in the articles. I think that’s great, but the devil is the in details. And this could be a weird loophole.

For example, if you produced an original video based on reviewing a product, that’s outstanding. But what if you didn’t shoot a video and simply embedded a video of the product from another creator, manufacturer, etc.? I’m seeing that technique used often while analyzing reviews and I think that could be a short-lived benefit.

If you are an affiliate marketer using video in your review articles, I would take a hard look at those videos and determine if they are truly helpful and if they reinforce your first-hand use of those products. Also, and this is just my opinion, but having original video is more powerful than leveraging someone else’s video. Actually, any site can embed the same exact video in their own review articles.

I know high quality video is not easy to produce, but it can really set your reviews apart from the competition. And if Google can figure out what’s truly original and insightful from a video standpoint, then having your own videos could only help (as long as they are high quality, insightful, and valuable for readers).

For example, here is an original video embedded in a review:

Review site with original video content.


A PRU loophole? Low-quality lists of products ranking well for some queries.
With previous Product Reviews Updates, I noticed some loopholes. There were some sites ranking with a very basic format (no review content actually). Although that specific loophole seemed to be closed leading up to the March PRU, I came across other examples of sites ranking with thin or low-quality reviews content. Actually, they weren’t really reviews. Instead, there was basically just a list of “best products” with minimal content, and those pages are ranking well for various review queries.

I can’t imagine this will stay as-is. I’m sure Google will pick up on this, refine the Product Reviews algorithm and handle accordingly. Whether that requires another Product Reviews Update, or if it happens before then, I expect those pages to sink in the rankings over time. If I were running those sites, I would definitely look to improve the pages that are ranking well now. They are far from the Wirecutter Standard, which is what I recommended trying to achieve in my previous posts about the PRUs. That’s a good segue to an interesting case I’ve been working on, which I’ll cover next..

First, here are two examples of urls surging with the March PRU that contain low-quality review content. Actually, it’s not even review content, it’s more like a list of visuals and links. Notice how they surge out of nowhere during the March PRU.

Loophole with Google's Product Reviews Update.
Page suring with Google's Product Reviews Update with low quality content.

Interesting Case: Employing a Wirecutter Approach but missing the cutoff:
Just like with broad core updates, sites should look to improve their reviews content significantly, and over the long-term. Google is using machine learning to evaluate sites and content over time, so quick and small changes will not suffice. Taking Google’s best practices to heart and implementing big changes across your reviews content is the way to get out of the gray area. That’s why I have recommended taking a Wirecutter approach to producing reviews content. You can read more about that in my previous posts, but publishing killer content, based on extensive first-hand testing and use, supported by original visuals and multimedia content, is a very strong approach to employ.

But… it’s not easy. It takes an enormous amount of time, energy, resources, money, etc.

Well, I’m helping a client that got hammered by the April PRU and then saw a partial recovery with the December PRU, that took my comments about employing a Wirecutter approach to heart. After analyzing the site, the content, user experience, etc., we spoke a lot about the Wirecutter Standard, and the site owner was all in. Over the past few months, they have mapped out their testing process, targeted certain categories for using a Wirecutter approach, and have already published a number of review articles based on that process.

And those are killer pieces of content.

Although produced by a small team, the new content is outstanding, provides a wealth of insightful and helpful information about the products being reviewed, provides their own rating system based on the areas being reviewed, they have original photos, gifs, and video that support the content, and more.

But for the March 2022, they missed the “cutoff”. The content was published right before the March Product Reviews Update rolled out. Therefore, those new killer articles weren’t going to help much when the March PRU rolled out.

On that note, Google is on record that recent changes aren’t reflected in major algorithm updates. Google needs to see significant changes over time, and over the long-term. The site just didn’t have the time…

Via @johnmu: Major impact from an algo update wouldn't be from *recent* changes. For larger sites, it can take Google's algorithms a longer time to adjust to site changes. It could take several months to recrawl, reindex, & reprocess the site changes: https://t.co/p0VbFtfOO7 pic.twitter.com/Nrpiety72k

— Glenn Gabe (@glenngabe) May 8, 2018

And here is Google’s John Mueller explaining a similar situation in a recent hangout video (when asked if a recent change could have led to a drop from a broad core update). John explained that the information used for broad core updates is collected over the long-term. And the same applies to an update like the Product Reviews Update (which is using machine learning when evaluating content and sites):

The timing was unfortunate for my client, but we are super-excited to see the next PRU roll out. I’ll post more information about how that goes after the next Product Reviews Update. If my client keeps on publishing Wirecutter-like content, then I would imagine they will see nice gains. We’ll see.  

Testing Labs: Follow the leader and how review testing labs will continue to expand.
Regarding “testing labs”, I’ve already covered Wirecutter heavily in my other posts about the PRU, but it’s worth mentioning that Good Housekeeping and Verywell also have their own testing labs. You can check out more information about those efforts by following the links below, but if you are producing reviews content, then I highly recommend trying to emulate what those companies are doing.

I know it’s not easy to do, but it can help future-proof your reviews content. The more you can map out a detailed review process, the more you will organically cover what Google’s algorithms are looking for. For example mapping out a ratings scale, providing pros and cons, actually testing out products (first-hand experience), producing visuals that support the testing (photos, videos, gifs, etc.), so on and so forth.

Wirecutter: https://www.nytimes.com/wirecutter/blog/anatomy-of-a-guide/
Good Housekeeping Institute: https://www.goodhousekeeping.com/institute/about-the-institute/a19748212/good-housekeeping-institute-product-reviews/
Verywell Testing Lab: https://www.verywellfit.com/commerce-guidelines-and-mission-4158702

The Good Housekeeping Institute:

Good Housekeeping Institute

Wirecutter: The New York Times

Wirecutter Reviews by the New York Times

The Verywell Testing Lab:

The Verywell Testing Lab


Do you need to test each product you are reviewing? Is first-hand use and experience required?
Over the past several months, I’ve received questions from site owners about the importance of first-hand testing of products and how necessary that is moving forward (since some products are not easy to test or consume). For example, when the December 2021 Product Reviews Update rolled out, Google explained that “users have told us that they trust reviews with evidence of products actually being tested…” And they included a new best practice for site owners explaining just that.

Google's best practice about first-hand use and testing for product reviews.

But for some products or services, it’s not easy (or even possible sometimes) to actually test a product, consume a product, or use a service in order to gain first-hand knowledge of how they work. Given those challenges, what does Google say about the situation? Well, Google’s Alan Kent has provided more information via Twitter and I wanted to include that information below.

Alan explains that it’s not always necessary to test or consume a product in order to write a high quality review. But he does warn that site owners and affiliate marketers should not just spin a description from a manufacturer as the core review content.

He said don’t expect a big boost if you simply say you tested it yourself and basically paraphrase the manufacturer description. And in another tweet, Alan explained to think about how you can add to the current body of knowledge for a given product (while avoiding simply providing the specs for a product that’s supplied by the manufacturer).

Here are Alan’s tweets. The first was in response to a question about supplements (and if the people reviewing the supplements were required to have tried the actual products). Alan says no.

You can certainly create a useful review without eating the product. E.g. people know too much sugar is not good for you. But dont expect big boosts if the review only adds a few sentences saying "I tested it myself too" with the rest paraphrasing the original product description

— Alan Kent (@akent99) April 13, 2022

And the second tweet from Alan was in response to an observation that some sites are claiming to have a product testing lab, but a number of reviews don’t explain that the products were actually tested. Alan explained that contributing new information to the body of knowledge about a product would be smart, but just repeating specs from the manufacturer website doesn’t really add any value.

Another way to think about it is does the review contribute new information to the body of knowledge about the product? I could test a car tire using a machine instead of on my own car. But just repeating the specs from the tire website with different words adds nothing.

— Alan Kent (@akent99) May 5, 2022

My take on first-hand testing:
If you are going to thoroughly review a product, it’s a wise idea to actually test and use that product. Doing so can give you a much stronger understanding of how the actual product works, which can yield a much stronger review. It can also yield original photos and video of you testing the product, which can be extremely helpful for readers.

But for products or services you can’t easily test out yourself, then provide as much unique information as you can without simply spinning information that can be found elsewhere. Like Alan explained, see what you can add to the current body of knowledge for a product. Add as much value as you can for the reader.


A quick note about intent shifts. It’s not you, it’s Google.
In my post about the December Product Reviews Update, I mentioned that there were some intent shifts going on where e-commerce retailers started ranking for reviews content, and review sites dropped to page two or beyond. And on the flip side, sometimes when e-commerce retailers were ranking well, then an intent shift happened and reviews content started to rank higher (pushing the e-commerce retailers lower).

This was typically happening with head terms (so queries lacking “best”, “reviews” or “compare”). Well, we saw that again with the March PRU. The reason I bring this up is because sometimes it’s not your content that’s the problem. It could just be an intent shift, which you have no control over. I covered that in my post about the difference between relevancy adjustments, intent shifts, and overall site quality problems.

So, if you see a drop during the PRU, definitely run a delta report and determine the root cause of the drop. And if it’s an intent shift, you might not need to radically improve your content (if it’s already high quality, insightful, valuable, etc.)

Here is an example of an intent shift happening with the December Product Reviews Update and then reversing with the March PRU. The site had no control over this…

Intent shifts during Google's Product Reviews Update

Google’s dueling machine learning algorithms are… still dueling: And this needs to be addressed (IMO).
In my post about the December Product Reviews Update, I mentioned dueling machine learning algorithms and how that’s a problem for Google. That’s where sites either surged or dropped during broad core updates, and then saw the opposite movement with a Product Reviews Update.

Well, I saw more of that with the March Product Reviews Update. Sites that were impacted in June, July or November with broad core updates saw the opposite movement with the March PRU.

With that happening, Google is sending serious mixed signals to site owners. For example, is the site’s content high quality, or not? Only Google’s machine learning systems know. Muahahaha. :)

Dueling machine learning algorithms with Google's broad core updates and Product Reviews Updates

It’s also a good time to reiterate that Google is using machine learning with both broad core updates and the Product Reviews Update, so it’s not like they are using 10, 20, or even 100 factors. Google could be sending many more signals to the machine learning system and then letting the system determine weighting (and ultimately rankings).

Again, welcome to SEO. Bing has explained more about that in the past. Here is Fabrice Canel on how Bing is using machine learning with its core ranking algorithm. They send “thousands of signals to the machine learning system and it determines the weighting”. This is ultra-important to understand. I linked to the video from my tweet below.

How much does a certain factor matter for SEO? Via Bing's @facan We simply don't know. Bing is heavily using machine learning. We don't set the weighting. It's about sending thousands of features to the ML system & the system figures it out: (at 35:02) https://t.co/EiTktEFqx7 pic.twitter.com/HTzu9wkA5m

— Glenn Gabe (@glenngabe) November 9, 2020

Also, I do believe the Product Reviews Update will be incorporated into Google’s core ranking algorithm at some point (and that will be a good thing). In my opinion, you can’t have a major algorithm update focused on quality impact a site one way and then another algorithm update focused on quality reviews impact the site in the opposite way. That’s maddening for site owners and makes no sense. But before that happens, Google needs to expand the PRU to other languages beyond English. That hasn’t happened yet, so I believe that will happen first and then maybe the PRU gets baked into Google’s core ranking algorithm. Again, we’ll see.

Google's Product Reviews Update and expanding to other languages.

Keep your eyes peeled. Ignore user feedback at your own peril.
I’ve covered the power of user studies before (especially with regard to Google’s broad core updates). It can be incredibly powerful to hear directly from objective users as they browse your site, consume your content, etc. But sometimes you can gain some of that feedback without even running a user study.

For example, I was analyzing one site that was negatively impacted during the March PRU that had user comments on each review page. Well, the comments can be telling… I found several comments on articles hammering the quality of reviews or questioning the expertise of the authors.

For example, “the reviewer clearly doesn’t know what they are talking about”, “how about updating the article”, and more.

Here is an example of what that looked like. The image has been slightly edited to protect the innocent. :)

User comments as feedback for review site owners.

That is incredible feedback for the site owner and they should take it to heart. Most users will not spend the time to post a comment like that, so it must be really bad if they are leaving those comments. And I’m not saying Google is using those comments directly when evaluating reviews (although it could absolutely be one of the many signals that are being sent to the machine learning system). But if I was the site owner, I would take that feedback to heart and figure out what needs to be updated. Then move as quickly as possible to improve the content. And maybe running a full-blown user study would be a smart next step.

Links. Still not the end-all for the Product Reviews Update.
In my posts about the April and December 2021 Product Reviews Updates, I explained how links were not the end all. For example, some sites surging had weaker link profiles overall and some sites dropping had stronger link profiles. Basically, there was not a clear connection between the strength of the link profile and how the site performed with the Product Reviews Update. Again, that could be the impact of a machine learning system that takes many signals into account and determines weighting.

So has that changed with the March PRU?

Not really. I’m still not seeing a major connection between link profile strength and how review sites are performing during PRUs. Sure, some powerful sites are surging, but is it because of their link profile? There are plenty of examples of the opposite… For example, sites with much weaker link profiles surging as well. Anyway, it’s just worth noting since it’s clear that Google’s machine learning-based PRU algorithm is using many signals (and many of those signals seem more focused on the quality of content).

Here are two examples of sites surging during the March PRU with weaker link profiles:

Site with weak link profile surging during the March 2022 Product Reviews Update.
Site with weaker link profile surging during the March 2022 Product Reviews Update.

And here are two sites dropping during the March PRU with stronger link profiles:

Site with strong link profile dropping during the March 2022 Product Reviews Update.
Site with strong link profile dropping during the March 2022 Product Reviews Update.

Key takeaways and tips for affiliate marketers and site owners:

  • Internalize Google’s best practices: Read Google’s best practices and take them to heart. Internalize them and then form a plan of attack for improving your reviews content.
  • Run a user study: User studies are absolute gold for SEO. Leverage Google’s best practices for product reviews and craft tasks and questions. Then use a strong platform for user testing (like usertesting.com). Gain feedback, watch video, and listen to users. The results can be enlightening.
  • Strive to be the Wirecutter of your niche: As I mentioned in my previous posts about the Product Reviews Update, work to become the Wirecutter or Good Housekeeping Institute for your niche. Yes, it’s challenging to do that, but it can pay huge dividends down the line.
  • Give readers multiple buying options: Link to multiple sellers for purchasing products (beyond just Amazon). It’s a best practice from Google… even if they say it’s not being enforced (yet). It’s a smart way to future-proof your reviews content (and protect from subsequent negative PRU impact).
  • Invest in visuals: Provide original photography, video, and gifs supporting your reviews content. It’s a great way to provide users with a killer view of the products you are covering while also showing users how you actually tested the products. Google has explained it’s looking for these things (it’s a best practice), and it can set you apart from the crowd. You can also repurpose that multimedia content for use on social media (like YouTube, Tiktok, Instagram, etc.) It’s a win-win.

Summary: The PRU continues to evolve.
Google’s March 2022 Product Reviews Update was another powerful update for affiliate marketers. It was the third in the series, and we can expect more as the PRU continues to evolve. Like broad core updates, Product Reviews Updates roll out just a few times per year. Therefore, if you have been negatively impacted by the latest PRU, then I highly recommend forming a strong plan of attack. The more you can significantly improve your reviews content, and over the long-term, the better position you can be in when the next PRU rolls out. Good luck.

GG

Back to the top>>

Filed Under: algorithm-updates, google, seo

How NewsGuard’s nutritional labels can help publishers avoid manual actions for medical content violations (Google News and Discover)

April 15, 2022 By Glenn Gabe Leave a Comment

In July of 2021, Google issued a number of warnings for sites publishing medical content that went against its guidelines (for Google News and Discover). The potential for a manual action was clear and some publishers scrambled to figure out what to do.

I mentioned this on Twitter in September:

Google adds information to help docs about displaying Discover Manual Actions in GSC

I've seen several examples of Discover policy violation warnings since early July. Will manual actions follow soon? Time will tell. :) https://t.co/huFckYCTr8 via @tldrMarketing pic.twitter.com/kCKImHjnhC

— Glenn Gabe (@glenngabe) September 2, 2021

And six months from the warnings, manual actions arrived for sites that hadn’t cleaned up the problem. Here is my tweet from January when Google issued the manual actions:

Heads-up. Don't ignore Discover & Google News policy warnings in GSC. It might take 6 months or a year, but a manual action could follow. Had multiple publishers reach out this weekend about manual actions for Discover/Google News. E.g. misleading content, medical content, etc. pic.twitter.com/JutaP82HQL

— Glenn Gabe (@glenngabe) January 30, 2022

To clarify, these were manual actions for Google News and Discover, and not Search. And for the publishers receiving manual actions for medical content, the medical policy for News and Discover states that Google “doesn’t allow publishing medical content that contradicts or runs contrary to scientific or medical consensus and evidence-based best practices.”

And the manual actions in Google Search Console explained the following:

“Your site appears to violate our medical content policy and contains content primarily aimed at providing medical advice, diagnosis, or treatment for commercial purposes. Nor do we allow content from any site that contradicts or runs contrary to scientific or medical consensus and evidence-based best practices.”

So, if you are publishing medical content, and receive a manual action for violating that policy, News and Discover visibility can be negatively impacted. Again, Search should not be impacted by the manual action, but Google News and Discover visibility could decline.

For example, here is the Discover performance for one of the flagged articles for a publisher that received a manual action:

Google Discover performance for a page impacted by a manual action for medical content.

When digging into the articles being flagged by Google, it was super-interesting to see the connection between NewsGuard ratings and the organizations that were covered heavily in the articles. Below, I’ll cover more about NewsGuard and how it could be helpful for sites publishing health and medical content.

Interesting cases and the connection between flagged content and NewsGuard ratings:
In 2018, I wrote a post covering NewsGuard, which I called a proxy for Google’s quality raters. NewsGuard has a team of analysts (trained journalists) that review websites based on nine journalistic criteria, including credibility, transparency, and trust. They originally started by focusing on news organizations, but they have expanded to health and medical as well. For example, there is now a HealthGuard service that, “helps patients, healthcare workers, and anyone involved in the medical field identify trustworthy sources of health information — and avoid dangerous misinformation.”

Once a site is reviewed, NewsGuard produces a “nutritional label” rating the site, which can also appear in the search results if you are using its Chrome plugin. In addition, NewsGuard has relationships with a number of organizations (in several capacities). For example, Bing, Facebook, the American Federation of Teachers (AFT), the World Health Organization (WHO), and others have partnered with NewsGuard to fight disinformation. You can read more about their various partnerships on the site.

Although NewsGuard does have partnerships with several organizations for helping fight misinformation and disinformation, I want to be clear that Google does not use NewsGuard data in its algorithms. But like I explained in my first post, those ratings sometimes line up with how the sites perform in organic search (since Google is also trying to algorithmically surface the highest quality and most authoritative content on the web).

It’s important to understand that Google is on record explaining that its algorithms can be more critical when it comes to health and medical content. Here is a Twitter thread of mine that expands on that point. Again, this is super-important to understand for anyone delving into health and medical content.

Run a health/medical e-commerce site? Via @johnmu: Our algorithms are more critical for health/medical topics, so def. keep E-A-T in mind. Make sure the site represents a very high standard. i.e. High-quality content created by actual medical professionals https://t.co/aiMrdN9Hl7 pic.twitter.com/Nuz3K7Pi6o

— Glenn Gabe (@glenngabe) March 27, 2021

For example, here is a health site with a horrible nutritional label from NewsGuard. The site has gotten hammered during broad core updates over time. Again, it’s not because of NewsGuard… it’s just interesting how they line up:

Health and medical site that dropped over time during Google's broad core udpates.

Cross-referencing organizations via NewsGuard based on manual actions for medical content:
For organizations receiving manual actions for medical content (News and Discover), I was interested in cross-referencing NewsGuard to see what the nutritional labels looked like for the organizations being covered (and promoted) in those flagged articles.

And to clarify, it’s not about simply mentioning sketchy organizations that would get content flagged. It’s more about the core of the article being about those organizations (including promoting their views). That’s exactly what the articles were doing that were flagged.

So what did the nutritional labels look like for those organizations being covered? They weren’t good. Not good at all… Here are two examples based on content getting flagged.

Here’s the first site’s label:

NewsGuard nutritional label with extremely poor ratings for a health and medical site.

And here’s the second site’s label:

NewsGuard nutritional label with poor ratings for a health and medical site.

And here is what one of the sites look like in the search results (when you are using the NewsGuard Chrome extension):

NewsGuard rating in the search results for a site with poor ratings.

When you hover over the NewsGuard icon (the red shield), you can view an overlay with more details. And that overlay contains a link to the full nutritional label on the NewsGuard website.

NewsGuard overlay with more information from a site's nutritional label.

When you visit the nutritional label on NewsGuard’s website, you can find all of the details about why the site received those ratings (and by category). And that includes all of the sources that were cited and referenced in their findings. For example, you can view CNN’s nutritional label here (just to get a feel for what one looks like, review the ratings by category, the sources section at the end, etc.)

Note, the site I mentioned that received the manual action is a large-scale publisher with millions of pages indexed, so most of the content would not fall into this category (covering organizations and views that go against Google’s guidelines). But, they do have some… and they were flagged by Google.

When discussing this situation with the site’s leadership, I explained that having some checks in place would be smart for understanding the risks involved with publishing certain pieces of content. And in my opinion, NewsGuard could be one of those checks.

Utilizing NewsGuard as a check during the publishing process:
So, if you are a site publishing health and medical content, then I would definitely put some checks in place to ensure you don’t receive a manual action for medical content. One solid approach could be adding checks using the NewsGuard plugin (which links to the nutritional labels). If you see red all over the label, you might want to be more cautious (or at least dig in further to learn more about that organization’s views).

For example, if the publisher I’m covering in this post that received the manual action checked NewsGuard before publishing that content, then they probably wouldn’t have published it at all (as long as they understood Google’s policies around medical content for News and Discover). Again, it’s a large-scale publisher with millions of pages indexed. A NewsGuard check could have raised red flags during the editing process.

Note, NewsGuard obviously doesn’t have labels for every site on the web, but understanding the ratings based on the organizations that have been reviewed is a good idea. Again, it was interesting to see the connection between some manual actions for medical content and the sketchy nutritional labels for those organizations being promoted in those articles. Like I explained in my original post about NewsGuard, it’s like a proxy for Google’s quality raters. So in my opinion, it’s smart to check those nutritional labels before publishing.

GG

Filed Under: algorithm-updates, google, manual-actions, seo

  • 1
  • 2
  • 3
  • …
  • 15
  • Next Page »

Connect with Glenn Gabe today!

Latest Blog Posts

  • How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)
  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent