In part one of this series, I covered an overview of the March 7 and April 16 updates, what I think could be going on with relevance and quality, what Google has explained about the update, and more. Now in part two, I’m going to provide examples of specific drops and surges I analyzed based on the April 16 update, including some reversals.
I’m not going to name any of the sites, since my intention is not to out any company impacted by these updates. Instead, I will explain how much they surged or dropped and provide interesting observations about why the drops or surges could have occurred (which in my opinion, includes both relevancy and quality). And then I’ll end with what I believe sites that were impacted can do now.
Remember, you don’t want to just sit there after seeing a major drop in traffic due to one of these updates. I explained more about Google’s comments regarding that in part one of the series and I’ll include more about this later in the blog post.
First, another quick disclaimer:
I do not work for Google, I don’t have access to its core ranking algorithm, I don’t have a hidden camera in the search engineers’ cafeteria, and I haven’t infiltrated Google headquarters like Tom Cruise in Mission Impossible. Note, I have been to Google HQ in both New York and Mountain View, but I didn’t drop from the ceiling with the help of Barry Schwartz. :)
I’m just explaining what I’ve seen after analyzing many sites impacted by algorithm updates over time, and how that compares with what I’m seeing with the latest Google updates. I have access to a lot of data across sites, categories, and countries, while also having a number of sites reach out to me for help after seeing movement from these updates. Again, nobody knows exactly what Google is refining, other than the search engineers themselves.
With that out of the way, let’s begin. In no specific order, let’s run through various examples of impact from the April 16, 2018 Google algorithm update (surges, drops, and some reversals). I think you’ll see why I keep hammering relevance AND quality after reading through the examples.
Examples of impact:
UX, Keyword Stuffing, Doorway Pages, and more
An ecommerce site that’s a key player in its niche saw a search visibility drop of 34% during the April update. Digging into the drop, there were several things that stood out. First, the main content is being pushed down the page based on UI modules at the top that don’t contain primary content (so users have to scroll to get to the main content). There was also a keyword stuffing component near the top (before the main content). Each page’s core keywords were repeated unnaturally and that was clearly for SEO purposes.
Content-wise, the product listings were actually spot-on. There’s no problem with the products being provided per query at all. I would say the products were extremely relevant to a number of the queries I was checking.
And then I opened a door, or several…
Yep, the site was employing doorway-like pages. Slight variations of the queries based on facets yielded indexable pages optimized for all of those query variations. And this was happening across the entire website.
The site was basically trying to rank for every possible keyword variation they could (probably based on a combination of keyword research, queries leading to the site, and actual on-site searches). I’ve seen this before many times over the years with a number of ecommerce sites.
To add insult to injury, the dreaded “stuffed category description at the bottom of the page” was present on all of the variations. Many ecommerce sites still employ this tactic, when John Mueller has clearly stated to not to do this… If you have unique and original content that can help users, why in the world would you stuff it at the bottom of the page? And if you read the content, does it even sound natural? If not, refine it to be helpful or just nuke it.
Here is video of John Mueller explaining this (at 56:30 in the video):
That’s the quality situation based on a top-level look, but there was a relevance sighting as well! And a serious one.
Exporting the queries where the site dropped by at least five positions revealed the relevance situation. Many of the queries I was checking could not be answered well by the page that ranked (the site couldn’t meet or exceed user expectations). Checking the pages that leapfrogged this site revealed they could answer the question much better.
So, Google was originally ranking the site for queries that the site had no right ranking for. I saw some drops of 5-10 positions, while others dropped off a cliff.
As a quick example, the site was ranking for a celebrity name, when that celebrity had a line of products related to the site content. That’s clearly NOT what most people would be looking for when searching just for the person’s name. Most are looking for information about the celebrity, not for the product they created. The site dropped for that query heavily. The top 10 is now filled with sites that provide in-depth celebrity information.
As another quick example, the site ranked for a generic keyword, when the page that was ranking was about a very specific aspect of that topic. Again, most people would not be looking for that niche article. The site dropped several pages after ranking at the top of page one for the query. This was clearly a relevance adjustment.
Improving quality over the long-term, enhancing technical SEO, and relevance NOT a big factor
A large-scale site in the food niche in over 35 countries saw nice gains during the April 16 update. And that’s after seeing some drops in certain countries during the March 7 update. This is a site I’ve been helping for a number of months already and it was great to see them surge.
The site has worked hard on increasing quality overall since early 2018. That included surfacing and nuking low-quality and thin content, properly handling an infinite spaces situation, improving the user experience, and more. In total, the site has removed millions of low-quality pages from Google’s index, while improving content quality, user experience, etc. And again, from a technical SEO standpoint, there were many little gremlins running around the site. And when you’re dealing with a large-scale site (30M+ pages indexed), little gremlins can turn into big problems.
The site surged on April 16 in a number of countries. You can see some of the surges below:
Surge of 34% since April 16 update:
Surge of 42% since April 16:
Surge of 165% since April 16 update (but based on lower traffic than the other countries):
In my opinion, this surge had much more to do with quality than relevance. The site fixed so many quality problems since early 2018 that it’s hard to ignore the impact that could have had. Also, checking the competition, the sites being leapfrogged also had relevant content… It’s hard to say that my client’s content was significantly more relevant than the others. If relevance had anything to do with this surge, then it was a minor factor (in my opinion).
And don’t underestimate the power of reeling in an infinite spaces problem. This specific situation was impacting over 10M urls on the site and causing serious index bloat (with low quality or thin content). My client was all over that situation across countries once I surfaced the issue. That was a huge win on several levels (quality, crawlability, quality indexation, performance, etc.)
Large-scale site, B2C Publisher
A key player in its niche saw a reversal from the March 7, 2018 update. It saw an initial drop of 20% on March 7, 2018 and then a surge of 31% on April 16, 2018. It now has greater search visibility then it did prior to 3/7.
This specific situation looked much more about relevancy that quality. For the queries that dropped, and then recovered, the content was indeed relevant. And the site overall is one of the most relevant in its niche. When digging into the queries where the site dropped in positions (and didn’t recover), it was easy to see that the content was clearly not as relevant.
For example, queries where Google has to determine intent (disambiguation) and those queries used to lead to certain pages on the site where only a small percentage of users would find the content helpful. Many of those rankings didn’t return on April 16.
It’s also worth noting that a number of pages that do rank well now, but initially dropped on March 7, are category pages for a topic. Those category pages link to relevant and helpful articles about that topic. For some reason, they got caught in the crosshairs during the first update in March. Either the site is sitting in the gray area… or Google made some adjustment that pulled the site into the white. Just a side note.
Large-scale B2C publisher (YMYL)
The next site I’ll cover is also a behemoth in its niche. It dropped initially during the March update, and then more during the April 16 update (for a total loss of 21% in search visibility). Like several of the others I’m covering, there definitely seemed to be a mix quality and relevance at play.
First, the UX is tough on the site, with ads squeezing the main content. You almost get claustrophobic while browsing the site. There were a number of thin and lower-quality posts ranking for keywords as well.
And many of the topics would fall under “Your Money or Your Life” (YMYL). You can read more about YMYL topics in the Quality Rater Guidelines (QRG), but Google holds those sites to a higher standard. That’s because they cover sensitive topics like health, finance, legal, and others that can impact the overall happiness of a user. For a number of sensitive queries, the site dropped off page one, and sometimes much lower. And the sites moving up were some of the most powerful sites in health.
YMYL information from Google’s Quality Rater Guidelines (QRG):
From a quality perspective, there were thinner posts, aggressive advertising, autoplay video, a weird app install popup, and more. And a large portion of the blog contained lower-quality posts. Those blog posts definitely did not match what you would expect to see on a site targeting YMYL topics. Note, the blog contained over 30K blog posts. Remember, Google takes every page indexed into account when evaluating quality. So adding many low-quality posts on the blog is not helping matters…
Here is video of John explaining this (at 10:06 in the video):
From a relevancy standpoint, the site was definitely ranking for some queries it had no right ranking for. In my opinion, only a small percentage of users would be happy with the content based on the query. Instead, most users would be searching for content that’s directly related to the query. And similar to what I covered earlier, the site ranked for some entity names (companies, organizations, and people), when it had no right doing so. Again, users would want thorough information about those entities versus articles about a very specific topic that involved those people or companies.
Same (or similar) content, huge risk, razor-thin differences
The next site I’ll cover is in the entertainment industry and it saw a 34% drop in search visibility during the 4/16 update. There were a number of sites that got hammered during the April 16 update that simply contained the same content as other sites (or very similar information). For situations like this, the difference between your site ranking well, or another, is razor thin. This is why Google has said many times that you need to provide some type of value-add. Don’t simply slap the same content on your pages and expect strong traffic to remain indefinitely.
After checking several of these sites after the April update, there wasn’t a massive difference between the ones surging and dropping quality-wise or relevance-wise. Sure, there were definitely some quality issues and aggressive advertising issues on several sites, but it wasn’t as clear as other examples.
It’s worth noting that John Mueller recently explained that great functionality alone will not suffice from a quality perspective. You still need high-quality content present on your pages and just slapping a cool piece of functionality on the site will not help quality-wise. Well, that fit for this situation, since the content is the same across competitors (and pulled from the same core set of data sources).
Here’s a video of John explaining this (at 25:35 in the video):
E-A-T galore, clean, well-organized, expert authors
The next site I’ll cover is a major health site (YMYL) that surged 37% during the April 16 update. This site has enormous amounts of E-A-T (expertise, authoritativeness, and trust). For example, top-notch content written by experts in their niche (with advanced degrees and multiple professional certifications). In addition, the content is reviewed by other experts in the niche, which adds another layer of trust to the equation.
There are also videos from experts in the niche and those videos are prominently displayed on each page (near the top of the main content after a solid introduction to the topic).
From a user experience and advertising perspective, the site does a good job at balancing monetization and user experience. The ads are clearly labeled and do not inhibit the user experience at all. I didn’t see ad deception at all and the site didn’t employ popups, interstitials, autoplay video, or anything like that, which can be extremely frustrating for users.
The site surged for many competitive keywords. You can see a screenshot below with some of the surges:
Low-quality user experience – UX nightmare mixed with aggressive and deceptive ads
Next, I’ll cover another entertainment site that took a 55% hit in total based on both the March and April updates.
The site provides a tough user experience (and aggressive advertising setup).
On desktop, there was a giant video at the top of the page with unrelated content pushing the main content down the page (so users had to scroll to see the primary content). To add insult to injury, the video with unrelated content also had pre-roll ads. So, users might think the video contained information that was directly related to the main content, and they might sit through a thirty second ad, only to find out the video had nothing to do with the main content.
How would you feel if that happened??
In addition, there were affiliate links right under the video. So, the site had unrelated videos first (with pre-roll ads), and then affiliate links closely tied to the video with no disclosure of the relationship between sites. That definitely falls into the deception category to me. And then weaved into the main content were other videos, with some not even playing/working. And when you tried to play some of the videos, you were whisked off the site to third party advertiser sites. I can only imagine there were many frustrated people feeling tricked by elements like that.
On mobile, you are smacked with a popup as soon as you hit the page, then the affiliate links mentioned earlier, double ads that were overlapping (which I hope was a technical glitch), and more. This was a UX nightmare mixed with aggressive and deceptive ads. And the site took a huge hit.
Double ads being displayed (either by accident or for monetization purposes):
Aggressive conversion tactics, doorway pages, false authority signals
The next site saw a 48% drop in search visibility based on both the March 7 and April 16 updates.
This was another site with an initial drop on March 7, and then a bigger drop on April 16. Based on analyzing the site and the drop, I believe quality problems were a much bigger factor than relevancy issues.
First, many queries led to pages that were extremely aggressive conversion-wise. There was a giant lead-gen form above the main content requesting user information. The user had to scroll to see the main content on all of the pages. And it was even worse on mobile.
Then there was a doorway page situation (based on city, state). The content was identical, but many pages were optimized for city state combinations. And I mean MANY of them. This included zip code over-optimization as well (an overload of zip codes present on the page).
The site also included the old-school approach of linking to Wikipedia from many pages. There was no reason to link to Wikipedia on these pages other than to send some false signal of authority. John Mueller has covered this several times in his webmaster hangout videos.
Here is video of John explaining this (at 36:52 in the video):
Relevance-wise, there were definitely some adjustments made (and they were the right ones). For example, the site used to rank for a top-level search for an online tool. The page dropped, which makes sense, since the site employed a very specific use-case for the tool. Overall, I would imagine most users would not be happy with the content. That’s just a quick example, but it matches the other relevancy adjustments made on the site.
The same content, previous algorithm update hits, many entity queries dropped, and relevance
The next site saw a search visibility loss of 32% between both the March and April updates. The site dropped during the March update and then dropped more during the April update. The site is in a niche where the differences between sites is razor thin (as covered earlier). Most contain very similar information, which is always risky from an SEO standpoint. Google’s John Mueller has explained many times that if you’re in this situation, then you better provide a serious value-add. Don’t be marginally better than your competition. Blow them out of the water.
From a history standpoint, the site has lived in the gray are of quality for some time. I see surges and drops during medieval Panda updates, with a big surge during Panda 4.1. Then they got smoked by the February 7, 2017 update, which was a huge quality update (one of the biggest we’ve seen actually).
When checking the queries that dropped, many were entity-based and not directly about what the site provides. Therefore, this fell into the relevancy bucket for sure. That’s a tough situation for any site, since they probably shouldn’t have been ranking for the queries to begin with. Now the site has dropped to page two or beyond for many of the keywords.
For the site’s core queries, I did still see them ranking well for a number of important keywords. It’s also worth noting that many of the SERPs for the entity-based queries only contained seven listings. So, the site is working in a tough niche… You are shooting for one of the top seven spots versus ten. So even if you drop a few spots, you could easily be on page two. And we all know what’s on page 2… :)
Misc. other surges and drops:
Due to the size of this post, I’ll provide some additional examples below, but the CliffsNotes version. I have analyzed many sites impacted by these updates and can write a book about the volatility… Here are some additional examples of impact from the 4/16 update:
News site, massive surge
This wasn’t a news publisher, but can definitely be categorized in the news niche. The site clearly provides what most people would be looking for based on the queries that surged. The functionality on the site is strong, including a number of easy to use functions that were extremely valuable for visitors. The site has a very clean interface and no ads at all (it’s a quasi-free site with a subscription-based model for pro accounts). Based on everything I explained, the site has a very strong link profile, with nearly seven million inbound links and many from trusted sources.
Relevance adjustments + Poor UX + Aggressive Ads
Well, that’s not the best combination in the world… You might say it’s a killer recipe from a Google algorithm update standpoint (literally). A site that got hammered during the 4/16 update was ranking for queries when it didn’t have the best content addressing user needs. In addition, there were non-relevant video clips playing above the main content. They weren’t ads, but the videos were only tangentially related to the main content. I covered a similar situation earlier.
In addition, there were many weaved ads in the main content (which is a huge deception problem). Several of the ads didn’t contain ad labels, which could end up driving users downstream to advertiser sites when they had no idea that would be happening. And the situation was even worse on mobile.
Food site, UX issues, blocked main content, quality reviews not indexable
A major player in its niche, this site got hammered by the April 16 update. When checking core pages that dropped during the update, I noticed parts of the main content were actually blocked (not visible on-load and hard for the user to access). There were also a number of UX barriers on the site that inhibited users from traversing content. It’s never good to do that…
In addition, the site has a boatload of quality reviews, but those reviews are blocked from indexing. Google has explained that reviews make up the content on the page, and therefore, can contribute to the quality of the page (and site). If you have well-moderated reviews with strong content, then don’t block the reviews. They can help your site quality-wise. But make sure they are heavily moderated… just like they can help you, they can easily hurt you too.
Powerful site, weaker main content across many pages
Another site, this time in the automotive niche, got hit hard as well. Digging into the drop, the queries, and landing pages revealed main content lacking enough solid information. A number of the top pages were thinner and weaker than competitors. And this is a very competitive niche.
In competitive categories like this, there’s often a fine line between sites surging and dropping. Some of the sites improving had better content and possibly a stronger user experience, but they weren’t head and shoulders above the rest. Just an important side note that SEOs and site owners need to be aware of.
A site that tanked during the 4/16 update contained deceptive ads all over the place. And worse, they were often ads that looked like download buttons with “download now” or “view pdf” calls to action. The ads almost tricked me several times, and I’m neck deep in this stuff!
The quality rater guidelines (QRG) provides a lot of information about deception and how sites employing deceptive ads should be rated low-quality. In addition, the situation was worse on mobile. Ads were shifted up and above the main content so users must scroll to get to that content. After heavily reviewing the site, the aggressive and deceptive advertising issue seemed to be the biggest problem (in my opinion), although there were other content-related issues as well (just not as big of a problem as the deception situation).
I’ll stop the examples here, since this post is getting too long… After reading through the examples above, I hope you can see why I believe that both relevance and quality played a factor with the March and April updates.
Next, I’ll cover what site owners can do now to either improve their sites, or to bolster their current rankings (if they surged and are doing well after the March and April updates). On that note, just because you surged doesn’t mean you will always stay there. Actually, a number of the sites that dropped during these updates have done very well in the past SEO-wise… Just an important side note.
Google’s advice: There’s no “fix”, but keep building great content.
I covered this in part one of the series, but it’s worth mentioning again. Many have asked me what I believe site owners should do if they experienced a drop during the March or April updates. They explain that Danny Sullivan said to “do nothing”. Well, he actually didn’t say that. Danny just said there’s no “fix”, as in fixing a meta tag and suddenly you’re back. Danny did say to remain focused on building great content. And if you do, then it might be your site that jumps up during the next update.
Just like with any major core ranking update, Google takes many factors into account. There’s never one smoking gun, there’s usually a battery of them. Therefore, I would not sit and wait it out…
Actually, John Mueller has said this a number of times. If you’ve been negatively impacted by a major algorithm update, don’t wait to see how things go. Make changes to improve your site. That could be changes to content, technical SEO, aggressive or disruptive advertising, user experience (UX), and more. That’s what recovery is all about. It’s never about finding one smoking gun that caused your site to tank (which rarely happens). It’s about fixing all quality and engagement issues you can find, fixing them as quickly as possible, and then driving forward.
Here is video of John explaining this (at 30:14 in the video):
Should you roll back recent changes?? No, it wasn’t your last release that caused problems…
It’s also worth noting that Google’s John Mueller said in a recent webmaster hangout video that recent site changes would not be the cause of impact from a major core ranking update. And that’s especially the case with large-scale sites. It can take a lot of time for Google to recrawl, reindex, and then reprocess those changes. So, your release from last week probably did not cause your site to tank. It was probably the months of bad things going on that caused your site to drop. That’s an important point.
Here is video of John explaining this (at 47:47 in the video):
Moving Forward: What site owners can do now:
If you’ve seen a drop due to the March or April updates, then I recommend addressing both quality and relevance. From a quality standpoint, my recommendations are similar to what I’ve recommended for a long time. You need to significantly improve quality over the long-term. Again, significant and long-term are the key phrases here.
You will probably NOT recover in a few weeks, or even in a month or two. Instead, you need to surface and then fix any potential quality problem across your site. Remember, quality can mean many things, including low-quality content, thin content, user experience barriers, aggressive and disruptive advertising, technical SEO problems that can cause quality problems, and more.
Also, it’s important to make the right changes and keep them in place for the long-term. Don’t reverse course three weeks into making changes because you’re not seeing improvement. Remember, Google’s John Mueller has explained several times that Google wants to see significant improvement over the long-term. It can easily take months to see positive movement during subsequent updates, and I’ve seen it take six months or longer. If you revert important changes, you will have no idea if they would help you long-term.
You’ll be in a maddening cycle of making changes, reverting those changes, making more changes, reverting those changes, etc. DON’T DO THAT. Make the right changes from the start and stick with them. Then drive forward by creating high-quality content, naturally building links, and always maintaining the best technical SEO setup you can.
Here’s a quick bulleted list of things you can start on today:
- Crawl Analysis and Audit – Perform a thorough crawl analysis of your site and surface ALL potential quality problems. Root out low-quality content and thin content, technical SEO problems that could be causing quality problems, user experience barriers, aggressive and disruptive advertising, deceptive tactics that can cause user frustration, and more. I have rarely seen collateral damage when it comes to major algorithm updates. So, find the areas of your site to improve, and improve them as quickly as you can.
- Analyzing Queries and Content That Dropped – Check the queries and content that dropped in rankings and objectively review the situation. If you were a user that knew nothing about your site, would the content meet or exceed user expectations? Does the competition have better content, written by people with more experience, does it contain better supporting content, is it more focused on the core topic than your content, etc? Based on what you learn by doing this, you can absolutely make changes to your content (to improve the situation).
- User Testing – Perform real user testing. Don’t simply invite your mother, spouse, children, or best friends from college to review your site. Instead, invest in having a number of people objectively traverse your site with a goal in mind. Listen to their feedback, their concerns, what they liked, what they disliked, and what they absolutely hated. You can learn a lot by listening to real users trying accomplish a task on your site. Then make changes based on their feedback. Don’t just share the results with your team and over-analyze it. Move and make changes.…
- The QRG – Read the Quality Rater Guidelines (QRG) several times, and have everyone on your team read it too. I’ve said this a thousand times over the past few years… The QRG contains 160 pages of SEO gold. In the QRG, Google clearly explains what its raters should deem high-quality versus low. You can read about low quality content, aggressive advertising, Your Money or Your Life (YMYL) topics, E-A-T (expertise, authoritativeness, and trust), and much more. I recommend downloading the QRG, sharing it with your team, and having 30-minute sessions at lunch where you go over each section. I can’t tell you how much overlap I’m seeing between what’s contained in the QRG and what I’m seeing in the field.
- GSC’s Index Coverage Report – Don’t forget to heavily analyze Google’s new index coverage reporting. It’s like index status from the old GSC, but on steroids. It contains actionable data based on Google crawling your site. It can help you understand which pages Google is indexing, and which pages it’s not indexing. And keep a close eye on the “Excluded” reporting. That’s where you can often find serious problems. It contains pages that Google has crawled, but decided NOT to index for some reason. For example, you can find canonical problems, robots.txt issues, soft 404s, duplicate content, urls submitted in xml sitemaps with problems, and much more.
Summary: The March and April Updates Were Big. Relevance AND Quality Stood Out
The March 7 and April 16, 2018 updates were significant core ranking updates from Google. Many sites were impacted on those dates, across categories and countries. Based on my analysis, you could see a number of factors at play, including both quality and relevance. If you have been impacted by these updates, I would not just sit and wait. As Google’s John Mueller has explained several times, you should always be looking to improve your site. Aim to be 10X better than your competition and not just slightly better. That’s the strongest way to make sure you don’t have to worry about updates like this in the future. And if you missed part one in this series, I recommend you read that post as well. Good luck.