Google’s December 2020 Broad Core Algorithm Update: Analysis, Observations, Tremors and Reversals, and More Key Points for Site Owners [Part 1 of 2]

Glenn Gabe

algorithm-updates, google, seo

{Update July 2022: I just published my post about the May 2022 broad core update. In my post, I cover five micro-case studies of drops and surges.}

Well, we waited nearly seven months, but it finally arrived (and right in the middle of the holiday shopping season). Google rolled out another broad core update on December 3, 2020, and as we expected, it was a huge update. The last broad core update was May 4, 2020 so we waited nearly seven months in between updates. I’ll explain more about why the time in between updates is important soon.

A Big Update Warrants A Two-Part Series:
I was going to write one post about the update, including the case studies, but as I was writing it I thought a two-part series would fit much better. In this first post I’ll cover a number of important topics regarding the December update and about broad core updates overall. For example, I’ll cover the rollout, timing, and early movement. I’ll also cover tremors, which yielded some reversals for certain websites. Then I’ll move into (more) important points for site owners to understand about broad core updates, including information from my SMX Virtual presentation which focused on broad core updates.

Then in part two of the series, I’ll cover three case studies that underscore the complexity and nuance of Google’s broad core updates. That’s in addition to the four I already covered in my post about the May 2020 broad core update. Each case is unique and provides an interesting view of how certain sites are impacted by these updates and how site owners responded. I’ll also end each post with tips and recommendations for site owners that have been impacted by the December update (or any broad core update for that matter).

Here is a quick table of contents if you want to jump around to a specific part of this post. I recommend reading it from start to finish to understand all of the context about broad core updates, but I understand that everyone is short on time.

The December Broad Core Update: Table of Contents

Rollout and Timing:
Danny Sullivan announced the broad core update on December 3, 2020 and linked to Google’s own blog post about core updates for reference. Soon after, Danny announced the update began rolling out at 1PM ET on 12/3 and that the update could take a week or two to fully roll out.

And then we finally heard from Danny again that the rollout was complete on 12/16/20. So that’s about two weeks to fully roll out, which made sense.

Google Goldilocks & The Holiday Shopping Season – Not Too Early, Not Too Late, Just Right For Most Site Owners
The timing of this update was a bit controversial. Some thought that with the pandemic surging and the holiday shopping season upon us, that Google should not roll out such as big update. I get that, but there were also many site owners that have been waiting since the May broad core update to see recovery. Heck, there were some that had been hit by the January core update that didn’t recover in May that were waiting for this update. So although I see both sides, I’m in the camp that the timing was fair.

In my SMX Virtual presentation about broad core updates from 12/8 (which was crazy timing since it was literally right after the December update rolled out), I explained that Thanksgiving, Black Friday, and Cyber Monday had passed already. So, sites that might be negatively impacted by the December core update could have benefited from those big shopping days even if they were going to drop. And for those waiting for recovery, they missed those important shopping days since they were down, but could potentially recover for the remaining holiday shopping season. Needless to say, it was tricky for Google, but I believe the timing was fair.

John Mueller About The Timing and Passage-based Ranking
It’s also worth noting that Google’s John Mueller explained in a Search Central Hangout (at 5:43 in the video) that although he wasn’t involved in the decision about the timing, he thought it seemed fair. He also answered a question about passage-based ranking and if it rolled out with the December update. John explained that a change like passage-based ranking would typically not be bundled with something like a broad core update. He also didn’t believe it rolled out yet, although Google had explained it could roll out before the end of the year. So stay tuned about passage-based ranking. Here is my slide from SMX about this:

The December 2020 Core Update: Interesting Observations From The Front Lines
Google is looking at many factors with broad core updates, and over an extended period of time. They have explained this many times and I have shared this often on Twitter, in presentations, in my blog posts about core updates, etc. For example, Google’s Paul Haahr, a lead ranking engineer at Google, explained at the webmaster conference in Mt. View that they complete an extraordinary amount of evaluation in between broad core updates. Actually, they do so much evaluation in between that it can be a bottleneck to rolling them out more often. And check the last bullet point about decoupling algorithms and running them separately. Yep, welcome to SEO. :)

Google is trying to understand a site overall, and across many factors. This is why it’s very hard to identify specific things that changed with broad core updates. And it’s also why it’s hard to know exactly what’s going on with a certain website unless you work on it, understand the fully history of the site, problems it had over time, improvements that have been implemented, etc.

In my post about the May core update, I covered how Google’s Gary Illyes explained that Google’s core ranking algorithm is made up of “millions of baby algorithms working together to output a score”. That’s super important to understand and it’s why you cannot focus on just one or two things when analyzing and improving your site. It’s much broader than that, pun intended. My slide from SMX Virtual underscored this point:

This is also why I provided four case studies in my post about the May 2020 core update, which underscored the complexity and nuance involved with broad core updates. On that note, part two of this series covers three more cases based on the December update.

I just wanted to bring up the complexity of broad core updates so you don’t focus too narrowly when reviewing your site or other sites that were impacted.

Before we move forward, a quick disclaimer:
I do not work for Google. I do not have access to Google’s core ranking algorithm. I did not dress up as a Fedex employee and try to force my way into Gary Illyes’ home to commandeer his laptop holding all of Google’s secrets. Broad core updates are very complex and it’s hard to write a post that covers all of the things I’m seeing, the complexity of the updates, and the nuance involved. With that out of the way, let’s get started.

Quick movement, very quick, almost too quick:
Once a broad core update rolls out, it’s typically a few days before we see a lot of movement across sites. But with the December update, we saw movement very quickly (within 24 hours). A number of people (myself included) were sharing hourly trending from Google Analytics of sites beginning to see impact from the update. For example:

Many sites that were impacted saw a ton of movement (either surging or dropping) right after the update on 12/4 and 12/5. Then it calmed down a bit. It was almost too calm after that first wave of volatility. But then 12/9 came around. I’ll cover that volatility in the next section.

Here are two examples of surges and drops after the update rolled out. Some were dramatic and I’ll be covering more about this in the case studies in part two of the series. Both examples below are very interesting cases by the way.

Tremors and Reversals:
Again, I kept thinking to myself that this was quick… almost too quick, which also reminded me of an important point that John Mueller confirmed years ago about major algorithm updates. After medieval Panda updates rolled out, I would sometimes see strange fluctuations after the rollout. For example, distinct and large changes in rankings several days after the rollout (sometimes even reversing course).

When I asked John about this, he explained that Google can implement smaller tweaks after major algorithm updates roll out based on what they were seeing. That made complete sense and I called them “Panda tremors” at the time. I’m sure the same approach applies with broad core updates and I believe we actually saw that with the December 2020 update. Starting on 12/9, we saw a lot of additional volatility, with many sites seeing more movement in the same direction. For example:

And here is John explaining more about what I call “tremors”:

But, we also saw some sites reverse course. And some completely reverse course. It was wild to see. Once I shared what I was seeing, I had a number of companies reach out to me explaining that was happening with their own sites! And several were sending screenshots from Google Analytics and Google Search Console (GSC) showing the reversals.

As you can imagine, this was incredibly disappointing for those site owners (and tough for them to experience). Imagine thinking you were surging with the update, only to drop back down to where you were before the update (and some dropped even further!) Here are some screenshots of the reversals from GSC and Google Analytics:

And here is what search visibility looks like for one of those sites. Insane:

As another example, there were definitely tremors during the July broad core update. I provide more information about that in my post about the July core update.

Also, it’s always important to check the queries and landing pages dropping during a broad core update (if you’ve been negatively impacted). That can help you understand if it was a relevancy adjustment, intent shift, or if it’s due to overall site quality problems. And it could be a mix of reasons… You can read my post about the differences to learn more.

Major Affiliate Impact with Reversals:
It’s worth noting that a number of affiliate sites saw massive volatility with the December update, and even more than usual (in my opinion). And some were impacted by the reversals I just covered. And then within affiliate marketing, those focused on health and medical saw a ton of volatility.

On that note, I mentioned an important point in my SMX Virtual slides about affiliate marketing and health/medical sites. I explained that YMYL (Your Money or Your Life) content is held to a higher standard. And when you mix commerce or affiliate marketing with health and medical, there’s a fine line between educating and selling. I saw a number of sites in this situation get absolutely smoked by the December 2020 core update. Beware.

Some Alternative Medicine Sites See Improvement, But Still Down Big-time From Previous Levels:
There were some alternative medicine sites that saw very nice increases with the December core update. If you remember, many of these sites have seen significant drops in the past, especially starting with the Medic Update in August of 2018. That was interesting to see, but you must zoom out to see how far they have come back (or not). Although some did see nice increases during this update, they are still way below where they were previously. Just an interesting side note.

For example, here’s a nice bump in visibility during the December update, but it pales in comparison to where the site once was.

A home remedy sub-niche as a microcosm of alt medicine volatility:
I also surfaced a great example that demonstrates the insane volatility a specific niche can see with broad core updates. Within alternative medicine, there is a home remedy sub-niche which many sites focus on. This space saw an incredible amount of volatility, with many sites dropping off a cliff. As with many home remedies, there are some claims being made that aren’t backed by science and can be dangerous to follow. Remember, YMYL content is held to a higher standard.

Here are some examples of the volatility in that niche:

Some Major e-Commerce Players Surge and Drop:
Remember I mentioned the timing of this update earlier? Well, some major e-commerce sites did not fare well during the update. The only good thing for them is that at least it wasn’t right before Thanksgiving week with Black Friday and Cyber Monday approaching. Although they dropped, other e-commerce sites waiting to recover were able to gain visibility during the holiday season. Again, I think this was fair. Some sites were waiting for nearly seven months to recover.

Here are two big players in e-commerce with very different outcomes based on the December update:

Online Games, Lyrics, and Coupons With Major Volatility (and a warning to sites with the same, or very similar content to other sites):
I wanted to mention the online games niche for a minute (and other categories like it). It’s a tough area since many of the sites contain the same or very similar content. I also covered this situation in my SMX Virtual presentation.

Google’s John Mueller has explained that if you provide the same content, or very similar content, to many other sites on the web, then it’s hard for Google’s algorithms to determine which site should rank. And that can lead to a lot of volatility over time. It’s super important to differentiate your site as much as you can. It’s worth noting that I saw more heavy movement for sites like this during the July 2021 core update. You can read more about that in my post covering the update.

I have provided my tweet below where I linked to a video from John Mueller explaining this. I also included the video below (which starts at 39:32).

YouTube video

Well, the online games niche saw some crazy movement with the December update. If your site is in this situation (having the same or very similar content to many other sites), then I highly recommend trying to differentiate your site as much as possible, provide some type of value-add for users, etc. If not, you can end up seeing similar visibility trending to the examples below. Unfortunately, there will be razor-thin margins between sites (scoring-wise and rankings-wise). Beware.

News Publisher Volatility:
There are always many moving parts with large-scale news sites. Millions of pages indexed, tens of millions of pages when you take into account each site’s crawlable footprint, maybe an advertising situation that can be aggressive and disruptive, sites trying to strike a balance between information, UX, and monetization. I’ve helped many news publishers over the years and there are typically many things I surface during those audits to improve.

Here is some volatility from the space (and this is before the Page Experience Signal rolls out, which is May 2021). I bring that up since news sites often have many issues based on the factors and sub-factors involved with the Page Experience Signal (so it should be interesting to see how news publishers are impacted once it rolls out). I also covered the sub-signals in a Web Story earlier this year if you want to learn more about those.

More Key Points About Broad Core Updates & My SMX Virtual Presentation:
In my post about the May 2020 broad core update, I included several important points for site owners to understand about Google’s broad core updates. Those points seemed to resonate with site owners, since the topic is extremely nuanced and site owners can end up confused about how they roll out, what Google is evaluating, and when you can see recovery.

I included those in my SMX Virtual presentation as well (presented on 12/8), but I also added more items to the list. I’ll include those additional points below. I definitely recommend reading my post about the May broad core update so you can understand all of the key points about these updates.

First, the points I covered in my May post include:

  • There’s Never One Smoking Gun – When reviewing a site that’s been heavily impacted by a broad core update, there’s never one smoking gun, there’s typically a battery of them.
  • Relevancy Adjustments – Google can implement relevancy adjustments during broad core updates, which aren’t necessarily a bad thing for your site. Dig into the drop and objectively figure out if there were major relevancy adjustments (like your site ranking for queries it had no right ranking for), or if there are deeper issues at play.
  • “Millions of baby algorithms…” – Google’s Gary Illyes explained at Pubcon in 2019 that Google’s core ranking algorithm is comprised of “millions of baby algorithms working together to output a score”. I love that quote and it’s why you can never pinpoint one issue that’s riddling a site from a broad core update standpoint.
  • Recovery – You typically cannot recover until another broad core update rolls out. This is super important to understand. If your site has been heavily impacted by a broad core update, you will need to significantly improve the site over the long-term. That’s what Google is looking for and you (typically) can only see that reflected during subsequent broad core updates.
  • Recent Changes Not Reflected – Recent changes will not be reflected in broad core updates. Google’s John Mueller has explained this in the past, and even again in a recent Search Central Hangout. For example, changes you implement 2-3 weeks before a broad core update will typically not be reflected in that update. Google is evaluating many factors over a long period of time with broad core updates. It’s not about that recent tweak or change you made.

And now based on my SMX Virtual presentation, I’m including some additional important points that site owners should understand about broad core updates. Again, these are foundational points that you should understand before tackling remediation:

Aggressive Ads and “Hell hath no fury like a user scorned”:
I continue to see terrible user experiences across many sites heavily impacted by broad core updates. Aggressive, disruptive, and deceptive ads yield a terrible UX for many people. Don’t do this. You can pay a heavy price. Google has even mentioned aggressive ads in its own blog post about broad core updates. Always remember, and respect, your users.

Below is a slide from my SMX presentation where I cover a common pitfall I see with sites impacted heavily by broad core updates. This was about one specific site, but I see this often.

The Gray Area of Google’s Algorithms (and Yo-Yo Trending):
For sites surfing the gray area of Google’s algorithms, it’s easy to continue to surge and drop during subsequent broad core updates. For example, dropping in January, surging in May, only to drop again in December (or vice versa). That’s why it’s super important to significantly improve your site overall over the long-term. You want to clearly get out of the gray area so you can limit volatility down the line. There are many examples of sites that either haven’t improved significantly, or that injected more problems into their sites that continue to see this type of movement.

And on the flip side, there were definitely sites that did nothing to improve and surged. But in my opinion, unless they get out of the gray area, they could very well see drops again. I’ve covered the gray area heavily in the past and it’s a maddening place to live for site owners. I will also cover a very special case study in part two of this series that fell into this category.

Beware Technical SEO Problems Causing Quality Problems:
There were several examples I picked up with the December broad core update of this happening and it can be sinister, since it can sit below the surface without site owners realizing it’s happening. I actually covered this in my SMX Virtual presentation as well, and I received several questions in the Q&A about clarifying what I’m referring to.

To clarify, I’m NOT talking about basic technical SEO problems. I’m referring to technical SEO problems that cause quality issues. Google is on record explaining that every page indexed is taken into account when evaluating quality. So, if you have pockets of pages that get published due to technical SEO problems, and those pages are low quality and/or thin, and they get indexed, then that’s what I’m referring to.

For example, stripping noindex out for thousands of low-quality pages by accident, having major canonical problems and Google indexing many additional pages that shouldn’t be indexed, mistakenly using parameters or session IDs causing Google to find many additional low quality urls, so on and so forth. This is why it’s incredibly important to thoroughly analyze your site through the lens of broad core updates. If you miss those underlying problems, you can spin your wheels having no idea what’s going on.

It’s also worth noting that Google’s John Mueller presented at SMX Virtual and gave some tips for 2021. In those tips, he explained that in 2021 and beyond, sites that are “technically better” have an advantage. Sometimes that’s a small advantage, but it can be bigger depending on the niche. He said it’s good to get that advantage. Remember, content is king, but strive for strong technical SEO. Here’s a tweet I shared when covering John’s presentation from SMX Virtual:

Google Discover and Top Stories Visibility Can Be Impacted:
I’ve covered this before and it’s important to understand that both Discover and Top Stories visibility can be impacted by broad core updates. So if Google’s Discover feed or Top Stories feature are important for your business, then you should take a hard look at your visibility after a broad core update rolls out.

For example, here are some examples of Discover traffic impacted after a broad core update. Notice the distinct upticks in clicks and impressions right after a broad core update rolls out:

And here is Top Stories visibility for a site after the December update. Note, they never were in Top Stories before at all. They ranked pretty well in Google News, but have struggled to break into Top Stories. This was a great sign for the company, which has been working hard to improve (after getting hit in 2019 by a broad core update, recovering during a subsequent update, and continuing to improve since then). It’s not a lot of visibility, but again, they were never in Top Stories before (ever):

Content is King: Extremely Relevant Content Can Rank Despite Other Problems
Google is always looking to surface the highest quality and most relevant content for each query. Google has reiterated that site owners should focus on building the best content based on what users are searching for. Don’t overlook this point.

Great content can still win despite the site having other major issues. This is why you can see some sites with many issues still ranking and not being impacted heavily by core updates. There are many factors being evaluated and if a site produces the most relevant content for users, Google can still end up ranking that site highly across queries. You can also see this with sites that have a ton of authority. They can rank despite having other issues.

This is also why it’s important to not blindly follow other sites that seem to be doing well. You might follow them right off a cliff. Your site is different than theirs. You might not get away with what they get away with. And if you don’t, you could end up dropping heavily when a broad core update rolls out.

The A in E-A-T and the Power of the Right Links:
When analyzing specific drops and gains in a vertical, it’s sometimes apparent that certain sites have an enormous amount of authority (and other less authoritative sites have trouble competing against them). For example, Google has explained that the A in E-A-T is heavily influenced by PageRank (or links from around the web). They explained this in a whitepaper on fighting disinformation that was published in 2019 (screenshots are below).

Also, Google’s Gary Illyes once explained at Pubcon that E-A-T was largely based on links and mentions from well-known sites. So when Google is trying to understand how authoritative a site is, then having the right links matter. It’s not about quantity, it’s about quality. And that’s why you can’t easily fix this situation if you currently don’t have a lot of authority…

For example, you can’t just go out and get links and mentions from powerful sites across the web (like CNN, The New York Times, or even smaller sites within a niche that have a lot of authority). You have to earn that naturally over time by doing the right things content-wise, promotion-wise, etc.

It’s worth noting I saw this in action with the December broad core update. I compared a site that had been hit hard with the sites that now rank in the top spots and it was very clear there was an authority difference there. One site had less than 1K links total and not many from extremely authoritative sites, while the others had millions of links, including many from some of the most authoritative sites on the web.

Below is a screenshot from Majestic’s Solo Links where you can compare the top links from each site. It’s never about one thing… but it can be hard to compete against sites with massive amounts of authority.

Below you can see some of the top domains that are linking to another site that is ranking well across many of the top queries, but not linking to the site that dropped. Again, it’s never about one thing, but authority matters:

The Usual Suspects:
I have covered this many times in my posts about core updates, but it’s worth mentioning here again. If you are looking for ways to improve your site, then it’s important to keep a look out for what I call “The Usual Suspects”. It’s a great movie, but it’s not so great for core updates. :)

Google is on record that it wants to see significant improvement in quality over the long-term to see gains during subsequent core updates. That’s why it’s important to surface all potential issues and address as many as you can. This ties with the “Kitchen Sink” approach to remediation, which I have covered in many of my posts about broad core updates.

For example, I would:

  • Hunt down all low-quality or thin content on the site and address that.
  • Rectify any user experience barriers on the site.
  • Make sure you don’ have an aggressive, disruptive, or deceptive advertising experience (I mentioned this earlier).
  • Review the site from an E-A-T perspective (expertise, authoritativeness, and trust). i.e. The site may lack E-A-T, which is a nuanced topic that confuses many site owners. Also, E-A-T is weighted heavier for YMYL queries, so it’s super important to understand this if you focus on a YMYL topic.
  • Hunt down technical SEO problems that cause quality problems. I covered this earlier.
  • And again, understand there could be relevance adjustments, which might be correct. For example, maybe your site was ranking for queries it had no right ranking for. If Google pushes a relevancy adjustment impacting those, then that’s fine. There’s nothing for you to do there.

Are Broad Core Updates Related To BERT?
There has been some confusion in the industry about whether broad core updates were related to BERT. If you’re not familiar with BERT, it’s an AI natural language processing (NLP) algorithm that helps Google understand queries and content better. Google announced the rollout in October of 2019 and called it one of the biggest leaps forward in the history of Search. It is now used for nearly 100% of English queries conducted on Google.

Regarding broad core updates and how they are related to BERT, Google’s Danny Sullivan replied to Barry Schwartz on Twitter that the two are unrelated. So to be clear, broad core updates have nothing to do with BERT directly.

https://twitter.com/dannysullivan/status/1341411255647662081

This makes complete sense. Google is evaluating many factors with broad core updates, and over an extended period of time. Remember, there’s never one smoking gun. There’s typically a battery of them. And I have covered a lot in this post related to that statement! So yes, BERT is important, but it’s not related to broad core updates.

Machine Learning and “It Depends”:
And there’s one more topic I wanted to cover before wrapping up part one of this series. Both Google and Bing have explained recently that they are using machine learning in Search in various capacities. That’s where they identify the signals, identify the desired outcomes, and then let machine learning figure out the weighting of those signals. Yes, they let machine learning determine the weighting of those signals.

That’s extremely important to understand and it’s why you might hear Google and Bing representatives say “it depends” when answering a question about how important something is. They literally don’t know the weighting, and the weighting can actually change over time. You can listen to Bing’s Fabrice Canel explain this in the 302 of a Kind podcast with Marcus Tandler and Izzi Smith (at 35:02 in the video).

This is also why it’s important to simply improve your site overall. Don’t focus on one or two things. Significantly improve your site over the long-term. That’s what both Google and Bing want to see.

Coming Soon: Part 2 of the Series with 3 (More) Case Studies That Emphasize The Complexity of Broad Core Updates
As I explained earlier in the post, part two of this series covers three interesting case studies based on the December 2020 broad core update. My hope is that between part one and two, and my post about the May core update, site owners can have a strong understanding of what Google is doing with these core updates, as well as how certain sites tackled being impacted by previous updates.

Moving forward: Tips and recommendations for site owners impacted by broad core updates
If you have been impacted by the December broad core update, or a previous core update, then definitely read the following bullets containing some tips and recommendations. I know it can be frustrating to see a big drop in visibility and I hope you find these bullets helpful.

  • Improve your site overall, don’t cherry-pick changes. Google wants to see significant improvement in quality over the long-term.
  • Remember there are “millions of baby algorithms” working together. Don’t miss the forest through the trees. Objectively surface all potential issues that could be impacting the site.
  • Use a “kitchen sink” approach to remediation. Fix it all (or as much as you can).
  • Conduct a user-study through the lens of broad core updates. I can’t emphasize enough how powerful this can be. Read my case study and form a plan of attack. Hearing from real users can help you identify issues you could easily miss. You might be too close to your own site.
  • Read and internalize information about E-A-T. I mentioned in my SMX Virtual presentation that both Lily Ray and Marie Haynes have published some excellent information about E-A-T. I recommend going through those articles and presentations.
  • Read the Quality Rater Guidelines (QRG). Then read it again. It contains 175 pages of SEO Gold. It’s a document published by Google that explains what it considers high versus low quality, what raters should look at while evaluating sites, and much more. If you haven’t read it, I think you will find it enlightening. :)
  • If you are impacted by a broad core update, you will typically need to wait for another broad core update to see recovery. So don’t roll out changes for a few weeks and then roll them back (after not seeing recovery). That’s not how it works. Google is on record explaining you (typically) will not see recovery until another broad core update rolls out, and only if you have significantly improved the site.
  • And most importantly… DON’T GIVE UP. You can absolutely recover from these updates. It just takes a lot of work and time.

Part 2 Coming Soon With Three Case Studies:
And remember, part two of this series contains three case studies based on sites impacted by the December broad core update. Each case provides a unique view of how broad core updates can impact a site, how site owners responded, how those situations ended up, and more. You can subscribe to my RSS feed or follow me on Twitter to be notified about the next post.

GG

Back to Table of Contents