The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for January 2018

M-Dot Versus D-Top: How To Hack DeepCrawl’s Test Site Feature To Compare Your Mobile And Desktop Sites SEO-wise

January 23, 2018 By Glenn Gabe Leave a Comment

Comparing separate mobile urls to desktop using DeepCrawl's test site feature.

Last month I wrote a post covering a number of real-world mobile problems I surfaced on sites using separate mobile urls (like m-dot subdomains). With Google moving to a mobile-first index, it’s extremely important to make sure your mobile urls contain the equivalent content, structured data, canonical tags, hreflang tags, etc. as your desktop urls. Since Google will use the mobile urls and content for ranking purposes once the switch to m-first happens, you can end up with ranking issues and a drop in Google organic search traffic if your mobile urls are inadequate. And that’s never a good thing for sites looking to grow.

Now, if a site is responsive or using dynamic serving, it should be fine. But for sites using separate mobile urls (like an m-dot subdomain), there could be many gremlins lurking around. My post contains actual problems I have uncovered while helping companies prepare for Google’s mobile-first index. From botched switchboard tags to broken canonicals to double canonicals to rel alternate being delivered via the header response, companies could run into a number of problems that are tough to detect with the naked eye.

As part of my final recommendations in that post, I explained that it’s important to crawl your site as both Googlebot desktop and Googlebot for Smartphones to make sure the mobile crawl doesn’t surface many problems not present via the desktop crawl. It’s a great way to check your setup from both a desktop and mobile perspective. But I didn’t explain one tool that can really help those sites using separate mobile urls, and I’m going to cover that in this post. If you are using separate mobile urls, I think you’re going to dig it. Read on.

Comparing Desktop to Mobile Crawls (with the help of DeepCrawl)
You can definitely crawl your desktop urls and mobile urls and dig in manually. That’s a smart thing to do. But comparing larger-scale sites using this method can get tough. Wouldn’t it be great if a crawling tool could help by comparing the two crawls for you? Then you could surface problems faster and spend your time digging into those problems versus scanning thousands of urls.

Enter DeepCrawl’s test site feature, which you can hack to compare against your mobile subdomain.

Hacking DeepCrawl’s Test Site Feature To Compare Mobile Versus Desktop URLs
If you’ve read my previous posts about audits, migrations, algorithm updates and more, then you have come across many mentions of DeepCrawl. It’s one of my favorite SEO tools and has been for a long time. Actually, I loved it so much that I’m on the customer advisory board and have been a CAB member since it was launched a few years ago.

In DeepCrawl, there’s an option for crawling a test site. It’s been part of DeepCrawl for a long time and it’s a great way for SEOs to crawl and analyze a site while it’s in staging. Not only will it crawl a test site, but it will compare the test site to the production site. And if you’re following along with what I mentioned above about testing m-dots, then you might see where this is headed. :)

DeepCrawl's Test Site Feature

Although your m-dot is not a test site, you can hack DeepCrawl into thinking it’s a test site. Then it will crawl the mobile subdomain and compare to the previous crawl (which can be your desktop site). This technique was covered by Jon Myers in his presentation about preparing for Google’s mobile-first index. You should check out his presentation on SlideShare if you haven’t yet. And if you’re running a site that employs m-dots, then a lightbulb probably appeared above your head. Below, I’m going to explain how to set this up via DeepCrawl. Then I’ll cover some of the problems you can surface once the crawls have completed.

How to set up DeepCrawl to compare desktop and m-dot urls:
In order for DeepCrawl to compare separate mobile urls to desktop urls, you need an initial crawl of your desktop pages and then a second crawl of your “test site”, which will be your m-dot subdomain. Follow the steps below to set this up.

1. Set up a new crawl, enter your desktop domain and give the crawl a unique name.

Setting up a new project in DeepCrawl.

2. Check the box for “Website” crawl and do not check the box for “Crawl all subdomains”.

Selecting a crawl source in DeepCrawl.

3. Set the maximum number of urls to crawl. I recommend crawling as many as you can from the site (unless it’s a massive site). You want to make sure you are comparing apples to apples, so it would be optimal to get a significant sample from the site. For many sites, this won’t be a problem. If you have a large-scale site (greater than a few hundred thousand pages indexed), then you can craft strategies for crawling site sections, certain directories, etc.)

Setting crawl limits in DeepCrawl.

4. Next, click “Advanced Settings” which is marked with a gear icon. Again, make sure “Crawl subdomains” is not checked. We want to focus on the desktop site.

Uncheck crawl subdomains in the advanced settings in DeepCrawl.

5. Under “Link Restrictions”, uncheck “Crawl URLs in media mobile alternate tags” and “Crawl URLs in AMPHTML alternate tags. Again, we want the first crawl to be focused on our desktop urls.

Don't crawl mobile alternate tags or AMP tags in DeepCrawl.

6. Under “Spider Settings”, select Googlebot as the user-agent for the first crawl. In the second crawl, we’ll use Googlebot for Smartphones.

Selecting Googlebot as the user-agent in DeepCrawl.

7. Then click “Save and Start Crawl” to begin the crawl of your desktop urls.

Saving and starting your crawl in DeepCrawl.

8. {Optional} While the crawl is running, you can relax and sip a frozen drink while reading my tweets about SEO. :)

Dogs relaxing on a beach.

Crawl Two: Setting Up A Crawl Of Your Separate Mobile URLs (m-dot)
Once the first crawl completes, you’ll be ready to launch the second crawl (which will crawl your m-dot subdomain and enable DeepCrawl to compare the two crawls). Follow the instructions below to set this up:

1. From the main projects screen in DeepCrawl, click the edit icon (a small pencil icon). This will enable you to edit the settings so you can correctly crawl your mobile subdomain.

Editing the crawl settings in DeepCrawl.

2. Jump to the “Settings” screen, which is the fourth screen when editing your crawl settings. First, make sure “Compare to last crawl” is selected. The first crawl you launch won’t have a previous and that’s normal. But the second crawl you launch should be compared to the first (basically comparing m-dot to desktop). Note, you can always select a specific crawl to compare to in case you launched additional crawls of your m-dot and wanted to compare to a previous crawl of desktop or m-dot.

Comparing to last crawl in DeepCrawl.

3. Click “Advanced Settings” (fourth screen) again and jump down to “Spider Settings”. Select Googlebot for Smartphones as the user-agent for this crawl.

Selecting Googlebot for Smartphones as the user-agent in DeepCrawl.

4. Next, jump down to Test Settings and select “Test Site Domain and Custom DNS”. Under “Test site domain”, you should enter your m-dot subdomain on the first line. And since this isn’t a real test server (remember, we’re hacking this feature to compare the m-dot to desktop), you don’t need to enter anything in the Basic auth username or Basic auth password fields. Also, make sure “Use the test site with the crawl” is checked.

Hacking the test site feature in DeepCrawl by adding an m-dot.

5. That’s all you need to change in order for DeepCrawl to crawl your m-dot and then compare to the previous desktop crawl. Click “Save and Start Crawl” to begin the crawl of your m-dot.

Saving and starting a crawl in DeepCrawl.

6. {Optional}: Finish your frozen drink from earlier and read more tweets from me about SEO on Twitter. :)

More relexation at the beach while DeepCrawl runs.

Examples of problems you can surface when comparing m-dot to desktop:
Once the mobile crawl finishes, you’re ready to dig into the data and surface gaps between your desktop and mobile sites. The dashboard view will show you the number of top issues surfaced overall while also showing a trending line for each problem compared to the previous crawl.

DeepCrawl's dashboard surfaces top problems from the crawl.

When employing separate mobile urls, you need to watch out for major gaps and differences between your desktop site and mobile site. That includes content differences, canonical problems, internal linking gaps, crawl errors, hreflang problems, structured data issues, and more. You should review all of the major categories in DeepCrawl to make sure you have a solid view of how the crawls compare. Below, I’ll cover a few examples of what you can find.

Max Fetch Time (Good timing based on Google’s latest announcement, pun intended.)
In the screenshot below, you can see a major difference in the number of urls that took a long time to load between desktop and mobile. There were over 7K urls that had performance problems, including some urls taking up to 15 seconds to load.

Google just announced that mobile page speed will be a ranking factor as of July 2018, so this handy report can help you uncover mobile urls that might have performance problems.

Performance problems spike on the m-dot subdomain.

Thin content and empty pages
You might find more thin content or even empty pages in the mobile crawl when comparing to desktop. With Google’s mobile-first index, mobile pages will be used for rankings (once your site is switched to the m-first index). Therefore, you don’t want a situation where the desktop urls contains high-quality content, but some of the mobile urls are thin or even blank.

Empty pages on the mobile subdomain.

404s (Page Not Found)
You might find a spike in 404s on the mobile site, which can obviously cause big problems SEO-wise (if the pages should resolve with 200s and contain content matching the desktop alternatives). In this example, there’s definitely an increase in 404s on the m-dot, and the company will need to rectify the problems soon (as the mobile-first index approaches). And combine this report with the broken links report and you can hunt down which pages contain links to the 404s.

404s spike on the mobile subdomain.

Pages Without hreflang
For sites providing content in multiple languages, hreflang is a great solution. It enables you to tell Google which urls provide the content in which languages. Then Google can match up the correct page with the correct user based on language. When using hreflang, many site owners provide a “cluster” of alternate urls in the head of the document. You can also supply hreflang via sitemaps and via the header response, but many provide the tags directly in the code. If those tags are missing on your mobile urls, then hreflang will not work as expected. In the screenshot below, you can see a spike in pages without hreflang. There’s clearly a problem with the site publishing hreflang tags on the m-dot.

Hreflang problems on the mobile subdomain.

403s Spike on Desktop (Forbidden)
Well, that’s not good. You can clearly see a spike in 403s on the mobile site for some reason. You would clearly want to dig in and find out why that’s happening. That’s a sinister SEO problem that can definitely cause problems rankings-wise. I’ve seen some pages return 403s that load perfectly (so the urls are returning 403s, but they look like 200s).

403 errors on the mobile subdomain.

Missing Titles
When using separate mobile urls, it’s easy for some site owners to pay less attention to the m-dot. For example, not properly optimizing each mobile page. In the screenshot below, you can see a spike in missing title tags (where the title tags are completely missing from the page or empty). Again, when your site gets switched to the mobile-first index, the mobile urls and content will be used for ranking purposes. I would not have many urls without title tags when that’s the case.

Title tags missing on the mobile urls.

Broken Images
When comparing m-dot to desktop, you might also find problems with the images on your site. Again, many site owners neglect their m-dots as they focus on desktop. Below, you can see a spike in broken images on the mobile subdomain. Broken images could obviously impact image search, while also impact the user experience. It’s hard to have confidence in a site when the page looks broken. And it’s hard for images to rank in image search if they return 404s.

Broken images on the mobile subdomain.

I think you get the picture. As you can see, DeepCrawl can surface many issues when comparing m-dots to desktop sites (and many of those issues could be invisible to the naked eye). When you crawl at scale, you have a better shot at picking up those issues, which can help you fix important problems before your site is switched to Google’s mobile-first index.

Summary: Hacking is not always a bad thing (when DeepCrawl is involved)
The process I documented in this post covers a smart way to hack the “test site” feature in DeepCrawl to compare separate mobile urls to desktop. And based on Google’s mobile-first index, and Google’s announcement about mobile page speed becoming a ranking factor, it’s important to compare your mobile and desktop sites now (and not after the fact). Because if you compare them now, you can nip problems in the bud (which can help the site in the short-term and after Google’s changes go live). So hack away. :)

GG

 

Filed Under: google, mobile, seo, tools

A Holiday Hornets’ Nest – Analysis and Findings From The December 2017 Google Algorithm Updates

January 11, 2018 By Glenn Gabe 16 Comments

Summary: From Maccabees to celebrities to doorways to affiliates, Google took it all on in December of 2017. There were several dates leading up to the holidays with significant movement from an algorithm update standpoint, including December 5, December 12, December 15, December 18, and December 26 (with possible connections between some of those updates). This post covers what I saw while digging into the December volatility.

December 2017 Google Algorithm Updates

If you follow me on Twitter, then you already know how crazy December was from an algorithm update standpoint. I was posting many screenshots throughout the month based on what I was seeing across sites. We know Google can push between 1,000 and 1,500 updates per year, but we also know many are small changes that few notice. And on the flip side, we also know there are larger algorithmic changes that cause the earth to shake.

Well, we saw multiple updates in December that did just that. It reminded me of September of this year, when we also saw a number of updates in a short amount of time. That’s when I wrote a post titled, “Enter the Hornets’ Nest” covering those updates. You can think of December’s volatility as a “Holiday Hornets’ Nest” since the updates seemed to increase as we got closer to Christmas Day (and we even saw manual actions being handed out on Christmas Day!) More about that soon.

For those of you unfamiliar with December updates from previous years, Google has explained in the past that they try to limit updates as the holidays approach. But that statement hasn’t held up over the past few years. Last year, we saw significant updates leading up to the holidays (with the last being December 15, 2016). I wrote about those updates on my blog after seeing serious volatility last year. By the way, keep December 15 in the back of your mind. We’ll be revisiting that date shortly, but for 2017 instead. Just an interesting side note.

First, let me cover the dates I saw significant movement and then we’ll start to dig further into the specific updates.

And here is handy table of contents that links to specific sections of the post:

  • List of dates in December with significant volatility.
  • December 5 – it begins.
  • December 12 – The Maccabees Update
  • December 15 – Celebrity Update
  • December 18 – Relevancy and possibly a continuation of 12/15 changes.
  • December 26 – Merry Christmas! Manual actions and more algo changes.
  • Recommendations for those impacted.

A quick note about the process I used when analyzing the updates:
I have access to large dataset of sites that have dealt with quality problems in the past. That’s across countries and categories as well. Between the sites I already have access to (and have assisted), new companies reaching out to me after seeing movement during the recent updates (up or down), and then digging into specific verticals during my analysis, I ended up with a list of 153 different websites that were impacted by the December updates.

For the sites I have helped, or have access to, I know the problems they are dealing with, what they have been doing to rectify those issues, etc. In addition, when it made sense, I decided to crawl a number of the sites that were impacted to get a better feel for what was going on from a quality and technical SEO standpoint. As many of you know, sometimes a crawl can reveal a number of problems that are harder to identify manually. More about that soon.

Dates In December With Significant Volatility (With Examples Of Movement)

December 5, 2017
Drop during the December 5, 2017 google algorithm update.

Increase during the December 5, 2017 Google algorithm update.

Drop during the December 5, 2017 Google algorithm update.

December 12, 2107
(Barry named this the Maccabees update.)

Drop during the Google Maccabees Update on 12/12/17.

Big drop during the December 12, 2017 Google algorithm update.

Big drop during the December 12, 2017 Google algorithm update.

Increase during the December 12, 2017 update.

Connection between 12/12 update and previous quality updates.

December 15, 2017
I saw many official celebrity sites drop on this date, among other hits and surges. You can read more in my post covering the celebrity update.
Here is the search visibility drop for Channing Tatum’s official website:

Official site for Channing Tatum dropping after 12/15 update.

And here is the drop from #1 to #11 for his own name:

Channing Tatum ranking #11 after the 12/15 update.

December 18, 2017 (could be tied to 12/15)

Drop during the December 18, 2017 update.

Drop during the December 18, 2017 update.

Drop during December 18, 2017 update.

Increase during the 12/18 Google algorithm update.

December 26, 2017
Yes, one day after Christmas. Ho ho ho. Note, there were also manual actions dished out on Christmas Day, but I also saw algorithmic changes too – showing on the 26th. Note, there was more volatility on 1/4 as well, and you can see that in one of the screenshots below. So Google apparently wasn’t done yet… the volatility continued into early January 2018. I saw movement on a number of sites on 1/4/18.

Drop on 12/26/17.

Increase on 12/26/17 with connections to previous updates.

Drop on 12/26 and more of a drop on 1/5/18.

Important Note: Google confirmed that they pushed multiple changes in a short amount of time in December. Actually, Danny Sullivan said, “several minor improvements”. Therefore, it’s really hard to pin down what each update targeted (since they undoubtedly overlapped in December). Also, some sites seemed to be impacted during multiple dates in December, and those sites often had a number of problems. I’ll cover what I’m seeing below, but the core point is to understand the problems overall (versus trying to connect one date with one set of problems). Also, Danny said they were “minor changes”. Based on the hits and surges I analyzed, they didn’t look minor. :)

Danny Sullivan confirms December changes in December of 2017.

More details about each update:

December 5, 2017 – Over-optimization, doorway-like pages.
Once I started digging into the data, you could clearly see movement on 12/5/17. And although many sites impacted by the 12/12 date were targeting many keyword permutations, I saw a number of those sites were actually impacted starting on 12/5.

The reason I bring this up is because Barry Schwartz dug into a number of sites that claimed to have been hit by the Maccabees update on 12/12 and he saw a number of those sites were targeting many keyword permutations. He was right and many employed doorway-like pages. But based on the data I analyzed, I saw a number of sites employing that tactic hit on 12/5/17 as well.

Some were large-scale sites targeting many locations or keyword variations. I saw impact on a range of different sites, including large-scale directories, ecommerce retailers, affiliate sites, etc. So as part of that update, Google could have made be an adjustment to how it algorithmically handles over-optimization, keyword stuffing, doorway pages, etc.

It was clear that the sites were targeting many different keyword variations on specific pages (again, doorway page-like). The pages were often over-optimized as well, with many keyword variations targeted in their titles tags. For example, I found close to 800 different pages on one site with nearly the same title tag (all targeting slight variations in keywords). And many of those pages were thin and low-quality. When checking the site overall, I see thousands of those urls indexed.

Here’s a quick visual of how those pages were laid out. Again, there were close to 800 of these on the site… And notice the blank area where main content should be. This is just one example, but I saw many during my travels while analyzing the 12/5 and 12/12 updates.

Sites targeting many keyword variations impacted on 12/5/17.

December 12, 2017 – Maccabees Update. A mixed bag of over-optimization and quality problems.
The movement on December 12 was substantial and gained the attention of many site owners and SEOs. Barry Schwartz covered the update heavily on Search Engine Roundtable and he also dug into a number of sites that were hit on that day. As mentioned earlier, Barry saw a range of problems, but a number of sites were targeting many keyword variations (doorway-like). That’s very similar to what I saw on 12/5 as well.

While analyzing sites that truly dropped or surged during 12/12 (and not 12/5), you could see a range of problems on the sites that were impacted. Based on the data I have access to, some sites impacted fell into the keyword permutations bucket (over-optimized and doorway page-like just like Barry saw), while other sites negatively impacted had many of the usual suspects from a quality standpoint. I also saw heavy affiliate impact. I saw a number of affiliate sites prioritizing monetization over user experience that were negatively impacted on 12/12.

And from a quality update standpoint, the term “quality” can mean several things from low-quality content to user experience barriers to aggressive and disruptive advertising, and even technical problems that cause quality problems. I have covered these issues many times in the past while writing about major algorithm updates.

So to me, 12/12 looked like Google pushed a few changes, including what many would call a “quality update”. Again, I’ve written about many of these updates over the years (including August 2017, May 2017, and February 2017, and more). You can read those posts to learn more about what I was seeing then.

For example, check out the percentage of low-quality or thin pages I surfaced on this site which was hit on 12/12:

Percentage of low-quality and thin pages.

And here is an example of the layout for one affiliate site that was hit hard. The user experience was horrible. It was clear that monetization was the priority:

Heavy affiliate sites were impacted.

I also saw sites covering what the Quality Rater Guidelines explains as “Your Money or Your Life” topics (YMYL), yet they lacked what’s called E-A-T, or expertise, authoritativeness, and trustworthiness. Sometimes, there were no authors listed, and when they were, they had missing photos, weak bios, and exhibited no expertise in the subject matter (covering health, finance, and more). I believe some were fake. I even saw some disclaimers that the site contained “general information” that was “not written by a professional” in that niche.

Here’s a quick screenshot from the QRG about E-A-T:

EAT in the Quality Rater Guidelines (QRG)

So if you were impacted on 12/12, I recommend reading my posts about quality updates and running through the Quality Rater Guidelines (QRG) several times. I surfaced all types of major quality problems on sites impacted by the 12/12 update (beyond the keyword variations problem).

Here are a few bullets directly from my analysis (there are many, this is just a quick list):

  • Surfaced over one thousand thin or near-empty pages.
  • Contains aggressive ads all over the site, including giant ads above the main content pushing the MC down the page.
  • Contains deceptive ads, not labeled, that look like main content, and drive users downstream to advertiser sites. Totally deceptive.
  • YMYL, but no E-A-T. Also contains aggressive and disruptive ads. Horrible user experience.
  • Low-quality affiliate site. Prioritizing monetization over UX. No surprise this was hit.
  • UX barriers galore, including a botched mobile UX. Hard to read the main content and traverse the site. I’m sure many are leaving the site quickly, especially mobile users.
  • And more problems that I simply don’t have enough room to include… 

Therefore, you need to objectively analyze your site through the lens of quality, and then make significant improvements over the long-term. John Mueller has explained this several times over the past few years when being asked about major algorithm updates. Don’t put a band-aid on the situation and expect to recover quickly. That simply won’t happen.

Here is John explaining this at 39:33 in the video:


And if you are employing many pages targeting many different keyword permutations, then you should revisit that approach. I would only publish pages when you have enough quality content to meet or exceed user expectations. And I would avoid publishing countless pages containing the same (or close to the same) content, but optimized for many different keyword variations. I could clearly see many sites that were negatively impacted employing that tactic. As Barry Schwartz documented in his post, and based on what I saw as well, there were many sites impacted that contained over-optimized pages.

December 15, 2017 – The Celebrity Update: Relevancy, user-happiness.
While analyzing both the 12/5 and 12/12 updates, I started to notice movement on 12/15 as well.  And that included serious impact to many official celebrity websites (with most of those sites dropping in rankings for the celebrity’s name).

Prior to 12/15/17, many official websites ranked number one for the celebrity name. And then boom, 12/15 arrived, and many of those sites dropped. And some dropped to the bottom of page one, while others dropped off page one. I saw this across many official celebrity sites and wrote a post covering my findings. I also heard back from Danny Sullivan on Twitter that he passed the information along to others at Google after reading my post (I’m assuming to the Search Quality team, Search engineers, etc.)

Tweet from Danny Sullivan about the 12/15 update.

For example, Tom Cruise, Charlie Sheen, John Lennon, Marilyn Monroe, and more all dropped on 12/15. Here’s an example of John Lennon’s official site ranking #1 before 12/15 and then dropping after 12/15. And check out the search visibility drop for his official site.
John Lennon's offiical site before 12/15.

Drop in search visibility for John Lennon's official site.

John Lennon's official site after 12/15.

Again, you can read my post about the 12/15/17 update to learn more about the algorithm update and to view more examples.

But why would these sites get hammered? Well, if you check many official celebrity websites, you’ll notice major quality problems. The sites are typically thin, don’t contain updated information, sometimes are just single-page websites, they have serious performance problems, and more.

Therefore, it seems Google began treating those official celebrity websites just like any other site starting on 12/15. Which also means… Google wasn’t treating them the same way prior to 12/15. I think that’s the most important point about the update on 12/15. Don’t get too caught up in “celebrity X dropped in rankings”, but you should get caught up in “Google’s core ranking algorithm was treating some official celebrity sites differently than others prior to X date”. That’s the real news with the 12/15 update.

But not all celebrities were hit. I provided an example in my post about Katy Perry’s site, which retained the number one spot for her name. And when you check out her site, you’ll see a much stronger site than most official celebrity sites. There’s more content, it’s well-organized, it contains updated information, and so on. Dare I say that Katie “Roared” her way through the update? :)

Katy Perry's official site still ranking number one.

John Mueller’s Celebrity Response:
I was able to ask John Mueller a question about the update during the last webmaster hangout video. You can view John’s response below. He basically explained what I wrote above, which was that users might be looking for more, or different information, versus what those official celebrity websites offer. That might be information from Wikipedia, movie information from IMDB, celebrity news, or information from other sites. But again, it’s interesting that the official celebrity sites all ranked number one prior to 12/15.

December 18, 2017 – More relevancy adjustments, Tremor from 12/15?
While analyzing the 12/15 update, it seemed that Google wasn’t done yet. There’s nothing that screams “holiday spirit” like more algorithm updates as we approach Christmas Day. Yes, I saw yet another date with significant movement, which was 12/18/17.

While checking out the drops and surges on that day, they seemed closely tied to relevancy. For example, Google making sure that sites prominently ranking for queries should actually rank for them. And then adjusting the ones that were “overly-prominent” – basically sites that shouldn’t be ranking well for those queries dropped. Sounds very Panda-like, right?

Now, since 12/18 is so close to 12/15, it’s totally possible the impact was from the same update. It’s hard to say. And if you think about it, the celebrity update was about relevancy as well. For example, as John Mueller explained, making sure the right sites are surfaced based on what Google believes users want to see. So again, 12/15 and 12/18 could very well have been the same update (or connected). Remember, Google can push smaller tweaks after larger updates are released. I called those smaller updates “tremors” and John Mueller confirmed that with me back in the “medieval Panda” days.

Algorithm tremors.

December 26, 2017 – Manual actions AND algorithmic movement. And Giphy begins its long road back.
Remember I said Google used to avoid pushing updates close to the holidays? Well, you can’t get much closer than the day after Christmas! Also, there were reports of manual actions being handed out on Christmas Day and you could see the damage pretty clearly, like the screenshot below. Notice the big drop and then the quick recovery? That site was hit on 12/25 with a manual action for unnatural links and it only took three to five days to have the manual action revoked. And the site regained many of the keywords quickly (but not all). This was documented by the site owner.

December 25, 2017 manual action revoked.

And at the same time, I saw sites surging or dropping algorithmically starting on 12/26. Therefore, not only will Google push changes close to the holidays, they are pushing changes very close to the actual holiday (and even dishing out manual actions on Christmas Day!)

One site that caught my attention that was positively impacted on 12/26/17 was Giphy, which got hammered back in October. I shared the drop on Twitter in October after seeing a massive drop. Google basically started deindexing their /search/ pages in October, which were ranking well and driving a lot of traffic.

Here was the massive drop for Giphy in early October 2017:

Giphy drops in early October 2017.

On 12/26, Giphy began the long road back to recovery. Note, they didn’t jump back to where they were… but they did surge. And the /search/ pages aren’t driving that surge. Instead, it looks like they are building out explore pages, while their gif pages saw an increase too.

Giphy surges on 12/26/17.

Here is snapshot of rankings changes during the 12/26/17 update for Giphy:

Giphy rankings increase on 12/26/17.

Others saw movement as well on 12/26 (both up and down). And the second site below had been impacted by previous quality updates (February and May of 2017):

Big drop on 12/26/17.

Increase during the 12/26/17 update with connections to previous updates.

Important Side Note: Losing Featured Snippets and Rich Snippets – A sign of a quality update.
I’ve mentioned before that both rich snippets and featured snippets have a quality component. Therefore, sites can lose or gain them when Google refreshes its quality algorithms. Well, we saw more of that during the December updates.

For example, check out the site below, which was negatively impacted by one of the December updates. They had many rich snippets until the update, and then boom, they were removed. Nearly 18% of queries yielded rich snippets prior to the algorithm update and that dropped to just .27%. That’s insane. Video results also drastically dropped, but I haven’t dug into that yet for the site in question. But that’s an interesting side note.

Site loses rich snippets after Google algorithm update.

My recommendations for moving forward:
Moving forward, I have some recommendations for site owners that were impacted (and for those that believe they are in the gray area and susceptible to being hit by future updates). First and foremost, always look to hunt down quality problems, UX barriers, aggressive and disruptive advertising, and technical SEO problems. If you stay on top of that, and nip those problems in the bud, you have a much better chance at avoiding a major hit. And that includes surfacing over-optimization and doorway-like pages which were hit on 12/5 and 12/12. Make sure you are not employing any tactics that can cause serious damage. Google clearly made an adjustment from an over-optimization and doorway page standpoint, and many sites were left reeling.

Because if you don’t stay on top of quality problems, and those issues remain on a site for an extended period of time, then you can wake up to a scary situation. I’ve received calls from many companies over the years that fell into the latter category. It’s not pretty and I recommend avoiding that at all costs.

Here’s a quick list of items you can start on now. It doesn’t contain every item you should tackle, but it’s a heck of a starting point:

  • Crawl your site today, and then on a regular basis. Hunt down quality problems, technical SEO problems, and more. Fix them ASAP.
  • Analyze your site through the lens of quality. Make sure you are providing the best possible content and user experience for the topic at hand. Objectively evaluate content-quality from the standpoint of a typical user searching for queries on Google that lead to those pages.
  • Read the Quality Rater Guidelines (QRG) several times, review your site objectively, surface potential problems, and rectify those problems quickly. As I’ve said before, I’m seeing many connections between the QRG and what I’m seeing in the field while analyzing sites impacted by major algorithm updates.
  • Ensure your site works well across devices, and make sure you are ready for Google’s mobile-first index. Google is actively moving sites to its mobile-first index now and will continue to do so throughout 2018 (and beyond).
  • Make sure your ads don’t send users screaming from your site. And don’t deceive your users with weaved ads or ads that look like main content. Don’t hold affiliate dollars over user experience. That probably won’t work out well for you.
  • Remove UX barriers that inhibit people from accomplishing tasks on your site. Google doesn’t want to send people to a site that’s frustrating to use.
  • Perform user testing. You never know what real people think about traversing your site until you actually hear from them. You might be shocked what you find.

Summary – The Hornets’ Nest Is The New Norm
I don’t see the frequency of updates slowing down any time soon. As we’ve seen in both September and December of 2017, Google can, and will, push multiple updates in a short amount of time. And that’s without giving much information about the specific updates. I recommend reading my closing tips, which can help weed out quality problems. And I would do this even if you haven’t been hit by an algorithm update. Some proactive work now can help maintain the long-term health of your site.

Moving forward, I’ll continue sharing more of what I’m seeing volatility-wise here on my blog, on Twitter, Facebook, etc. And if history has shown us anything, I’ll be posting about the next wave of updates pretty soon.

Welcome to Google Land. Please keep your arms, legs, and websites inside the vehicle at all times. :)

GG

 

Filed Under: algorithm-updates, google, manual-actions, seo

Connect with Glenn Gabe today!

Latest Blog Posts

  • How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)
  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent