The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for May 2017

The May 17, 2017 Google Algorithm Update – Frequency of Quality Updates, Surfing The Gray Area, and Reversals

May 26, 2017 By Glenn Gabe 30 Comments

May 17, 2017 Google Update

We’ve seen our fair share of major core ranking updates this year with an update in early January, then the February 7 update, then Fred on March 7, and then more movement in late April and early May. And just a few weeks from the last update, we witnessed another big core update that rolled out on May 17, 2017.

Barry Schwartz was the first to report the update, and it wasn’t long before the impact was clear. I’ve seen substantial volatility from this update and it does look like yet another quality update. More on that soon.

Since 5/17, I’ve been digging into the update and analyzing sites that saw impact (both positive and negative). From a data standpoint, I have access to a number of sites that saw movement on 5/17, and I’ve also had a number of companies reach out to me after seeing drops or gains starting on the 17th. So there was plenty of data to dig into.

Below, I’ll describe what I’ve uncovered during my research, provide some interesting cases, along with some observations about how Google is rolling out quality updates. I’ll also cover what you can do now if you have been impacted (which shouldn’t be a surprise if you’ve read my previous posts about major core updates focused on quality).

Examples of Impact
Just like with other major core ranking updates focused on quality, I’ve seen a range of impact. For example, some sites are dropping or surging 20-30%, while others saw much more impact. And there were several that dropped heavily in early May, only to surge back on 5/17. For example, one site dropped by 83% on 5/7, only to fully recover on 5/17. It’s been fascinating to analyze cases like that (there were several).

First, here are some examples of movement based on the update:

May 17, 2017 increase.

May 17 drop after february drop.

May 17, 2017 surge.

May 17, 2017 surge.

May 17, 2017 drop.

You get the picture. When the update rolled out, there was significant movement across many sites globally.

Remember, real people are visiting your site. Respect them.
After digging into many drops, I saw the usual suspects when it comes to “quality updates”. For example, aggressive advertising, UX barriers, thin content mixed with UX barriers, frustrating user interface problems, deceptive ads, low quality content, and more.

If you’ve read ANY of my posts about previous major core updates focused on quality, then what I just listed should be very familiar to you. As Google collects fresh data and refreshes its algorithms, sites drop or gain. And if you’ve been pushing the limits from a user experience standpoint, good luck.

Since Phantom 2 in May of 2015, I’ve been saying, “hell hath no fury like a user scorned”. You should get a poster of that saying and hang it in your office. Actually, hang several of them. I’m not kidding.

For example, check out this page on a site that was negatively impacted by the 5/17 update:

Example of aggressive ads.

And on the flip side, there are sites that surged during the update that have been working hard to enhance the user experience, cut down on aggressive advertising, and boost content quality. And some have been working on this for a long time… so it’s great to see them increase during the May 17 update.

Remember, Google’s John Mueller said you need to significantly improve quality overall (and for the long-term). I’ve seen this first hand while helping companies impacted by these updates.

Side Note: Eighth Graders Know What’s Frustrating, Shouldn’t We All?
Every year I present to eighth graders at a local school about SEO, digital marketing, Google, etc. And every year I ask them if they’ve experienced any sites where it’s hard to find the content (due to ads, popups, and other aggressive monetization tactics). Well, this year they all raised their hands and groaned in agreement at the same time.

THOSE ARE YOUR VISITORS.

And Google is trying to make them (and others like them) happy. So if eighth graders know the difference between what’s acceptable usability-wise, shouldn’t we all? That includes c-level executives, monetization teams, marketing teams, and content teams.

So read the Quality Rater Guidelines, share the pdf with everyone on your team, have meetings to cover the guidelines, and internalize what Google deems low versus high quality. I’ve seen many connections between what’s contained in the QRG with what I’m seeing in the field. Beware.

Here’s a screenshot from the Rater Guidelines about distracting ads. Keep in mind, this is directly from Google:
Quality Rater Guidelines 2017

Below, I’ll cover a few interesting examples from my analysis of the update. I can’t cover everything I’ve seen or this post would be fifty pages long. So I’ve surfaced some of the more intriguing examples.

Major Reversals – Fine Tweaks or Collateral Damage?
I can’t cover this update without mentioning a very interesting case. There’s a site that was algorithmically smoked on 5/7 (around the 5/4 update) that lost 83% of its Google organic traffic overnight. Yes, 83%. It was a horrible hit, and one of the worst I have seen. After the initial hit, the company reached out to me to let me know what was going on, explained what they thought it could be, etc.

After reading about updates like this, they decided to change how sponsored links were handled across the site, and refined how advertising was displayed in general (although it doesn’t appear that their ads were overly aggressive from a UX standpoint.) Then they waited… and on 5/17, almost all of their rankings and traffic returned. Here is a screenshot of the drop and the reversal.

Reversal during the 5/17/17 update.

Now, I’ve covered how long it takes to typically recover from a major core ranking update focused on quality, and that’s usually not just a few weeks. It typically takes months. In addition, I’ve seen reversals before based on Google tweaking an algorithm to ensure we are seeing the best possible results in the SERPs. And those tweaks can sometimes yield complete reversals.

Like these reversals from past algorithm updates:

Reversal during algorithm update.

Another reversal during algorithm update.

So, did the changes the company implemented after getting smoked cause the recovery during the 5/17 update or did Google simply adjust the algorithm? The only way to test this out would be to reverse their changes and see how it goes. But I definitely don’t recommend doing that… since the changes they implemented were the right ones. Anyway, just an interesting case.

Surfing the Gray Area of Quality
There was another interesting example I’ve been analyzing. It’s a site that got hammered by the June 2016 update and then surged during the February 2017 update. The site definitely did some work from an aggressive monetization standpoint, but probably just enough to creep out of the gray area. I call this “surfing the gray area of quality” because at any point, the wave could crash and you can fall back into the black.

Surfing the gray area of quality.

During the May 17 update, the site lost close to 20% of its Google organic traffic. It’s in a very competitive niche and dropped in rankings for a number of important queries. So, this was clearly an adjustment after the surge in February. I’m working with them now to identify all potential quality problems, so it will be interesting to see how they progress during future quality updates.

Drop of 20% during May 17 update.

In-depth content paying off.
There’s another site I’ve helped extensively over the years that has been working hard on creating killer content on their blog. It’s an ecommerce site in a competitive niche and they decided to invest heavily in content development (including video).

During the May 2017 update, they jumped 21% overall, and the blog content really surged (jumping 37%). When checking blog versus non-blog content, you can really see the difference in the surge. So it’s interesting to see their hard work pay off from a content development standpoint.

Blog surge during May 17 2017 update.

A Splattering of Ectoplasm and a Sprinkle of Panda
When analyzing sites that were impacted, you could see relevancy adjustments show up again (just like with previous quality updates). I’ve mentioned this before in previous posts about major core ranking updates focused on quality. For example, sites dropping in rankings when their content couldn’t really live up to user expectations.

And remember, that’s exactly how Google has explained Panda ever since it became part of Google’s core ranking algorithm. Here’s a video of Gary Illyes explaining this. So, was Panda prepped again and rolled out at the same time other quality algorithms were refreshed? That’s totally possible, but hard to pin down. All I can say is that I saw relevancy adjustments many times during my travels while analyzing the 5/17 update.

Frequency of Quality Updates – Google’s quasi human is progressing…
Ever since the fall of 2016 when we saw a number of updates in a short period of time, I’ve been hinting that Google might be increasing the frequency of “quality updates”. And 2017 has followed that pattern. Here are the core ranking updates focused on quality that I have seen since the beginning of 2017:

  • January 5, 2017
  • February 7, 2017
  • March 7, 2017 (Fred)
  • April 26, 2017
  • May 4, 2017
  • May 17, 2017

Notice anything interesting? We’ve gone from every few months to once per month (and sometimes multiple times per month). That’s not surprising since Google’s ultimate goal is to NOT roll out updates like this at one time. Instead, I’m sure they would love to have these updates continually rolling. Yes, like Panda of the present, which is very different than old-school Panda (which rolled out on one day and caused mass hysteria).

I’m sure I’ll be covering this more in future posts, but it’s worth noting that Google is upping its game from a frequency standpoint. On the one hand, that’s great for sites that have been negatively impacted. And on the other hand, that means sites can be impacted more often (and not just every few months).

So heads-up… it’s not unusual to see impact like what’s shown below. That’s when a site sees impact during a number of quality updates (since it’s in the gray area quality-wise). That’s why it’s so important to get out of the gray area.

Connection between quality updates.

What You Can Do Now:
I’ve already covered this extensively while writing about quality updates, but I’ll list it again below. If you’ve been negatively impacted by a major core ranking update focused on quality, then:

  • Analyze your site objectively from both a content quality and user experience standpoint. Run a crawl, or several, to get a solid look at your entire site. Then analyze that crawl through a quality lens.
  • Improve low quality content, nuke thin content, or noindex pages that shouldn’t be in Google’s index. If they aren’t indexed, they can’t hurt you. Focus on “quality indexation”.
  • Tone down aggressive advertising. If you annoy your users, Google can pick that up. And if Google sees this in aggregate, you can get smoked. Beware.
  • Fix all major user experience problems on the site. Go through your site like a user would. If there are any issues that inhibit users achieving a goal, fix them. And fast.
  • Hunt down technical problems that can be causing either content or UX problems. For example, major fetch and render problems, tech glitches that can cause thin content, performance problems, and more. Technical SEO is extremely important to nail down. So don’t overlook what’s under the hood.
  • Continually work to add high quality content to your site. Make sure what you’re publishing is unique, relevant, and will satisfy user needs. If you can’t do that, don’t publish it. And boost the quality of content already on your site (if you feel some of it is lacking). Check the queries leading to your content and make sure it can meet or exceed user expectations. If it can’t, enhance it.
  • Understand that you probably won’t recover quickly. John Mueller has said many times that Google needs to see significant improvement in quality overall, and over the long term. So keep fighting your way back. You can recover, but it’s not a sprint. It’s a marathon.

Summary – Will the frequency of quality updates continue to increase?
Based on the (new) frequency of updates, I’m wondering if we’ll see another quality update soon. Google seems to be pushing these updates monthly now (and we’ve even seen multiple quality updates per month in some cases). Again, that’s good for those that are impacted, but could be mean a lot of volatility for other sites. Stay tuned. It could be a hot summer algo-wise.

GG

 

Filed Under: algorithm-updates, google, seo

Beyond the UI – How to filter Google Search Console (GSC) data using regular expressions in Google Analytics (GA)

May 15, 2017 By Glenn Gabe Leave a Comment

Filtering GSC data in GA.

Update: April 2021
Google has finally rolled out filtering via regular expressions (regex) directly in Google Search Console! That’s great news and it’s something the SEO community has been requesting for a long time. You can access the filters in the Performance reporting directly in GSC. You can still do this in Google Analytics like this post explains when you connect Search Console to GA, but you can now also use regex right in GSC.

Regex filtering in Google Search Console (GSC).

———————-
How To Filter GSC Data In Google Analytics Using Regular Expressions (regex):
Yes, it’s possible to filter GSC data using regex. Like many of you, I’ve been asking for regex support in GSC for a long time. But as we know, the search analytics reporting unfortunately only provides basic filters for queries and landing pages. In addition, you can’t export more than one thousand rows in the GSC UI, which is limiting. And… you can’t export just filtered the data (which is a pain in the neck). It’s maddening.

But I’m here to show you a different way to slice and dice GSC data. And it’s been right under your nose the entire time. In this post, I’ll explain how to use regex to filter GSC data by using Google Analytics. And if you follow my Search Engine Land column, then you know I just wrote a post about filtering crawl data using advanced filters. So, you can consider this the sequel to that post. They say the sequel isn’t as good as the original, so let’s see if I can change that. Let’s rock and roll.

Requirement – Connecting GSC and GA
First, in order to do this, you need to connect Google Search Console to Google Analytics. To be honest, you should have done this already, but if you haven’t yet, better late than never. :)

I won’t cover how to connect the two, since there is solid documentation about how to connect them. Once you do, then you’ll begin seeing historical GSC data soon. Then you can start using regex in GA like I’m about to show you below. Note, you will still only be able to see the past ninety days of data. I’m hoping that changes at some point in the future, but we just have ninety days now.

Connecting GSC and GA.

Using regex in GA, it’s available and ready to go.
Once you connect GSC to GA, you will see the reporting under the Acquisition bucket of reports. There is a section labeled Search Console with a “NEW” tag. Within that set of reports, you will see landing pages, countries, devices, and queries. We’ll focus on queries and landing pages reports for this post.

GSC reporting in Google Analytics.

Let’s start by accessing the queries report. Once you click the link in the left navigation, you’ll see your queries listed in the report. Make sure to adjust the date to show you the past ninety days.

GSC Queries report in Google Analytics.

Now we’re ready to slice and dice data via regex. In order to filter the report using a simple text search, you can easily enter some text in the filter box. But don’t stop with a simple text filter. That’s the same type of lame filtering you get in GSC. Let’s take this a step further and use regex. Click the “advanced” link next to the search box to access advanced search filters.

Advanced filtering in Google Analytics.

After clicking the “advanced” link, make sure to keep “Include” and “Search Query” for the first two options, but you should change the next dropdown that defaults to “Containing” to “Matching RegExp”. That will enable you to enter a regular expression versus a simple text search.

Using regular expressions in Google Analytics.

Simple example: Filter queries using pipes (OR)
One of the easiest, but still powerful, regular expressions can be achieved by using pipe characters. A pipe {|} represents “or” in a regex. So if you were trying to filter the list of queries by three different text searches, you could enter them like this:

queryA|queryB|queryC

And when you do, GA will list only the queries in the report that match either of those conditions.

Using pipe characters in regex.

And as you can guess, you don’t need to limit the queries to simple text. You can use any regular expression for each entry in your statement. For example, let’s filter by any query that starts with query A or ends with query B.

Using regex to filter queries that start with or end with keywords.

You can go crazy here, and your own regular expressions will be based on your own analysis. The key point is that you can slice and dice GSC data in GA to your heart’s delight.

Layer on average position.
The other benefit you get when using GA to filter GSC data is the ability to layer on dimensions and/or metrics. Under your GA filters, you’ll see an option for “Add a dimension or metric”. When you click that dropdown, you can add other GSC metrics like Average Position, Clicks, CTR, and Position. So you can create more advanced filters quickly and easily.

Below, I’ll add a metric to a pipe-based regex to show us any query that matches the text we enter AND has an average position of less than 5.

Layering average position on regex filtering.

Layer on clicks and/or impressions.
And you can do the same with clicks or impressions. This will enable you to filter specific queries that also have a minimum or maximum number of impressions or clicks. For example, how about queries that contain certain keywords plus have at least 500 clicks over the past ninety days:

Adding filters for clicks and impressions in GA.

Add exclude vs. include to create advanced filters.
Previously, we kept “Include” as the first option for including the filters we were applying. But you can switch that to “Exclude” to hone your reporting. The exclude option when filtering by regex is powerful and enables you to drill down by excluding queries or landing pages that match your regex. I also mentioned this technique in my Search Engine Land post when explaining how to filter crawl data using DeepCrawl.

Excluding regex to hone reporting.

You can also filter landing pages using regex!
You can use a similar approach for filtering landing page data. If you want to find out if certain page types are gaining impressions and clicks, then you can use regex to surface them. For example, imagine you wanted to find all urls that have a certain parameter.

Filtering landing pages using regex in GA.

Or how about any landing page url containing certain keywords that have more than 3000 impressions?

Filtering landing pages by query and impressions.

You get the picture! The possibilities are endless.

Hunting featured snippets
Let’s go hunting for featured snippets by filtering the report by certain queries and then entering an average position of less than two. Featured snippets will rank #1 in GSC since they hold the top position. That was covered in the help documentation provided by Google about clicks and impressions. But you can enter “Less than 2” versus equaling one since the query will not always rank number one. It simply helps you capture more queries that might be yielding a featured snippet. Here I’m filtering by “how to” or “what is” queries with an average position of less than two.

Filtering for featured snippets in GA.

Export it!
As I mentioned earlier, the GSC UI limits your exports to just one thousand rows. That’s extremely limiting for many sites. In GA, you can export five thousand rows at a time by using the dropdown at the bottom of the report. In addition, you can export in stages by clicking the next page in the pagination and then exporting another five thousand rows. It’s not optimal, but better than an absolute limit of one thousand rows.

Exporting reports using advanced filters in GA.

Summary – Regex is available for GSC data (with the help of GA)
If you’re like me, then you’ve wanted regex support in GSC for a long time. But by using GA, you can essentially get that regex support, but just not in GSC proper. If you need to drill into your data, slice and dice query and landing page data, etc., then I recommend using the approach I listed above. I think you’ll dig it. Now brush up on your regex skills and dig in. :)

GG

Filed Under: google, google-analytics, seo, web-analytics

Connect with Glenn Gabe today!

Latest Blog Posts

  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap
  • Google Multisearch – Exploring how “Searching outside the box” is being tracked in Google Search Console (GSC) and Google Analytics (GA)
  • Sitebulb Server – Technical Tips And Tricks For Setting Up A Powerful DIY Enterprise Crawler (On A Budget)
  • Google’s Helpful Content Update Introduces A New Site-wide Ranking Signal Targeting “Search engine-first Content”, and It’s Always Running
  • The Google May 2022 Broad Core Update – 5 micro-case studies that once again underscore the complexity of broad core algorithm updates
  • Amazing Search Experiments and New SERP Features In Google Land (2022 Edition)

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent