The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for August 2018

Meet Newsguard, A Team Of Quality Raters For News Publishers – And Another Way To Check Site Trust, Credibility, and Transparency

August 30, 2018 By Glenn Gabe Leave a Comment

Newsguard for news publishers.

Update: April 2022
I just published a post explaining how sites can use NewsGuard’s nutritional labels to avoid manual actions for violating Google’s medical policy (for News and Discover). This is based on helping sites that received manual actions in January of 2022.

—–

Based partly on the August 1 Google algorithm update, there’s been an extreme focus from site owners and SEOs on building and demonstrating E-A-T (expertise, authoritativeness, and trust). Now, the concept of E-A-T has been documented in Google’s quality rater guidelines (QRG) for years, but the recent update did seem to have an element that turned up the volume from an E-A-T perspective (and especially for sites focused on health and medical.) That’s why Barry Schwartz decided to name it the Medic Update.

Keep in mind that many different types of sites were impacted by the August 1 update and Google even confirmed they didn’t target sites that could be categorized as Your Money or Your Life (YMYL). Also, it’s important to know that you can’t build E-A-T quickly. I’ve mentioned that in my previous posts about major algorithm updates. You can, however, build strong E-A-T over time if you are doing the right things (building great content written by experts in your niche, using Social to reach a targeted audience, naturally building links from well-known sources, etc.)

Regarding the Quality Rater Guidelines (QRG), Google’s quality raters are asked to evaluate its algorithms, and E-A-T comes up many times in the 160-page guide. Note, raters cannot directly impact the rankings of any specific site, but their feedback is sent to the engineers, who then refine their algorithms. Therefore, they sure can impact rankings indirectly.

Based on what I explained above, site owners tend to have a few important questions:

  • Is my site and content trusted?
  • Are my writers perceived as experts in a niche?
  • Which specific factors on my site would cause users (and Google) to not trust the site?
  • Is there any way to see ratings for my site?

When helping companies that have been significantly impacted by a major Google algorithm update, it’s important to review the site overall, identify all problems that could be causing issues, and form a remediation plan for fixing those problems as quickly as possible. And helping companies determine perceived E-A-T is important. For example, do you have expert authors, are you deceiving users, does your content exhibit a high level of authoritativeness, etc.

The topic is extremely nuanced, since each niche category is different. For example, a medical site is different from a financial site, which is different from a dating website. And those sites are much different than coupon sites or lyrics websites. Again, there are many different types of sites on the web. And to address the last question from above about actually seeing ratings, there’s no way to see what raters thought of your site while evaluating Google’s algorithms.

That’s unless you’re a news publisher… Did I get your attention? :)

Well, you can’t see what Google’s quality raters think, but you can see view what Newsguard’s team of analysts think. And that can be a good proxy for understanding issues your site has regarding trust, credibility, and transparency.

Newsguard – Publicly evaluating trust and transparency of news publishers
Based on what I explained above, it’s great when you can get objective third parties to review a site (almost like your own version of Google’s quality raters). You can do that on your own, but there’s a good amount of work to do in order to run effective user studies. That’s why it’s great when you can find public sources of information from other services that can yield strong feedback about your site. That’s exactly what Newsguard is doing (for news publishers).

On its site, Newsguard explains that it aims to fight fake news, misinformation, and disinformation. I’ll explain more about Newsguard in the next section, but think of it as a way for the average person to quickly identify organizations that are adhering to important journalistic standards, and not participating in misleading the public.

Newsguard has a Chrome plugin that you can install that provides a quick way to see ratings for news publishes across Search and Social Media. You can also see ratings when visiting the sites in question. As soon as I read an article about Newsguard, I had to check it out. And after using it, I found it fascinating to receive objective feedback about the trustworthiness of various news organizations. It had me wishing there was a Newsguard for every category of website.

How It Works – Criteria and Nutrition Label
Newsguard has analysts that are trained journalists objectively rate news publishers based on nine criteria broken down into two major categories, credibility and transparency. Each criterion is weighted and worth a certain number of points. If a site receives 60 points or higher, it gets a green label (which is displayed in the browser, in the Search results, and across Social Media sites). If it falls below 60, it gets a red label. There are also other labels for user-generated content, satire and humor sites, etc.

For example, here’s what the plugin looks like in the browser window:

Newsguard in Chrome.

And here’s what it looks like in the Search results:

Newsguard in the search results.

And here’s what it looks like across Social Media sites:

Newsguard on Facebook.

And you don’t just receive basic feedback (like the score). You can see a full “nutrition label” to learn more about why the site received that score, what the analysts listed as reasons for the scoring, etc. Newsguard explains they are completely transparent about why a site received a certain score. They also reach out to the sites being reviewed for more information and will publish that too. And if a score needs to be adjusted based on new information, Newsguard will document that information as well. You can read more about their corrections policy on the site.

For example, here’s an example of a nutrition label for CNN:

Newsguard nutrition label for CNN.

And you can read the full rating with information from analysts:

Newsguard full ratings for CNN.

9 Factors Evaluated By Analysts
You can read more about the criteria on the Newsguard website, but they look at a number of factors that are important for any news publisher (and it’s important to note that there’s overlap with certain parts of Google’s quality rater guidelines).

For example, from a credibility standpoint, does the site:

  • Publish false content.
  • Gather and present information responsibly.
  • Correct or clarify errors.
  • Handle the difference between news and opinion responsibly.
  • Avoid deceptive headlines.

And from a transparency standpoint, does the site:

  • Disclose ownership and financing.
  • Clearly label advertising.
  • Reveal who is in charge, including conflicts of interest.
  • Provide information about content creators.

Again, each factor is scored and weighted, with 60 points being the difference between a green label and red label. You can see the weighting below:

Newsguard criteria and points system.

And for those of your reading this post involved in SEO, your QRG antennae might have gone up several times when reading those bullets points. For example, they mention deceptive advertising, bios for content creators, deceptive headlines, and more. Again, wouldn’t it be great to have this objective feedback as a starting point for every site on the web?

Examples of Site Ratings:
Below, I’ll provide a few examples so you can quickly see two sites on opposite ends of the spectrum. One receives very strong ratings, while the other sits below the 60-point threshold.

The New York Times (view the page on Newsguard.com):

Newsguard ratings for The New York Times.

A news publisher with 6 major categories of content, including YMYL topics, that received a red badge (less than 60 points):
Newsguard nutrition label for news publisher with less than 60 points.

Not listed? Submit your site.
As you can guess, the major limitation here is that this is a very manual process for Newsguard. That’s why I came across a number of news publishers that were in the process of being reviewed, or not even in the system yet. I don’t know how long it takes analysts to review each site, and then how long it takes for editors to review those ratings, and then for the co-CEOs to make a final review, but the process doesn’t seem quick. I’m not saying it should be quick, but I can see them getting swamped with requests very quickly. You can read more about the review process on the site.

So if you don’t see a rating for your news site, then feel free to submit it to Newsguard. I submitted a few dozen during my testing. It should be interesting to see how long it takes for those submissions to be reviewed. I’ll share more on Twitter when I see some of those submissions finally receive scores.

Here’s an example of a site in the process of being reviewed:

Site not reviewed yet.

A Note About Google’s Major Core Algorithm Updates – How do sites with failing grades fare during major updates?
While testing Newsguard across the web to check news publishers, I kept wondering how good of a proxy it was for Google’s quality rater guidelines (and how that would manifest itself during major algorithms updates).

For example, would failing sites also see negative movement during broad core updates? Let’s face it, publishing false information, providing deceptive advertising, using deceptive headlines, and more certainly are things that could cause algorithmic problems. But remember, Newsguard is only tackling a piece of the quality rater guidelines (and clearly not looking at everything Google is looking at). There are 160 pages in the guidelines and Newsguard focuses heavily on trust, credibility, and transparency.

But I was still interested in seeing the correlation. I’m not saying a failing Newsguard grade would mean you are going to tank during a major Google update, but it sure can provide some important feedback from Newsguard analysts about how your site is perceived. I plan to dig in more to see the connection between low ratings and algorithmic hits, but for now I just checked some news publishers with failing scores. I provided two examples below.

This news site failed and had previously been hammered by multiple major algorithm updates over the past two years. But, they did work hard on fixing many issues on the site recently and ended up surging during the August 1, 2018 update. So the ratings were before that happened. I’m recommending that the site contact Newsguard to provide additional information about their recent changes (and to see if they can get a fresh review).

A false positive from Newsguard.

And here’s a site that got hammered by the May 2017 update and has never come back. I haven’t analyzed this site heavily, but you can quickly see many serious issues when visiting the site. It’s pretty clear there are quality issues, UX barriers, and aggressive advertising problems throughout the site. A failing rating is justified here in my opinion, and they did get hammered by a major algorithm update.

Newsguard ratings for site hit by major algorithm update.

Summary – Free quality rater feedback from Newsguard
With many marketers now heavily interested in Google’s quality rater guidelines and E-A-T, I found it interesting to learn more about Newsguard, the criteria they use to rate sites, and to see how various news publishers fared with their scoring system.

At a minimum, news publishers should review their scores and look to improve, where possible. Again, it’s like seeing what Google’s quality raters think of a site without having access to the real quality raters. I believe having a public resource like this can be extremely valuable for site owners, journalists, and SEOs. I just wish there was a Newsguard for all sites. Hey, maybe there will be some day.

GG

Filed Under: google, seo, tools

Analysis and Findings From The August 1, 2018 Google Algorithm Update – A Massive Core Ranking Update

August 9, 2018 By Glenn Gabe Leave a Comment

The August 1, 2018 Google Algorithm Update (The Medic Update)

{Update July 2022: I just published my post about the May 2022 broad core update. In my post, I cover five micro-case studies covering drops and surges based on the update.}

—–

On August 1, 2018 Google rolled out one of the biggest algorithm updates I’ve ever seen. It was huge and many sites saw significant volatility across categories and countries. The 8/1 update followed similar broad algorithm updates in March and April of 2018, which also caused significant volatility across the web. I wrote a 2-part series covering those spring updates, which you should definitely read. Google’s Danny Sullivan confirmed this was a broad core ranking update just like those updates.

Many sites across the web have been impacted by the August 1 update and Google estimated that it would take over a week to fully roll out. That seems to be correct, as I’ve seen a number of sites see much more impact Monday into Tuesday of this week. Also, Google confirmed yesterday that the update has fully rolled out, so it did take about a week.

I’ve been heavily analyzing the update since it rolled out and will cover a number of important topics in this post. Specifically, I’ll cover confirmed surges, including some of what those companies have completed after being negatively impacted in the past. I’ll cover fresh hits and what I’m seeing across those sites that were negatively impacted. And I’ll also cover important information from Googlers about this type of algorithm update (some of which has not been covered yet since the update rolled out).

So strap your battle helmets on. Let’s take a look at Battle Ground Google, The August 2018 Edition.

Significant AND Long-term + Relevance AND Quality
I help a lot of companies that have been negatively impacted by major algorithm updates. They are typically large-scale sites, but I do help some small to medium sized sites as well. The sites I’m helping have typically lost a significant amount of search visibility, rankings, and traffic during major updates, and sometimes have dropped over several updates (making the situation even worse).

As I’ve said in the past many times, broad core ranking updates take many factors into account. There’s never one smoking gun. Instead, there’s typically a battery of smoking guns. Actually, I tweeted that soon after this update rolled out based on what I was seeing across sites that were impacted. And it was awesome to see John Mueller retweet that! That makes sense, since John has communicated similar things a number of times during webmaster hangout videos.

John Mueller Retweet August 1, 2018 Algorithm Update

Google is evaluating relevance and quality at the site-level. And “quality” can mean a number of things. For example, content quality, user experience (UX), aggressive, disruptive, and deceptive, advertising, technical SEO problems that cause quality problems, and more. Therefore, it’s important to fully understand all of the problems with your site and then form a serious plan of attack for remediation. Remember, Google is on record explaining they want to see significant improvement in quality over the long-term.

You will not recover by putting band-aids on the situation. You will not recover in a few weeks, or a month or two. You must implement significant changes to improve your site and make sure they stick around for the long-term. Don’t roll back changes after a few weeks or a month or two. That may not be long enough for Google… Note, I’ll cover surges soon and what my clients worked on to improve their sites (and how long that took).

Relevance, Which *Includes* Quality Overall – From John Mueller
After the Brackets Update on March 7, John explained that the update had a lot to do with relevance, and the industry went wild with that comment. But after analyzing many sites and helping a number of sites recover during the March and April updates, I explained that it seemed quality was a part of the updates as well. It was easy to see that after analyzing many sites impacted in both March and April. I wrote about this in my 2-part series about the updates in case you want to read more about that.

Well, it was awesome to hear John explain that relevance INCLUDES quality overall! And that includes how you present your website, like sites that provide ads above the fold. Yep, he said ADS. :) He explained this in another webmaster hangout video. I cheered from my office, literally. Here’s the video:

That made complete sense based on what I was seeing across sites, so it was great to hear him clarify his original statement about relevance.

A note about Your Money or Your Life (YMYL) and E-A-T
Many people have explained the amount of YMYL sites impacted during this update, and specifically, health and medical sites. Actually, Barry Schwartz saw so many health sites impacted based on hearing from site owners that were negatively impacted that he called it the Medic Update.

I agree that many health and medical sites were impacted during this update, but they weren’t alone! I have a spreadsheet of over 210 sites impacted by the August 1 update and many are NOT YMYL sites. For example, there are entertainment sites, coupon sites, lyrics sites, e-commerce sites, sites containing online games, and more. So yes, it does look like Google made adjustments that would impact YMYL sites, but the update didn’t isolate those sites.

For example, here is a lyrics site that was impacted and a coupon site that was impacted:
A large-scale lyrics site that dropped heavily during the August 1 update:

A lyrics website that dropped during the 8/1/18 Google algorithm update.

A large-scale coupon site that surged during this update:

A coupon site that surged during the 8/1/18 Google algorithm update.

But YMYL sites are extremely important to Google. They are held to a higher standard, since they can “impact the future happiness, health, financial stability, or safety of users.” Therefore, it makes complete sense that Google would always look to enhance its core ranking algorithm with regard to E-A-T (expertise, authoritativeness, and trust), site reputation, author reputation, etc. It does look like Google refined how this works in the latest core ranking update on 8/1, but again, there were many sites outside of YMYL that were impacted. It’s important to know that.

With that out of the way, let’s dig into some surges.

Surges – The Long And Tough Road Back
The first client I’m going to cover is in the entertainment space. They have been impacted by a number of updates over the years and were destroyed by the June 2016 update (which was a huge algorithm update).

When they contacted me, they explained that they had tried to fix a number of things, but they thought they could be missing some important items. They were simply not recovering during subsequent algorithm updates. So, I began a long-term engagement to help them root out all quality problems and form a strong remediation plan.

My audit and crawl analysis uncovered many problems across the site. In total, I sent 84 pages of findings in Word during the entire engagement covering everything from low-quality content to aggressive and deceptive advertising to affiliate problems to technical SEO problems to performance issues and more. There were many, many things wrong with the site from a quality standpoint. It’s always interesting to go back and see how many findings were sent when analyzing a large-scale site with many quality problems. And it’s important for other sites to understand that too. There’s rarely a single issue causing massive drops during core ranking updates.

My client’s team worked hard to implement those changes, including the always-tough conversation with the monetization team. When you’re down, the knee-jerk reaction is to increase advertising to make up for the loss of traffic. But I was telling them to drastically cut down on the advertising situation, which was over-the-top from a disruptive and deceptive standpoint. The conversations were tough, but they did tone it down greatly.

They implemented a boatload of changes over four to five months, but remember, Google needs to process these changes over the long-term. So they kept driving forward while hoping they would see positive movement during subsequent algorithm updates.  Remember what I said about “long-term” and “significant” from earlier?

The engagement ended in April and many items had been implemented at that point. I told them to keep driving forward, keep publishing killer content, keep enhancing the user experience, and to NOT fall back to old ways from an aggressive advertising standpoint.

Then August 1 arrived, and BOOM, they began to surge. It was awesome to see. Here is their search visibility trending from SEMrush.

An entertainment website surging during the core ranking update on August 1, 2018.

According to Google Analytics (which I cannot share a screenshot of due to an NDA), they are up over 48% since August 1 (and specific areas of the site are surging more than that). That accounts for more than 822K additional sessions from Google organic alone in just the past week. Clicks and impressions in GSC are off the charts. My client explained that this is the most traffic they have seen in about two years.

Another Surge – YMYL Health Site
The second client surge I’m going to cover is a large-scale health site that has seen a significant drop in rankings and traffic over the past 18 months. They approached me after seeing multiple dips during major algorithm updates and wanted to surface more problems from a quality standpoint.

Similar to what I explained above, I dug in heavily and sent through 93 pages of findings during the first engagement and then 51 more pages through during a second engagement (for a total of 144 pages of findings). Yes, there were many problems with the site from a quality perspective. And the site contains so many different areas of content that I was surfacing more and more problems with every twist and turn. Again, it’s important to know the amount of work the company did to improve the site. Their team was great and moved very quickly to make changes. And they continue to make changes to this day, but many of the significant changes were completed months ago.

Similar to my example from earlier, they worked on a number of areas, including nuking low quality or thin content, fixing a very aggressive and deceptive advertising situation, improving the UX, fixing technical SEO problems across the site, and more.

And then on August 1, they began to surge across many key areas of the site. Here is their search visibility trending from SEMrush:

A YMYL health site surging during the August 1, 2018 core ranking update.

As of today, various sections of the site have increased anywhere from 20% to 60% since August 1. Note, they still have a long way to go to get back to where they once were, but this was a significant move in the right direction. It was great to see.

And it’s also worth noting that some site sections decreased! So, on a large-scale site with many different sections, you can definitely see ups and downs during major algorithm updates by directory. That might be obvious to site owners of large-scale sites, but I’m not sure everyone knows that.

Those are just two examples of large-scale sites seeing improvement during the August 1 update, but the moral of the story was the sites surging have completed a boatload of work to improve their respective sites. They didn’t find one smoking gun. Instead, they found a battery of smoking guns and worked their hardest to root out those issues and improve their sites overall. In addition, they kept those changes in place and kept driving forward with improving the site. They didn’t revert back to old ways, which is hard to do especially from an advertising standpoint.

Fresh Hits And Digging Deep(er) With Sites Negatively Impacted
Once the update started rolling out, it was amazing to see how many sites were being negatively impacted. And some weren’t just being hit, they were obliterated. That includes some of the largest sites and brands across the web.

Here are some examples of serious negative impact. I have many examples of this happening, but I’ll just provide a few:

Massive drop during the 8/1 broad ranking update.

Another huge drop during the 8/1 Google algorithm update.

Yet another massive drop during the August 1, 2018 algorithm update.

When sites drop, it’s sometimes easy to take a quick look at them and try to figure out what’s wrong. But when you do that, it’s easy to miss serious problems that lie below the surface. Often, the only way to really see what’s going on is to dig deep, crawl the site, analyze the various areas across a site, etc. Remember, Google is on record explaining they take every page indexed into account when evaluating site quality. So, if you just look at five pages, you could be missing most of the SEO iceberg.

SEO problems deep in a site.

Therefore, in addition to reviewing top pages from sites that dropped during this update, I also crawled a subset of each site (10K pages per site). It’s not perfect, but that at least enabled me to see a bigger picture than just checking a few pages. And I’m glad I did. After crawling those sites, you could clearly see a number of problems across content quality, technical SEO, UX, aggressive advertising, and more.

And from a YMYL standpoint, you could clearly see sites ranking that probably shouldn’t have when it comes to important topics that can impact the “future happiness, health, financial stability, or safety of users”. Remember, the update DID NOT just target YMYL sites. Many different types of sites were impacted, but Google clearly was adding something new with regard to E-A-T, adjusting the threshold for YMYL sites, etc. Hard to say exactly what they did, but many were impacted.

Let’s go through some examples below. I’ll provide a few key findings per domain based on crawling and analyzing the site. Again, I didn’t crawl the entire site… I crawled a subset of each site (10K urls) to get a sample of what’s going on there. I can only imagine what I would find if I spent more time…

YMYL – Health Site
YMYL site dropping during the 8/1 update.
I surfaced hundreds of thin pages (650+) from just the crawl of 10K urls, so there are probably thousands like these pages on the site now if you take the entire site into account. Many of those thin pages contained a boatload of ads and very little main content (sometimes just a line or two of content, or just an image).

There were many thin user profiles indexed across the site. Some thin pages were filled with low-quality supplemental content, with very little main content.

The site also heavily uses popups, which can be extremely disruptive and annoying. I’ve covered disruptive and aggressive advertising many times while writing about major algorithm updates.

There were also a number of technical SEO problems including canonical issues.

And remember, this was only based on crawling 10K urls…

Reference Site
A large-scale reference site dropping during the August 1 Google update.
I surfaced over 6K near-empty or empty pages. It looks like a technical problem that is causing content quality problems. These pages are indexed now. And the pages are filled with ads. So you have empty pages (main content) but they are still displaying many ads. That’s not a great combination at all.

I then surfaced more content quality problems including pages containing just a few words of content with a number of large ads on the page. Again, not a great combo.

There were also annoying UX barriers that broke up the content. This was not done in an elegant way at all, and I found it extremely disruptive. I found myself scrolling to try and get to the rest of the main content.

From a SERP perspective, Google is returning a wider mix of results, and not just reference information that can be found on this site. Therefore, the site is battling quality problems, but it’s also dealing with what Google deems to be relevant based on query. So it seems Google was determining that users weren’t happy with the previous results (or that they were simply looking for something else).

Lyrics Site
A large-scale lyrics site dropping during the 8/1 Google update.

I surfaced thousands of ultra-thin pages across the site, including blank pages containing ads. I also surfaced a number of pages returning a 503 header response code (by accident). This seemed like a technical glitch as the 503 is still in place now on those pages.

There’s a forum with many user profiles that are very low quality (and often contain very little main content).

There were also many empty artist pages that contained a number of ads. So there wasn’t any main content, yet many ads on the page. I saw this a number of times while analyzing the update. Remember, Google takes every page indexed into account when evaluating quality. Make sure your best content is indexed and nuke low quality or thin content. And if you can boost low quality content, that’s a great path as well.

Low quality or thin content and aggressive and disruptive advertising seemed to be the killer combination for this site. And to add insult to injury, the site still hasn’t moved to https.

YMYL – Health
YMYL site dropping during the August 1, 2018 Google update.

I surfaced close to a thousand pages of low quality or thin content out of just a 10K url crawl. There was a serious lack of E-A-T for YMYL pages. The site’s authors do not have the expertise to write about sensitive medical topics.

In addition, there were many pages that contained just an autoplay video surrounded with ads and no other main content. Popups were used extensively across the site, also in combination with autoplay video ads. This was very annoying and disruptive. Again, not a great combination.

Then there were pages that should contain video, but the page was basically blank with ads surrounding the empty main content area. Sounds familiar, right? Sometimes there was just a placeholder for video and ads were all over the page.

I also saw a major relevance component at play here. Some rankings that dropped didn’t meet user expectations based on query. So, it seemed like the right move from Google’s standpoint to downrank those urls. For example, the site was tangentially covering a topic, but ranking well for the topic head term. I can’t imagine many users were thrilled with the results.

Recipes
Recipe site dropping during the 8/1 update.

I surfaced over 3K thin or low-quality pages out of a crawl of just 10K. And many had autoplay video ads (although without sound). Even though they have no sound, they are still incredibly annoying.

In addition, popups are used as well! So, the combination of autoplay video ads and popups on the same page had me banging my head against my monitor. And on mobile, there are giant ads pushing the content way down the page, so the entire viewport was filled with an ad. Note, I’ve often called Google’s Above The Fold (ATF) algorithm the beast that nobody talks about. Well, having ads that push main content out of the viewport is not a great idea, and it’s horrible for users.

In addition to many thin pages, I also surfaced empty pages that were indexed. And those empty pages contained ads. Again, sound familiar? There were also many pages with affiliate links weaved into the content. And there’s no disclosure on the page about which links were affiliate and which ones weren’t. I found that pretty deceptive from a user standpoint.

Side note about E-A-T, the site is well-known in the niche, has top writers with strong E-A-T. Therefore, it doesn’t seem like E-A-T was a core problem here. I believe it had more to do with the thin or low quality content mixed with poor user experience problems.

YMYL – Health
Another YMYL Health site dropping during the August 1 update.

There were many UX barriers and technical problems across the site. For example, there was a module in the middle of many pages that just hung while hourglassing. Hard to tell if that was main content, supplemental content, or videos, but clearly not good to have on many pages.

There were MANY weaved ads throughout the content. Actually, it was every paragraph or two… I found that extremely disruptive. It’s ok to do that every now and then, but many weaved into a single article is horrible for users.

There were many amazon affiliate links weaved into the content as well. It seems like the site was focusing more on monetization than the user. And that’s always a dangerous situation. I’ve had many affiliate marketers reach out to me after getting smoked and they had that in place.

There were other user interface problems, especially on mobile. For example, some paragraphs did not align well with images. So there were only a few characters for each line on mobile. It was literally impossible to read. Imagine trying to read a paragraph 3 characters at a time…

Also, and this is clearly important on a YMYL site, there was NO author information. That’s possibly a worst-case scenario for a YMYL site writing about medical topics. So the site doesn’t exhibit much E-A-T, but the authors really don’t demonstrate strong E-A-T.

Many smoking guns, not just one
As you can see, even a relatively quick analysis enabled me to see a number of problems across sites that were negatively impacted. That’s why it’s important to thoroughly analyze a site and surface all potential problems. I’m sure if I really dug into those sites, I would surface many issues that could be contributing to their drop. These broad updates are called “broad” for a reason. :)

Follow The “Leader”, Flatliners, And The Danger In Doing Nothing.
When helping companies that have been negatively impacted by major algorithm updates, I often hear similar things. For example, some explain that they are just following what stronger players in the niche are doing and hoping they will surge like they have in the past. But unfortunately, they are following them blindly. For example, they have no idea if those are the right things to do, if those companies are surging in spite of having those things in place, etc.

The danger, of course, is that you can follow them right off an algorithmic cliff. Here’s a quick example. One of my newer clients always references a certain site with how they follow their lead, implement the same type of functionality, layout, content, etc. Once I started helping them, I explained the danger in doing that. And guess what? That site they follow just got hit pretty hard by the August 1 update. That site has done very well over the past year or so, but just got whacked by the August 1 update. Note, it’s NOT a YMYL site.

Following a site off an algorithmic cliff.

And here’s a site that refused to change after getting smoked by the February 7 algorithm update. Not only have they NOT increased at all. They have actually dropped more. Not good.

A site that never fixed SEO problems continues to drop.

The QRG Is Important, But Combine It With Traditional SEO Audits
Since 2014, I’ve been explaining the connection between Google’s quality rater guidelines and major algorithm updates. Yes, that’s when medieval Panda roamed the web. I think I’ve mentioned the power of the QRG in almost every blog post about major algorithm updates since then (including Panda, quality updates, the March and April 2018 updates, and more).

The overlap between the QRG and what I see in the field is uncanny. That’s why I highly recommend reading the QRG to see what Google deems high versus low quality, to understand how Google treats YMYL sites, to understand the importance of E-A-T, to understand the impact of aggressive, disruptive, and deceptive ads, and much more.

But it’s not the only thing you should do. I recommend combining the QRG with what you would typically do in a standard audit. Don’t ignore technical SEO, thin content, performance problems, and other things like that. Think about the site holistically and root out all potential problems. That’s the most powerful way to proceed in my opinion.

Google’s Danny Sullivan brought up the QRG when asked about the 8/1 update last week. So yes, definitely read the rater guidelines, internalize it, share it with your team, etc., but don’t forget about all of the other things involved with a site from an SEO standpoint (and how your users interact with your site).

Here’s Danny’s tweet:
Google's Danny Sullivan mentioning the Quality Rater Guidelines after the 8/1 update.

Update November 14, 2018: Google’s John Mueller Confirms BBB Ratings Aren’t Used Algorithmically
On that note, I’ve been pretty vocal recently that I don’t believe BBB ratings were being used algorithmically by Google. There are many people that have contacted me worried about BBB scores when the BBB isn’t even in their country! The idea of Google using BBB scores or ratings algorithmically just didn’t make sense to me. First, the BBB is only in three countries (and really one primary country – the United States). Since Google’s core ranking algorithm is global, it doesn’t make sense that Google would use a limited source like that. Second, there’s a paid element to the BBB. You don’t have to pay, but you can pay to become accredited. That ranges from $480 to $1,155 per year depending on the size of your company. That also throws a wrench into Google using BBB ratings. And third, it’s just not scalable. Remember, we’re talking about global algorithms and NOT just algorithms focused on the United States (or a limited set of countries).

So, I asked Google’s John Mueller in a webmaster hangout if Google was using BBB ratings algorithmically. He was pretty clear with his response. No, they aren’t using BBB ratings algorithmically.  He said there are various issues with sources of information for companies and Google can’t blindly rely on some third-party ratings. You can watch the video below at 15:32 to hear his full response.

Now, that doesn’t mean they aren’t looking at site reputation overall. There’s a good chance they are… But they wouldn’t rely on one source or use a score from one site algorithmically. Actually, I posted an update on 10/23/18 in my post about the September 2018 update where I explained how Google might be dialing-up, or down, the power of “trusted” or “untrusted” links. That’s much more likely and way more scalable. You should read that section (and the LinkedIn article I published to learn  more about that).

Next Steps For Site Owners:
If you’ve been negatively impacted by the August 1 algorithm update, then don’t sit and wait it out. John Mueller has explained in the past that if you’ve been impacted by one of these core updates, then take a step back and review your site overall from a quality perspective. And then try to significantly improve quality over the long-term. Don’t sit and do nothing. That probably won’t end well for you.

Therefore, here are some closing bullets (some might sound familiar if you’ve read my previous post about major algorithm updates):

  • Perform a thorough crawl analysis and audit of your site. Surface all potential quality problems and form a plan of attack for fixing them (and improving the site overall).
  • Read the Quality Rater Guidelines (QRG) and have everyone on your team that’s involved with your site read them too. Then review your site through the lens of the QRG (objectively). Then identify all potential problems and fix them.
  • Conduct user studies to understand how real people view your site. I’m not talking about having your mother, spouse, or close friends go through your site. I’m talking about objective third parties that can evaluate your site based on specific actions you want them to take. Identify all problems and barriers and remove them.
  • Analyze your Google Search Console (GSC) reporting, including the new index coverage reporting (which is killer). Find all technical SEO problems (combined with your crawl analysis and audit), and then fix them as quickly as possible.
  • Continue to publish killer content. Don’t sit and wait while you are fixing problems… Keep pumping out high quality content that users want to read (based on understanding your niche). The more high-quality content you can produce, the more you will exceed user expectations, which will often yield more sharing, which can result in more branded searches for your site and content, while also enabling you to build more (natural) links.
  • Keep building E-A-T. Note, you cannot quickly build E-A-T… you will need to do this over time. Remember, Google’s Gary Illyes explained when evaluating E-A-T, Google primarily looks at links and mentions from well-known sources. You cannot quickly build those links and mentions, but you can over the long-term by working hard and doing the right things. And make sure to showcase your expertise on your site and in your linked profiles (like Social). Jennifer Slegg just covered Google’s new creator reputation from the latest quality rater guidelines released in July. I recommend reading her post to get a solid understanding of what Google is doing on that front.

Summary – Google’s Mid-summer Classic Was Huge
The core algorithm update that rolled out on August 1, 2018 was massive and many sites across the web were impacted. Although there were a lot of health sites impacted, many others in non-YMYL categories were affected as well. If you have been negatively impacted by the 8/1 update, then it’s important to objectively analyze your site to find ways to improve. And remember, there’s never one smoking gun. There’s usually a battery of them. So go find them now. :)

GG

Filed Under: algorithm-updates, google, seo

Connect with Glenn Gabe today!

Latest Blog Posts

  • How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)
  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent