The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

How NewsGuard’s nutritional labels can help publishers avoid manual actions for medical content violations (Google News and Discover)

April 15, 2022 By Glenn Gabe Leave a Comment

In July of 2021, Google issued a number of warnings for sites publishing medical content that went against its guidelines (for Google News and Discover). The potential for a manual action was clear and some publishers scrambled to figure out what to do.

I mentioned this on Twitter in September:

Google adds information to help docs about displaying Discover Manual Actions in GSC

I've seen several examples of Discover policy violation warnings since early July. Will manual actions follow soon? Time will tell. :) https://t.co/huFckYCTr8 via @tldrMarketing pic.twitter.com/kCKImHjnhC

— Glenn Gabe (@glenngabe) September 2, 2021

And six months from the warnings, manual actions arrived for sites that hadn’t cleaned up the problem. Here is my tweet from January when Google issued the manual actions:

Heads-up. Don't ignore Discover & Google News policy warnings in GSC. It might take 6 months or a year, but a manual action could follow. Had multiple publishers reach out this weekend about manual actions for Discover/Google News. E.g. misleading content, medical content, etc. pic.twitter.com/JutaP82HQL

— Glenn Gabe (@glenngabe) January 30, 2022

To clarify, these were manual actions for Google News and Discover, and not Search. And for the publishers receiving manual actions for medical content, the medical policy for News and Discover states that Google “doesn’t allow publishing medical content that contradicts or runs contrary to scientific or medical consensus and evidence-based best practices.”

And the manual actions in Google Search Console explained the following:

“Your site appears to violate our medical content policy and contains content primarily aimed at providing medical advice, diagnosis, or treatment for commercial purposes. Nor do we allow content from any site that contradicts or runs contrary to scientific or medical consensus and evidence-based best practices.”

So, if you are publishing medical content, and receive a manual action for violating that policy, News and Discover visibility can be negatively impacted. Again, Search should not be impacted by the manual action, but Google News and Discover visibility could decline.

For example, here is the Discover performance for one of the flagged articles for a publisher that received a manual action:

Google Discover performance for a page impacted by a manual action for medical content.

When digging into the articles being flagged by Google, it was super-interesting to see the connection between NewsGuard ratings and the organizations that were covered heavily in the articles. Below, I’ll cover more about NewsGuard and how it could be helpful for sites publishing health and medical content.

Interesting cases and the connection between flagged content and NewsGuard ratings:
In 2018, I wrote a post covering NewsGuard, which I called a proxy for Google’s quality raters. NewsGuard has a team of analysts (trained journalists) that review websites based on nine journalistic criteria, including credibility, transparency, and trust. They originally started by focusing on news organizations, but they have expanded to health and medical as well. For example, there is now a HealthGuard service that, “helps patients, healthcare workers, and anyone involved in the medical field identify trustworthy sources of health information — and avoid dangerous misinformation.”

Once a site is reviewed, NewsGuard produces a “nutritional label” rating the site, which can also appear in the search results if you are using its Chrome plugin. In addition, NewsGuard has relationships with a number of organizations (in several capacities). For example, Bing, Facebook, the American Federation of Teachers (AFT), the World Health Organization (WHO), and others have partnered with NewsGuard to fight disinformation. You can read more about their various partnerships on the site.

Although NewsGuard does have partnerships with several organizations for helping fight misinformation and disinformation, I want to be clear that Google does not use NewsGuard data in its algorithms. But like I explained in my first post, those ratings sometimes line up with how the sites perform in organic search (since Google is also trying to algorithmically surface the highest quality and most authoritative content on the web).

It’s important to understand that Google is on record explaining that its algorithms can be more critical when it comes to health and medical content. Here is a Twitter thread of mine that expands on that point. Again, this is super-important to understand for anyone delving into health and medical content.

Run a health/medical e-commerce site? Via @johnmu: Our algorithms are more critical for health/medical topics, so def. keep E-A-T in mind. Make sure the site represents a very high standard. i.e. High-quality content created by actual medical professionals https://t.co/aiMrdN9Hl7 pic.twitter.com/Nuz3K7Pi6o

— Glenn Gabe (@glenngabe) March 27, 2021

For example, here is a health site with a horrible nutritional label from NewsGuard. The site has gotten hammered during broad core updates over time. Again, it’s not because of NewsGuard… it’s just interesting how they line up:

Health and medical site that dropped over time during Google's broad core udpates.

Cross-referencing organizations via NewsGuard based on manual actions for medical content:
For organizations receiving manual actions for medical content (News and Discover), I was interested in cross-referencing NewsGuard to see what the nutritional labels looked like for the organizations being covered (and promoted) in those flagged articles.

And to clarify, it’s not about simply mentioning sketchy organizations that would get content flagged. It’s more about the core of the article being about those organizations (including promoting their views). That’s exactly what the articles were doing that were flagged.

So what did the nutritional labels look like for those organizations being covered? They weren’t good. Not good at all… Here are two examples based on content getting flagged.

Here’s the first site’s label:

NewsGuard nutritional label with extremely poor ratings for a health and medical site.

And here’s the second site’s label:

NewsGuard nutritional label with poor ratings for a health and medical site.

And here is what one of the sites look like in the search results (when you are using the NewsGuard Chrome extension):

NewsGuard rating in the search results for a site with poor ratings.

When you hover over the NewsGuard icon (the red shield), you can view an overlay with more details. And that overlay contains a link to the full nutritional label on the NewsGuard website.

NewsGuard overlay with more information from a site's nutritional label.

When you visit the nutritional label on NewsGuard’s website, you can find all of the details about why the site received those ratings (and by category). And that includes all of the sources that were cited and referenced in their findings. For example, you can view CNN’s nutritional label here (just to get a feel for what one looks like, review the ratings by category, the sources section at the end, etc.)

Note, the site I mentioned that received the manual action is a large-scale publisher with millions of pages indexed, so most of the content would not fall into this category (covering organizations and views that go against Google’s guidelines). But, they do have some… and they were flagged by Google.

When discussing this situation with the site’s leadership, I explained that having some checks in place would be smart for understanding the risks involved with publishing certain pieces of content. And in my opinion, NewsGuard could be one of those checks.

Utilizing NewsGuard as a check during the publishing process:
So, if you are a site publishing health and medical content, then I would definitely put some checks in place to ensure you don’t receive a manual action for medical content. One solid approach could be adding checks using the NewsGuard plugin (which links to the nutritional labels). If you see red all over the label, you might want to be more cautious (or at least dig in further to learn more about that organization’s views).

For example, if the publisher I’m covering in this post that received the manual action checked NewsGuard before publishing that content, then they probably wouldn’t have published it at all (as long as they understood Google’s policies around medical content for News and Discover). Again, it’s a large-scale publisher with millions of pages indexed. A NewsGuard check could have raised red flags during the editing process.

Note, NewsGuard obviously doesn’t have labels for every site on the web, but understanding the ratings based on the organizations that have been reviewed is a good idea. Again, it was interesting to see the connection between some manual actions for medical content and the sketchy nutritional labels for those organizations being promoted in those articles. Like I explained in my original post about NewsGuard, it’s like a proxy for Google’s quality raters. So in my opinion, it’s smart to check those nutritional labels before publishing.

GG

Filed Under: algorithm-updates, google, manual-actions, seo

Google’s Core Algorithm Updates and Copied Content: The Domino Effect of Negative Impact

September 9, 2019 By Glenn Gabe Leave a Comment

There’s never a dull moment in Google Land. And that’s especially the case for health and medical sites over the past year starting with the Medic Update in August of 2018. Since then, we have seen several major core updates, which seem to include a new method of evaluating quality for health and medical sites. Google’s John Mueller explained more about the changes in a webmaster hangout video in May.

That algorithmic change has led to some insane volatility for various sites in the health and medical niche. Some are surging and dropping with every update, which is a clear sign that Google is heavily tinkering with those algorithms (turning the dial up and down to try and find the right balance). I’ve been pretty vocal that I believe Google hasn’t figured this out yet… and I believe we’ll see even more volatility in the health space with the next core update (which we are due for).

I’ve had the opportunity to help a number of health and medical sites over the years, and even more since last August when the Medic Update landed. It’s been fascinating to see the impact, surface problems across those sites, watch some sites surge, others drop more, and some ride the “Google roller coaster” (surging and dropping with each core update). Like I said earlier, there’s never a dull moment in Google Land.

A Multi-Core Victim with an interesting content problem:
I recently started helping another health and medical site that has gotten hammered over several core updates. Their traffic dropped 54% during the June core update, and they are down 67% since the March core update. After the latest drop in June, the site owners finally decided to have someone come in and heavily analyze the site through the lens of Google’s core updates to see what’s going on.

Here are the drops from the March and June core updates:

Drop in clicks and impressions during two core updates.

During my first wave of analysis, I surfaced a huge problem that I’m sure is contributing to the site’s drop during core updates. Sure, there’s never one smoking gun with Google’s core updates… there is typically a battery of problems. For example, I’m still heavily analyzing the site and surfacing more issues as I write this post. But, the issue I found was big enough that I actually stopped analyzing the site to craft a separate deliverable just about the topic. That’s what I’m going to cover in my post.

The Find – The Domino Effect of Negative Impact From Copied Content
One of the first things I like to do when helping a company that was impacted by a core update is to run a traffic drop report (what I previously called a Panda Report). The report enables me to see the content that dropped the most after a major core update rolls through. It can often reveal glaring problems across once-popular content on a site.

It wasn’t long before I noticed a disturbing trend. There were many articles that dropped that were copied from third-party health sites and blogs. And to make matters worse (and what should be obvious to SEOs reading this post), the copied pages were fully indexable. In other words, they weren’t being noindexed and they weren’t being canonicalized to the original articles. By the way, it’s not like those tactics make copying content better… but it at least decreases the chance of those pages outranking the original. 

Here is an example of one page copied that was ranking for over 1K queries in the top 10 results BEFORE the March core update pummeled the site (and the copied content):

A copied page ranking for 1K+ queries in the top 10 results
BEFORE the March core update.

Important reminder: YMYL sites held to a higher standard
Remember, this is a site focused on “your money or your life” (YMYL) topics in health and medical. And many sites in the health/medical niche have gotten hammered since the Medic update, seeing a big drop in rankings in traffic during several core updates in 2019. It’s a volatile space, that’s for sure.

Based on that impact, many health and medical sites have been working hard to publish killer content, showcase their expertise, hire medical experts to write and review their content, build medical review boards, and more. So, having content that’s copied from other sites in their niche is clearly not a good thing. And some of those sites are the biggest players in the health and medical category.

In total, there were hundreds of articles that fell into this category. And to clarify, there was original content on the site beyond the copied content. It’s not like the entire site was filled with copied content. But again, there were hundreds of copied articles that were freely indexable (and some were ranking well at certain points in time).

When checking the trending of those copied articles over time, many had dropped heavily in rankings and organic search traffic. Trending for many of those specific pieces of content looked like this:

A copied piece of content dropping during the June core update.

The Domino Effect: Wait, were those sites hit too?
When checking the content that was copied, I noticed some of those articles weren’t high quality. So I decided to check the sites where the content was being copied from. And low and behold, several of those sites had been hit by major core updates as well! Therefore, we had a classic SEO double whammy. The site I was analyzing was copying content from other sites in the niche, but it was also copying content from several sites that were also being hit hard by core updates.

Here is search visibility trending for two of those sites.

A site where content was copied from dropping during major algorithm updates.
Another site where content was copied from dropping during the Medic update.

Copied Content and Major Google Algorithm Updates (Including Medieval Panda)
I’ve seen copied content, and the incorrect use of consuming syndicated content, cause massive problems over the years. I wrote about it (check the Q&A) when medieval Panda roamed the web and I’ve seen it cause problems with major core updates as well. It makes sense. If Google believes you are copying another site’s content, and then ranking for that content, then Google’s quality algorithms can have a big problem with that situation. And in a YMYL world, it can be a very serious issue.

In case you are wondering, Google has explained the problems with copying content a number of times, including publishing a section in its Quality Rater Guidelines (QRG). Google also has a Pirate algorithm for extreme cases, which I have analyzed heavily.

For example, John Mueller has explained that if Google believes a majority of your content is copied, then it can lose trust in your site. In extreme cases, the webspam team might even get involved. Again, I have seen this play out many times over the years (algorithmically and via manual actions). It’s not pretty.

Here is a video of John explaining this (at 26:03 in the video):

And here is a section from the QRG about copied main content:

DMCA threats, but no official takedowns.
After surfacing this situation, I wondered how many DMCA takedowns the site received. It ends up not many were filed for some reason. I found out that there were a few threats of using DMCA takedowns, but none of the sites ever acted on those threats. I believe the site did remove those articles when contacted by third-parties whose content had been copied, but there were no official DMCA takedowns filed.

No DMCA takedowns filed.

By the way, that should have been a sign that something wasn’t right… When you are being threatened with DMCA takedowns, that’s never a good sign.

Moving forward: How to resolve the issue
In order to rectify the situation, there are multiple paths the site can take. I’ll cover them below:

1) Nuke it: 404, 410
First, I’m typically aggressive with core update remediation. So, you can probably guess what my recommendation was. I would nuke the copied content via 404s or 410s. Let’s face it… it’s not your content to begin with! Remember, Google takes every page indexed into account when evaluating quality, so removing low-quality content is always a good thing (including copied content).

2) Seek permission and then canonicalize:
Next, if a site really wants to provide that third-party content for some reason, then the site owners can seek permission to use that content and then use rel canonical pointing back to the original page (basically canonicalizing the urls on your site to the original urls). So, this involves requesting authorization to consume syndicated content and then properly canonicalizing the url to the original.

3) Seek permission and noindex:
You could also seek permission for publishing the content on your site and then noindex that content. Remember, if it’s not indexed, it can’t hurt the site quality-wise. And to be clear, this is AFTER receiving approval for publishing the content from the original author and site. Don’t just go and start copying content from anywhere thinking it’s fine. It’s definitely not fine.

4) Remove and 301 redirect to extremely relevant content (if possible)
You could also 301 redirect the old pages you are removing to extremely relevant content on your site. But keep in mind that Google can simply treat the old pages as soft 404s if you redirect to non-relevant content. You can read my case study about that, and my recent post about Google ignoring rel canonical when the content wasn’t the same or extremely relevant. Those are important points to consider.

Summary: The moral of the (SEO) story:
Don’t copy another site’s content. Don’t try to rank based on content that’s not yours. It’s bad karma and as the saying goes… “karma will take care of it”. In this situation, some of those articles were copied from sites that also got hammered by Google’s core updates. So those pieces of copied content could have contributed to the quality problems that this site is dealing with now. Again, it’s a great example of an SEO double whammy.

Again, I’m definitely not saying this is the only problem the site is dealing with. I’m surfacing a number of issues across categories, but having hundreds of copied articles on the site is definitely not helping the situation.

Instead of taking the easy path and copying content, you should always be striving to create 10X content that blows your audience away, that can naturally attracts links, spark social sharing, and drive searches for your brand plus the topic. If you do that, then you won’t have to worry about the domino effect of negative impact from copied content – or about bad karma. Instead, start building good karma… that’s how you win.

GG

Filed Under: algorithm-updates, google, manual-actions, seo

A Holiday Hornets’ Nest – Analysis and Findings From The December 2017 Google Algorithm Updates

January 11, 2018 By Glenn Gabe 16 Comments

Summary: From Maccabees to celebrities to doorways to affiliates, Google took it all on in December of 2017. There were several dates leading up to the holidays with significant movement from an algorithm update standpoint, including December 5, December 12, December 15, December 18, and December 26 (with possible connections between some of those updates). This post covers what I saw while digging into the December volatility.

December 2017 Google Algorithm Updates

If you follow me on Twitter, then you already know how crazy December was from an algorithm update standpoint. I was posting many screenshots throughout the month based on what I was seeing across sites. We know Google can push between 1,000 and 1,500 updates per year, but we also know many are small changes that few notice. And on the flip side, we also know there are larger algorithmic changes that cause the earth to shake.

Well, we saw multiple updates in December that did just that. It reminded me of September of this year, when we also saw a number of updates in a short amount of time. That’s when I wrote a post titled, “Enter the Hornets’ Nest” covering those updates. You can think of December’s volatility as a “Holiday Hornets’ Nest” since the updates seemed to increase as we got closer to Christmas Day (and we even saw manual actions being handed out on Christmas Day!) More about that soon.

For those of you unfamiliar with December updates from previous years, Google has explained in the past that they try to limit updates as the holidays approach. But that statement hasn’t held up over the past few years. Last year, we saw significant updates leading up to the holidays (with the last being December 15, 2016). I wrote about those updates on my blog after seeing serious volatility last year. By the way, keep December 15 in the back of your mind. We’ll be revisiting that date shortly, but for 2017 instead. Just an interesting side note.

First, let me cover the dates I saw significant movement and then we’ll start to dig further into the specific updates.

And here is handy table of contents that links to specific sections of the post:

  • List of dates in December with significant volatility.
  • December 5 – it begins.
  • December 12 – The Maccabees Update
  • December 15 – Celebrity Update
  • December 18 – Relevancy and possibly a continuation of 12/15 changes.
  • December 26 – Merry Christmas! Manual actions and more algo changes.
  • Recommendations for those impacted.

A quick note about the process I used when analyzing the updates:
I have access to large dataset of sites that have dealt with quality problems in the past. That’s across countries and categories as well. Between the sites I already have access to (and have assisted), new companies reaching out to me after seeing movement during the recent updates (up or down), and then digging into specific verticals during my analysis, I ended up with a list of 153 different websites that were impacted by the December updates.

For the sites I have helped, or have access to, I know the problems they are dealing with, what they have been doing to rectify those issues, etc. In addition, when it made sense, I decided to crawl a number of the sites that were impacted to get a better feel for what was going on from a quality and technical SEO standpoint. As many of you know, sometimes a crawl can reveal a number of problems that are harder to identify manually. More about that soon.

Dates In December With Significant Volatility (With Examples Of Movement)

December 5, 2017
Drop during the December 5, 2017 google algorithm update.

Increase during the December 5, 2017 Google algorithm update.

Drop during the December 5, 2017 Google algorithm update.

December 12, 2107
(Barry named this the Maccabees update.)

Drop during the Google Maccabees Update on 12/12/17.

Big drop during the December 12, 2017 Google algorithm update.

Big drop during the December 12, 2017 Google algorithm update.

Increase during the December 12, 2017 update.

Connection between 12/12 update and previous quality updates.

December 15, 2017
I saw many official celebrity sites drop on this date, among other hits and surges. You can read more in my post covering the celebrity update.
Here is the search visibility drop for Channing Tatum’s official website:

Official site for Channing Tatum dropping after 12/15 update.

And here is the drop from #1 to #11 for his own name:

Channing Tatum ranking #11 after the 12/15 update.

December 18, 2017 (could be tied to 12/15)

Drop during the December 18, 2017 update.

Drop during the December 18, 2017 update.

Drop during December 18, 2017 update.

Increase during the 12/18 Google algorithm update.

December 26, 2017
Yes, one day after Christmas. Ho ho ho. Note, there were also manual actions dished out on Christmas Day, but I also saw algorithmic changes too – showing on the 26th. Note, there was more volatility on 1/4 as well, and you can see that in one of the screenshots below. So Google apparently wasn’t done yet… the volatility continued into early January 2018. I saw movement on a number of sites on 1/4/18.

Drop on 12/26/17.

Increase on 12/26/17 with connections to previous updates.

Drop on 12/26 and more of a drop on 1/5/18.

Important Note: Google confirmed that they pushed multiple changes in a short amount of time in December. Actually, Danny Sullivan said, “several minor improvements”. Therefore, it’s really hard to pin down what each update targeted (since they undoubtedly overlapped in December). Also, some sites seemed to be impacted during multiple dates in December, and those sites often had a number of problems. I’ll cover what I’m seeing below, but the core point is to understand the problems overall (versus trying to connect one date with one set of problems). Also, Danny said they were “minor changes”. Based on the hits and surges I analyzed, they didn’t look minor. :)

Danny Sullivan confirms December changes in December of 2017.

More details about each update:

December 5, 2017 – Over-optimization, doorway-like pages.
Once I started digging into the data, you could clearly see movement on 12/5/17. And although many sites impacted by the 12/12 date were targeting many keyword permutations, I saw a number of those sites were actually impacted starting on 12/5.

The reason I bring this up is because Barry Schwartz dug into a number of sites that claimed to have been hit by the Maccabees update on 12/12 and he saw a number of those sites were targeting many keyword permutations. He was right and many employed doorway-like pages. But based on the data I analyzed, I saw a number of sites employing that tactic hit on 12/5/17 as well.

Some were large-scale sites targeting many locations or keyword variations. I saw impact on a range of different sites, including large-scale directories, ecommerce retailers, affiliate sites, etc. So as part of that update, Google could have made be an adjustment to how it algorithmically handles over-optimization, keyword stuffing, doorway pages, etc.

It was clear that the sites were targeting many different keyword variations on specific pages (again, doorway page-like). The pages were often over-optimized as well, with many keyword variations targeted in their titles tags. For example, I found close to 800 different pages on one site with nearly the same title tag (all targeting slight variations in keywords). And many of those pages were thin and low-quality. When checking the site overall, I see thousands of those urls indexed.

Here’s a quick visual of how those pages were laid out. Again, there were close to 800 of these on the site… And notice the blank area where main content should be. This is just one example, but I saw many during my travels while analyzing the 12/5 and 12/12 updates.

Sites targeting many keyword variations impacted on 12/5/17.

December 12, 2017 – Maccabees Update. A mixed bag of over-optimization and quality problems.
The movement on December 12 was substantial and gained the attention of many site owners and SEOs. Barry Schwartz covered the update heavily on Search Engine Roundtable and he also dug into a number of sites that were hit on that day. As mentioned earlier, Barry saw a range of problems, but a number of sites were targeting many keyword variations (doorway-like). That’s very similar to what I saw on 12/5 as well.

While analyzing sites that truly dropped or surged during 12/12 (and not 12/5), you could see a range of problems on the sites that were impacted. Based on the data I have access to, some sites impacted fell into the keyword permutations bucket (over-optimized and doorway page-like just like Barry saw), while other sites negatively impacted had many of the usual suspects from a quality standpoint. I also saw heavy affiliate impact. I saw a number of affiliate sites prioritizing monetization over user experience that were negatively impacted on 12/12.

And from a quality update standpoint, the term “quality” can mean several things from low-quality content to user experience barriers to aggressive and disruptive advertising, and even technical problems that cause quality problems. I have covered these issues many times in the past while writing about major algorithm updates.

So to me, 12/12 looked like Google pushed a few changes, including what many would call a “quality update”. Again, I’ve written about many of these updates over the years (including August 2017, May 2017, and February 2017, and more). You can read those posts to learn more about what I was seeing then.

For example, check out the percentage of low-quality or thin pages I surfaced on this site which was hit on 12/12:

Percentage of low-quality and thin pages.

And here is an example of the layout for one affiliate site that was hit hard. The user experience was horrible. It was clear that monetization was the priority:

Heavy affiliate sites were impacted.

I also saw sites covering what the Quality Rater Guidelines explains as “Your Money or Your Life” topics (YMYL), yet they lacked what’s called E-A-T, or expertise, authoritativeness, and trustworthiness. Sometimes, there were no authors listed, and when they were, they had missing photos, weak bios, and exhibited no expertise in the subject matter (covering health, finance, and more). I believe some were fake. I even saw some disclaimers that the site contained “general information” that was “not written by a professional” in that niche.

Here’s a quick screenshot from the QRG about E-A-T:

EAT in the Quality Rater Guidelines (QRG)

So if you were impacted on 12/12, I recommend reading my posts about quality updates and running through the Quality Rater Guidelines (QRG) several times. I surfaced all types of major quality problems on sites impacted by the 12/12 update (beyond the keyword variations problem).

Here are a few bullets directly from my analysis (there are many, this is just a quick list):

  • Surfaced over one thousand thin or near-empty pages.
  • Contains aggressive ads all over the site, including giant ads above the main content pushing the MC down the page.
  • Contains deceptive ads, not labeled, that look like main content, and drive users downstream to advertiser sites. Totally deceptive.
  • YMYL, but no E-A-T. Also contains aggressive and disruptive ads. Horrible user experience.
  • Low-quality affiliate site. Prioritizing monetization over UX. No surprise this was hit.
  • UX barriers galore, including a botched mobile UX. Hard to read the main content and traverse the site. I’m sure many are leaving the site quickly, especially mobile users.
  • And more problems that I simply don’t have enough room to include… 

Therefore, you need to objectively analyze your site through the lens of quality, and then make significant improvements over the long-term. John Mueller has explained this several times over the past few years when being asked about major algorithm updates. Don’t put a band-aid on the situation and expect to recover quickly. That simply won’t happen.

Here is John explaining this at 39:33 in the video:


And if you are employing many pages targeting many different keyword permutations, then you should revisit that approach. I would only publish pages when you have enough quality content to meet or exceed user expectations. And I would avoid publishing countless pages containing the same (or close to the same) content, but optimized for many different keyword variations. I could clearly see many sites that were negatively impacted employing that tactic. As Barry Schwartz documented in his post, and based on what I saw as well, there were many sites impacted that contained over-optimized pages.

December 15, 2017 – The Celebrity Update: Relevancy, user-happiness.
While analyzing both the 12/5 and 12/12 updates, I started to notice movement on 12/15 as well.  And that included serious impact to many official celebrity websites (with most of those sites dropping in rankings for the celebrity’s name).

Prior to 12/15/17, many official websites ranked number one for the celebrity name. And then boom, 12/15 arrived, and many of those sites dropped. And some dropped to the bottom of page one, while others dropped off page one. I saw this across many official celebrity sites and wrote a post covering my findings. I also heard back from Danny Sullivan on Twitter that he passed the information along to others at Google after reading my post (I’m assuming to the Search Quality team, Search engineers, etc.)

Tweet from Danny Sullivan about the 12/15 update.

For example, Tom Cruise, Charlie Sheen, John Lennon, Marilyn Monroe, and more all dropped on 12/15. Here’s an example of John Lennon’s official site ranking #1 before 12/15 and then dropping after 12/15. And check out the search visibility drop for his official site.
John Lennon's offiical site before 12/15.

Drop in search visibility for John Lennon's official site.

John Lennon's official site after 12/15.

Again, you can read my post about the 12/15/17 update to learn more about the algorithm update and to view more examples.

But why would these sites get hammered? Well, if you check many official celebrity websites, you’ll notice major quality problems. The sites are typically thin, don’t contain updated information, sometimes are just single-page websites, they have serious performance problems, and more.

Therefore, it seems Google began treating those official celebrity websites just like any other site starting on 12/15. Which also means… Google wasn’t treating them the same way prior to 12/15. I think that’s the most important point about the update on 12/15. Don’t get too caught up in “celebrity X dropped in rankings”, but you should get caught up in “Google’s core ranking algorithm was treating some official celebrity sites differently than others prior to X date”. That’s the real news with the 12/15 update.

But not all celebrities were hit. I provided an example in my post about Katy Perry’s site, which retained the number one spot for her name. And when you check out her site, you’ll see a much stronger site than most official celebrity sites. There’s more content, it’s well-organized, it contains updated information, and so on. Dare I say that Katie “Roared” her way through the update? :)

Katy Perry's official site still ranking number one.

John Mueller’s Celebrity Response:
I was able to ask John Mueller a question about the update during the last webmaster hangout video. You can view John’s response below. He basically explained what I wrote above, which was that users might be looking for more, or different information, versus what those official celebrity websites offer. That might be information from Wikipedia, movie information from IMDB, celebrity news, or information from other sites. But again, it’s interesting that the official celebrity sites all ranked number one prior to 12/15.

December 18, 2017 – More relevancy adjustments, Tremor from 12/15?
While analyzing the 12/15 update, it seemed that Google wasn’t done yet. There’s nothing that screams “holiday spirit” like more algorithm updates as we approach Christmas Day. Yes, I saw yet another date with significant movement, which was 12/18/17.

While checking out the drops and surges on that day, they seemed closely tied to relevancy. For example, Google making sure that sites prominently ranking for queries should actually rank for them. And then adjusting the ones that were “overly-prominent” – basically sites that shouldn’t be ranking well for those queries dropped. Sounds very Panda-like, right?

Now, since 12/18 is so close to 12/15, it’s totally possible the impact was from the same update. It’s hard to say. And if you think about it, the celebrity update was about relevancy as well. For example, as John Mueller explained, making sure the right sites are surfaced based on what Google believes users want to see. So again, 12/15 and 12/18 could very well have been the same update (or connected). Remember, Google can push smaller tweaks after larger updates are released. I called those smaller updates “tremors” and John Mueller confirmed that with me back in the “medieval Panda” days.

Algorithm tremors.

December 26, 2017 – Manual actions AND algorithmic movement. And Giphy begins its long road back.
Remember I said Google used to avoid pushing updates close to the holidays? Well, you can’t get much closer than the day after Christmas! Also, there were reports of manual actions being handed out on Christmas Day and you could see the damage pretty clearly, like the screenshot below. Notice the big drop and then the quick recovery? That site was hit on 12/25 with a manual action for unnatural links and it only took three to five days to have the manual action revoked. And the site regained many of the keywords quickly (but not all). This was documented by the site owner.

December 25, 2017 manual action revoked.

And at the same time, I saw sites surging or dropping algorithmically starting on 12/26. Therefore, not only will Google push changes close to the holidays, they are pushing changes very close to the actual holiday (and even dishing out manual actions on Christmas Day!)

One site that caught my attention that was positively impacted on 12/26/17 was Giphy, which got hammered back in October. I shared the drop on Twitter in October after seeing a massive drop. Google basically started deindexing their /search/ pages in October, which were ranking well and driving a lot of traffic.

Here was the massive drop for Giphy in early October 2017:

Giphy drops in early October 2017.

On 12/26, Giphy began the long road back to recovery. Note, they didn’t jump back to where they were… but they did surge. And the /search/ pages aren’t driving that surge. Instead, it looks like they are building out explore pages, while their gif pages saw an increase too.

Giphy surges on 12/26/17.

Here is snapshot of rankings changes during the 12/26/17 update for Giphy:

Giphy rankings increase on 12/26/17.

Others saw movement as well on 12/26 (both up and down). And the second site below had been impacted by previous quality updates (February and May of 2017):

Big drop on 12/26/17.

Increase during the 12/26/17 update with connections to previous updates.

Important Side Note: Losing Featured Snippets and Rich Snippets – A sign of a quality update.
I’ve mentioned before that both rich snippets and featured snippets have a quality component. Therefore, sites can lose or gain them when Google refreshes its quality algorithms. Well, we saw more of that during the December updates.

For example, check out the site below, which was negatively impacted by one of the December updates. They had many rich snippets until the update, and then boom, they were removed. Nearly 18% of queries yielded rich snippets prior to the algorithm update and that dropped to just .27%. That’s insane. Video results also drastically dropped, but I haven’t dug into that yet for the site in question. But that’s an interesting side note.

Site loses rich snippets after Google algorithm update.

My recommendations for moving forward:
Moving forward, I have some recommendations for site owners that were impacted (and for those that believe they are in the gray area and susceptible to being hit by future updates). First and foremost, always look to hunt down quality problems, UX barriers, aggressive and disruptive advertising, and technical SEO problems. If you stay on top of that, and nip those problems in the bud, you have a much better chance at avoiding a major hit. And that includes surfacing over-optimization and doorway-like pages which were hit on 12/5 and 12/12. Make sure you are not employing any tactics that can cause serious damage. Google clearly made an adjustment from an over-optimization and doorway page standpoint, and many sites were left reeling.

Because if you don’t stay on top of quality problems, and those issues remain on a site for an extended period of time, then you can wake up to a scary situation. I’ve received calls from many companies over the years that fell into the latter category. It’s not pretty and I recommend avoiding that at all costs.

Here’s a quick list of items you can start on now. It doesn’t contain every item you should tackle, but it’s a heck of a starting point:

  • Crawl your site today, and then on a regular basis. Hunt down quality problems, technical SEO problems, and more. Fix them ASAP.
  • Analyze your site through the lens of quality. Make sure you are providing the best possible content and user experience for the topic at hand. Objectively evaluate content-quality from the standpoint of a typical user searching for queries on Google that lead to those pages.
  • Read the Quality Rater Guidelines (QRG) several times, review your site objectively, surface potential problems, and rectify those problems quickly. As I’ve said before, I’m seeing many connections between the QRG and what I’m seeing in the field while analyzing sites impacted by major algorithm updates.
  • Ensure your site works well across devices, and make sure you are ready for Google’s mobile-first index. Google is actively moving sites to its mobile-first index now and will continue to do so throughout 2018 (and beyond).
  • Make sure your ads don’t send users screaming from your site. And don’t deceive your users with weaved ads or ads that look like main content. Don’t hold affiliate dollars over user experience. That probably won’t work out well for you.
  • Remove UX barriers that inhibit people from accomplishing tasks on your site. Google doesn’t want to send people to a site that’s frustrating to use.
  • Perform user testing. You never know what real people think about traversing your site until you actually hear from them. You might be shocked what you find.

Summary – The Hornets’ Nest Is The New Norm
I don’t see the frequency of updates slowing down any time soon. As we’ve seen in both September and December of 2017, Google can, and will, push multiple updates in a short amount of time. And that’s without giving much information about the specific updates. I recommend reading my closing tips, which can help weed out quality problems. And I would do this even if you haven’t been hit by an algorithm update. Some proactive work now can help maintain the long-term health of your site.

Moving forward, I’ll continue sharing more of what I’m seeing volatility-wise here on my blog, on Twitter, Facebook, etc. And if history has shown us anything, I’ll be posting about the next wave of updates pretty soon.

Welcome to Google Land. Please keep your arms, legs, and websites inside the vehicle at all times. :)

GG

 

Filed Under: algorithm-updates, google, manual-actions, seo

Panda, Penguin, and Manual Actions – Questions, Tips, and Recommendations From My SES Atlanta Session

July 14, 2014 By Glenn Gabe 12 Comments

SES Atlanta Panda

{Important Update About Penguin: Read John Mueller’s latest comments about the Penguin algorithm.}

I just returned from SES Atlanta, where I presented “How To Avoid and Recover From Panda, Penguin, and Manual Actions”. The conference was outstanding, included a killer keynote by Duane Forrester and sessions packed with valuable information about SEO and SEM. By the way, I entered my hotel room in Atlanta and immediately saw a magazine on the desk. The photo above is the cover of that magazine! Yes, a Panda was on the cover. You can’t make this stuff up. :)

During (and after) my presentation about algorithm updates and penalties, I received a number of outstanding questions from audience members. And later in the day, I led a roundtable that focused on Panda and Penguin. There were also some great conversations during the roundtable from business owners and marketers across industries. It’s always interesting to hear top marketer concerns about major algorithm updates like Panda and Penguin (and especially Panda 4.0 which just rolled out in late May). We had a lively conversation for sure.

On the plane flight home, I started thinking about the various questions I was asked, which areas were the most confusing for marketers, and the tips and recommendations I was sharing.  And based on that list, I couldn’t help but think a Q&A style blog post could be very helpful for others dealing with Panda, Penguin, and manual actions. So, I decided to write this post covering a number of those questions. I can’t cover everything that I spoke about at SES Atlanta (or this post would be huge), but I can definitely provide some important tips and recommendations based on questions I received during the conference.  Let’s jump in.

Algorithm Updates and Manual Actions – Q&A From SES Atlanta

Question: I’ve been hit by Panda 4.0. What should I do with “thin content” or “low-quality” content I find on my website?  Is it better to nuke the content (404 or 410), noindex it, or should I redirect that content to other pages on my site?

Glenn: I hear this question often from Panda victims, and I know it’s a confusing topic. My recommendation is to remove thin and low-quality content you find on your site. That means 404 or 410 the content or noindex the content via the meta robots tag. When you have a content quality problem on your site, you need to remove that content from Google’s index. In my experience with helping companies recover from Panda, this has been the best path to take.

That said, if you find content that’s thin, but you feel you can enhance that content, go for it. If you believe the content could ultimately hold information that people are searching for, then beef it up. Just make sure you do a thorough job of developing the additional content. Don’t replace thin content with slightly thin content. Create killer content instead. If you can’t, then reference my first point about nuking the content.

Also, it’s important to ensure you are removing the right content… I’ve seen companies nuke content that was actually fine thinking it was low-quality for some reason. That’s why it’s often helpful to have an objective third party analyze the situation. Business owners and marketers are often too close to their own websites and content to objectively rate it.

Panda Decision Matrix

 

Question: How come I haven’t seen a Panda recovery yet even though I quickly made changes? I was expecting to recover during the next Panda update once the changes were implemented.

Glenn: This is another common question from Panda victims. It’s important to understand that completing the changes alone isn’t enough. Google first needs to recrawl the site and the changes you implemented.  Then it needs to better understand user engagement based on the changes. I’ve explained many times in my blog posts about Panda that the algorithm is heavily focused on user engagement. So just making changes on your site doesn’t provide Google enough information.

Panda recovery can take time. Just read my case study about 6 months with Panda. That was an extreme situation in my opinion, but it’s a great example of how long it can take to recover.

Second, Panda roughly rolls out once per month. You need an update to occur before you can see changes. But that’s not a hard rule. John Mueller from Google clarified the “Panda Tremors” I have been seeing since Panda 4.0, and explained that there isn’t a fixed frequency for algorithm updates like Panda. Instead, Google can continue to tweak the algo to ensure it yields the desired results. Translation: you might see turbulence after a Panda hit (and you may see increases or decreases as the tremors continue).

Panda Tremors John Mueller

And third, you might see smaller recoveries over time during subsequent updates (versus a full recovery in one shot). I’ve had several clients increase with subsequent Panda updates, but it took 4-5 updates for them to fully recover. So keep in mind that you might not see full recovery in one shot.

 

Question:  We know we have an unnatural links problem, and that we were hit by Penguin, but should we tackle the links problem or just build new links to balance out our link profile?

Glenn: I’ve seen many companies that were hit by Penguin avoid tackling the root problem, and instead, just try and build new links to balance out their link profile. In my opinion, that’s the wrong way to go. I always recommend aggressively handling the unnatural links situation, since that’s the most direct path to Penguin recovery.

And to clarify, you should still be pumping out killer content, using Social to get the word out, etc. I always tell clients impacted by Penguin or Panda to act like they aren’t impacted at all. Keep driving forward with new content, sharing via social media, connecting with users, etc. Fresh links and shares will be a natural side effect, and can help the situation for sure. And then the content they are building while under the Penguin filter could end up ranking well down the line. It’s hard to act like you’re not hit, but that’s exactly what you need to do. You need to be mentally tough.

Address Unnatural Links for Penguin

 

Question: Is it ok to remove content from Google’s index? Will that send strange signals to the engines?

Glenn: Nuke it. It’s totally fine to do so, and I’ll go even further and say it could be a great thing to do. I mentioned this several times in my Panda 4.0 findings, but the right indexation is more important than high indexation. In other words, make sure Google has your best content indexed, and not thin, duplicate, or other low-quality content.

I had one client drop their indexation by 83% after being impacted by Phantom and Panda, and they are doing extremely well now Google organic-wise. I love the screenshot below. It goes against what many marketers would think. Lower indexation = more Google traffic. That’s awesome.

Indexation and Panda

 

Question: We consume a lot of syndicated content. What’s the best way to handle attribution?

Glenn: I saw a number of sites get smoked during Panda 4.0 that were consuming a lot of syndicated content and not handling that properly SEO-wise. The best way to handle attribution for syndicated content is to use the cross domain canonical url tag pointing to the original article. If you can’t do that (or don’t want to do that), then you can keep the content out of Google’s index by noindexing it via the meta robots tag.

It’s not your content, so you shouldn’t be taking credit for it.  That said, if set up correctly, it’s fine to have syndicated content on your site for users to read. But the proper attribution is important or it can look like you are copying or scraping content. I know that won’t go over well for ad teams looking to rank in organic search (to gain more pageviews), but again, it’s not your content to begin with.

Syndication and Panda

 

Question: Why hasn’t there been a Penguin update since October of 2013? What’s going on? And will there ever be another update?

Glenn: It’s been a long time since the last Penguin update (October 4, 2013). Like many others heavily involved with Penguin work, I’m surprised it has taken so long for another update.

Penguin 2.1 on October 4, 2013

Matt Cutts recently explained at SMX Advanced that they have been heavily working on Panda 4.0, so Penguin has taken a back seat. But he also said that an engineer came up to him recently and said, “it’s probably time for a Penguin update”. That situation is both positive and scary at the same time.

On the one hand, at least someone is thinking about Penguin on the webspam team! But on the flip side, they clearly haven’t been focusing on Penguin for some time (while many Penguin victims sit waiting for an update). On that note, there are many webmasters who have rectified their unnatural link problems, disavowed domains, urls, etc., and are eagerly awaiting a Penguin update. It’s not exactly fair that Google has been making those business owners wait so long for Penguin to roll out again.

Now, there’s always a possibility that there is a problem with the Penguin algorithm. Let’s face it, there’s no reason it should take so long in between updates. I’m wondering if they are testing Penguin and simply not happy with the results. If that’s the case, then I could see why they would hold off on unleashing a new update (since it could wreak havoc on the web). But that’s just speculation.

In my opinion, it’s not cool to let Penguin victims that have worked hard to fix their link problems sit in Penguin limbo. So either Google is seriously punishing them for the long-term, they have put the algo on the back burner while focusing on other algos like Panda, or Penguin is not up to par right now. Remember, if Google isn’t happy with the results, then they don’t have to push it out. And if that’s the case, Penguin victims could sit in limbo for a long time (even longer than the 9 months they have waited so far.)  Not good, to say the least.


Important Penguin Update: Google’s John Mueller provided more information about the Penguin algorithm on today’s Webmaster Central Office Hours Hangout.

John was asked if Penguin would be released again or if it was being retired. And if it was being “retired”, then would Google at least run it one more time to free those webmasters that had cleaned up their link profiles. John explained that Penguin was not being retired. Let me say that again. he said Penguin is not being retired. John explained that it can sometimes take longer than expected to prepare the algorithm and update the necessary data. He also explained that if Google were to retire an algorithm, then they would “remove it completely” (essentially removing any effect from the algorithm that was in place).

So we have good news on several fronts. Penguin is still alive and well. And if Google did retire the algo, then the effect from Penguin would be removed. Let’s hope another Penguin update rolls out soon.

You can view the video below (starting at 5:16) or you can watch on YouTube -> https://www.youtube.com/watch?v=8r3IIPCHt0E&t=5m16s

 

Question: We’ve been hit by both Panda and Penguin. We don’t have a lot of resources to help with recovery, so which one do we tackle first?

Glenn: I’ve helped a number of companies with Pandeguin problems over the years, and it’s definitely a frustrating situation for business owners. When companies don’t have resources to tackle both situations at the same time, then I’ve always been a big fan of tackling the most acute situation first, which is Penguin.

Pandeguin Hit

Panda is a beast, and has many tentacles. And Penguin is all about unnatural links (based on my analysis of over 400 sites hit by Penguin since April 24, 2012). That’s why I recommend focusing on Penguin first (if you can’t knock out both situations at once). I recommend aggressively tackling unnatural links, remove as many spammy links as you can, and then disavow the remaining ones you can’t get to manually. Then set up a process for monitoring your link profile over time (to ensure new unnatural links don’t pop up).

After which, you can tackle the Panda problem. I would begin with a comprehensive Panda audit, identify the potential problems causing the Panda hit, and aggressively attack the situation (the bamboo). Move quickly and aggressively. Get out of the grey area of Panda (it’s a maddening place to live).

 

Question: Is linkbuilding dead? Should I even focus on building links anymore and how do I go about doing that naturally?

Glenn: Links are not dead! The right links are even more important now. I know there’s a lot of fear and confusion about linkbuilding since Google has waged war on unnatural links, but to me, that makes high quality links even more powerful.

Duane Forrester wrote a post recently on the Bing Webmaster Blog where he explained if you know where a link is coming from prior to gaining that link, then you are already going down the wrong path. That was a bold statement, but I tend to agree with him.

Duane Forrester Quote About Linkbuilding

I had several conversations about this topic at SES Atlanta. To me, if you build killer content that helps your target audience, that addresses pain points, and teaches users how to accomplish something, then there’s a good chance you’ll build links. It’s not the quantity of links either… it’s the quality. I’d rather see a client build one solid link from a site in their niche versus 1000 junky links. The junky links are Penguin food, while the solid link is gold.

 

Question: I was hit by Panda, but my core competitors have the same problems we do. We followed what they were implementing, and we got hit. Why didn’t they get hit? And moving forward, should we follow others that are doing well SEO-wise?

Glenn: I can’t tell you how many times companies contact me and start showing me competitors that are doing risky things SEO-wise, yet those sites are doing well in Google. They explain that they tried to reproduce what those competitors were doing, and then they ended up getting hit by Panda. That situation reinforces what I’ve told clients for a long time. Competitive analyses can be extremely beneficial for gathering the right intelligence about your competitors, but don’t blindly follow what they are doing. That’s a dangerous road to travel.

Instead, companies should map out a strong SEO strategy based on their own research, expertise, target audience, etc. Ensure you are doing the right things SEO-wise for long-term success. Following other companies blindly is a dangerous thing to do. They could very easily be headed towards SEO disaster and you’ll be following right along.

For example, I had a client always bring up one specific company to me that was pushing the limits SEO-wise (using dark grey hat tactics). Well, they finally got hit during a Panda update in early 2014 and lost a substantial amount of traffic. I sent screenshots to my client which reinforced my philosophy. My client was lucky they didn’t follow that company’s tactics… They would have jumped right off the SEO cliff with them. The screenshot below shows an example of a typical surge in Google before a crash.

Surge in Traffic Before Algo Hit

 

Question: We’ve been working hard on a manual action for unnatural links, but right before filing reconsideration, it expired. What should we do?

Glenn: I’ve seen this happen with several clients I was helping with manual actions. It’s a weird situation for sure. You are working on fixing problems based on receiving a manual action, and right before you file a reconsideration request, the manual action disappears from Google Webmaster Tools. When that happens, is the site ok, do you still need to file a reconsideration request with Google, should you wait, or should you continue working on the manual action?

It’s important to know that manual actions do expire. You can read that article by Marie Haynes for more information about expiring manual actions. Google has confirmed this to be the case (although the length of each manual action is variable). But those manual actions can return if you haven’t tackled the problem thoroughly… So don’t’ think you’re in the clear so fast.

Expiring Manual Actions

 

That said, if you have tackled the problem thoroughly, then you are probably ok. For example, I was helping a company with a manual action for unnatural links and we had completed the process of removing and disavowing almost all of their unnatural links. We had already written the reconsideration request and were simply waiting on a few webmasters that were supposed to take down more links before filing with Google.

As we were waiting (just a few extra days), the manual action disappeared from Google Webmaster Tools. Since we did a full link cleanup, we simply drove forward with other initiatives. That was months ago and the site is doing great SEO-wise (surging over the past few months).

Just make sure you thoroughly tackle the problem at hand. You don’t want a special surprise in your manual action viewer one day… which would be the return of the penalty. Avoid that situation by thoroughly fixing the problems causing the penalty.

 

Summary – Clarifying Panda and Penguin Confusion
As you can see, there were some outstanding and complex questions asked at SES Atlanta. It confirms what I see every day… that business owners and webmasters are extremely confused with algorithm updates like Panda and Penguin and how to tackle penalties. And when you combine algo updates with manual actions, you have the perfect storm of SEO confusion.

I hope the Q&A above helped answer some questions you might have about Panda, Penguin, and manual actions. And again, there were several more questions asked that I can’t fit into this post! Maybe I’ll tackle those questions in another post. So stay tuned, subscribe to my feed, and keep an eye on my Search Engine Watch column.

And be prepared, I felt a slight chill in the air this past weekend. The next Penguin update could (and should) be arriving soon. Only Google knows, but I hope they unleash the algo update soon. Like I said in my post, there are many webmasters eagerly awaiting another Penguin rollout. Let’s hope it’s sooner than later.

GG

 

Filed Under: algorithm-updates, bing, google, manual-actions, seo

Connect with Glenn Gabe today!

Latest Blog Posts

  • How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)
  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent