The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for December 2013

Panda Report – How To Find Low Quality Content By Comparing Top Landing Pages From Google Organic

December 18, 2013 By Glenn Gabe 13 Comments

Top Landing Pages Report in Google Analytics

Note, this tutorial works in conjunction with my Search Engine Watch column, which explains how to analyze the top landing pages from Google Organic prior to, and then after, Panda arrives.  With the amount of confusion circling Panda, I wanted to cover a report webmasters can run today that can help guide them down the right path while on their hunt for low-quality content.

My Search Engine Watch column covers an overview of the situation, why you would want to run the top landing pages report (with comparison), and how to analyze the data.  And my tutorial below covers how to actually create the report.  The posts together comprise a two-headed monster that can help those hit by Panda get on the right track.   In addition, my Search Engine Watch column covers a bonus report from Google Webmaster Tools that can help business owners gather more information about content that was impacted by the mighty Panda.

Why This Report is Important for Panda Victims
The report I’m going to help you create today is important, since it contains the pages that Google was ranking well and driving traffic to prior to a Panda attack.  And that’s where Google was receiving a lot of intelligence about content quality and user engagement.  By analyzing these pages, you can often find glaring Panda-related problems.  For example, thin content, duplicate content, technical problems causing content issues, low-quality affiliate content, hacked content, etc.  It’s a great way to get on the right path, and quickly.

There are several ways to run the report in Google Analytics, and I’ll explain one of those methods below.  And remember, this should not be the only report you run… A rounded analysis can help you identify a range of problems from a content quality standpoint.  In other words, pages not receiving a lot of traffic could also be causing Panda-related problems.  But for now, let’s analyze the top landing pages from Google Organic prior to a Panda hit (which were sending Google the most data before Panda arrived).

And remember to visit my Search Engine Watch column after running this report to learn more about why this data is important, how to use it, red flags you can identify, and next steps for websites that were impacted.  Let’s get started.

How To Run a Top Landing Pages Report in Google Analytics (with date comparison): 

  • First, log into Google Analytics and click the “All Traffic” tab under “Acquisition”.  Then click “Google / Organic” to isolate that traffic source.
    Accessing Google Organic Traffic in Google Analytics
  • Next, set your timeframe to the date after Panda arrived and extend that for a decent amount of time (at least a few weeks if you have the data).  If time allows, I like to set the report to 4-6 weeks after Panda hit.  If this is right after an algorithm update, then use whatever data you have (but make sure it’s at least one week).  I’m using a date range after the Phantom update hit (which was May 8th).
    Setting a Timeframe in Google Analytics
  • Your next move is to change the primary dimension to “Landing Page” to view all landing pages from Google organic search traffic.  Click the “Other” link next to “Primary Dimension” and select “Acquisition”, and then “Landing Page”.  Now you will see all landing pages from Google organic during that time period.
    Primary Dimension to Landing Page in Google Analytics
  • Now let’s use some built-in magic from Google Analytics.  In the timeframe calendar, you can click a checkbox for “Compare to” and leave “Previous period” selected.  Once you click “Apply”, you are going to see all of the metrics for each landing page, but with a comparison of the two timeframes.  And you’ll even have a nice trending graph up top to visualize the Panda horror.
    Comparing Timeframes in Google Analytics
  • As you start going down the list of urls, pay particular attention to the “% Change” column.  Warning, profanity may ensue.  When you start seeing pages that lost 30%, 40%, 50% or more traffic when comparing timeframes, then it would be wise to check out those urls in greater detail.  Again, if Google was sending a lot of traffic to those urls, then it had plenty of user engagement data from those visits.  You might just find that those urls are seriously problematic from a content quality standpoint.
    Viewing The Percent Change in Traffic in Google Analytics

 

Bonus 1: Export to Excel for Deeper Analysis

  • It’s ok to stay within Google Analytics to analyze the data, but you would be better off exporting this data to Excel for deeper analysis.  If you scroll to the top of the Google Analytics interface, you will see the “Export” button.  Click that button and then choose “Excel (XLSX)”.  Once the export is complete, it should open in Excel.  Navigate to the “Dataset” worksheet to see your landing page data (which is typically the second worksheet).
    Exporting A Report In Google Analytics
  • At this point, you should clean up your spreadsheet by deleting columns that aren’t critical for this report.  Also, you definitely want to space out each column so you can see the data clearly (and the data headers).
    Clean Up Google Analytics Export in Excel
  • You’ll notice that each url has two rows, one for the current timeframe, and one for the previous timeframe.  This enables you to see all of the data for each url during both timeframes (the comparison).
    Two Rows For Each URL Based on Timeframe
  • That’s nice, but wouldn’t it be great to create a new column that showed the percentage decrease or increase for visits (like we saw in Google Analytics?)  Maybe even with highlighting to show steep decreases in traffic  Let’s do it.  Create a new column to the right of “Visits” and before “% New Visits”.  I would title this column “% Change” or something similar.
    Creating a New Column for Percent Change in Excel
  • Next, let’s create a formula that provides the percentage change based on the two rows of data for each url.  Find the “Visits” column and the first landing page url (which will have two rows).  Remember, there’s one row for each timeframe.  If your visits data is in column C, then the post-Panda data is in row 2, and the pre-Panda data is in row 3 (see screenshot below).  You can enter the following formula in the first cell for the new column “% Change”.=(C3-C2)/C3.Again, C3 is the traffic levels from the previous timeframe, C2 is the traffic levels from the current timeframe (after the Panda hit), and you are dividing by the previous traffic levels to come up with the percentage change.  For example, if a url dropped from 5,450 visits to 640 visits, then your percentage drop would be 88%.  And yes, you would definitely want to investigate that url further!
    Creating a Formula to Calculate Percent Change in Excel
  • Don’t worry about the floating decimal point.  We’ll tackle that soon.  Now we need to copy that formula to the rest of the column (but by twos).  Remember, we have two records for each url, so you’ll need to highlight both cells before double clicking the bottom right corner of the second cell to copy the formula to all rows.  Once you do, Excel automatically copies the two rows to the rest of the cells in that column.  Now you should have percentage drops (or increases) for all the urls you exported.  Note, you can also highlight the two cells, copy them, and then highlight the rest of that column, and click paste.  That will copy the formula to the right cells in the column as well.
    Copying a Formula to All Rows in Excel
  • Now, you will see a long, floating decimal point in our new column labeled “% Change”.  That’s an easy fix, since we want to see the actual percentage instead.  Highlight the column, right click the column, and choose “Format Cells”.  Then choose “Percentage” and click “OK”.  That’s it.  You now have a column containing all top landing pages from Google organic, with the percentage drop after the Panda hit.
    Formatting Cells in Excel

 

Bonus 2: Highlight Cells With A Steep Drop in Red

  • If you want the data to “pop” a little more, then you can use conditional formatting to highlight cells that exceed a certain percentage drop in traffic.  That can easily help you and your team quickly identify problematic landing pages.
  • To do that, highlight the new column we created (titled “% Change”), and click the “Conditional Formatting” button in your Home tab in Excel (located in the “Styles” group).  Then select, “Highlight Cells Rules”, and then select, “Greater Than”.  When the dialog box comes up, enter a minimum percentage that you want highlighted.  And don’t forget to add the % symbol!  Choose the color you want to highlight your data with and click “OK”.  Voila, your problematic urls are highlighted for you.  Nice.
    Applying Conditional Formatting in ExcelApplying Conditional Formatting by Percentage in Excel

 

Summary – Analyzing Panda Data
If you made it through this tutorial, then you should have a killer spreadsheet containing a boatload of important data.  Again, this report will contain the percentage increase or decrease for top landing pages from Google Organic (prior to, and then after, a Panda hit).  This is where Google gathered the most intelligence based on user engagement.  It’s a great place to start your analysis.

Now it’s time to head over to my Search Engine Watch column to take a deeper look at the report, what you should look for, and how to get on the right track with Panda recovery.  Between the tutorial and my Search Engine Watch column, I hope to clear up at least some of the confusion about “content quality” surrounding Panda updates.  Good luck.

GG

 

 

Filed Under: algorithm-updates, google, google-analytics, seo, web-analytics

Google’s Pirate Algorithm and DMCA Takedowns | Exploring the Impact Threshold

December 9, 2013 By Glenn Gabe 3 Comments

Google Pirate Algorithm

In August of 2012, Google announced an update to its search algorithm that targeted websites receiving a high number of DMCA takedown requests.  The update was unofficially called “The Pirate Update”, based on the concept of pirating someone else’s content like music, movies, articles, etc.  With the update, Google explained that “sites receiving a lot of removal notices may appear lower in our results.”   For most websites, this wasn’t a big deal.  But for others, this was more than just a proverbial “shot across the bow”.  It was a full-blown cannon shot right through the hull of a ship.

I do a lot of algorithm update work, including Panda, Penguin, and Phantom work, so it’s not unusual for website owners to contact me about major drops in traffic that look algorithmic.  And I’ve had several companies contact me since August 2012 that believed the Pirate update could be the cause of their drop.   Regarding dates, the update first rolled out in August of 2012, and the impact could be seen almost immediately.  I’ll cover more about how I know that soon.

My goal with this post is to introduce the Pirate update, explain how you can analyze DMCA takedowns requests (via data Google provides), and explore the threshold of removal requests that could get a site algorithmically impacted (or what I’m calling “The Threshold Impact”).

So without further ado, it’s time to sail into dangerous waters.

 

DMCA Takedowns
So, what’s a DMCA takedown?  It’s essentially a notice sent to an online service provider explaining that infringing material resides on its network, and that the infringing url(s) or website should be taken down.  As you can imagine Google receives many of these takedown requests on a regular basis, and it provides a simple process for filing takedowns.  Actually, Google provides a complete transparency report where it lists a slew of data regarding removal requests, copyright owners, domains specified in DMCA notices, etc.  I’ll explain more about that data next.

Google Transparency Report

For the purposes of this post (focused on the Pirate update), DMCA takedowns are sent to Google when someone or some entity believes urls on your website contain their copyrighted material.  And of course, those urls are probably ranking for target queries.  So, companies can go through the process of filing a copyright complaint, Google will investigate the issue, and take action if warranted (which means Google will remove the url(s) from the search results).  In addition, every request is documented, so Google can start to tally up the number of DMCA notices that target your domain.  And that’s extremely important when it comes to the Pirate algorithm.

And jumping back to Google’s original post about the Pirate Update, Google says, “Sites with high numbers of removal notices may appear lower in our results.” So every time a new takedown notice comes in, you have one more strike against you.  Now, we don’t know how many strikes a site needs to receive before the Pirate algorithm kicks in, and I try and shed some light on that later in this post.

Google Transparency Report – Requests to Remove Content
I mentioned earlier that Google provides a Transparency Report, where it lists requests to remove content from its services (from governments, and due to copyright).  The section of the Transparency Report focused on copyright requests provides a wealth of data regarding takedown notices, domains being specified in those takedowns, top copyright owners, etc.  You can see on the site that over 5M urls were requested to be taken down by Google just last week, and 24M in the past month!  Yes, it’s a big problem (and a huge undertaking by Google).

Copyright Removal Requests

Being a data nut, I was like a kid in a candy store when I started going through this data.  This is the “smoking gun”, so to speak, when analyzing sites that could have been hit by Pirate.  By clicking the “Domain Specified” link in the left navigation, you can scroll through a list of the domains being targeted via DMCA takedown notices.  You can see the number of copyright owners that have filed notices, the number of reporting organizations (which work on behalf of copyright owners), and the number of urls submitted (that allegedly contain copyrighted material).  You can filter this data by week, month, year, or “all available”.  And more importantly, you can download the data as a .csv file.  This is where it gets interesting.

Domains Listed in Google Transparency Report

Working with the .csv file
First, and most importantly, the file holding domains contains 14M records. So if you try and simply open the file in Excel, you won’t get very far. Each worksheet in Excel can only contain 1M rows, so you have far too much data to run a simple import.  To get around this issue, I imported the file into Access, so I could work with the data in various ways.  Note, Access is a database program that enables you to import larger sets of data, and then query that data based on various criteria.  It’s a robust desktop database program from Microsoft that comes with certain versions of Microsoft Office.  So, you might already have Access installed and not even know it.

Using Microsoft Access to Analyze DMCA Takedown Requests

My goal was to analyze the domains getting hit by the Pirate algorithm, and then also try to identify the threshold Google is using when choosing to target a domain.  For example, how many requests needed to be filed, how many urls needed to be targeted, and what’s the “url to total indexed ratio”?  More about that last metric soon.

Tracking The Pirate Update via Data
Now that I had Pirate data, it was time to start analyzing that data.   I began to take a look at the top domains in the list, and cross-reference their organic search trending via SEMRush.  I wanted to make sure I could spot the impact from the Pirate algorithm for these specific domains.   That turned out to be easier than I thought. Check out the trending below for several of the websites that topped the list:

Website Impacted by Pirate Update - Example 1

Website Impacted by Pirate Update - Example 2

Website Impacted by Pirate Update - Example 3

Website Impacted by Pirate Update - Example 4

And I saw many more just like this…

Searching For The Impact Threshold – The Connection Between DMCA Takedowns and Algo Hits
Based on heavily reviewing the organic search trending for sites on the list, I wanted to see if there was a threshold for getting algorithmically impacted.  For example, did there have to be a certain number of complaints before Google impacted a site algorithmically?  Or was that too rudimentary?  Were there other factors involved that triggered the algo hit?  These are all good questions and I try to answer several of them below.

In addition to straight removal notices, it’s hard to overlook a specific metric Google is providing in the transparency report for DMCA takedowns.  It’s listed on the site as “percentage of pages requested to be removed based on total indexed pages”.  Now that metric makes sense! (theoretically anyway).  Understanding the total package could yield better decisions algorithmically than just the pure number of takedown requests.

For example, if the percentage is 1% or less for certain sites, they might be treated differently than a site with 5%, 10% (or even higher).  Note, I saw some sites with greater than 50%!  Based on my research, I saw a strong correlation with sites showing 5% or greater and what looked to be Pirate algorithm hits (i.e. 5% of the total urls on the site were requested to be removed via DMCA takedown requests).  And for the domains that dropped sharply after Pirate was first introduced, the percentage was often higher.  For example, I saw percentages of “<50%” often, and even a few “>50%”.

Website With High Percentage Of Removal Requests Based On Total Indexed Pages

I know this sounds obvious, but if half of your indexed urls have been requested to be taken down, you’ve probably got a serious Pirate problem. :)  And it should be no surprise that you’ve been hit by the Pirate update.

DMCA Takedowns and Google – What To Do If You Have Been Contacted
If a DMCA takedown request has been filed with Google about infringing url(s) on your site, you should receive a message in Google Webmaster Tools explaining the situation, along with links to the infringing content.  At that point, you can file a counter notice, remove the content, or choose to ignore the problem (which I don’t recommend).  If you do remove the content, then you can fill out the “content removed notification form”.   Once you complete the process of removing urls and notifying Google, then you will need to wait to see how your site rebounds.  Note, Google provides links to the forms I mentioned above in their messages via Webmaster Tools.

Example of a DMCA notice in Google Webmaster Tools:
In case you were wondering what a DMCA takedown request from Google looks like, here’s a link to a webmaster forum thread that shows a GWT DMCA message.

Example of DMCA Notice in Google Webmaster Tools

Also, and this is more related to the algorithmic hit you can take, I recommend visiting the transparency report and analyzing the data.  You can search by domain by accessing the search field in the copyright section of the transparency report.  You can also download and import the data into Access to identify the status of your domain (as mentioned earlier).

For example, you can figure out how many requests have been filed and review the % column to see how Google understands your entire domain based on alleged copyright violations.  If you see a large number of urls, and a high(er) percentage of infringing urls based on total indexation, then it could help you determine the cause of the latest algorithm hit that impacted your site.  Or if you’re lucky, you could thwart the next attack by being aggressive with copyright cleanup.

Summary – Walking The Plank With The Pirate Update
I hope this post explained more about the Pirate update, how it can impact a website, how you can research your domain via Google’s Transparency Report, and what to do if you have received a DMCA message from Google.  My recommendation to webmasters is to (obviously) avoid breaking copyright laws, take swift action if you are contacted by Google with DMCA notices (remove content or file a counter notice), and to research your domain to better understand the big picture (% of urls requested to be removed based on total indexation).

If not, you could very well be walking the plank into search oblivion.  And let’s face it, nobody wants to sleep in Davy Jones’ locker.  :)

GG

 

Filed Under: algorithm-updates, google, seo

Connect with Glenn Gabe today!

Latest Blog Posts

  • How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)
  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent