The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

Archives for November 2012

How Google Analytics *Really* Handles Referring Traffic Sources [Experiment] – Why Clicks and Visits Might Not Match Up

November 27, 2012 By Glenn Gabe 14 Comments

Google Analytics Referrals

Let me walk you through a common scenario in web marketing.  You have a website, and some people visit your site by clicking through links on other websites.  In your web analytics reporting, those visits are categorized as referring visits.  In Google Analytics specifically, those visits show up in your “Referrals” report under “Traffic Sources”.  And when visitors click on an outbound link on your site (a link to another website), your site shows up as a referring source in that website’s referrals report.

That’s pretty straight forward, but what I’m about to cover isn’t.  I find many marketers and webmasters don’t understand how Google Analytics handles that referring traffic during future visits to their websites.  For example, if someone clicks through to your site from sampledomain.com, leaves your site, and then returns the next day.

Do you know how that visit will be categorized in Google Analytics?  There’s a good chance you don’t, and I’m going to cover the topic in detail in this post.

Understanding Referring Visits is Important When Revenue and Cost Are Involved
I believe one of the reasons this topic isn’t understood very well is because it often doesn’t directly impact revenue or cost for many webmasters.  Sure, you definitely want to know how many people are coming from each referring site, but for many webmasters, the exact number doesn’t impact revenue, or payments to other webmasters.

But, for websites that need to track the monetary value of inbound visits and outbound clicks, accurately determining referring visits is extremely important.  For example, imagine you were charging certain partners for traffic you were sending from your site to theirs, or vice versa.  The fact of the matter is that checking referring sources could show different numbers than you think, and could be much different than the outbound clicks you see.  And depending on your own situation, the numbers could be way off…

The Core Disconnect – How Google Analytics Calculates Visits from Referring Sources (or any campaign, search visit, etc.)
Here’s the core disconnect.  When someone clicks through to your site via a referring source, the utm_z cookie is updated with traffic source information.  That cookie will not be overwritten unless another referring source or campaign takes it place.  Direct Traffic will not overwrite this value.  Let me say that again.  Direct Traffic will not overwrite the utm_z cookie value.  That means the utm_z value will remain the referring source of traffic when those visitors return to the site.

Google Analytics utm_z Cookie


What This Means To You

I know what you’re thinking. This guy is telling me about utm_z cookies?? What the heck does that mean to me?  OK, stick with me for a second.  Let’s say you had a partnership set up where another website pays you for traffic.  Maybe you’re both in the same niche and want to leverage each other’s traffic for more exposure.  You check your stats for the previous month and notice that you sent 500 visits to partner A.  Cool, so you contact them to check how the partnership is going and to make sure they are seeing the same number of visits.  They come back and say they’ve seen 700 visits from your site and thank you for the traffic.  The check will be cut soon.

Google Analytics Clicks and Visits Could Be Off

But that 200 visit discrepancy is bothering you.  Why is there a big difference between your partner’s reporting and the numbers you are seeing?  And let’s assume you have a solid setup for tracking clicks out of your website.  For example, maybe you are running outbound clicks to partners through a redirect that captures a number of important metrics.  The redirect then sends the visitor off to the correct URL on the partner site.  Basically, you know you are capturing all outbound clicks to the partner website.

This is where the native handling of referring sources in Google Analytics comes into play.  Sure, you are tracking clicks off your site, but your partner’s analytics package is capturing those clicks plus any return visits that are direct visits.  So, if someone clicks through to your partner’s site, then that’s one visit.  If they leave that site, and return directly (by typing the url directly in their browser or via a bookmark), then the visit will show up as a visit from the original referring source (your website).  That’s now two visits.  And if they do it again, that will be three visits.  That’s until another referring source or campaign overwrites the utm_z cookie.  In this example, there were 3 visits to your 1 outbound click!

An Example of How Google Analytics Handles Referrals

Based on this simple example, you can easily see how over a month’s time, some people would click through to your partner’s site and then revisit their site directly (and possibly a few times).  That would lead to more than one visit per user, and could sway the visit count from your website.

Still confused?  Let me clear this up via an experiment below.

Experiment – Calculating Referring Visits in Google Analytics
In the following simple example, I set up a webpage on a second domain that links to a landing page I set up on my website just for this experiment.  I didn’t want to skew the reporting by using an existing page on my site that gets a lot of visits.  Then I used several computers I have here with clean browsers to first visit the referring page that links to my new landing page, and then I clicked through.  The referring source should show up as the domain name of the referring site.  That would be visit #1.

Next, I would leave the new landing page on my site and revisit my website later by typing the exact URL into my browser (what most people would think is a Direct Traffic visit).  In theory, the referring site should show up as the traffic source, even though I’m entering the site as “Direct Traffic”.  Remember, the utm_z cookie will only be overwritten by another referring source or campaign.

Last, I would search for a keyword that my site ranks for, and then click through to the site.  And since this visit was from a search engine, the utm_z cookie would be updated with this new value, and my reporting would show Google as the referring source (along with the keyword I entered). Let’s find out the results of the experiment below.

The Results
1. First Visit

First, I visited the second domain and clicked through to my website.  Here is the first referring visit showing up in my analytics package:
Referring Sites Experiment - First Visit

2. Second Visit (Directly Visiting the Site)
Next, I left the site and returned via Direct Traffic.  Google Analytics shows the referring site as the source for this traffic, even though I entered via “Direct Traffic”. Also notice it accurately categorizes me as a “return visitor”:
Google Analytics Referral Experiment - Second Visit

3. Third Visit (Again Directly Visiting the Site, but the Next Day)
Just to underscore my point, I left and revisited the site the next day (again via Direct Traffic).  Google Analytics again shows the visit is from the initial referring source:
Google Analytics Referral Experiment - Third Visit

4. Fourth Visit, This Time From Search
Finally, I searched for my name on Google and visited my website.  Now Google Analytics shows the keyword that led to the site (from the traffic source “Google”).  Remember, the utm_z cookie will only be updated when another referring source is identified (versus Direct Traffic).
Google Analytics Referral Experiment - Search Visit

 

So there you have it.  Proof that your visit count by source may not be what you think it is.  Now, if you’re reading this post and are either generating revenue from referring visits, or you have to pay partners based on visits, then you might be frantically running to Google Analytics to rerun your reports.  Yes, this could impact things quite a bit.   I’ll leave it up to you how you handle the situation. :)

What You Can Do – The Importance of Clarity
If you do have a partnership where you are either generating revenue by driving traffic, or you are paying for traffic from other websites, then each party needs to clearly understand the arrangement.  Each website involved needs to be clear on the definitions of “traffic”, “clicks”, “visits”, etc.  For example, think about AdWords for a second.  You pay Google for clicks on ads, but don’t pay Google for direct visits back to your site (even though those visits will show up as campaign visits).  And by the way, most partners will not give you access to their reporting anyway… Therefore, you will only know the clicks out from your site.

If you are tracking outbound clicks, you can use event tracking in Google Analytics to track those clicks, including the pages or links where those clicks are originating.  If you don’t want to use event tracking, then you can run outbound clicks through a 302 redirect and capture the information you need to accurately track clicks.  If you are receiving traffic, then you can make sure the referring links contain querystring parameters so you can understand which partner the traffic is coming from (and that it’s not a standard referral from the site).  There are other ways to handle this, and those are just a few ideas.

Summary – Understanding Visits in Google Analytics
I hope you found this post explaining how Google Analytics handles referring visits helpful.  I know this topic can be confusing, and experiments always help clear up some of the confusion.  So now you know why visits might be higher or lower than you think, and how the utm_z cookie controls what shows up in your reporting.  I bet you’ll never look at referring sources the same again.

And let’s hope you’re not on the short end of the stick. :)

Happy Reporting.

GG

 

Filed Under: google, google-analytics, web-analytics

Hunting False Negatives – How To Avoid False Negatives When Checking Redirects After a Website Redesign or Migration [Screaming Frog Tutorial]

November 14, 2012 By Glenn Gabe 1 Comment

How To Check Redirects Using Screaming Frog

Every webmaster has to deal with a website redesign or migration at some point.  And redesigns and migrations often mean that your URL structure will be impacted.  From an SEO perspective, when URL’s need to change, it’s critically important that you have a solid 301 redirection plan in place.  If you don’t, you can pay dearly SEO-wise.

I wrote a post for my Search Engine Journal column last spring titled “How to Avoid SEO Disaster During a Website Redesign” and implementing a 301 redirection plan was one of the most important topics I covered.  I find many webmasters and marketers don’t understand how SEO power is built URL by URL.  As your URL’s build up inbound links and search equity, it’s important that those URL’s maintain those links and equity.  If you change those URL’s, you must notify the search engines where the old content moved to, and that’s where 301 redirects come into play.

So, when you change URL’s, you run the risk of losing all of the links pointing to the older URL’s, and the search power that the URL’s contained.  That’s unless you 301 redirect the old URL’s to the new ones.  A 301 redirect safely passes PageRank from an old URL to a new one (essentially maintaining its search equity).

Unfortunately, I’ve seen many companies either not set up a redirection plan at all, or botch the plan.  That’s when they end up with a catastrophic SEO problem.  Rankings drop quickly, traffic drops off a cliff, sales drop, and nobody is happy at the company (especially the CMO, CFO, and CEO).

Traffic Drop After Website Redesign

Meet the False Negative Redirect Problem, A Silent Killer During Redesigns or Migrations:
Needless to say, properly setting up your redirects is one of the most important things you can do when redesigning or migrating your website.  That said, even if you address redirects and launch the new site, how do you know that the redirects are in fact working?  Sure, you could manually check some of those URL’s, but that’s not scalable.  In addition, just because an older URL 301 redirects to a new URL doesn’t mean it redirects to the correct URL.  If you don’t follow through and check the destination URL (where the redirect is pointing), then you really don’t know if everything is set up properly.

This is what I like to call the False Negative Redirect Problem.  For SEO’s, a false negative occurs when your test incorrectly shows that the redirects are working properly (they don’t test positive for errors), when in fact, the destination URL’s might not be resolving properly.  Basically, your test shows that the redirects are ok, when they really aren’t.  Incorrectly thinking that 301 redirects are working properly by only checking the header response code for the old URL can trick webmasters into believing the redesign or migration has gone well SEO-wise, when in reality, the destination URL’s could be 404’ing or throwing application errors.  It’s a silent killer of SEO.

False Negatives can be a Silent SEO Killer

How To Avoid the Silent SEO Killer When Changing Implementing Redirects
The false negative problem I mentioned above is especially dangerous when changing domain names (where you will often implement one directive in .htaccess or ISAPI_Rewrite that takes any request for a URL at one domain and redirects it to the same URL at another domain).  Just because it 301’s doesn’t mean the correct URL resolves.  Think about it, that one directive will 301 every request… but you need to check the destination URL to truly know if the redirects are working the way you need them to.  Unfortunately, many SEO’s only check that the old URL’s 301, but they don’t check the destination URL.  Again, that could be a silent killer of SEO.

Screaming Frog Hops to the Rescue
I mentioned “scalable” solutions earlier.  Well, Screaming Frog provides a scalable solution for checking redirects during a migration or website redesign.  Note, Screaming Frog is a paid solution, but well worth the $157 annual fee.  Using Screaming Frog, you can import a list of old URL’s from your analytics package or CMS and have it crawl those URL’s and provide reporting.  Running a two-step process for checking redirects and destination URL’s can help you understand if your redirects are truly working.  For example, you might find redirects that lead to 404’s, application errors, etc.  Once you find those errors, you can quickly change them to retain search equity.

Below, I’m going to walk you through the process of exporting your top landing pages from Google Analytics and checking them via Screaming Frog to ensure both the redirects are working and that the destination URL’s are resolving correctly.  Let’s get started.

What You’ll Need and What We’ll Be Doing

  • First, we are going to export our top landing pages from Google Analytics.
  • Second, we’ll use the CONCATENATE function in Excel to build complete URL’s.
  • Next, we’ll add the URL’s to a text file that we can import into Screaming Frog.
  • Then we’ll fire up Screaming Frog and import the text file for crawling.
  • Screaming Frog will crawl and test those URL’s and provide reporting on what it finds.
  • Then we can export the destination URL’s we find so we can make sure they resolve correctly.  Remember, just because the old URL’s 301 redirect doesn’t mean the destination URL’s resolve properly.  We are hunting for false negatives.
  • Last, and most importantly, you can fix any problematic redirects to ensure you maintain search equity.


How To Use Screaming Frog to Hunt Down False Negatives:

  1. Export Top Landing Pages from Google Analytics
    Access your Google Analytics reporting and click the “Content” tab, “Site Content”, and then “Landing Pages”.  Click the dropdown for “Show rows” at the bottom of the report and select the number of rows you want to view.Export top landing pages from Google Analytics

    Tip: If you have greater than 500 pages, then you can edit the URL in Google Analytics to display greater than 500 URL’s.   After first selecting a row count from the dropdown, find the parameter named table.rowCount= in the URL.  Simply change the number after the equals sign to 1000, 5000, 10000, or whatever number you need to capture all of the rows.   When you export your report, all of the rows will be included.

  2. Export the Report from Google Analytics
    Click the Export button at the top of the report and choose “CSV”.  The file should be exported and then open in Excel once it downloads.
    Exporting a report from Google Analytics
  3. Use Excel’s CONCATENATE Function to Build a Complete URL
    When the URL’s are exported from Google Analytics, they will not include the protocol or domain name.  That’s the beginning of a URL with http://www.yourdomain.com.  Therefore, you need to add this to your URL’s before you use them in Screaming Frog.  Excel has a powerful function called CONCATENATE, which lets you combine text and cell contents to form a new text string.  We’ll use this function to combine the protocol and domain name with the URL that Google Analytics exported.

    Create a new column next to the “Landing Page” column in Excel.  Click the cell next to the first landing page URL and start entering the following: =CONCATENATE(“http://www.yourdomain.com”, A8).  Note, change “yourdomain.com” to your actual domain name.  Also, A8 is the cell that contains the first URL that was exported from Google Analytics (in my spreadsheet).  If your spreadsheet is different, make sure to change A8 to whichever cell contains the first URL in your sheet.  The resulting text should be the complete URL (combining protocol, domain name, and URL exported from Google Analytics).  Then you can simply copy and paste the contents of that cell (which contains the formula) to the rest of the cells in that column.  The formula will automatically adjust to use the right landing page URL for that row. Now you have a list of all complete URL’s that you can import into Screaming Frog.

    Using the CONCATENATE function in Excel to buld URL's

  4. Copy all URL’s to a Text File
    Since all we want are the URL’s for Screaming Frog, you can select the entire new column you just created (with the complete URL’s) and copy those URL’s.  Then open a text file and paste the URL’s in the file.  You can use Notepad, Textpad, or whatever text editor you work with.  Save the file.

    Copy the URL list to a text file

  5. Fire Up Screaming Frog
    After launching Screaming Frog, let’s change the mode to “list” so we can upload a list of URL’s.  Under the “Mode” menu at the top of the application, click “List”, which enables you to use a text file of URL’s to crawl.   Then click “Select File” and choose the text file we just created.  Then you can click “Start” and Screaming Frog will begin to crawl those URL’s.

    Using List Mode to Crawl URL's

  6. Review Header Response Codes From the Crawl
    At this point, you will see a list of the URL’s crawled, the status codes, and the status messages.  Remember, all of the URL’s should be 301 redirecting to new URL’s.  So, you should see a lot of 301’s and “moved permanently” messages.  If you see 404’s at this point, those URL’s didn’t redirect properly.  Yes, you just found some bad URL’s, and you should address those 404’s quickly.  But that’s not a false negative.  It’s good to catch low-hanging fruit, but we’re after more sinister problems.

    Viewing 301 redirects after a Screaming Frog crawl

  7. Find the Destination URL’s for Your Redirects
    Now, just because you see 301 redirects showing up in the main reporting doesn’t mean the destination URL’s resolve correctly.  If you click the “Response Codes” tab, you’ll see the redirect URI (where the 301 actually sends the crawler).  THOSE ARE THE URL’S YOU NEED TO CHECK.    Click the “Export” button at the top of the screen to export the “Response Code” report.  This will include all of the destination URL’s.
    Finding Destination URL's via the Response Code Tab
  8. Copy All Destination URL’s to a Text File
    In Excel, copy the destination URL’s and add them to a text file (similar to what we did earlier). Make sure you save the new file.  We are now going to crawl the destination URL’s just like we crawled the original ones.  But, this process will close the loop for us, and ensure the destination URL’s resolve correctly.  This is where we could find false negatives.

    Exporting all destination URL's to excel from Screaming Frog

  9. Import Your New Text File and Crawl the Destination URL’s
    Go back through the process of selecting “List Mode” in Screaming Frog and then import the new text file we just created (the file that contains the destination URL’s).  Click “Start” to crawl the URL’s, and then check the reporting.

    Using List Mode to Crawl URL's

  10. Analyze the Report and Find False Negatives
    You should see a lot of 200 codes (which is good), but you might find some 404’s, application errors, etc.  Those are your false negatives.  At this point, you can address the errors and ensure your old URL’s in fact redirect to the proper destination URL’s.  Disaster avoided.  :)

    Finding and Fixing False Negatives Using Screaming Frog


Screaming Frog and Actionable Data: Beat False Negatives
Going through the process I listed above will ensure you accurately check redirects and destination URL’s during a website redesign or migration.  The resulting reports can identify bad redirects, 404’s, application errors, etc.  And those errors could destroy your search power if the problems are widespread.  I highly recommend performing this analysis several times during the redesign or migration to make sure every problem is caught.

Make sure you don’t lose any URL’s, which can result in lost search equity.  And lost search equity translates to lower rankings, less targeted traffic, and lower sales.  Don’t let that happen.  Perform the analysis, quickly fix problems you encounter, and retain your search power.  Redesigns or migrations don’t have to result in disaster.  You just need to look out for the silent SEO killer. :)

GG

 

Filed Under: google, seo, tools, web-analytics

Connect with Glenn Gabe today!

Latest Blog Posts

  • How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)
  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent