The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

How To Use Scroll Depth Tracking, Adjusted Bounce Rate, and Average Time On Page As A Proxy For User Engagement and Content Quality

November 28, 2018 By Glenn Gabe Leave a Comment

Share
Tweet
Share
Email
214 Shares

How to use scroll depth tracking to understand user engagement.

I was helping a company a few months ago that got hit hard by recent algorithm updates. When digging into the audit, there were a number of problems that were surfaced, including content quality problems, technical SEO problems, user experience issues, and more. From a content quality perspective, the site had an interesting situation.

Some of the content was clearly lower-quality and needed to be dealt with. But they also had a tricky issue to figure out. They target a younger audience and a lot of the content was long-form. And I mean really long-form… Some of the articles were over three thousand words in length. I asked my client if they did any type of user testing in the past to determine if their target audience enjoyed their long-form content or if they wanted shorter and tighter articles. It ends up they never ran user testing and didn’t really know their audience’s preference for content length. Like many site owners, they were pretty much guessing that this was the right approach.

The site owner said, “I wish there was a way to determine how far people were getting into the content…” That’s when I responded quickly with “you can do that!” Google Analytics is a powerful tool, but many just use the standard setup. But if you leverage Google Tag Manager, you can set up some pretty interesting things that can be extremely helpful for understanding user engagement.

That’s when I recommended using a three-pronged approach for helping identify user engagement, content consumption, etc. I told my client we could triangulate the data to help identify potential problems content-wise. And the best part is that it doesn’t take long to set up and we used one of the most ubiquitous tools on the market – Google Analytics (with the help of Google Tag Manager).

Note, there’s nothing better than running actual user testing. That’s where you can watch users interacting with your site, receive direct feedback about what they liked or didn’t like, and more. It’s extremely powerful for truly understanding user happiness. But that shouldn’t stop you from leveraging other ways to understand user engagement. What I’ll explain below can be set up today and can provide some useful data about how people are engaging with your content (or not engaging).

Triangulating The Data
The three analytics methods I recommend using to help identify problematic content include Scroll Depth Tracking, Adjusted Bounce Rate (ABR), and then Average Time On Page. I’ll go through each of them below so you can get a feel for how the three can work together to surface potential issues.

Note, there isn’t one metric (or even three) that can 100% tell you if some piece of content is problematic. It’s part art and part science. There are times you can easily surface thin or low-quality content (like thousands of pages across a site that were mistakenly published containing one or two lines of text). But then you have other times where full articles need to be boosted since they are out of date or just not relevant anymore.

Therefore, don’t fully rely on one method to do this… Also, it’s not about word count, it’s about value to the user. Google’s John Mueller has explained this several times over the past few years. Here’s a post from Barry Schwartz covering John’s comments where he explains that it’s about value versus word count. Here’s the tweet that Barry is referring to:

I agree with you & Mihai :). Word count is not indicative of quality. Some pages have a lot of words that say nothing. Some pages have very few words that are very important & relevant to queries. You know your content best (hopefully) and can decide whether it needs the details.

— John (@JohnMu) July 24, 2018

And once you identify potential issues content-wise and dig in, you can figure out the best path forward. That might be to enhance or boost that content, you might decide it should be noindexed, or you might even remove the content (404).

Scroll Depth Tracking
In October of 2017, Google Tag Manager rolled out native scroll depth tracking. And the analytics world rejoiced.

Celebrate Scroll Depth Tracking!

Using scroll depth tracking, you could track how far down each page users were going. And you have control over the triggers percentage-wise. For example, you could track if users make it 10, 25, 50, 75, and then 100 percent down the page. And then you could easily see those metrics in your Google Analytics reporting. Pretty awesome, right?

Setting up scroll depth tracking in Google Tag Manager

For my client, this alone was amazing to see. Again, they wanted to make sure their core audience was reading each long-form article. If they saw that a good percentage of users stopped 25% down the page, then that wouldn’t be optimal… And if that was the case, then my client could adjust their strategy and potentially break those articles up and craft shorter articles moving forward.

I won’t cover the step-by-step instructions for setting up scroll depth tracking since it’s been covered by many people already. Here’s a great post from Simo Ahava on how to set up scroll depth tracking via Google Tag Manager.  It doesn’t take long and you can start collecting data today.

Here is a screenshot of the tag in Google Tag Manager when I set this up. Just remember to set Non-interaction hit to true so scrolling events don’t impact bounce rate. We’ll use Adjusted Bounce Rate (ABR) to address that instead:

Using Google Tag Manager to set up scroll depth tracking.

And here are two example of scroll depth tracking in action. The first is a post where many readers are engaged and a good percentage are making their way down the page. Note, the values are events and not sessions or users. That’s important to understand. Also, the screenshots below are from two different sites and each site owner has chosen different scroll depth thresholds.:

Engaged users via scroll depth tracking.

And on the flipside, here’s a piece of content where many aren’t making their way down the page. It’s a lower quality page that isn’t seeing much engagement at all. There’s clearly much less traffic as well.

Scroll depth tracking showing unengaged users.

Adjusted Bounce Rate (ABR)
Ah, an oldie but goodie. In 2014 I wrote an article about how to set up Adjusted Bounce Rate (ABR) via Google Tag Manager. You can check out that post to learn more about the setup, but ABR is a great way to get a stronger feel for actual bounce rate. My post explains the problems with standard bounce rate, which doesn’t take time on page into account. So standard bounce rate is skewed. ABR, on the other hand, does take time on page into account and you can set whatever threshold you like based on your own content.

For example, if you write longer-form content, then you might want to set the threshold to longer (maybe a minute or longer). But if you write shorter articles, then you might want to set the ABR threshold to shorter (like 30 seconds or less). Once the time threshold is met, Google Analytics will fire an event causing the session to NOT show up as a bounce (even if the person only visits one page).

It’s not uncommon to see Bounce Rate in Google Analytics drop off a cliff once you implement ABR. And that makes complete sense. If someone visits your article and stayed on the page for six minutes, then that shouldn’t really count as a bounce (even if they leave the page without visiting any other pages). The person was definitely engaged. Here’s what that drop looked like for a client that implemented adjusted bounce rate this summer:

Bounce rate drops when adjusted bounce rate is implemented.

Here is an example of two highly-engaged posts of mine about major algorithm updates. The adjusted bounce rate is just 12% for one and 13% for the other. Many people visiting these pages spend a lot of time reading them. So even if that’s the only page they read, it shouldn’t be counted as a bounce.

Adjusted Bounce Rate example

So, now we have two out of three metrics set up for helping gauge user happiness. Next, I’ll cover the third, which is Average Time On Page (a standard metric in Google Analytics). When combining all three, you can better understand if visitors are staying on a page for a certain amount of time, how far they are scrolling down the page, and then how long they are staying overall on that page.

Average Time On Page
I find there’s a lot of confusion about time metrics in Google Analytics. For example, why Average Time On Page might be one time, and then Average Session Length is shorter than that. How can that be? Well, Mike Sullivan from Analytics Edge wrote a post about this a while ago. I recommend reading that article to get a feel for how the metrics work. In a nutshell, Average Time On Page excludes bounces (one-page visits). That’s because Google Analytics needs a page hop to calculate how long somebody remained on a page. Remember, we are setting up scroll depth tracking to be a non-interaction hit, so it won’t impact bounce rate or time metrics

Therefore, Average Time On Page will tell you how long users are staying on a piece of content when they visit another page on the site. Sure, it excludes bounces (so it’s not perfect), but it’s still smart to understand time on page for users that click through to another page on the site.

As an example, here’s my post about the August 1, 2018 algorithm update. The Average Time On Page is 12:48 for the responsive page and 17:52 for the amp version. In web time, that’s an eternity.

Average time on page high for engaged posts.

And on the flip side, here’s a page from a different site that needs some help. It hasn’t been updated in a while and users are identifying that pretty darn quickly. Even for people that click through to another page, the Average Time On Page is just 0:55. That’s a major red flag for the site owner.

Low avg time on page.

Some final tips:
Now that I’ve covered three methods for better understanding user happiness and engagement, I wanted to cover some final tips. Once you are collecting data, you can slice and dice the information in Google Analytics several ways.

  • First, review all three metrics for each piece of content you are analyzing. If you find high adjusted bounce rate, low scroll depth, and low average time on page, then there’s clearly an issue. Dig in to find out what’s going on. Is the content old, is there a relevancy problem based on query, etc.?
  • You might find content where scroll depth looks strong (people are scrolling all the way down the page), but adjusted bounce rate is high. That could mean people are quickly visiting the page, scrolling down to scan what’s there, and then leaving before your ABR time threshold is met. That could signal a big relevancy problem.
  • You can use segments to isolate organic search traffic to see how users from Google organic are engaging with your content. Then you can compare that to other traffic sources if you want.
  • You can also segment mobile users and view that data against desktop. There may be some interesting findings there from a mobile perspective.
  • Heck, you could even create very specific segments to understand how each one is engaging with your content. For example, you could create a segment of female visitors ages 18-34 and compare that to male users. Segments in Google Analytics can be extremely powerful. I recommend reviewing that topic in detail (beyond just what I’m covering today with scroll depth tracking, ABR, etc.)

Summary – Using Analytics As A Proxy For User Engagement, User Happiness, and Content Quality
I always recommend conducting user studies to truly find out what your target audience thinks about your content. You can watch how they engage with your content while also receiving direct feedback about what they liked or didn’t like as they browse your site. But you can also use analytics as a proxy for engagement and user happiness (which can help you identify content quality problems or relevancy issues).

By combining the three methods listed above, you can better understand how users are engaging with your content. You’ll know if they are staying for a certain amount of time, how far they are scrolling down each page, and then you’ll see average time on page (excluding bounces). It’s not perfect, but it’s better than guessing.

And once you collect the data, you may very well choose to refine your content strategy. And the beautiful part is that you can start collecting data today. So go ahead and set up scroll depth tracking and adjusted bounce rate. Then combine that with average time on page so you can use the three-pronged approach I covered in this post. Leverage the power of Google Analytics and Google Tag Manager (GTM). You never know what you’re going to find.

GG

 

Share
Tweet
Share
Email
214 Shares

Filed Under: google, google-analytics, seo, tools, web-analytics

Connect with Glenn Gabe today!

Latest Blog Posts

  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap
  • Google Multisearch – Exploring how “Searching outside the box” is being tracked in Google Search Console (GSC) and Google Analytics (GA)
  • Sitebulb Server – Technical Tips And Tricks For Setting Up A Powerful DIY Enterprise Crawler (On A Budget)
  • Google’s Helpful Content Update Introduces A New Site-wide Ranking Signal Targeting “Search engine-first Content”, and It’s Always Running
  • The Google May 2022 Broad Core Update – 5 micro-case studies that once again underscore the complexity of broad core algorithm updates
  • Amazing Search Experiments and New SERP Features In Google Land (2022 Edition)

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent