The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)

March 15, 2023 By Glenn Gabe Leave a Comment

Hourly tracking in Google Analytics 4

I was just asked on Twitter if there was an easy way to compare Google organic traffic hourly like you can in Universal Analytics. That’s a great question, and that’s a super useful report to have as major algorithm updates roll out. You can typically start to see the separation over time as the update rolls out (if your site was heavily impacted by a major update like broad core updates, Product Review Updates, etc.)

So I fired up GA4 and created a quick exploration report for analyzing hourly traffic. Here is a short tutorial for creating the report:

1. Fire up GA4 and click the “Explore” tab in the left-side menu.

Explore tab in Google Analytics 4

2. Click the “Free Form” reporting option.

Free form exploration reporting in Google Analytics 4

3. Click the plus sign next to “Segments” to add a new session segment. Then create a segment for Google Organic by adding a new condition, selecting “Session source / medium” and then adding a filter for “google / organic”.

Creating a segment for Google Organic in Google Analytics 4
Selecting session source and medium and then filtering by Google Organic when creating a new segment in GA4

4. Add that segment to your reporting by dragging it to the “Segment Comparisons” section of the report.

Adding a segment to the reporting in Google Analytics 4

5. Set “Granularity” to Hour.

Selecting Hour as the granularity for the reporting in Google Analytics 4

6. Add a new metric and select “Sessions”. And then drag “Sessions” to “Values”.

Adding sessions as a metric in Google Analytics 4

7. Change the visualization to line chart by clicking the line chart icon.

Changing the visualization of the reporting to line graph in Google Analytics 4

8. For timeframe, select “Compare” and choose a day. Then choose the day to compare against. Note, GA4 isn’t letting me choose today (which is a common way to see how the current day compares to a previous day). So, you’ll have to just compare the previous day to another day. Sorry, I didn’t create GA4.

Comparing timeframes in Google Analytics 4

9. Name your report and enjoy comparing hourly sessions.

I hope you found this helpful, especially since the March 2023 broad core update is currently rolling out. Have fun. :)

GG

Filed Under: algorithm-updates, google, google-analytics, seo, tools, web-analytics

Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]

January 25, 2023 By Glenn Gabe Leave a Comment

Google continuous scroll study based on GSC data.

Google rolled out continuous scroll in the desktop search results for English queries in the United States on December 5, 2022. Continuous scroll enables users to seamlessly continue to page two and beyond without having to click a “next” button at the bottom of the results. This followed Google rolling out continuous scroll in the mobile results in October of 2021 (again, U.S-only for English queries).

Here is Google’s announcement about the rollout to desktop in early December (with a gif of continuous scroll in action):

Starting today, we’re bringing continuous scrolling to desktop in English in the U.S. so you can continue to see more search results easily. When you reach the bottom of a search results page, you'll now be able to see up to six pages of results. pic.twitter.com/xIuVP24FFm

— Google (@Google) December 5, 2022

After the rollout to desktop, many wondered how continuous scroll would impact the visibility of rankings that were beyond page one. For example, for sites with urls ranking on page two, and maybe even the top of page two, the ability for users to easily scroll to additional pages of search results should lead to more impressions, clicks, and conversions. That’s the idea anyway, and something I set out to analyze.

Before continuous scroll rolled out in the SERPs, ranking on page two and beyond meant your listings probably wouldn’t be seen much. Sure, some people would venture to page two and beyond, but most would stick on page one (and just refine their search if they couldn’t find what they needed after scanning the results). But with continuous scroll, users can easily move to the second page of results without having to click a button. The new results just show up as you approach the bottom of the initial set of results.

Analyzing Continuous Scroll in the U.S. Desktop Search Results:
Right after the rollout of continuous scroll on desktop, I published a post explaining how to analyze the change in impressions, clicks, and click through rate based on users being able to seamlessly view more listings in the search results. My tutorial explains how to use the GSC API and Analytics Edge in Excel to bulk export data from GSC, filter by desktop only from the United States, compare timeframes, and then filter by page two and three results. The resulting worksheets quickly provide the changes across metrics when comparing the timeframe before, and after, continuous scroll rolled out.

After publishing that post, I’ve been eagerly waiting for more data to build in order to dig into the reporting across sites. And that’s exactly what I did for a number of sites across verticals. I knew the sites would have a ton of data to analyze, and across verticals, so it should be easier to see differences based on continuous scroll rolling out in the desktop SERPs in the United States.

Below, I’ll cover the methodology I used, the data I analyzed, some interesting (and scary) findings about GSC data, and the impact, or lack thereof, of continuous scroll rolling out in the desktop search results. Let’s jump in.

Methodology:
First, I selected twelve different sites that have a steady and significant amount of traffic from Google organic. Some of the properties are large-scale sites driving a lot of clicks from Google organic, where others were niche sites driving less traffic (but still a good amount of clicks). I made sure the sites were across verticals, and that those verticals weren’t heavily impacted by the holidays (as much as I could). Also, I made sure to focus on just the desktop results from the United States, since continuous scroll did not roll out internationally yet.

Then I used the process I mapped out in my tutorial for analyzing the change in metrics based on exporting data from GSC in bulk, filtering by desktop only from the U.S., comparing timeframes, and then filtering by page two and three results. You can check my tutorial for how to accomplish this using the GSC API, Excel, and Analytics Edge.

When analyzing the data, I made sure to review queries where the average position was about the same before and after continuous scroll rolled out. For example, I wouldn’t review a query that was ranking 24 before the update and then 16 after. That’s a big difference and could obviously impact the data. I looked for queries where the site ranked at about the same position so I could better analyze if continuous scroll was having an impact on visibility and engagement.

Regarding Google organic search traffic for the twelve sites, I have provided the number of clicks over the past three months for each of the sites I analyzed (just so you have a feel for how much traffic they were driving from Google recently).

The sites ranged from 89M clicks to 1.1M clicks over the past three months and spanned a number of verticals:

  • Site 1: 89M clicks
  • Site 2: 31M clicks
  • Site 3: 25.3M clicks
  • Site 4: 8.6M clicks
  • Site 5: 7.4M clicks
  • Site 6: 4.9m clicks
  • Site 7: 3.7M clicks
  • Site 8: 3.2M clicks
  • Site 9: 3.0M clicks
  • Site 10: 2.9M clicks
  • Site 11: 2.5M clicks
  • Site 12: 1.1M clicks

The GSC Void: Dark, Murky, and Inconclusive
First, the reality of filtered GSC data hit hard after exporting data based on query. Barry Schwartz covered this in July of 2022 after Patrick Stox wrote a post explaining what he was seeing across sites with GSC filtering data. It was eye-opening to see how much data was filtered for some sites…

When I checked at the time across sites, I also saw massive gaps in data when exporting based on query. For example, the total at the top of the Performance report in GSC can be much greater than what you see after exporting the data by query (and then totaling the clicks and impressions). And I mean WAY OFF. For some sites I analyzed at the time, I was only seeing 20% of the total after exporting the data. Yes, that means 80% of the data was filtered.

Yep, the doc was updated on Friday by Google. I've seen some sites with over 80% of queries being filtered. You can check this by using the GSC API. Beware, you might be shocked with what you see… https://t.co/7SQr7hVG6M

— Glenn Gabe (@glenngabe) July 11, 2022

The reason is that Google filters queries based on privacy concerns. In its documentation, which it refined after Patrick’s study, Google explains that it filters “anonymized queries” to protect the privacy of users. And for some sites, it can be a ton of data. It’s worth noting that exporting by page will yield the full results (or close), but exporting by query highlights what I call the GSC void.

Back to continuous scroll data… For the twelve sites I analyzed, some GSC properties only provided 20-30% of the total data reported in the Performance reporting due to filtering. You read that correctly. That means 70-80% was filtered for those sites.

On the flip side, I’ve seen as high as 84% of the data showing (so 16% filtered), but that was the most I could find based on reviewing a number of properties in GSC. Don’t get me wrong, that’s much better than 24%!

When I saw the amount of filtering, I knew I had my work cut out for me with trying to analyze the data. I was hoping there was enough to see changes based on continuous scroll rolling out… One thing was clear, the GSC void was dark and murky.

Dark and Murky: Analyzing the impact to impressions, clicks, and click through rate.
First, “Dark and Murky” isn’t the name of a trendy new drink you can order poolside at a resort. It’s just the first thing I said after going through the data across the sites I analyzed. When I got past the first page of results, the numbers across most of the sites plummet. They drop so much that it’s nearly impossible to draw any conclusions about how continuous scroll is impacting clicks and click through rate from the desktop SERPs.

And from an impressions standpoint, I couldn’t see a consistent trend with the increase in impressions. For some queries, impressions did increase. For others, they dropped. And again, clicks and click through rate were very hard to analyze due to the insanely low numbers beyond page one.

For example, as soon as I checked the spreadsheet for page two results across several of the sites, the number of clicks was inconsequential. That shows you how much data is being filtered, by the way… On page one, some queries are yielding tens of thousands of clicks, or more. Then page two drops to almost nothing? So yes, the GSC void is real and it can severely hamper your analysis.

Here are some screenshots from the spreadsheets for page two of the search results. Get ready to be underwhelmed from a clicks standpoint. :)

But, I mentioned one site that only had 16% of its data filtered (which was the best I came across out the twelve). For that site, you would think I would have enough data to make some conclusions… but not really. Clicks were very low once I analyzed page two and beyond. I could see that impressions increased for a number of queries, but clicks didn’t. And since clicks were so low beyond page one, the difference in click through rate was pointless to review.

Here is a screenshot from the site that was only 14% filtered:

For example, for one query the number of impressions jumped by 3,110, but clicks only increased by 18. Average position went from 21.3 to 19.0, which is close, but that could have meant a jump from page three of the results to two. Clearly this isn’t enough data to draw any conclusions. The impressions increase is one thing, but the clicks were so low that it didn’t mean much. And to be honest, who really cares about an increase in impressions if clicks don’t follow. For most site owners, this isn’t really a branding exercise. They want the clicks and subsequent conversions! :)

Here’s another site where there was a nice increase in impressions for some queries, and definitely an increase in clicks for some of the queries. That said, some of the increases were due to the site ranking much stronger in the latest timeframe. There was an increase in impressions for some queries when the site ranked about the same position, but there’s just not enough click data to draw any serious conclusions…

Key takeaways based on analyzing Continuous Scroll in the desktop SERPs:

  • Based on my analysis across sites, there is NOT much data on page two and beyond to analyze… That’s even the case for large-scale sites with a ton of Google organic traffic. That’s based on GSC filtering anonymized queries.
  • I could see an increase in impressions for some queries, but I it was hard to draw any conclusions since there were many that dropped when comparing timeframes as well.
  • Clicks and click through rate were even harder to analyze. There weren’t many clicks to report overall beyond page one, which made it very tough to draw any conclusions.
  • From a GSC data standpoint, I had severely-limited data based on GSC filtering. This has been reported before, and this study underscored how much filtering is going on. For example, some of the exports were only yielding 20-30% of the data reported in GSC in the Performance reporting (when analyzing by query).
  • I do recommend going through this process for your own sites using the tutorial I published (if for no other reason than to see the severe filtering going on with GSC data when exporting by query). Note, you should be able to see the full data if exporting by page, but there are many queries that lead to specific pages (which can muddy waters analysis-wise).  

Summary: The GSC Void Limits Analysis of Continuous Scroll in the SERPs
After continuous scroll rolled out in the desktop search results, I was extremely excited to analyze the impact to impressions and clicks based on users scrolling to page two and beyond. Unfortunately, GSC data filtering hampered my efforts big-time. Some sites were only returning 20-30% of the total data based on GSC’s filtering of anonymized queries.

I’ll be sure to update this post if I come across stronger findings based on analyzing continuous scroll across sites. In the meantime, I do recommend going through this process for your own sites. You never know, GSC might not be filtering as much of your data… Good luck.

GG

Filed Under: google, seo, tools

How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API

December 12, 2022 By Glenn Gabe Leave a Comment

Analyzing the impact of continuous scroll in Google's desktop search results.

Google rolled out continuous scroll in the desktop search results in the U.S. on December 5, 2022, which follows a rollout in the mobile search results in October of 2021. It’s basically like infinite scroll for the search results. When you approach the bottom of page one, the second page of results seamlessly load, enabling users to easily continue their journey to find answers.

Starting today, we’re bringing continuous scrolling to desktop in English in the U.S. so you can continue to see more search results easily. When you reach the bottom of a search results page, you'll now be able to see up to six pages of results. pic.twitter.com/xIuVP24FFm

— Google (@Google) December 5, 2022

For site owners and SEOs, this means hidden treasures ranking on page two and beyond in the search results could see higher visibility (as users load additional pages in the SERPs without having to click the next button at the bottom of each page). I said “could” because that’s in theory and would need to be proven via data. It wasn’t long before I started hearing questions about how to best track the addition of continuous scroll in the desktop search results, and how that’s impact clicks, impressions, and click through rate. That’s when I fired up Analytics Edge in Excel to come up with a solution that could help.

Automating A Solution By Combining The GSC API And Analytics Edge In Excel
If you’ve been following me on Twitter and reading my blog for a while, then you have probably seen some of my tutorials for using Analytics Edge to automate the exporting of data from GSC (and then automatically work with that data via macros). Analytics Edge is an amazing solution created by Mike Sullivan and I often call it a Swiss Army Knife for working with various APIs.

In this tutorial, I’ll explain how to bulk export data from GSC, compare that data to a previous timeframe, filter by position in the search results, and create separate worksheets by Google search result page. When you’re done, you will have separate worksheets for page two, page three, etc., and you’ll be able to see the difference in clicks, impressions, and click through rate based on Google rolling out continuous scroll in the desktop search results in the United States.

Let’s jump into the tutorial. I’m sure you are eager to see the data for your own properties!

Tutorial: How to use Analytics Edge to analyze the impact of continuous scroll in the desktop search results.

1. Set up Analytics Edge in Excel:
I have covered this several times in previous tutorials. Please reference those blog posts to learn how to download and install Analytics Edge. For example, my post about creating Delta Reports explains how to set up  Analytics Edge. Also, there is a free trial available for Analytics Edge, and the cost is super economical (it’s just $99 for the year for the core add-in and $50 per year for the Google Search Console add-on). Note, Analytics Edge is up to version 10.9 now (the image below shows a previous version).

Install Analytics Edge in Excel

2. Export all GSC data for the timeframe AFTER Google rolled out continuous scroll in the desktop results:
Analytics Edge enables you to build a macro with several tasks that work together to accomplish your goal. The first step in our Analytics Edge macro is to export all GSC query data for desktop searches for the timeframe after continuous scroll rolled out in the desktop search results in the U.S. Click the Analytics Edge tab in Excel and click “Google Search”, and then “Search Analytics”.

Using the Search Analytics API in Analytics Edge in Excel

3. Choose your settings for exporting data via the GSC API:
When the dialog box opens, select the account and then GSC property you want to export data from.

Select a GSC property in Analytics Edge

4. Choose dimensions and metrics to export:
Then click the Fields tab and click the query dimension in the left side pane labeled “Available Dimensions and Metrics”. Then click the “Add” button to add that dimension to your export. Notice that the selected metrics include clicks, impressions, ctr, and position. Keep all of those as-is.

Select fields to export using Analytics Edge

5. Set a filter for Desktop devices only in the United States:
Next, we don’t want to muddy our data with mobile traffic and non-U.S. traffic, since we are trying to analyze the impact of continuous scroll rolling out in the DESKTOP results in the U.S. only. So, click the “Filters” tab and click the dropdown for “Devices” and select “DESKTOP”. Then for “Country”, select “United States”. Then keep all other settings as-is for this tab.

Select devices as a filter in Analytics Edge to focus on desktop-only

6. Select dates to compare:
Next, we want to analyze the difference in clicks, impressions, and ctr for the timeframe after Google rolled out continuous scroll in the desktop search results to the timeframe before. The rollout began on 12/5, so select “Start” and choose a start date of 12/5. For the end date, I would choose a date with full data (and not partial data). I used 12/9 as the end date.

Make sure you select the “Compare to” checkbox and then enter dates to compare the data with. For the start date, select specific dates that line up for day of the week and number of days. If this isn’t the same number of days, or if it’s a different set of days of the week, your data could be off. I selected 11/28 through 12/2.

Select dates to compare in Analytics Edge in Excel

7. Choose a sort order:
You can tell Analytics Edge to sort the results by a specific metric. For our purposes, you can choose clicks or impressions in descending order (which means it will be highest to lowest amount of clicks or impressions). Just select one metric for this tutorial (I chose clicks). Note, you can easily change the sorting once the data has been exported in Excel. Click OK to export the data.

Choose a sort order in Analytics Edge in Excel

8. Set the table name:
Analytics Edge will export the data and hold it memory. You will see a partial set of data in a worksheet highlighted in green. Before we write the full data to a worksheet, we want to store that data in a virtual table that we can reference later via Analytics Edge (so we can filter the data later on). To add the data to a table, click the “Analytics Edge” tab in Excel and then select “Table Name”. In the dialog box, set the table name to whatever you want. I named it “allpages”. Then click “OK”.

Set a table name to store exported data in memory in Analytics Edge
Assign the table name in Analytics Edge

9. Write the full data to a worksheet (just to have all of the data documented):
Although we are looking to isolate queries where the site ranks on page two and three in the desktop search results, we are going to export all of your query data (just to have a worksheet you can reference if needed). You will notice Analytics Edge is showing you a subset of the data highlighted in green. The full data is in memory. To write that data to a worksheet, click the File menu in Analytics Edge and select “Write to Worksheet”. Name the worksheet something like “Queries All Data” and click “OK”.

Write data to a worksheet in Analytics Edge in Excel

10. Filter the data for just page two results:
OK, so now we have a worksheet containing all of our query data compared to a previous timeframe. Next, we are going to filter the data to only pull results with a position of 11 through 20 (roughly page two results in Google) and write that to a new worksheet. Sure, some pages contain more than 10 results, but overall this should work for us. Click the “Analytics Edge” menu and click “Table”, then “Filter”. In the dialog box, we are going to filter by the column containing position for the time period after continuous scroll rolled out in the desktop results in the U.S.

Select the column in the dropdown box and choose “Greater than” in the criteria filed and enter 10. Then add another rule using the same field, but this time select “Less than” and enter 21. That gives us results with a position of 11-20. And to make sure we are comparing apples to apples, let’s make sure the site ranked in a similar position in the previous timeframe. So add one more filter rule using the field with the previous position and select “Greater than” 10. We are doing this to make sure the position didn’t radically change (and move from page one to two).

Filter data to isolate page two results from Google in Analytics Edge

11. Write to worksheet:
Now that we’ve filtered the results for just page two data, we need to write that data to a new worksheet (so we can analyze the data in Excel). Click the File menu in Analytics Edge and select “Write to Worksheet” like we did before. Name the worksheet something like “Page Two” and click OK. The new worksheet should appear with your data filtered for positions 11-20.

Write data to a worksheet in Analytics Edge in Excel

12. Set the table name again before filtering:
In step 10 we set a table name holding all of our exported data and I said we would need that again. Well, now that we exported the second page of results, we also want to isolate the third page of results. So, we’ll need to reference that virtual table again before filtering for positions 21-30. To do that, click the Table menu again and select “Table Name”. In the dialog box, select the radio button for “Switch to a previously named table” and select the “allpages” table we set earlier. If you named it something different, then choose that name. Then click OK.

Switching to a previously named table in order to filter data in Analytics Edge

13. Filter the results for third page rankings:
Just like we filtered the results for page two rankings, we’ll do that now for page three. To do that, click the Table menu in Analytics Edge and then select “Filter”. In the dialog box, select the column for position for the most recent timeframe and select “Greater than” and set the value as 20. Then add a second rule and choose that column again, but this time select “Less than” as the criteria and enter 31. That will limit the queries to ranking between 20 and 30 (roughly page three in the Google search results). Then to make sure we are comparing apples to apples, add one more rule to make sure the previous position was at least 20. So select the column for position for the previous timeframe, select “Greater than” as the criteria, and enter 20. Then click OK.

Filtering by position in Analytics Edge in Excel

14. Write to worksheet to complete the macro:
Now that we are filtering by page three rankings, we need to finalize that step by writing the data to a new worksheet (so we can analyze the data separately). Click the File menu in Analytics Edge and select “Write to Worksheet”. Name the worksheet something like “Page Three” and click OK. The new worksheet will be created with page three data.

Writing the final data to a worksheet in Analytics Edge

Congratulations! You just created a system for analyzing the change in impressions, clicks, and click-through rate based on continuous scroll launching in the desktop search results in the United States! Now it’s time to dig into the data to identify surges or drops across various metrics. Next, I’ll provide some final tips for working with the data so you can begin to identify the change based on continuous scroll rolling out on desktop.

Next steps and final tips for analyzing the data:

  • I recommend formatting the CTR columns to percentages using Excel’s functionality. It will make it much easier to scan and determine the percentage change for each query. Also, once you run this for a specific property in GSC, the columns will retain their formatting. So if you rerun the query, the CTR columns should stay as percentages, which is great.
  • I would also format the clicks and impressions columns to be “Number”, with no decimal points, and add a comma for thousands. Again, this is just to help you easily scan the data.
  • And last, format the position columns to Number with one decimal place. So 11.9125 would become 11.9.
  • Analysis-wise, look for larger changes in impressions and click through rate when scanning the data. That could mean that continuous scroll is having an impact for those queries. But, make sure position is comparable when checking the previous timeframe. For example, if you see a huge increase in impressions, make sure the position didn’t cause the change versus continuous scroll. If a site ranked on the bottom of page one versus top of page two, that could yield a big difference in impressions.
  • I would also filter each worksheet so you can slice and dice the data. For example, you could easily sort the data by impressions in descending order (largest to smallest), you could do that by clicks, or CTR change. Playing with the data can help you surface interesting findings quicker. In order to filter, click the Data menu in Excel and click “Filter”, which is a funnel icon.
  • You can also use color coding in Excel to highlight drops and surges in green and red. This is especially helpful if you are sending the data to a client or someone else in your company that isn’t as familiar with GSC data.
  • And once you create a template, it can easily be used for other properties in GSC. Just save a new spreadsheet for each property you want to analyze. And again, the formatting for each column should remain (so you don’t have reformat the worksheet each time you export the data).

Summary – Determining the impact of continuous scroll on desktop via Analytics Edge and the GSC API.
With the addition of continuous scroll in the desktop search results in the U.S., users can easily make their way from page one to two (and beyond) without having to click to the next page of results. And that can definitely impact impressions, clicks, and CTR of your listing that are ranking beyond page one. Using the approach I explained in this tutorial, you can use GSC data to analyze the impact. If you have any questions while going through this tutorial, feel free to ping me on Twitter. I think you’ll dig using Analytics Edge for this task! It’s just another powerful way to use one of my favorite SEO tools.

GG

Filed Under: google, seo, tools, web-analytics

Percent Human: A list of tools for detecting lower-quality AI content

November 9, 2022 By Glenn Gabe Leave a Comment

AI content

Updated on 2/1/23: OpenAI’s AI content detection tool (AI Text Classifier) was added to the list. GPTZeroX was also included (an upgrade to GPTZero).
Updated on 1/10/23: GPTZero was added to the list of AI content detection tools (created by a Princeton University senior).
Updated on 12/29/22: Content at Scale’s AI content detection tool was added.
Updated on 12/14/22: Writer’s AI content detection tool has been updated to detect GPT-3, GPT 3.5, and ChatGPT.
Updated on 12/13/22: Originality.ai was added to the list of AI content detection tools.

———-

As I’ve been sharing examples of sites getting pummeled by the Helpful Content Update (HCU) or the October Spam Update, I’ve also been sharing screenshots from tools that detect AI content (since some sites getting hit are using AI to pump out a lot of lower-quality content – among other things they were doing that could get them in trouble). And based on those screenshots, many people have been asking me which tools I’m using.

So, instead of answering that question a million times (seriously, it might be a million), I figured I would write a quick post listing the top tools I have come across. Then I can just quickly point people to this post versus answering the question over and over.

And note, I’m not saying these tools are foolproof. I have just found them to be pretty darn good at detecting lower-quality AI content. And that’s what we should be trying to detect by the way (not all AI content… but just low-quality AI content that could potentially get a site in trouble SEO-wise).

For example, here is high-quality human content run through a tool:

Detecting human content

And here is an example of lower-quality AI content run through a tool:

Detecting lower quality AI content

Again, it’s not foolproof, but can give you a quick feel for if AI was used to generate the content. Below, I’ll cover my favorite AI content detectors I’ve come across so far. I’ll also keep adding to this list so feel free to ping me on Twitter if you have a tool that’s great at detecting lower-quality AI content!

Here is a list of tools covered in this post for detecting AI content:

  1. Writer’s AI content detector tool.
  2. Huggingface GPT-2 Output Detector Demo.
  3. Giant Language Model Test Room (GLTR).
  4. Originality.ai (AI content and plagiarism detection)
  5. Content at Scale’s AI content detection tool.
  6. GPTZeroX
  7. OpenAI’s AI Text Classifier

1. Writer’s AI content detector tool:
The first tool I’ll cover is from a company that has an AI writing platform (sort of ironic, but does make sense). Also, it seems like the platform is more for assisting writers from what I can see. You can check out their site for more information about the platform. Well, they also have a nifty AI content detector that works very well. You have probably seen my screenshots from the tool several times on Twitter and LinkedIn. :)

Update: 12/14/22 – While I was testing content created via GPT 3.5 and ChatGPT, I noticed that Writer’s detection tool was accurately detecting the content as created by AI. That was a change, since the tool was originally focused on GPT-2, so I quickly reached out to Writer’s CEO for more information. And I was correct! Writer’s AI content detection tool has been updated to detect GPT 3, GPT 3.5, and ChatGPT. So it’s now the second tool on the list that can achieve that.

AI content is progressing, but so are the tools. Below are some examples of using Writer’s AI content detection tool.

Here is Writer’s tool detecting higher-quality human content:

Writer's AI content detection tool measuring high quality human content

And here is Writer’s tool detecting content created via GPT-3.5 (using davinci-003, which is the latest model as of 12/14/22):

Writer's AI content detection tool accurately detecting content created via GPT-3.5 (using davinci-003)

2. Huggingface GPT-2 Output Detector Demo:
If you’re not familiar with Huggingface, it’s one of the top communities and platforms for machine learning. You can check out their site for more information about what they do. Well, they also have a helpful AI content detector tool. Just paste some text and see what it returns. I have found it to be pretty good for detecting lower-quality AI content. 

For example, here is Huggingface’s tool detecting higher quality human content:

Huggingface's AI content detection tool measuring high quality human content

And here is Huggingface’s tool detecting lower-quality AI content:

Writer's AI content detection tool measuring lower quality ai content

3. Giant Language Model Test Room (GLTR.io)
The third tool I’ll cover was actually down recently, but I had heard good things about it from several people (when it was working). It ends up there was a server issue and the tool was hanging. Well, the GLTR is back online now and I’ve been testing it to see how well it detects AI content.

The tool was developed by Hendrik Strobelt, Sebastian Gerhmann, and Alexander Rush from the MIT-IBM Watson AI Lab and Harvard NLP. It’s definitely not as intuitive as the first tools I covered, but once you get the hang of it, it can definitely be helpful.

How it works:
You can paste text into the tool and view a visual representation of the analysis, along with several histograms providing statistics about the text. I think most people will focus on the visual representation to get a feel for how likely each word would be the predicted word based on the word to its left. And that can help you identify if a text was written by AI or by a human. Again, nothing is foolproof, but it can be helpful (and I’ve found the tool does work well). To learn more about GLTR and how it works, you can read the detailed introduction on the site.

For example, if a word is highlighted in green, it’s in the top 10 of most likely predicted words based on the word to its left. Yellow highlighting indicates it’s in the top 100 predictions, red in the top 1,000, and the rest would be highlighted in purple (even less unlikely to be predicted).

The fraction of red and purple words (unlikely predictions) increases when the text was written by a human. If you see a lot of green and yellow highlighting, then it can indicate the text contains many predicted words based on the language model (signaling the text could have been written by AI).

Here are two examples. The first shows AI content (many words highlighted in green and yellow). This text was generated via GPT-2.

Giant Language Model Test Room (GLTR) analysis of AI generated content.

And here is an example from one of my articles about broad core updates. Notice there are many words highlighted in red, and several purple words as well (signaling this is human-written text).

Giant Language Model Test Room (GLTR) analysis of human-written content.

4. Originality.ai (for detecting GPT 3, GPT 3.5, and ChatGPT)
I was able to test Originality.ai recently and I’ve been extremely impressed with their platform. The CEO emailed me and explained they were one of the few tools to be able to detect GPT-3, GPT 3.5 and ChatGPT (as of December 13, 2022). Needless to say, I was excited to jump in and test out its AI content detection tool. Also, it’s worth noting that the tool can detect plagiarism as well (which is an added benefit). They have also released a Chrome extension and they have an API for handling requests in bulk. I’ll cover more about the Chrome extension below.

So, I fired up OpenAI and selected text-davinci-003 (the latest model as of 12/13/22) and started generating essays, short articles, how-tos, and more. I also used ChatGPT to generate a number of examples I could test.

And when testing those examples in Originality.ai’s detection tool, it picked up the work as AI every time. Again, I was extremely impressed with the solution.

For example, here was a short essay based on GPT 3.5:

Originality.ai AI content detection tool

And here was a how-to containing several paragraphs and then a bulleted list of steps. I also checked for plagiarism:

Originality.ai AI and plagiarism detection tool

It’s not a free tool, so you will need to sign up and pay for credits. That said, it’s been a solid solution based on my testing. Note, they are providing a coupon code (BeOriginal) that gets you 50% off your first 2000 credits. One credit scans 100 words according to the site.

Originality.ai Chrome Extension:
I mentioned earlier that Originality.ai has both a Chrome extension and an API. The Chrome extension enables you to highlight text on a page in Chrome and quickly check to see if it was written by AI. You must log in and use the credits you have purchased, so it’s not free. It works very well based on my testing so far.

For example, here is an article created via Automated Insights. By highlighting the article text, right clicking, and selecting Originality.ai in the menu, you can check to see if the content was created by AI.

AI content created via Automated Insights
Originality.ai Chrome extension detecting AI content

5. Content at Scale
Next up is an AI content detector tool from Content at Scale. Like Writer, they provide a platform for AI content generation that uses an interesting approach. You can read more about the platform on their site. But, like Writer, they also have an AI content detection tool. You can include up to 2,500 characters and the tool will analyze the text and determine if it’s AI content or human content. And like Originality.ai and Writer, it can detect GPT-3, 3.5, and ChatGPT.

For example, here is the tool detecting AI content generated by ChatGPT (a short essay):

Content at Scale's AI content detector detecting AI-generated content.

And here is the tool detecting content from one of Barry’s blog posts as human:

Content at Scale's AI content detector detecting human-generated content.

6: GPTZeroX
Next up is a new AI content detection tool created by a Princeton University student! And it’s causing quite the buzz. I’ve read a number of articles across major publications about Edward Tian and his tool called GPTZero, which works to detect if content was written by ChatGPT.

{Update: 2/1} GPTZeroX was just released and can highlight which parts of the text being tested is AI-generated. It’s more granular with its detection, which was a top feature that Edward Tian heard from educators.

Beyond that, Edward explains that “GPTZeroX also supports larger text inputs, multiple .txt, word, and pdf file uploading, and lightning-fast processing speeds.” There is also an API now that can handle high-volume requests.

Here is GPTZeroX detecting a part of text as AI-generated:

GPTZeroX from Edward Tian

With GPTZero, Edward’s approach is interesting, since it uses “perplexity” and “burstiness” in writing to detect if a human or AI wrote the content. “Perplexity” aims to measure the complexity of the content being tested, or what Edward explains as the “randomness of text”. And “burstiness” aims to measure the uniformity of the sentences being tested. For example, Edward explains that “human written language exhibits non-common items appearing in random clusters.” Humans tend to write with more burstiness, while AI tends to be more consistent and uniform.

I’ve been testing the tool over the past few days, and it has worked well (and has been pretty accurate). The site has definitely had some growing pains since launching, since I’m sure Edward didn’t think the tool would become so popular that quickly, but site performance has improved greatly recently. Also, the homepage now explains he is creating a “tailored solution for educators”. I’m eager to hear more about that, but for now, you can add GPTZero as yet another tool in your AI detection arsenal. I think you’ll like it.

For example, here is the tool measuring “perplexity” and “burstiness” of content (based on an essay written by ChatGPT):

And here is the final result accurately detecting AI content written by ChatGPT:

7. OpenAI’s AI Text Classifier
Well, this was an interesting development! OpenAI, the creator of ChatGPT, just released its own AI content detection tool. And as you would guess, it can detect when a piece of content was written by ChatGPT (like several other tools in my post). Based on my testing, it works well (when taking direct output from ChatGPT and testing it). Like other AI content detection tools, it’s not foolproof, but does seem to catch a number of examples of AI content that I tested.

Note, it requires a minimum of 1,000 characters of input and provides one of five responses:

  • Very unlikely to be AI-generated.
  • Unlikely to be AI-generated.
  • Unclear if it is AI written.
  • Possibly AI-generated.
  • Likely AI-generated.

Here is a quick example based on an essay I created via ChatGPT. As you can see, it’s accurately being detected as AI content:

OpenAI's content detector called AI Text Classifier accurately detecting AI content.

And here is an example of OpenAI’s tool accurately detecting a blog post of mine as human:

OpenAI's content detector called AI Text Classifier accurately detecting human content.

Summary: Although not foolproof, tools can be helpful for detecting AI content.
Again, I’ve received a ton of questions about which tools I’ve been using to detect lower-quality AI content, so I decided to write this quick post versus answering that question over and over. I hope you find these tools helpful in your own projects. And again, if you know of other tools that I should try out, feel free to ping me on Twitter!

GG

Filed Under: google, seo, tools

True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting

November 3, 2022 By Glenn Gabe Leave a Comment

If you are confused when Google reports redirects as other categories, like “blocked by robots.txt”, “soft 404s”, “noindexed”, “404s”, and others, it could be Google silently following the redirect and reporting the status of the true destination url instead. My post covers the situation in detail, and provides examples of this happening in the wild.

While heavily analyzing websites from an SEO standpoint, you will undoubtedly find yourself deep in Google Search Console (GSC) reporting. GSC contains a boatload of data directly from Google and can help site owners and SEOs surface key insights. That said, it’s important to understand the nuances involved with GSC reporting, and how Google determines the information it provides in those reports. Having a clear understanding of what the data is showing is important when taking action to improve SEO.

And there’s no better example of GSC data confusion than the dreaded true destination url for redirects in GSC’s index coverage reporting (and URL inspection tool). I have received so many questions about this from clients that I decided to write this post so I can just point people here versus explaining it again and again.

So, join me on a GSC adventure where we uncover the secrets of the true destination url. Some of you might already know this, but I know some do not. And for those that don’t, this will all make sense very soon. You might not be happy with how this is working, but at least you’ll understand why urls are categorized in certain ways in GSC (and via the URL inspection tool).

What is the dreaded true destination url situation in GSC for redirects?
When viewing the indexing status in GSC of urls that are being redirected, Google reports on the true destination url (even if that url is outside of your own site). For example, if you redirect a url to another url, and that url is not indexable for some reason, GSC will silently follow the redirect and report on the final destination’s status. And that can be super confusing for site owners and SEOs that don’t know this is happening.

Yes, that means you can see urls showing up as “blocked by robots.txt”, “noindexed”, “soft 404”, “404”, and more (when the url you are inspecting is actually redirecting). As you can imagine, many site owners are left confused when they see “blocked by robots.txt” when they know 100% that a url is redirecting.

Google’s John Mueller has been asked about this many times, and he has replied with what I explained above (and does admit it can be a bit confusing). Also, Barry wrote a post covering how this happens with the URL inspection tool based on John’s comments. Even though this has been documented, I find it’s still a very confusing situation for many site owners and SEOs (which is why I’m writing this post).

Here is a tweet of mine with a link to John explaining how Google silently follows redirects (and how that shows up in GSC):

Right, that's why I said "reminder". :) John has explained this before in webmaster hangout videos. For example, here is one from 2019 where he explains how Google silently follows redirects for the URL inspection tool (and it's what shows up in Coverage): https://t.co/XG0aGNPOSW

— Glenn Gabe (@glenngabe) January 5, 2021

Now that you know this is happening, you might be wondering what this actually looks like in GSC. I’ll cover that next with examples of this happening in the wild.

Examples of Google silently following redirects and reporting the true destination url status in GSC:
Below, I’ll provide examples with screenshots of Google reporting on the true destination urls versus the redirect. Again, this is when the final destination urls are not indexable for some reason.

Blocked by robots.txt:
The url is redirected outside the site to a url that is blocked by robots.txt. Google reports the redirecting url as being “blocked by robots.txt” since the final destination is actually disallowed.

A twist on blocked by robots.txt:
This url redirects first to a tracking url, which is blocked by robots.txt. The final destination is not blocked, but Google can’t follow the first redirect to find the final destination url since it’s disallowed. It just knows that first url in the chain is blocked and reports that in GSC. Below, you can see the second step shows the url is actually blocked by robots.txt (and that’s what is reported in GSC).

Soft 404:
The url redirects to a page that’s a soft 404 (a product is unavailable). Google reports that the redirecting url is a soft 404 (since the true destination url is being seen as a soft 404).

Here is the page the url redirects to (with the product “currently unavailable”). Hence the soft 404:

Noindexed:
Yep, you guessed it. The url redirects to a page that’s noindexed. Google reports the url that is redirecting as noindexed in the coverage reporting:

Crawled, not indexed:
At first glance, you might think the redirect is being reported as “Crawled, not indexed”. Not true! It’s the final destination url that’s not being indexed by Google. Google is reporting “Crawled, not indexed” for the true destination url.

The final destination url is indeed not indexed:

404:
How can Google see a redirect as a 404? It doesn’t. It’s the true destination url that 404s and that’s what is reported in GSC.

404 with domain name change:
This is just a variation on the 404 situation to explain how this works when changing domain names. The url on the old domain redirects to a url on the new domain name, but the url was never migrated (it 404s). So Google reports that the redirecting url is a 404.

Sorry, more confusion with redirects:
When a url redirects to a page that resolves with a 200 header response code, and is indexed, the URL inspection tool reports accurately about the redirect (and says that initial url is a redirect and not indexed), but Google shows the canonical as the true destination url (where the redirect leads to). Talk about confusing, especially based on everything I explained above with the other examples where the redirecting urls are being reported as something different than a redirect…

A possible solution in GSC to clear up the confusion:
So, how can this be more intuitive? I think if GSC actually provided a message that it’s reporting on the true destination url, it could clear up the confusion for site owners and SEOs. Below, I have mocked up what this can look like in GSC. If Daniel Waisberg is reading (and I hope you are), then please add this!

Summary: Clearing up the confusion with redirects and destination url reporting.
I hope this post helped you understand how Google is silently following redirects and reporting on the true destination urls in GSC. I know it’s a confusing topic for many site owners and SEOs and I’m sure it has led to many head-scratching moments. Just keep in mind that as of now, GSC is reporting on the true destination urls when a url redirects. So don’t be surprised when you notice redirects in other categories in GSC’s coverage reporting (or when using the url inspection tool). And who knows, maybe the GSC product team will implement that message I mocked up above…

GG

Filed Under: google, seo, tools

  • 1
  • 2
  • 3
  • …
  • 16
  • Next Page »

Connect with Glenn Gabe today!

Latest Blog Posts

  • How to compare hourly sessions in Google Analytics 4 to track the impact from major Google algorithm updates (like broad core updates)
  • It’s all in the (site) name: 9 tips for troubleshooting why your site name isn’t showing up properly in the Google search results
  • Google Explore – The sneaky mobile content feed that’s displacing rankings in mobile search and could be eating clicks and impressions
  • Bing Chat in the Edge Sidebar – An AI companion that can summarize articles, provide additional information, and even generate new content as you browse the web
  • The Google “Code Red” That Triggered Thousands of “Code Reds” at Publishers: Bard, Bing Chat, And The Potential Impact of AI in the Search Results
  • Continuous Scroll And The GSC Void: Did The Launch Of Continuous Scroll In Google’s Desktop Search Results Impact Impressions And Clicks? [Study]
  • How to analyze the impact of continuous scroll in Google’s desktop search results using Analytics Edge and the GSC API
  • Percent Human: A list of tools for detecting lower-quality AI content
  • True Destination – Demystifying the confusing, but often accurate, true destination url for redirects in Google Search Console’s coverage reporting
  • Google’s September 2022 Broad Core Product Reviews Update (BCPRU) – The complexity and confusion when major algorithm updates overlap

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent