The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

How to extend a multi-site indexing monitoring system to compare Google-selected and user-selected canonical urls (via the URL Inspection API and Analytics Edge)

March 16, 2022 By Glenn Gabe Leave a Comment

Last month I published an article on Search Engine Land explaining how to use the new URL inspection API to build a multi-site indexing monitoring system. By using Analytics Edge in Excel with the new URL Inspection API from Google, you can check the indexing status for the most important urls across multiple sites on a regular basis (and all by just clicking a button in Excel). It’s a great approach and can help you nip indexing problems in the bud. Remember, if your pages aren’t indexed, they clearly can’t rank. So monitoring indexing is super important for site owners and SEOs.

After I published the article, it was great to see people in the industry test out this approach, and I’ve heard from quite a few that they use it on a regular basis. That’s outstanding, but I think systems like what I originally built can always be enhanced… As I was using the system to check indexing levels across various client sites, I came up with a simple, but powerful, idea for extending the system. And it relates to canonicalization.

First, it’s important to understand that rel canonical is just a hint for Google. I’ve covered that before in case studies, other blog posts, and heavily on Twitter over the years. Google can definitely ignore what site owners include as the canonical url and then choose a different urls (based on a number of factors). And when Google selects a different url as the canonical, you definitely want to know about that. That’s because the url being canonicalized will not be indexed (and won’t rank in the search results). This can be fine, or not fine, depending on the situation. But you definitely want to dig in to see why Google is choosing a different canonical than what you selected.

Luckily, the URL Inspection API returns both the user-selected canonical and the Google-selected canonical when inspecting urls. So, via some Analytics Edge magic, we can compare the two columns returned by the API as the urls are being processed, and flag that in our worksheets. It’s just another level of insight that can help you address indexing problems across the sites you are monitoring.

What we are going to achieve: Comparing canonicals via the URL Inspection API.
As I explained above, we are going to add another step in the indexing monitoring system to compare the user-selected canonical with the Googles-selected canonical. And we are going to dynamically create a new column in each worksheet that lets us know if there is a difference between the two. 

And as a quick reminder, we will be doing this across all sites that are included in our indexing monitoring system (which can span as many GSC properties as you want). If you followed my original tutorial, then you can easily add this additional step in your system to check canonicalization across your top urls. And if you didn’t already set up an indexing monitoring system, then I would do that first and then come back to add this step.

With that out of the way, let’s enhance our system!

How to extend an indexing monitoring system by comparing Google-selected and user-selected canonicals:

1. Set up the foundational indexing monitoring system:
First, follow my original tutorial for setting up the indexing monitoring system. Once you have that up and running, we are going to add an additional step for comparing the user-selected and Google-selected canonical urls. And then we’ll dynamically create a new column in each worksheet called “Different Canonical” that flags if they are different.

2. Add a step to the macro in Analytics Edge:
In order to add another step to our macro in Analytics Edge, you simply run the macro to the point where the new instruction will be added and then add the new functionality. You can accomplish that via the “Step” button in the task pane. First, open your spreadsheet, click the Analytics Edge tab, and open the task pane (which holds your macros).

3. “Step” to your desired location in the macro:
Click the instruction in the task pane BEFORE where you want to add the new function. Since we are going to compare data after the API returns results, we will add our new function after the “Index Inspection” step in our macro. So click “Index Inspection” in the task pane and then click the step button (which is located next to the run button). After the macro executes to that point, you can add additional functionality to the macro. For our purposes, we are going to add a Formula function that will compare columns after the API returns results for each url.

Note, this will only run the macro that’s showing in the task pane. It will not refresh ALL macros in the spreadsheet. So if you are monitoring several sites, and each site has its own macro, then those will need to be updated separately. I’ll cover how to do that later in the tutorial.

4. Add a new formula for comparing canonicals:
Once the macro runs to the point we indicated in the previous step, Analytics Edge will stop running the macro. And then you can add the new function for comparing the Google-selected and user-selected canonical urls. To do that, click the Analytics Edge tab, and then click the Column dropdown, and select “Formula” from the dropdown list.

5. Add the conditional statement in the formula dialog box:
In the formula window, enter a name for the new column you want to add based on the formula we will create. You can use “Different Canonical” for this tutorial. Next, select where the column should be added in our worksheet. I want to put the new column right after the userCanonical column in the worksheet (which makes the most sense in my opinion). And finally, we are going to add a conditional statement which checks to see if the Google-selected canonical equals the user-selected Canonical. If it does, we’ll add “No” to the “Different Canonical” column, and if it’s different we’ll add “Yes”. Here is the formula you will include that accomplishes this task. Simply copy and paste this formula into the “Enter Formula” text box:

=if([indexStatusResult/googleCanonical]=[indexStatusResult/userCanonical],”No”,”Yes”)

Then click OK to apply the formula to the data that the API returned in the previous step. And then clip the step button in the Analytics Edge task pane to execute the final step in our macro, which is to write the results to a worksheet.

6. Check Your Results!
You can check the worksheet with the results to see the data. You should have a new column named “Different Canonical” that contains a “Yes” or “No” based on if the Google-selected canonical is different than the user-selected canonical.

7. Copy and paste the new formula to each macro in your spreadsheet.
Congratulations, you just extended your multi-site indexing monitoring system to check for canonical differences! Now apply the same formula to all of the worksheets you created in your spreadsheet (if you are checking more than one website or GSC property). The great news is that Analytics Edge has copy and paste functionality for macros (and for specific steps in your macros).

Just highlight the new formula you created in the task pane, click the copy button, select the macro you want to copy the formula to, click the step before where you want to add the formula, and then click paste in the task pane. Boom, you just copied the formula to another macro.

8. Check indexing and canonicalization all in one shot.
And that’s it. Your monitoring system will now check the indexing status of each url, while also detecting if the Google-selected canonical is different than the user-selected canonical. And as a reminder, all you have to do is click “Refresh All” in Analytics Edge to run all macros (which will check all of the GSC properties you are monitoring).

Important Reminder: The system is only as good (and accurate) as Google’s URL inspection system…
One thing I wanted to point out is that the indexing monitoring system is only as good as the data from Google’s URL inspection tool. And unfortunately, I’ve seen that be off sometimes during my testing. For example, it might say a url is indexed, when it’s not (or vice versa). So just keep in mind that the system isn’t foolproof… it can be inaccurate sometimes.

Summary – Continuing to improve the indexing monitoring system.
With this latest addition to the multi-site indexing monitoring system, we can now automatically check whether the Google-selected canonical is different than the user-selected canonical (which is a situation you definitely would want to dig into for urls not being indexed). Moving forward, I’ll continue to look for ways to improve the indexing monitoring system. If you decide to follow my set of tutorials for setting this up, definitely let me know if you have any questions or if you run into any issues. You can ping me on Twitter as you set up the system.

GG

Filed Under: google, seo, tools

The Link Authority Gap – How To Compare The Most Authoritative Links Between Websites Using Majestic Solo Links, Semrush Backlink Gap, and ahrefs Link Intersect

November 11, 2021 By Glenn Gabe Leave a Comment

Link Authority Gap

While helping companies that have been heavily impacted by Google’s broad core updates, the topic of “authority” comes up often. And that’s especially the case for companies that focus on “Your Money or Your Life” (YMYL) topics. For example, sites that focus on health, medical, financial, etc. all fall under YMYL.

If you’ve read my posts about broad core updates, then you know that Google is taking many factors into account when evaluating sites (and over a long period of time). It’s one of the reasons I recommend taking the “kitchen sink” approach to remediation where you surface all potential issues impacting a site quality-wise, and work to fix them all (or at least as many as you can). For example, improving content quality, user experience, technical SEO problems causing quality problems, the site’s advertising setup, affiliate setup, and more.

And for sites that cover YMYL topics, it’s important to understand that they are held to a higher standard. Google wants to surface the most authoritative sites for sensitive queries (for topics that can impact the health, happiness, financial status, etc. of a user). And that’s one area where authority comes into play.

“Authority” can be a nebulous topic, so I’ll just focus on one aspect for this post – which are links. Danny Sullivan (pre-Google) actually wrote a good post about the subject where he interviewed Paul Haahr, distinguished engineer at Google. In that post, Danny explained how a mixture of factors are used to measure “authority”, with PageRank being the “original authority metric” (links from across the web). So links are not the only factor contributing to authority, but they are important. Instead, there’s a bucket of signals Google is using. By the way, Paul is one of the smartest guys at Google from a ranking perspective. I’ve always said that when Paul talks, SEOs should listen. :)

PageRank, the original authority metric.
Google calculating authority.

In addition, Google has explained that PageRank, or links from across the web, is one of the most well-known factors when evaluating authority. That’s from a whitepaper Google published about fighting disinformation across the web.

Google, PageRank, and E-A-T

Google also explained in the whitepaper that when it detects a YMYL query, then it gives more weight in its ranking system to factors like authoritativeness, expertise, and trust. So those links and mentions are even more important for YMYL queries/topics.

Google, PageRank, E-A-T, and YMYL

Going even further, Google’s Gary Illyes explained at Pubcon in 2017 that E-A-T is largely based on links and mentions from authoritative sites. Here is a tweet from Marie Haynes after Gary explained this at Pubcon:

I asked Gary about E-A-T. He said it's largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that's good.

He recommended reading the sections in the QRG on E-A-T as it outlines things well.@methode #Pubcon

— Marie Haynes (@Marie_Haynes) February 21, 2018

But there’s more…. In 2020 Bill Slawski covered a super-interesting patent that discusses expertise and how Google might be categorizing sites to determine which set of sites should even be considered for ranking (for certain types of queries). By the way, that sure sounded a lot like what happened with the Medic Update in August of 2018 (when many health and medical sites saw extreme volatility). As part of that patent, E-A-T was mentioned with a focus on “authority” (which we know includes links and mentions from authoritative sites).

As you can see, links are important and can provide a strong signal to Google about the authoritativeness of a page and website. And as Gary Illyes once explained, that includes mentions on well-known sites (and not just links).

A Link Authority Gap Analysis: Comparing The Most Powerful Links Between Sites
Based on what I explained above, it can be important to understand where powerful links and mentions are coming from (both to your site and to your competitors). And that’s especially the case if you focus on YMYL topics. Therefore, I find it’s always interesting to compare link profiles, understand the differences in links from authoritative sites, and identify gaps in your own link profile in order to have a solid understanding of your current state.

The tough part for site owners is understanding that there’s no quick fix to a situation where your competitors have a clearly stronger link profile from an authority standpoint. You can’t just go out the next day and gain links from CNN, the CDC, The New York Times, and other authoritative sites. Note, as I explained earlier, rankings (and “authority) are not just about links, so this isn’t a binary rank-or-not-rank situation. But again, the right links can be a strong signal to Google about the authoritativeness of a piece of content, or a site overall.

For example, it’s not easy to earn links from the following domains…

Majestic Solo Links

Also, from a link earning perspective, you typically need to build a strong content strategy, social strategy, and possibly digital PR strategy to help earn links naturally over time. And that takes time… it’s not something you can do quickly (if done naturally). And if you find a large gap from a link authority standpoint, it’s definitely not a reason to spam every news publisher in the world to try and gain links. That typically won’t work well and could backfire big-time.

Peeling back a link profile to reveal “link authority gaps”:
In this post, I’m going to explain how to quickly surface the most authoritative links pointing to a competitor when those same sites don’t link to your own site. In other words, finding the authority gap between sites. The post isn’t about doing a deep dive into a link profile, which can also be beneficial. Instead, it’s focused on quickly identifying important gaps from an authority standpoint.

Again, this is a nuanced topic and just because one site has links from X domain doesn’t mean that’s why it is ranking above you. But, the link authority gap can give you a pretty good view of why a site may be seen as more authoritative than another. In addition, you can learn a lot about the type of content earning those links and possibly identify content gaps on your own site.

Below, I’ll cover how to run a quick link comparison using three tools:

  • Majestic Solo Links
  • Semrush Backlink Gap
  • ahrefs Link Intersect tool

Solo Links in Majestic:
I love Solo Links and it has definitely flown under the radar for a really long time. It compares the top 500 referring domains for two different websites (ordered by Majestic’s Trust Flow metric). I have found it to be a strong way to surface the most powerful domains linking to one site, and not another. And no, Google doesn’t use Trust Flow or any third-party metrics from the various tools. It’s just Majestic’s internal metric for estimating quality based on being closely linked to “trusted seed sites”.

Just enter the two domains you want to compare and click the magnifying glass. You’ll see a Venn digarm representing the overlap, and the gaps, between both sites. Below the Venn diagram, you’ll the referring domains listed, the number of links, and you have the ability to quickly view the links. For example, if you click the link count, you will see the actual links along with supporting information (like anchor text, if it’s nofollow, when it was first seen, etc.) It’s not perfect (no tool is), but it can sure get you moving in the right direction when trying to determine the link authority gap.

For example, here’s a gap analysis for a powerful site in the health niche compared to a smaller, less authoritative site. You can clearly see the differences in the Venn diagram and then in the referring domains list.

Solo links for health sites venn diagram
Solo links for health sites, referring domains

A quick note to Majestic:
Please add the ability to export all of the links in bulk and not just the domains. It would be great to simply click “export” and get all of that data in one shot. It’s a bit cumbersome now to get that data out of Solo Links since you can only export links for each domain and not all of the domains listed at one time…

Exporting in Majestic Solo Links

Backlink Gap in Semrush:
Similar to Solo Links, Semrush’s Backlink Gap tool enables you to enter up to five domains to start comparing link profiles. It doesn’t just focus on the top links, but the data is presented based on Semrush’s Authority Rank metric (AR). And just to reiterate, Google does NOT use AR or any other third-party metric when determining rankings… It’s just an Semrush-specific metric.

After entering at least two domains, you have options for viewing the links pointing to the competition, but not your site, or vice versa. Again, it’s not perfect, but can help you start to understand powerful links leading to your competitors that you don’t have.

Backlink Gap in Semrush

Note, the default view is to list the referring domains (without showing the links). To view the actual links leading to your competitor’s site, you can click the arrow to the right of the referring domain in the chart. It’s not the most intuitive way to find the links, but they are there. :)

Viewing links in Semrush Backlink Gap

Link Intersect in ahrefs:
Another tool that has flown somewhat under the radar is ahrefs Link Intersect Tool. It enables you to enter up to four competitor domains and view gaps in your own link profile (ordered by Domain Rank, a metric created by ahrefs). And once again, Google does not use any third-party metrics like Domain Rank, Authority Rank, Domain Authority, etc.)

Like the other tools I mentioned, the link intersect tool can shed some light on authoritative links pointing to the competition that you don’t have. For example, what are those sites linking to content-wise, do you have content like that on your own site, what can you build that’s even better than that, etc.?

To use the tool, simply enter a competitor’s domain (or several) and then your domain in the “but doesn’t link to” field. The tool will find links leading to the competition, but not to your site. The list of referring domains will be ordered by ahrefs Domain Rank. In order to view the actual links per referring domain, just click the number in the column for each competitor. You can see information about that link, including the anchor text, the url that’s linking to the site, whether the link is nofollow, if the link was found in the content, etc.

ahrefs Link Intersect Tool
Referring Domains in ahrefs Link Intersect Tool
Exporting links in ahrefs Link Intersect Tool

You can also export the list, but just like Majestic, you can’t export all links in one shot. You can just export the referring domains in one shot… So you’ll have to export links per referring domain, which can be tedious. So just like my note to Majestic, I have the same for Ahrefs! Please add the ability to export all links from the Link Intersect tool in one shot! That would be a great addition.

Exporting domains in ahrefs Link Intersect Tool

Tips and recommendations when evaluating the link authority gap:

  • “Authority” is Not binary: Links (PageRank) is just one factor. Sure, it’s an important factor, but just one. Google has explained it uses a bucket of signals to determine authority. While you are analyzing the link authority gap, just understand that the links alone might not be the reason a competitor is outranking you. It’s great to see the gap, and can be important for forming a long-term strategy to improve authority-wise, but it’s not a rank-or-not-rank situation.
  • Understand the Search competition versus just the competition: Make sure you are comparing to your actual competitors in Search (who you are competing within the search results) and not just which sites you think are the top competition (outside of Search). Your goal here is to understand the link authority gap between your site and sites outranking your for target queries.
  • Mine the gaps, create a plan: Identify gaps in your link profile from authoritative sites, understand the content that gained those links, determine if you have content covering those topics, and figure out ways to produce killer content that can earn links like that (or even better). Don’t just try to match what your competitors are doing. Try to outdo them.
  • Have patience… building authority is a long-term process: Understand that earning powerful links from authoritative sites is a long-term process. You cannot change the situation overnight… That might be frustrating, but it’s true. Understand what the competition is doing to earn those links, figure out gaps on your site, and build a killer content strategy to earn amazing links over the long-term. Once you start earning those links, you’ll appreciate that it takes time, since it can build a natural layer of protection for you (until others start going through the same process, put the work in, etc.)

Summary – Understanding the link authority gap can be enlightening, but you must execute.
By using the process I documented in this post, you can quickly start to understand the most authoritative sites linking to your competition that don’t link to your site. And that “link authority gap” can be important. But don’t stop once you uncover the gap. You need to execute based on your analysis in order to make an impact. So… what are you waiting for?? The data awaits.   

GG

Filed Under: google, seo, tools

How to identify ranking gaps in Google’s People Also Ask (PAA) SERP feature using Semrush

October 19, 2021 By Glenn Gabe Leave a Comment

When performing a competitive analysis, it’s smart to run a keyword gap analysis to determine the queries that competitors rank for that your site doesn’t rank for. It can often yield low-hanging fruit that your content team can execute on. As part of that process, it’s also smart to analyze Google’s People Also Ask (PAA) SERP feature for queries your site already ranks for, or doesn’t rank for (to determine what those additional queries are and which sites are ranking for them). I find this step is skipped often for some reason and it can also yield powerful findings that you can start to execute on quickly.

In this post, I’m going to walk you through the process of identifying ranking gaps in People Also Ask (PAA) using Semrush, which provides outstanding functionality for mining PAA data.

What we’re going to accomplish:
For this tutorial, I’m going to filter queries leading to a site by top ten rankings and then layer a secondary filter for surfacing queries where a People Also Ask module also ranks (but the site doesn’t rank in the default PAA listings). In other words, you rank in the top ten, but you don’t have content ranking in PAA for those queries for some reason. I’ve found that can yield very interesting findings that sites can execute on pretty quickly.

For example, in the screenshot below, a site ranks in the top ten for 3,359 queries when it also does not rank in the default People Also Ask (PAA) module:

Viewing Google ranking gaps in people also ask via semrush.

Step-by-step instructions for identifying PAA gaps via Semrush:

1. First, fire up Semrush and enter the domain name you want to analyze.

Enter domain name in semrush.

2. Access the Organic Research reporting.
Click Organic Research in the left-side navigation, which will bring us to a powerful set of features for analyzing the search performance of the domain, subdomain, directory, or url you enter.

Viewing Organic Research reporting in Semrush.

3. View all rankings for the domain via the Positions tab.
Click the Positions tab, which will yield all queries that a site ranks for in the top 100 listings (based on Semrush data).

Viewing the positions tab in Organic Research in Semrush.

4. Filter by top ten results:
Next, we are going to filter the results by queries where the site ranks in the top ten (so these are queries where the site ranks very well already, but might not have content that ranks in People Also Ask). I’ll cover the second part of this step next, but start by filtering by queries ranking in the top ten by clicking the Positions tab and then selecting Top 10.

Filter by top ten rankings in Semrush.

5. Layer a secondary filter for PAA:
To complete this step, we want queries ranking in the top ten, but where the site doesn’t rank in People Also Ask (PAA), which can provide a great opportunity to fill gaps content-wise. To view these queries, click “SERP features”, then “Domain doesn’t rank in” and then select “People Also Ask”.

Add a filter for when a site doesn't rank in people also ask via Semrush.

6. Analyze the data:
Now that you have two filters selected, you will be presented with all of the queries where the site ranks in the top ten but doesn’t rank in the default PAA module. When you scan the list, definitely spot-check the live search results to see which questions are listed in People Also Ask, which sites are ranking there, the content ranking for those queries, etc. Again, you can identify content gaps, format gaps (like video), and more. It’s a quick way to help your content team identify opportunities (and some gaps you find might lead you to face-palm). :)

Final report showing ranking gaps in People Also Ask via Semrush.

7. Export the data:
You can always export the results if you want to use that data in other programs like Excel. You can export the data in Excel or CSV format.

Export ranking data in Semrush.

And there you have it. A quick and easy way to identify ranking gaps in PAA via Semrush. It only takes a few minutes to run, and you’ll have a boatload of PAA data to go through (where your site ranks, but not in PAA). By the way, if you’re looking for other posts I’ve written about Semrush tools, then check out The Link Authority Gap which shows you how to compare the most authoritative links between websites.

PAA Gaps – Final tips and recommendations:
Before I end the tutorial, here are some final notes and recommendations based on helping clients go through this process over time. Semrush is like a Swiss Army Knife for SEO research, so make sure you are getting the most of out the tool.

  • Live graphs – Remember that the graphs in Semrush are live graphs, so they change based on the filters you select. Therefore, you can see trending over time for ranking (or not ranking) in PAA when you already rank well in the organic search results. It’s a cool way to visually see your progress.
  • Advanced filtering – Use advanced filtering in Semrush to fine-tune your analysis. For example, combine filters like search volume, keywords, directory, subdomain, urls, etc. You can filter the data multiple ways in Semrush (and combine those filters for advanced reporting). Play around there… you might find some great combinations that surface important data.
  • PAA by country – Run this analysis by country! Just change the country you are analyzing in the reporting, and voila, you have a fresh set of queries where your site doesn’t rank in PAA.
  • By device – Be sure to check both mobile and desktop data. Similar to country, just select desktop versus mobile in the filters to see each resulting dataset. You might find some differences there.
  • Spot check the results – Make sure you are spot-checking the actual SERPs. PAA can obviously change (and Semrush isn’t always perfect), so make sure you really aren’t ranking in PAA for those queries. Then form a plan of attack once you identify the gaps.
  • PAA formats – Keep an eye on the format of content ranking in People Also Ask. As I mentioned earlier, video could be ranking there as well. Understand the types of content Google is ranking based on query and choose the right formats for the job.
  • View historical rankings – You can easily change the dates via Semrush! For example, you can look back in history and run this analysis for previous months. Have you improved, declined, or remained stable? How has Google changed with regard to PAA for those queries?

Summary: Identifying PAA Gaps Via Semrush can be powerful.
It’s hard to overlook People Also Ask when an analyzing the SERPs and the feature often contains important questions that people are searching for based on the original query. By using the process I detailed in this tutorial, you can surface and export queries where your site already ranks in the top ten search results, but doesn’t rank in PAA. In my opinion, it’s a great way to identify low-hanging fruit that your content team can dig into quickly. You never know, you might find some quick wins… or many of them. Have fun.

GG

Filed Under: google, seo, tools

Smart Delta Reports – How To Automate Exporting, Filtering, and Comparing Google Search Data Across Timeframes Via The Search Console API and Analytics Edge

April 12, 2021 By Glenn Gabe Leave a Comment

How to automate a delta report via Analytics Edge in Excel.

In 2013, I wrote a post explaining how to create what I called a Panda Report, which enabled you to identify landing pages seeing the biggest drop during a major algorithm update. The post explained how to do this based on Google Analytics data, but you can definitely do the same thing with GSC data (and with queries in addition to landing pages).

Well, it’s 2021, the process I use has been enhanced, and I wanted to publish a new post explaining how to automate the process using Analytics Edge. First, since medieval Panda is now missing in the SEO trees, the report needed a new name. For the past several years, I’ve simply called it a Delta Report. That fits much better since you can use this approach to identify the change in impressions, clicks, or rankings based on any event impacting Search (like a broad core update, a site migration, website redesign, or any other situation causing volatility). By identifying the landing pages and/or queries seeing the biggest drop, you can often find glaring issues. It’s a great way to start digging into the data after a big drop or surge in rankings and traffic.

And beyond the name change, there are some great ways to bulk export data via the Search Console API now. Since the GSC UI limits exports to just one thousand rows of data per report, using the Search Console API is critically important for exporting all of your data. For example, I often use the API to mass-export landing pages and queries from GSC (going well beyond the one thousand row limit).

A Delta Report is a Panda Report on Steroids – The Power of Analytics Edge and APIs
I have covered Analytics Edge a number of times before in blog posts about exporting data. It’s an add-on for Microsoft Excel that quickly and efficiently enables you to bulk export data via the Search Console API, Google Analytics API, and more. But that’s not all you can do with Analytics Edge. Beyond just the export, you can create macros that filter and organize the data to create advanced reports (so you can export, compare, filter, etc. and all in one shot).

For our purposes, that means exporting all queries or landing pages, comparing the data based on timeframe, filtering based on your site structure, important query types, etc., and then writing those exports to their own worksheets for further analysis. And this is all done in one shot based on the macro you create (build once, use often).

Excited? Below, I’ll walk you through the process of creating the macro via Analytics Edge. And while you go through this tutorial, you’ll probably think of 100 other things you can use Analytics Edge for while working with data. It’s basically a Swiss Army Knife for working with APIs.

How to export GSC data via the Search Console API, compare data across timeframes, filter the data by page or query type, and create separate worksheets, all in one shot:
First, there are some requirements. Although Analytics Edge isn’t free, it’s extremely cost-effective. The core add-in costs $99 per year and the GSC connector costs $50 per year. Also, the good news is that there is a 30-day free trial so you can walk through this tutorial and use the process for 30 days to see how it works for you.

So, for $150 per year, you can use Analytics Edge to your heart’s delight. If you are helping larger sites where the API is necessary to export all of your data, then Analytics Edge is a great way to go, and it definitely won’t break the bank.

How To Create A Delta Report: Step-by-step instructions

1. Download and Install Analytics Edge (Core Add-in):
You can download and run the Analytics Edge installer to quickly install the add-in. After installing the add-in, click the license button and accept the Terms of Use. Once you do, a 30-day free trial will start for the core add-in.

2. Install the GSC Connector from within Excel:
Now that the Core Add-in is installed, you need to add the GSC connector so you can work with the Search Console API. It’s very easy to install the connectors available in Analytics Edge. Simply click the License button in the Analytics Edge menu and then click the dropdown to add a new connector. Select Google Search and then click Add. Then click Install. Once the connector is installed, a 30-day trial will begin for that connector.

3. Connect Your Google Account:
In order to export data from your GSC properties, you first need to connect your Google Account associated with those properties. Once you set this up, you will not need to do this over and over. And you can add multiple Google Accounts if you have access to GSC properties across various accounts. Click the Google Search dropdown in the Analytics Edge menu and select Accounts. Click Add Account and walk through the process of quickly connecting your Google account. Once you do, you’ll be able to use the API to connect to any property you have access to.

4. Create Your Macro – Exporting All Landing Pages or Queries and Comparing Data Across Timeframes:
Let’s start creating our macro by exporting all landing pages and comparing data across timeframes. In a real-world situation, you would compare the timeframe after a major event (like a broad core update) to the previous timeframe to see the changes per landing page or query. But for this tutorial, we’ll keep it simple. Let’s just pull all landing pages for the last 28 days and compare to the previous timeframe. Once you get the hang of this, you can customize the report for each situation you encounter. Note, if you ever lose the Macro window, just click “Task Pane” in the Analytics Edge menu in Excel. It will show back up on the right side of the spreadsheet. Let’s start creating our macro. Go ahead and click Analytics Edge in the main menu, then Google Search, and then Search Analytics. Name your macro DeltaReport (or whatever you want).

5. Choose The Account and GSC property:
Select the Google account you want to use and then select the GSC property in the site list.

6. Select Fields To Export:
In the available dimensions and metrics list, select page and click “Add” to export all landing pages from Google organic Web Search. Leave the selected metrics as-is (with clicks, impressions, ctr, and position all selected).

7. Leave Filters Tab As-Is, But Review The Settings:
For this tutorial, we’ll leave the filters tab as-is, but note the options you have here while exporting data from GSC. You can filter by page, query, country, device type, search type, and search appearance. You’ll notice the default search type is Web Search. That’s what we want for this specific report, so keep the default settings.

8. Select A Date Range – Comparing Data Across Timeframes Made Easy:
Depending on your situation, select the appropriate timeframe for exporting data. For this tutorial, let’s pull the last 28 days of data and compare to the previous timeframe. Click the Dates tab and simply use the dropdown to select Last 28 Days. To compare timeframes, make sure to select the checkbox for “Compare to” and select a timeframe to compare with. To keep things simple, we’ll just select “Previous period”.

9. Sort By Clicks Or Impressions:
Under the Sort/Count tab, use the dropdown to select a metric to sort the data by. I typically choose clicks in descending order. Make sure to click the button labeled “Descending” to apply the sort preference.

10. Run The Query!
Click OK in the bottom right corner of the wizard to run the query. Depending on how much data needs to be exported, it can take a few seconds (or longer). Once the query has completed, you will see the results highlighted in green. Note, this does NOT show the full results from the export. Analytics Edge just shows a sample of the results and is waiting for more input from you (either to write the full data to a worksheet now or to use the built-in functions to create more advanced reports).

11. Set The Table Name:
Since we’ll be filtering the data we just exported multiple times (by page type), we need to set the table name so we can come back to the full data in future steps. To do this, click the Table dropdown in the Analytics Edge menu and select Table Name. Set the table name to something like “allpages” and click OK. Again, we’ll need this in the future.

12. Write To Worksheet:
Let’s complete the export by writing the data to a new worksheet. In order to do this, click the File dropdown in the Analytics Edge menu and then select Write to Worksheet. Name your worksheet something like “Landing Pages All Data” and then click OK. Analytics Edge will create a new worksheet containing the full export from GSC. Just click the new worksheet to view all of the data. You’ll notice all of the landing pages were exported with columns showing the difference in clicks, impressions, CTR, and position based on comparing the last 28 days to the previous timeframe. Awesome, right? But we’re not done yet. Our macro will be smarter than that. :)

13. Start Filtering Your Data:
Our goal is to create separate worksheets by page type so you can easily analyze each one separately. To keep things simple, let’s say we wanted to break out category pages (/category/), product pages (/products/), and blog posts (/blog/) so we could analyze them separately. Let’s start with category pages. Click the Table dropdown in the Analytics Edge menu and select “Filter”. This menu will enable you to filter the data by any column in the active table. The active table now is “allpages”, which we set up in step 11. Once you click “Filter in the menu”, you can set the filter rules. The column should say “A page”, which will enable you to filter by the column in our active table titled “page”. For Criteria, you have several helpful options, including regex. Yes, you can use regex here if needed, which is awesome. To filter by category pages which contain /category/ in the url, select “Contains” and then enter /category/ in the Value text box. Then click the Add button. Note, you can combine rules here if you want to create more complex filtering options. Click OK to filter the active table. You will see the results again highlighted in green. We’ll write the filtered data to a worksheet in the next step.

14. Write The Filtered Data To A Worksheet:
Just like we did before, let’s write the filtered category data to a new worksheet. Click the File dropdown in the Analytics Edge menu and select “Write Worksheet”. Then name the new worksheet “Landing Pages Category” and click OK. A new worksheet will be created with all of the category page data. At this point, you should have two worksheets, one containing all landing page data and another just containing category page data.

15. Switch The Table Back In Order To Filter Again:
Now we want to filter the full data again for product pages, which contain /product/ in the url. In order to do that, we need to switch the active table back to “allpages”, which contains all of our exported data and not just the filtered category data. If we don’t switch the table name again, then Analytics Edge will use the current active table, which is the category page data. In order to switch the table, click the Table dropdown in the Analytics Edge menu and click Table Name. Click the second radio button to switch to a previously named table and select “allpages”. Now that becomes the active table and we can filter it again.

16. Filter Product Pages:
We’ll use the same approach that we did when filtering the category pages, but this time, we’ll filter by urls containing /products/. After switching the table name, select Table again in the Analytics Edge menu and then Filter. Now enter /products/ in the value field for page. Then click OK. The data will now be filtered by any url with /products/ in it.

17. Write Filtered Data To Worksheet:
Next, we need to write the product filtered data to a new worksheet. Click the File dropdown in the Analytics Edge menu and select “Write Worksheet”. Name the worksheet “Landing Pages Products” and click OK. You will now have a new worksheet with the filtered data. And now you should have three worksheets containing landing page data (full data, category pages, and product pages).

18. Rinse and Repeat For Blog URLs:
Use the same approach to export all data filtered by blog urls (containing /blog/ in the url). First, switch the table name back to “allpages” (see step 14 for how to do this). Then filter the data by any url containing /blog/, and then write to a new worksheet called “Landing Pages Blog”. When you’re done, you should have four worksheets in total with three that contain filtered data (one for category urls, one for product urls, and one for blog urls). And Analytics Edge already took care of comparing data across timeframes and provided difference columns in the worksheets.

Congratulations! You have exported all of your landing pages from GSC, compared data across timeframes, filtered by page type, and then created specific worksheets containing the filtered data. Oh, and now you have a macro using Analytics Edge that you can reuse whenever you want to accomplish a similar task in the future. Just reopen the spreadsheet, save a new file, edit the macro to change the settings like GSC property, click “Refresh All” in the upper left corner of the Analytics Edge menu, and boom, you’re good to go. Time is valuable and this can save you a lot of it in the future…

Beyond The Delta Report: More Functionality = More Advanced Reporting
As I mentioned earlier, Analytics Edge comes with a ton of functionality built-in. You can create advanced reporting by using the various functions available in Analytics Edge when working with data exported from GSC, Google Analytics, and more. So, if you’re feeling ambitious, here are some other things you can try using Analytics Edge and the GSC Connector:

  • Run the same type of report, but for queries instead of landing pages. Then you can analyze drops or surges by query type instead of page type.
  • You can segment by search type to analyze drops and surges for Image Search, Video Search, or the News tab.
  • You can segment by search feature (like AMP, how-to, FAQs, Q&A, reviews, recipes, etc.) Note, you can follow my tutorial for exporting data by search appearance to learn more about that process.
  • And make sure to review all of the functions available in the Analytics Edge menu within the Multiple, Table, and Column dropdowns. For example, you can filter, group, pivot, sort, append, combine, compare, convert, split, and more. Again, Analytics Edge is like a Swiss Army Knife for APIs.

Summary – Automated Delta Reports are Panda Reports on Steroids
It’s always smart to analyze the top landing pages and/or queries when a site sees a big drop or surge in rankings and traffic from Google (due to an algorithm update, site migration, website redesign, etc.) Since the Performance reporting in the GSC UI limits exports by 1K rows, using a tool like Analytics Edge can help you quickly and efficiently export all of your data via the Search Console API.

In addition, Analytics Edge comes with a number of functions for filtering and working with your data to create advanced reports (including comparing data by timeframe). By following this tutorial, you can create a template for quickly exporting data, comparing data across timeframes, filtering by page or query type, and then writing the results to separate worksheets for further analysis. Once you get the hang of Analytics Edge, the sky’s the limit. I think you’ll dig it.

GG

Filed Under: google, seo, tools, web-analytics

How To Use GSC’s Crawl Stats Reporting To Analyze and Troubleshoot Site Moves (Domain Name Changes and URL Migrations)

March 2, 2021 By Glenn Gabe Leave a Comment

For site migrations, I’ve always said that Murphy’s Law is real. “Anything that can go wrong, will go wrong.” You can prepare like crazy, think you have everything nailed down, only to see a migration go sideways once it launches.

That’s also why I believe that when something does go wrong (which it will), it’s more super-important to address those problems quickly and efficiently. If you can nip migration problems in the bud, you can avoid those problems becoming major issues that impact SEO. That’s why it’s important to prepare as much as you can, have all of the necessary intelligence in front of you while the migration goes live, and then move quickly to attack any problems that arise.

By the way, if you think you’re immune to site migration problems, then listen to the episode of Google’s Search Off The Record podcast where John Mueller, Gary Illyes, Cherry Prommawin, and Martin Splitt talk about the migration of the webmaster central site to the new search central site. It ends up they ran into several problems just like any other site owner could and had to move quickly to rectify those issues. So, if it can happen to Google, it sure can happen to you. :)

Adding Google’s Crawl Stats Reporting To Your Site Migration Checklist:
There are plenty of checklists and tools out there to help with site migrations. For example, Google’s testing tools in GSC, third-party crawlers like Screaming Frog, DeepCrawl, and Sitebulb, site monitoring tools, log file analysis tools, and more.

And on the topic of log files, they provide the quickest way to understand how Google is crawling your site post-migration. You don’t need to wait for data to populate in a tool, you don’t have to guess how Google is treating urls, redirects, etc., and there are several log file analysis tools hungry to consume your logs.

But there’s a catch… trying to get log files is like attempting to complete a mission as Tom Cruise in one of his great Mission Impossible movies. If gaining log files was a scene in Mission Impossible, I could hear Tom now:

“Wait, so we have to scuba dive under a bridge heavily guarded by troops, climb a 200 story building in our underwear (in 20 degree weather), use elaborate yoga moves to dodge a scattered laser security system, steal the ancient lamp of Mueller which is protected by special forces, hack into a computer system protected by six layers of ciphers, download the log files, and then parachute off the building back into the water, only to scuba dive back under the bridge to safety? No problem… hold my coffee.”

OK, it’s not that bad, but any SEO that has attempted to get log files from a client knows how frustrating that situation can be. They are huge files, seemingly not owned by one group or person in a company, and you can even find some companies not keeping logs for more than a few days (if that). So, it’s no easy feat to get a hold of them.

What’s an SEO to do?

Meet The New Crawl Stats Report in GSC: A (Pretty) Good Proxy For Log Files
In November of 2020 Google launched the new Crawl Stats reporting in GSC. The reporting is outstanding, and it was a huge improvement from the previous version. The new reporting provides a boatload of data based on Google crawling your site. I won’t go through all of the reports and data in this post, but you can check out the documentation to learn more about each of the report sections.

I’m going to cover what Google considers “site moves with url changes”, which covers domain name changes and url migrations. I’ll focus on domain name changes, but you can absolutely use the new Crawl Stats reporting to troubleshoot url migrations as well.

For domain name changes, you can view crawl stats reporting for both the domain you are moving to and the domain you are moving from. So, using the crawl stats reporting can supplement your current migration checks and enable you to see how Google is handling the migration at the source (the old domain).

And for url migrations, you can also surface problems that Google is experiencing post-migration. It’s not as clear as a domain name change, since you can’t isolate the crawl stats reporting by domain, but it can still help you surface issues based on bulk-changing urls.

Note: There is a delay in the Crawl Stats reporting.
The Crawl Stats reporting lags by a few days, so log files are still important if you want to see a real-time view of how Google is handling a site migration. The reporting updates daily, but lags by 3-4 days from what I have seen. Below, you can see the report was last updated on 2/26/21, but today is 3/2/21.

How to identify problems with domain name changes and url migrations using the Crawl Stats reporting in GSC:
As mentioned above, for domain name changes, you can analyze the Crawl Stats reporting for the domain name you are moving from, and the domain name you are moving to. Below, I’ll cover some of the ways you can use the reporting to surface potential issues.

How To Find The New Crawl Stats Report in Google Search Console (GSC):
First, I know there’s some confusion about where the new Crawl Stats report is located. You will not find the report in the left-side navigation in GSC. Instead, you first need to click “Settings”, find the Crawling section of the page which contains top-level crawl stats, and then click “Open Report” to view the full Crawl Stats reporting.

Now that you’ve found the Crawl Stats reporting, here are some of the things you can find when analyzing and troubleshooting a site migration.

404s and Broken Redirects:
The Crawl Stats reporting for the domain you are moving from will list urls that Google is crawling that end up as 404s. All urls during a domain name change should map to their equivalent url on the new domain (via 301 redirects). By analyzing the source domain name that’s part of the migration, you can view urls that Googlebot is coming across that end up as 404s. And that can help you find gaps in your 301 redirection plan.

For example, you can see the reporting for a site that went through a domain name change below. 4% of the crawl requests were ending up as 404s when most of the urls should be redirecting to urls that return a 200 header response code on the new domain.

If you click into that report, you can see a sample of the top 1,000 urls with that issue and you can inspect the urls as well:

And here is what it should look like. 100% of the requests are 301 redirecting to the equivalent urls on the new domain:

Important (and often confusing) note: It’s worth noting that GSC reports on the destination url, so a 404 showing up for the old domain name could actually be showing you a redirect to the new domain name, but that new url 404s. In other words, the 404 is actually on the new domain, but shows up in the reporting for the old domain name. That’s extremely important to understand overall with GSC, and it can cause confusion while analyzing the reporting. I tweeted about this in January with regard to the Coverage reporting:

Reminder: GSC reports on destination urls in the Coverage reporting. So if you see urls that are categorized as blocked by robots.txt or noindexed, but they aren't, they could be redirecting to urls that are. And that's what is reported. Can send you off on a wild goose chase: pic.twitter.com/QYSWUcVTc1

— Glenn Gabe (@glenngabe) January 5, 2021

Image Search: Googlebot Image
If image search is important for your business, you will definitely want to review the “By Googlebot type” section of the reporting. You will see a listing for “Image”. You can click into that reporting to see the urls Googlebot Image is crawling. If you see 404s, 5XX, etc., then make sure you jump on those issues quickly. You should see plenty of 301s if you redirected images properly during the migration (which you should). I covered that in my Mythbusting video with Google’s Martin Splitt about site migrations. The video can be seen later in this post.

As you can see below, Googlebot Image is coming across 404s as well. This is from the site that went through a domain name change.

This is what you should see. Notice how the Googlebot Image requests all properly 301 redirect to the images on the new domain:

Robots.txt issues:
In the host issues section, you can see if Googlebot is having problems accessing the robots.txt file for the domain(s) involved. If Google cannot fetch the robots.txt file (which returns a 200 or 403/404/410), then it will not crawl the site at that time. Google will check back later to see if it can fetch the robots.txt file. If it can, then crawling will resume. You can read more details about how this is handled on Google’s support page (or in the screenshot below). Note, you can 404 a robots.txt file and that’s absolutely fine. This is about Google having problems fetching the file (i.e. Google seeing a 429, 5XX).

And here is what it can look like in GSC’s Crawl Stats reporting. Although this falls under an “acceptable fail rate”, I would sure check why the robots.txt fetch is failing at all:

Other host issues: DNS Resolution and Server Connectivity:
Along the same lines, you can see if there are other host-level issues going on. The host reporting also contains DNS resolution errors and server connectivity problems. You obviously want to make sure Google can successfully recognize your hostname and that it can connect to your site.

Performance Problems:
The reporting also will show pages that are timing out for some reason, so keep an eye on that report. You will find that in the “By response” section.

Subdomain Issues:
Hopefully you picked up all subdomains that were in use before you pulled the trigger on the domain name change. But if you didn’t, you can see Crawl Stats reporting per subdomain that Google is crawling. The catch is that you need a domain property set up in GSC for the domain you are moving from (unless you had those subdomains verified and set up already in GSC).

If you did, you could view the crawl stats reporting for those subdomains separately. Domain properties make this easier since all subdomains being crawled by Google (the top 20 over the past 90 days) will be shown in the Hosts report in the Crawl Stats reporting.

Below, you can see that the crawl stats reporting shows 17 different subdomains with crawl requests over the past 90 days.

Note, I always recommend having a domain property set up. It’s amazing how many companies have not done this yet… If you haven’t, I would do that today. It doesn’t take long to set up and it covers all protocols and subdomains.

Crawl Stats For Site Migrations: Final tips And Recommendations
The Crawl Stats reporting can help site owners and SEOs get closer to log file analysis, when gaining those logs might be tough. Although there’s a lag in the data populating (3-4 days), the Crawl Stats reporting can sure help surface problems during domain name changes and url migrations. And the quicker you can nip those problems in the bud, the less chance they become bigger issues SEO-wise.

Here are some final tips and recommendations:

  • Set up domain properties for each of the domains involved in the migration (if changing domain names). This will give you access to all subdomains in the Crawl Stats reporting.
  • Once data starts populating in the Crawl Stats reporting post-migration, dig into the domain you are moving from. You might see a number of issues there based on what I explained earlier. For example, 404s, performance issues, robots.txt problems, and more.
  • Nail the redirection plan. If you see gaps and problems with your 301 redirects, move quickly to rectify those problems. Nip those problems in the bud.
  • Check for host-level problems (like robots.txt fetch issues, DNS resolution issues, and server connectivity problems). Your redirection plan doesn’t matter if Google can’t successfully connect to your site.
  • Look for pages that are timing out. This would show up in the “By response” section of the reporting. If you see that, dig into those problems to see why the pages are timing out. Again, move quickly to address performance issues.
  • Don’t forget your images! Make sure to 301 redirect your images and then check the section labeled “By Googlebot type”. Then check the “Image” reporting to see how Googlebot Image is crawling your content.

More About Site Migrations: Mythbusting Video
If you are interested in site migrations, then you should check out the Mythbusting video I shot with Google’s Martin Splitt. In the video, we cover a number of important topics including domain name changes, url migrations, redirecting images, when a site should revert a migration, site merges, the Change of Address Tool in GSC, and more.

Summary – GSC’s Crawl Stats as a proxy for log files.
After reading this post, I hope you see the power in adding Google’s Crawl Stats reporting to your site migration checklist. The reporting provides a boatload of great information based on Google crawling your site post-migration. I’ve found it extremely helpful while helping companies monitor and troubleshoot domain name changes, url migrations, and more. And remember, Murphy’s Law is real for site migrations. Things will go wrong… which is ok. The important part is how quickly you handle and rectify those problems.

GG

Filed Under: google, seo, tools

  • 1
  • 2
  • 3
  • …
  • 14
  • Next Page »

Connect with Glenn Gabe today!

Latest Blog Posts

  • Amazing Search Experiments and New SERP Features In Google Land (2022 Edition)
  • Analysis of Google’s March 2022 Product Reviews Update (PRU) – Findings and observations from the affiliate front lines
  • How NewsGuard’s nutritional labels can help publishers avoid manual actions for medical content violations (Google News and Discover)
  • What Discover’s “More Recommendations”, Journeys in Chrome, and MUM mean for the future of Google Search
  • How to extend a multi-site indexing monitoring system to compare Google-selected and user-selected canonical urls (via the URL Inspection API and Analytics Edge)
  • Favi-gone: 5 Reasons Why Your Favicon Disappeared From The Google Search Results [Case Studies]
  • Google’s Broad Core Updates And The Difference Between Relevancy Adjustments, Intent Shifts, And Overall Site Quality Problems
  • Google’s December 2021 Product Reviews Update – Analysis and Findings Based On An Extended And Volatile Holiday Rollout
  • The Link Authority Gap – How To Compare The Most Authoritative Links Between Websites Using Majestic Solo Links, Semrush Backlink Gap, and ahrefs Link Intersect
  • How to identify ranking gaps in Google’s People Also Ask (PAA) SERP feature using Semrush

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2022 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy

We are using cookies to give you the best experience on our website.

You can find out more about which cookies we are using or switch them off in settings.

The Internet Marketing Driver
Powered by  GDPR Cookie Compliance
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

3rd Party Cookies

This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping this cookie enabled helps us to improve our website.

This site also uses pixels from Facebook, Twitter, and LinkedIn so we publish content that reaches you on those social networks.

Please enable Strictly Necessary Cookies first so that we can save your preferences!