The Internet Marketing Driver

  • GSQi Home
  • About Glenn Gabe
  • SEO Services
    • Algorithm Update Recovery
    • Technical SEO Audits
    • Website Redesigns and Site Migrations
    • SEO Training
  • Blog
    • Web Stories
  • Contact GSQi

How To Bulk Export GSC Performance Data For A Specific List Of URLs Using The Google Search Console API, Analytics Edge, and Excel

September 15, 2023 By Glenn Gabe Leave a Comment

Share
Tweet
Share
Email
Bulk Export Data via the GSC API for a specific list of urls

As I’ve been analyzing the impact from the August broad core update (I’ll have more to share on that soon…), I’ve been digging into drops and surges across sites. For larger-scale sites, I often come across pockets of content that I want to take a closer look at from a quality standpoint. And as part of that analysis, I often want to cross-reference GSC data to better understand if Google is surfacing that content in the SERPs, how many clicks that content is receiving, how that looks across mobile and desktop, and more.

For example, maybe there is a large group of articles you want to explore in greater detail, a number of product pages you want to analyze, or a group of reviews from across a site. And maybe there are hundreds, or even thousands, of those urls that you want to pull GSC data for. For situations like that, you don’t want to simply export ALL data for a GSC property and dig through it. You just want GSC data for the group of urls you are analyzing.

I have already written several articles about bulk exporting GSC data using Analytics Edge, including how to automate delta reports, but I never covered how to export data for a specific set of urls. It was just for exporting that data by property. That’s why I came up with a solution for exporting data just for the urls I want to check. And beyond that, the system I created can export data across Google surfaces like images, video, news, Discover, etc. This process can be very helpful after major algorithm updates roll out like broad core updates, helpful content updates, and reviews updates.

What we are going to achieve via the GSC API and Analytics Edge:
First, we’ll create a list of urls that we want to export GSC data for. That list can contain as many urls as you want (dozens, hundreds, or even thousands). Then we’ll use Analytics Edge to bulk export performance data via the GSC API and then use the worksheet of urls as a lookup table. In other words, we’ll match the exported data with the worksheet we create containing urls we want to check, and then export the data that matches those pages.

And later in this post, I’ll quickly explain how you can expand this system to string multiple macros together to check several Google surfaces in one shot (news, Discover, search, images, etc.)

Step by step: How to bulk export GSC data for a specific set of URLs:

1. Gather your list of urls:
First, create a worksheet of urls you want to export GSC data for. This can be based on an audit you are conducting, a recent crawl you completed, an xml sitemap you have, etc. Basically, you are looking to view clicks, impressions, click through rate, and position for a set of urls. Name the worksheet “Pages” and you can name the column “page”.

Pages worksheet in Excel

2. Export landing pages via the GSC API for a property:
The next step is to export all landing page data for the GSC property you want to analyze. Note, you will not be filtering data at this stage, so just export all of the data via the API. We’ll filter the data in the next step based on the worksheet you created. You can view my other tutorial for exporting landing page data via Analytics Edge or you can view the abbreviated steps below. It’s very easy to do.

3. Export landing page data via the GSC API and Analytics Edge:
When exporting the data, select “page” as the dimension and make sure the selected metrics include clicks, impressions, ctr, and position.

Export landing pages via the GSC API

4. Export Web Search data:
In the filters tab, select “web” under Type (for Web Search data). I’ll explain more about exporting other types of data later in the tutorial.

Exporting Web Search data via the GSC API

5. Select a timeframe:
For the dates tab, select the timeframe you want to check. You can choose a preselected timeframe or set a custom timeframe. For this tutorial, select “Last 3 Months”.

Set a timeframe for the export

6. Sort by clicks or impressions:
In the Sort/Count tab, use the dropdown to select “Clicks” and the click the button for “Descending”. This will sort the exported data by pages with the most clicks first. You can also sort by impressions if you are just looking to see if the pages ranked in the SERPs. Either way, you can easily sort the export via Analytics Edge.

Sort the exported data

7. Export the data:
Click “OK” in the bottom of the Analytics Edge modal window to export the data via the GSC API.

8. It’s lookup table time! Filter based on the Pages worksheet:
Before we write this data to a worksheet, we want to filter the export to only provide data for the urls we listed in our Pages worksheet. Remember, we don’t want all the data, we just want to analyze the data for the pages we included in that worksheet.

9. Use the Match function to check the Pages worksheet:
By using the Match function, we can use the Pages worksheet as a lookup table and only write the data for those urls to a new worksheet. Click the dropdown in the Analytics Edge menu labeled “Multiple” and click “Match”. Then use the “Match with” dropdown to select “Worksheet Pages” and then keep the selected column as “A page”. And make sure the radio button for “Handling Matches” is set to “Keep matching rows”. Click “OK” to execute the match.

Using the Match function in Analytics Edge
Match settings in Analytics Edge

10. Write to worksheet:
The final step is to write the filtered data to a worksheet. Click the dropdown labeled “File” in the Analytics Edge menu and click “Write to Worksheet”. In the “Worksheet field”, enter a name for the worksheet that will contain the filtered data. You can name it “Search Data” for this tutorial. Then click “OK”.

The final exported data via the GSC API based on a lookup table of URLs

Congratulations, you have successfully exported GSC data for a specific list of urls! If you are ever looking for a quick way to view GSC data for a set group of urls (no matter the size), you now have a template for accomplishing that task. Again, Analytics Edge is like a Swiss Army Knife for working with APIs.

Bonus: It’s Analytics Edge, Be Creative, Go Nuts:
In previous tutorials, I explained how to string multiple macros together to automate even more actions. Well, for this tutorial you could easily create new macros for exporting more data from across Google surfaces like Discover, images, the news tab in Search, Google News, etc. Once you set up multiple macros, you will have a system ready for exporting data by Google surface for a specific set of urls (and by clicking a single button in Excel).

Exporting data across Google surfaces

Summary – Exporting data for a set of urls via Analytics Edge.
There are times you want to quickly understand how certain urls are performing across a site (like after major algorithm updates) without having to sift through all of the data from that site. By using the approach I mapped out in this tutorial, you can leverage Analytics Edge and the GSC API to do just that (and across Google surfaces). It shouldn’t take long to set up, and you’ll always have that template for future projects. I think you’ll dig it. :)

GG

Share
Tweet
Share
Email

Filed Under: Uncategorized

Connect with Glenn Gabe today!

Latest Blog Posts

  • The September 2023 Google Helpful Content Update – Did Google’s Announcement in April About Page Experience Foreshadow What We’re Seeing With The Current HCU(X)?
  • How To Find Lower-Quality Content Being Excluded From Indexing Using Bing’s XML Sitemap Coverage Report (and Its “Content Quality” Flag)
  • How To Bulk Export GSC Performance Data For A Specific List Of URLs Using The Google Search Console API, Analytics Edge, and Excel
  • Analyzing the removal of FAQ and HowTo snippets from the Google search results [Data]
  • Why Noindexing Syndicated Content Is The Way – Tracking 3K syndicated news articles to determine the impact on indexing, ranking, and traffic across Google surfaces [Case Study]
  • Jarvis Rising – How Google could generate a machine learning model “on the fly” to predict answers when Search can’t, and how it could index those models to predict answers for future queries [Patent]
  • Analysis of Google’s Perspectives Filter and Carousel – A New Mobile SERP Feature Aiming To Surface Personal Experiences
  • People Also Search For, Or Do They Always? How Google Might Use A Trained Generative Model To Generate Query Variants For Search Features Like PASF, PAA and more [Patent]
  • Disavowing The Disavow Tool [Case Study] – How a site owner finally removed a disavow file with 15K+ domains, stopped continually disavowing links, and then surged back from the dead
  • Google’s April 2023 Reviews Update – Exploring its evolution from PRU to RU, a powerful tremor on 4/19, and how its “Review Radar” found larger publishers

Web Stories

  • Google’s December 2021 Product Reviews Update – Key Findings
  • Google’s April 2021 Product Reviews Update – Key Points For Site Owners and Affiliate Marketers
  • Google’s New Page Experience Signal
  • Google’s Disqus Indexing Bug
  • Learn more about Web Stories developed by Glenn Gabe

Archives

  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • August 2021
  • July 2021
  • June 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • GSQi Home
  • About Glenn Gabe
  • SEO Services
  • Blog
  • Contact GSQi
Copyright © 2023 G-Squared Interactive LLC. All Rights Reserved. | Privacy Policy
This website uses cookies to improve your experience. Are you ok with the site using cookies? You can opt-out at a later time if you wish. Cookie settings ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can read our privacy policy for more information.
Cookie Consent