Export Search Console Page List via Screaming Frog

Last updated on March 16th, 2022 at 01:56 am

Home » Articles » Export Search Console Page List via Screaming Frog

Use screaming frog to discover, crawl, analyze and export your Google Search Console URLs will help you uncover numerous types of indexing and technical issues on your website. Skip to the FAQs at the bottom of the page for specific examples.

The Configuration Settings

  1. Launch Screaming Frog
  2. Set default configuration
    • Go to menu: File > Configuration > Clear default Configuration
  3. Update crawl Configuration
    • Go to menu: Configuration > Spider
    • On the “Crawl” tab uncheck all of the “Resource Links” “Page Links”, “Crawl Behavior” and “XML Sitemaps” section checkboxes, then click “OK”
  4. Mode > select ‘list’
  5. Go to Menu: Configuration > API Access > select ‘Google Search Console’
    • Connect to appropriate user account
    • Select appropriate property
    • Click the Date Range tab and select date range
    • Skip the Dimension Filter tab
    • Click the General tab and check the ‘Crawl New URLs Discovered In Google Search Console’ check box then click ‘OK’
    • Find more information on connecting Screaming Frog to the built-in APIs at the Screaming Frog User Guide.
screamingfrog GSC API settings
Screamingfrog Google Search Console API settings

Running the Crawl

  1. Click ‘Upload’ and select ‘Enter Manually…’
  2. Place just the homepage URL of the web property, click ‘Next’ then click ‘OK’
  3. After the homepage is crawled, wait… the pages present in search console will start importing into Screamingfrog, which you can export in whichever format you choose.
Type or paste in web property homepage to URL

Frequently Asked Questions

Can include just specific sections of my website?

Yes, you can use the include configuration to only include sections of the website you want to export from GSC. You may also use the exclude configuration to exclude specific sections of your website. Go to Menu: Configuration > Include
Then put in the directory to include followed by a dot and asterisk.
Example of how to include pages from a blog only:screamingfrog include configuration setting

What types of issues can I detect by crawling URLs discovered in Google Search Console?

You will be able to detect status and error code issues, problems with noindex tag directives, canonical url discrepancies and many other technical SEO items that may be causing indexing or user experience problems on your website.

Why should I use Screamingfrog to export URLs from GSC instead of exporting directly from Google Search Console?

GSC has a limit of the number of URLs you can export. I am not sure if the API has a limit, but I have not hit it yet.

Do I have to connect to the API to get this data?

Yes, of course. As long as you have verified access to the web property you want to discover you can do this.

Can I do this same thing with the Google Analytics API in Screamingfrog?

Yep, same process but connect to the GA API instead.

Categorized as Articles

By Eric Mandell

Eric Mandell is a an SEO professional since 1999 and is currently a Director of SEO at Catalyst Performance Marketing Services out of Boston.


  1. dear eric

    this is really a valuable source and has made my day.

    your solution offers really great approach to detect errors & issues. any idea from you experience about how sensible/often one should perform the crawl.

    thank you very much for sharing – great work

    1. Hi Michael, thanks you for the compliment. I think a standard site should probably do this monthly or if you do regular technical audits then make this part of it. You can also learn a lot from running the Page Inventory Crawl with the GSC API connected so you can review orphan pages and indexable pages that are not receiving any impressions. That will help you find SEO opportunities for specific pages.

Leave a comment

Your email address will not be published. Required fields are marked *