Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should you aim for Google to use your meta tags?
When updating meta titles and descriptions, I'm taking note of whether Google is displaying the set tag or changing it to copy from the page. Does this affect the ranking position if Google is having to change the tag? How much should I worry if Google is choosing to change every other page? Thanks!
On-Page Optimization | | Omar_aw0 -
Does anyone use Genesis Framework? If so can a newbie use it and a few other questions
Hi, So as I search the wonderful land of the internet, I see this Genesis framework brought up quite a bit. I have researched it for a few weeks, but it seems like it uses hooks instead of shortcodes. So I am curious if anyone has used it? And if so what your thoughts are about it? I am a COMPLETE newbie here, so hooks look scary. I am sure with time they will seem like second nature. They claim it has airtight security. So if you have used this framework, how is this any different than an updated stock wordpress site? I understand that vulnerabilities may be in plugins and such, but if it is really airtight, that seems great. Any thoughts are appreciated as I just want the best user experience. So many people use this framework, yet my site gets if I'm lucky 1000 views each month. It is a basic site to let people know we exist. So its not like I have a popular blog with 50,000 pageviews each month. But... going into the future, I want a pleasant and consistent user experience. Maybe a wordpress theme is all you need. Maybe a framework is more for developers. Any thoughts are greatly appreciated. Chris
On-Page Optimization | | asbchris0 -
Google Map Marker not showing up when using HTML Embed Code
Hi community, For several clients, when embedding the Google Maps, the Marker DOES NOT appear. I am copying straight from Google Maps, it appears fine in Google Maps. Is there HTML I can add or is there something wrong with the map marker to begin with? These are verified listings. Help! Thank you! Priscilla
On-Page Optimization | | FullMedia900 -
Not using H1's with keywords to simulate natural non SEO'd content?
There has been a lot of talk lately about making a website seem like it is not SEO'd to avoid over optimization penalties with the recent Google Algorithmic updates. Has anyone come across the practice of not using Headings (H1's, H2's etc..) properly to simulate that the current webpage isn't over optimized? I've come across a site that used to use multiple keywords within their headings & now they are using none. In fact they are marking their company name & logo as an H1 and non keyworded H2's such as our work or Contact. Is anyone holding back on their old SEO tactics to not seem over optimized to Google? Thanks!
On-Page Optimization | | DCochrane0 -
Can you expound why i have to avoid using meta keywords?
I'm using the on page report card and it tells me that i have to avoid using meta keywords.I'm a little bit confused. I thought that it's important to use it all the time so search engine can better index the site. if I use SEO Quake it will tell me in the diagnostic test that I need to input keywords.
On-Page Optimization | | jsevilla0 -
Does anyone have any opinions on whether to use a pipe | in your title tags? Or commas or just nothing at all?
I am trying to decided whether to use: BootsJeansandLeathers.com Mens Jeans Mens Boots Mens Leathers Mens Footwear OR BootsJeansandLeathers.com | Mens Jeans | Mens Boots | Mens Leathers | Mens Footwear as my title?
On-Page Optimization | | ebowdublin0 -
How are your "Service Area" pages handling Penguin/Panda?
We just got a new client because of recent Penguin/Panda changes. A national "SEO" firm decided it was a good idea to set up a page for each service town or county they serve with nothing but duplicate content. Needless to say, on the week of the 23rd, their rankings tanked from 1st page (it's not a competitive niche) to 4th. I'm not bringing this up to brag, but rather because it got me thinking... How are your geographically targeted "service area" pages doing? Have the recent changes caused you to rethink your geographic targeting in any way?
On-Page Optimization | | BedeFahey0 -
Major update to site architecture (outline)-Is Google going to drop?
I'm working with a lawyer client who has a table-based, outdated site. Her nav links consist of a jumble of topics and static pages in one long sidebar list on the home page. I'm moving her site to Wordpress and I've recommended that she organize the site based on categories that roughly match the topics/keywords she wants to rank highest for in Google. The site will be much better organized and coded and the URLs for the new launch will be much stronger for SEO by being targeted and coded properly. So the site should rank better after, right? Right??? I know that when Google crawls the new architecture, it's not going to find the expected long sidebar list of internal nav links. It'll find better, more keyword targeted internal nav links. But will that keep the site from getting dropped off page 1? I'm speaking w/ the client tomorrow and if she's going to drop or get bounced around, I feel like I should prepare her and let her know roughly what might happen. I'm thinking based on my current understanding that I should tell her to expect to be bounced around for a few weeks, but in the end she should rank higher than before. What would you do/say?
On-Page Optimization | | bvrob0