Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any benefit to using HeadSpace AND All in One?
I noticed that somewhere along the line (outside developer or SEO) I ended up with HeadSpace AND All in One on one of my WP sites. There are functions that I appreciate with both and I wonder if there is any danger to completing both forms for a post or page? Is there really any benefit or just a waste of time? I keep finding articles that compare the 2, but nothing that talks about using them together. If I get rid of All in One, …. WOW. Mid question, i realized I'm a dum-dum. All in One has the same no follow options I thought I would miss from HeadSpace. So new question…if I uninstall headspace, will I lose the data/settings that it was used to set up? Jenn
On-Page Optimization | | vernonmack0 -
Can I use Same Keyword for Multi pages Title Tags?
Hello All, I am working on client website and currently they are targeting One Keywords for multi pages. As I have search with Allintitle: Search query and Google display around 37 pages of website which carry same keyword in "Title Tags". I have told to client to change the "Title Tags" but they want that keyword for all relevant pages. So I want to know is that harm in Search Engine Ranking? Note: They have not done the link building activities for multi pages with same Keyword, they are using only in "Title Tags" only
On-Page Optimization | | jemindesai0 -
Using a lightbox - possible duplicate content issues
Redesigning website in Wordpress and going to use the following lightbox plug-in http://www.pedrolamas.pt/projectos/jquery-lightbox/ Naming the original images that appear on screen as say 'sweets.jpg'
On-Page Optimization | | Jon-C
and the bigger version of the images as 'sweets-large.jpg' Alt text wise I would give both versions of the images slightly different descriptions. Do you think there would be any duplicate content issues with this? Anything I should do differently? I'm very wary of doing anything that Google is likely to think is naughty, so want to stay on their good side! Cheers
T0 -
How are your "Service Area" pages handling Penguin/Panda?
We just got a new client because of recent Penguin/Panda changes. A national "SEO" firm decided it was a good idea to set up a page for each service town or county they serve with nothing but duplicate content. Needless to say, on the week of the 23rd, their rankings tanked from 1st page (it's not a competitive niche) to 4th. I'm not bringing this up to brag, but rather because it got me thinking... How are your geographically targeted "service area" pages doing? Have the recent changes caused you to rethink your geographic targeting in any way?
On-Page Optimization | | BedeFahey0 -
What is a better mobile domain from an SEO perspective an m.example.com or using your regular domain with user agent detection?
Just wondering what domain is more beneficial for a mobile site and why.
On-Page Optimization | | CabbageTree0 -
Missing meta descriptions on indexed pages, portfolio, tags, author and archive pages. I am using SEO all in one, any advice?
I am having a few problems that I can't seem to work out.....I am fairly new to this and can't seem to work out the following: Any help would be greatly appreciated 🙂 1. I am missing alot of meta description tags. I have installed "All in One SEO" but there seems to be no options to add meta descriptions in portfolio posts. I have also written meta descriptions for 'tags' and whilst I can see them in WP they don't seem to be activated. 2. The blog has pages indexed by WP- called Part 2 (/page/2), Part 3 (/page/3) etc. How do I solve this issue of meta descriptions and indexed pages? 3. There is also a page for myself, the author, that has multiple indexes for all the blog posts I have written, and I can't edit these archives to add meta descriptions. This also applies to the month archives for the blog. 4. Also, SEOmoz tells me that I have too many links on my blog page (also indexed) and their consequent tags. This also applies to the author pages (myself ). How do I fix this? Thanks for your help 🙂 Regards Nadia
On-Page Optimization | | PHDAustralia680 -
Using rel="nofollow"
Hello, Quick question really, as far as the SERPs are concerned If I had a site with say 180 links on each page - 80 above suggested limit, would putting 'rel="nofollow"' on 80 of these be as good as only having 100 links per page? Currently I have removed the links, but wereally need these as they point to networked sites that we own and are relevant... But we dont want to look spammy... An example of one of the sites without the links can be seen here whereas a site with the links can be seen here You can see the links we are looking to keep (at the bottom) and why... Thanks
On-Page Optimization | | TwoPints0 -
How deep should I go with a directory site?
I am creating a new site which has a directory component. Based on what I have ready I am inclined to keep the site architecture as flat as possible. However, the natural layout that I have come up with in my head has the directory listings 5 or 6 pages deep in the site structure. I saw in another post that someone in a similar situation was suggesting that going deep like this is fine so long as there are many internal links to the deeper pages to indicate that they are important. Should I make a conscious effort to make the site architecture as flat as possible? Are there any specific guides/resources that address this particular issue that I should be aware of? Thanks!
On-Page Optimization | | fastestmanalive0