Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using h2 for category on ecommerce website
Hi, I am working on an ecommerce site and the main category - lets call them car widgets - is using a h1 at the top of the page which is great. There are 4 sub categories on the page - lets call one of them red widget. The only content on the page is the sub category name and an image. Should the sub category red widget use a h2? Thanks S
On-Page Optimization | | bedynamic0 -
What tools or tactics do you use to identify which ranking factors Google is weighting for your industry or keyword?
Google ranking factors are increasingly more complex and less universal. Google is emphasizing different ranking factors for different scenarios. What tools are available that can help identify which ranking factors Google may be weighting for a given query or industry? For example, are there any tools that provide correlative analysis of Google's rankings for a given keyword?
On-Page Optimization | | AdamThompson0 -
My competitors are using blackhat. What should i do.?
My competitors are using on page black hat methods They are using like keyword stuffing What should i do.?
On-Page Optimization | | aman1231 -
On page SEO Strategy / What pages to use?
What is the best page to use for targeting your hard to rank keywords? The keyword phrases in question here are "Acrylic Tank Manufacturing", "Custom Aquariums", & "Acrylic Aquariums" As of right now we have created 3 separate pages for each one of these keyword phrases. http://seaquaticaquariums.com/custom-aquariums for "Custom Aquariums" http://www.seaquaticaquariums.com/custom-aquariums/acrylic-aquariums/ for "Acrylic Aquariums" http://www.seaquaticaquariums.com/services/acrylic-tank-manufacturing/ for "Acrylic Tank Manufacturing" Or are we better of using the home page http://www.seaquaticaquariums.com/ for the our main hard to rank for terms. Generally speaking I would think more people will link to our home page.
On-Page Optimization | | SeaQuatic0 -
Long tail traffic - what is the best way to go back and add focus to repetitive long tail keywords?
Hey everybody, So, our niche doesn't have a million and a half searches per month, which makes a handle full of visitors look mighty enticing to a CMO Our price point is very high too, so to the question, is it worth taking the time to put a whole new content strategy in line for a few new visitors, the answer is yes. Now's the hard part. How on earth do I make 1,000 pages for similar topics? Is making new pages the best way to go about this? (probably so right? It's the only thing that I can see that would certainly increase likelihood of being more relevant, plus if I don't I will be missing out on the benefits of beefing up our site, AND the opportunity to more specifically answer a users query.) With phrases like "keyword" and "aftermarket keyword," the searcher is asking for two totally separate collections of results. I'm always reading about the importance of being there throughout the buyers complete purchasing /research process, which makes me think that considering doing anything other than creating unique pages is simply missing out.. Suggestions? Massive Content Strategy Help? Anybody? Thanks, TA
On-Page Optimization | | TylerAbernethy0 -
Rel="canonical" on home page?
I'm using wordpress and the all in one seo pack with the canonical option checked. As I understand it the rel="canonical" tag should be added to pages that are duplicate or similar to tell google that another page (one without the rel="canonical" tag) is the correct one as the url in the tag is pointing google towards it. Why then does the all in one seo pack add rel="canonical" to every page on my site including the home page? Isn't that confusing for google?
On-Page Optimization | | SamCUK0 -
Using AJAX to get a meta description to show up
We're unhappy with the meta descriptions google is picking up for our links in SERPs so have started using AJAX to stream in the content google was previously picking up for meta descriptions. This worked and it doesn't seem to have impacted traffic coming to our site, however since the day of that change our bounce rates have gone up significantly, even for pages that we did not push this change to. Is it possible that doing this caused Google to treat our site differently site wide? Is there anything we should be cautious of when doing this? I know the bounce rates could be impacted by users being better prepared by the google meta descriptions, however it doesn't explain what's happening to parts of our site that we didn't do anything to. -Billy
On-Page Optimization | | RealSelf0 -
Why isn't SEOMoz using File Extensions (*.html etc) on any of their web page URLs?
...and what is the SEO benefit of this? This video from Matt Cutts suggests using file extentions, except for a directory.
On-Page Optimization | | magicrob0