Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz Pro recommends not using a keyword more than 15 times. If there is a lot of content and the density is low, is it okay to go over that?
From MOZ on-page grader... "Recommendation: Edit your page to use your targeted keywords no more than 15 times." But if I use a keyword 50 times and the keyword density is only 2 percent, is that ok? What is more important, the raw number used or the density?
On-Page Optimization | | Jeremy-Marion1 -
Using keywords in my URL: Doing a redirect to /keyword
My website in "On Page Grade" received an A.Anyway, I only have 1 thing to optimize:_"Use Keywords in your URL__Using your targeted keywords in the URL string adds relevancy to your page for search engine rankings, assists potential visitors identify the topic of your page from the URL, and provides SEO value when used as the anchor text of referring links."_My website is ranking in top10 for a super high competitive keyword and all my others competitors have the keyword on their domain, but not for my URL.Since I can't change my domain for fixing this suggestion, I would like to know what do you think about doing a 301 redirect from / to mydomainname.com/keyword/So the index of my website would be the /keyword.I don't know if this can make a damage to my SERP for the big change ir it would be a great choice.
On-Page Optimization | | estebanseo0 -
Product Landing page- Key Words used too often.
Hi There to everyone. I'm very new at SEO so I appreciate all the help I can get! Question: on my e-commerce website, one heading is "USB KEYS", then when you click on that, you get a page full of products with the word "USB KEYS" on the title, such as "4gig black USB KEY".. So then I do the MOZ Page grader and get a low score because on the product landing page it thinks I've used the word "USB KEYS" too much, like 72 times,.. But I have to use it- Cause it has to be in the product title! Is this ok!? thanks
On-Page Optimization | | cowhidesdirect0 -
Using product prices in title tags & other ways to improve CTR
I'm about to launch a new website and was just wondering if anyone had tried this before; On product pages i thought it may be a good idea to include prices, with a message to save more when the customer buys more. Something along the lines of this; [Product name] at/from [Brand name] - Only [Price] [Buy # to save] Etc. Does anyone have any experience with using this in title tags (along with some variation of offering a saving when bulk buying) etc? Any comments would be hugely appreciated! Thanks! [EDIT] Should also just note that we are very competitive on pricing - Looked at more than 70 competitors and we're offering better quality products (than the ones from [supposedly top] competitors we've sampled) for lower prices.
On-Page Optimization | | tea0 -
Can I use the first sentence of my page content as a meta description tag as well?
I just want to copy my content on the page and use the first or as well the second sentence of the content self for my meta description tag. Is that OK? Or should the Meta description tag be different?
On-Page Optimization | | paulinap19830 -
Is there a SEO penalty for multi links on same page going to same destination page?
Hi, Just a quick note. I hope you are able to assist. To cut a long story short, on the page below http://www.bookbluemountains.com.au/ -> Features Specials & Packages (middle column) we have 3 links per special going to the same page.
On-Page Optimization | | daveupton
1. Header is linked
2. Click on image link - currently with a no follow
3. 'More info' under the description paragraph is linked too - currently with a no follow Two arguments are as follows:
1. The reason we do not follow all 3 links is to reduce too many links which may appear spammy to Google. 2. Counter argument:
The point above has some validity, However, using no follow is basically telling the search engines that the webmaster “does not trust or doesn’t take responsibility” for what is behind the link, something you don’t want to do within your own website. There is no penalty as such for having too many links, the search engines will generally not worry after a certain number.. nothing that would concern this business though. I would suggest changing the no follow links a.s.a.p. Could you please advise thoughts. Many thanks Dave Upton [long signature removed by staff]0 -
Recreate missing pages or just use 301 redirect?
Hi, on the Competitive Comparison section of the "link analysis" page the top 5 sites are linking to pages on my site that no longer exist. I'm wondering if it is worthwhile to recreate these pages that no longer exist or if I should just use a 301 redirect to some page on my site that has related info. Thanks for your suggestions.
On-Page Optimization | | PillarMarketing0 -
Should my client remove "SEO" from the XML sitemap name?
I have suggested to a client with limited content on their site (considering it's in a very competitive sector with oceans of content possibilities!) that they probably shouldn't name the XML sitemap featuring their "seo content pages" (I hate that terminology BTW!) - google_sitemap_seo.xml My reasoning is that if I was a Google engineer or Google bot, I would probably ignore and disregard those pages because they are most likely poor quality content/doorway pages/boiler plate pages/ "enter your descriptive phrase here" pages. The push back from tech is that it doesn't make a difference so we're not going to do it.
On-Page Optimization | | Red_Mud_Rookie0