Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Javascript Links Be Used to Reduce Links per Page?
We are trying to reduce the number of links per page, so for the low-value footer links we are considering coding them as javascript links. We realize Google can read java, but the goal is to reduce level of importance assigned to those internal links. Would this be a valid approach? So the question is would converting low-value footer links to js like below help reduce the number of links per page in google's eyes even though we're reasonably sure they can read javascript. <a <span="" class="html-tag">href</a><a <span="" class="html-tag">="</a><a class="html-attribute-value html-external-link" target="_blank">javascript:void(0);</a>" data-footer-link="/about/about">About Us
On-Page Optimization | | Jay-T0 -
I have an eCommerce Site with in some cases, 100s of versions of the same product. How do I avoid "duplicate content" without writing literally 100s of unique product descriptions for the exact same product?
For instance, one item where the only difference is the Sports Team Logo is different, etc... or It comes in a variety of color Variants. I'm using Shopify.
On-Page Optimization | | pstone291 -
Is it convinient to use No-Index, Follow to my Paginated Pages?
I have a website http://www.naukrigulf.com and it has a lot of Paginated pages on its SERP and most of paginated pages are getting indexed in Google SERP. Is it beneficial to use No-Index, Follow to keep the link equity to main (first page), although we have already used rel=next and rel=prev. If Answer is "yes" is their any harm by using no-index, follow with rel=next, rel=prev.
On-Page Optimization | | vivekrathore0 -
Using phrases like 'NO 1' or 'Best' int he title tag
Hi All, Quick question - is it illegal, against any rule etc to use phrases such as 'The No 1 rest of the title tag | Brand Name' on a site?
On-Page Optimization | | Webrevolve0 -
How to use canonical with mobile site to main site
I am pretty sure that the mobile version of the main site needs to be the same canonical link from what I understand. I am trying to find good docuementation that supports this. Even better if its from Google or Matt Cutts. I have a main domain like http://www.mydomain.com the mobile version of this is http://www.mydomain.com/m/ Should my canonical be rel="canonical" href="http://www.mydomain.com"/> for both these pages?
On-Page Optimization | | cbielich0 -
To use or not to use: Keywords with locations
Hello there. I work for a marketing agency that manages SEO campaigns for a variety of small businesses in South Florida. Let's say we have a client that sells cheap shoes at their store location. Obviously, we want to show up in Google rankings for search terms like "cheap shoes south florida" or "cheap shoes miami." Now, my question is, when optimizing a website's content for various keywords, is it really necessary to include keywords with the location (which are often awkward for both reading and writing purposes)? Ideally, I'd prefer to have text that always reads as naturally as possible. Text like this is just an eyesore: Welcome to ExampleSite.com, home of the best cheap shoes Florida. We offer all kinds of cheap shoes Boca Raton. Your whole family doesn't have enough fingers and toes to count how many cheap shoes West Palm Beach we have in stock! Contact us to ask about our cheap shoes Miami discounts today! Olé!" What say you? Is there a way to work around ugly SEO text like this while still effectively ranking for GEO terms? Thanks!
On-Page Optimization | | BBEXNinja0 -
Recommended Min Amount of Content for "News" Bulletins
My company often puts out short news bulletins to announce short news updates. We have to write about these topics for our customers and to remain as an industry leader. However, there is not much real and interesting content to write about these topics. What is the minimum length you think these articles should consist of so that Google won't see them as weak/useless pages and possibly give us a Panda penalty for them?
On-Page Optimization | | theLotter0 -
If i only want to rank for one specific keyword and use it in all my page titles, will it negatively affect my rankings?
If i want to rank highest for one specific keyword (virtualization management, for example) and use that keyword in all the titles on my website, will that negatively affect my search rankings? SEOmoz is telling me that i should use unique titles for my different pages to ensure that they describe each page uniquely and don't compete with each other for keyword relevance.
On-Page Optimization | | foonista0