Screaming Frog - What are your "go to" tasks you use it for?
-
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports.
Just looking for things I perhaps hadn't thought about, that this might be useful for.
-
Ha ha, I know! It's like giving the developers a little present all wrapped up with a bow...here's the problem, and here's where to fix it
-
Allie,
That's a great example use-case. After my audits, clients are like "you found thousands of internal redirects and 404s - where are they?"
I'm like - hold on I have a spreadsheet of that!
-
I love Screaming Frog! One use case I've used recently is using it to find internal 404 errors prior-to and immediately-after a major site redesign.
After running a crawl, go to Bulk Export > Response Code > Client error (4xx) Inlinks and download the report. It shows the offending URL and the URL referring to it, which makes it easier to update the bad link.
I also have this page bookmarked, and it's my go-to guide:
-
It's one of the best tools so I feel like I use it "for everything." But some includes:
-
Title / meta duplication & finding parameters on ecomm stores
-
Title length & meta desc length
-
Removing meta keywords fields
-
Finding errant pages (anything but 200, 301, 302, or 404 status code)
-
Large sitemap export (most tools do "up to 500 pages." Useless.)
-
Bulk export of external links (what ARE we linking to??)
-
Quickly opening a page in Wayback Machine or Google cache
-
Finding pages without Analytics, as was mentioned.
I use Screaming Frog for tons of other things. Finding the AJAX escaped frag URL, identifying pages with 2 titles, 2 canonicals, 2 H1 tags, etc. Even seeing www & non-www versions live, links to pages that shouldn't be linked and http vs https.
Very cool tool - useful for pretty much everything! haha
-
-
That's awesome. Thanks. Will take a look at all those things this week.
-
I use SF religiously for all the audit work I do. I run a sample crawl (using Googlebot as the crawler) to check for all the standard stuff and go further.
My standard evaluation with SF includes:
- Redirect / dead end internal linking
- Redirect / dead end "external" links that point to site assets housed on CDN servers.
- URL hierarchical structure
- Internal linking to both http and https that can reinforce duplicate content conflicts
- Page Title/H1 topical focus relevance and quality
- Confusion from improperly "nofollowing" important pages (meta robots)
- Conflicts between meta robots and canonical tags
- Slow page response times
- Bloated HTML or image file sizes
- Thin content issues (word count)
- Multiple instances of tags that should only have one instance (H1 headline tags, meta robots tags, canonical tags)
-
That crawl path report is pretty cool, and it led me to the redirect chain report, which I have a few issues to resolve with that with a few multiple redirects on some old links. Fantastic stuff.
-
I am a big fan of Screaming frog myself. Apart from the real basic stuff (checking H1, titles,...etc) it's also useful to check if all your pages contain your analytics tag and to check the size of the images on the site (these things Moz can't do).
It's also extremely useful when you're changing the url structure to check if all the redirects are properly implemented.
Sometimes you get loops in your site, especially if you use relative rather than absolute links on your site - Screaming Frog has an extremely helpful feature: just click on the url and select "crawl path report" - which generates an xls which shows the page where the problem originates
It's also very convenient that you can configure the spider to ignore robots.txt / nofollow / noindex when you are test a site in a pre-production environment. Idem for the possibility to use regex to filter some of the url's while crawling (especially useful for big sites if the they aren't using canonicals or noindex where they should use it)
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does using Yoast variables for meta content overwrite any pages that already have custom meta content?
The question is about the Yoast plugin for WP sites. Let's say I have a site with 200 pages and custom meta descriptions / title tags already in place for the top 30 pages. If I use the Yoast variable tool to complete meta content for the remaining pages (and make my Moz issue tracker look happier), will that only affect the pages without custom meta descriptions or will it overwrite even the pages with the custom meta content that I want? In this situation, I do want to keep the meta content that is already in place on select pages. Thanks! Zack
On-Page Optimization | | rootandbranch0 -
Should I spend time going back and optimizing old blog posts for SEO or just write new posts?
The site I manage (Boutique Estate Law Firm) has at least 350 old blog post archived that were not well optimized for SEO. Would it be valuable to go through and optimize those old posts or just write new optimized posts even though they are on the same subjects? My boss loves to churn out 300 word posts.
On-Page Optimization | | SEO4leagalPA0 -
Can "window.location" javascript on homepage affect seo?
Hi! I need to add a splashpage to my wordpress site. I use "window.location" javascript on the homepage to redirect on the splashpage (controlled by cookie to redirect only for the first access). Can this technique affect the SEO on homepage? Thanks in advance!
On-Page Optimization | | StudioCiteroni0 -
Should you use Plural version of a keyword or singular
H If kw research shows that singular version of a keyword has higher search volume than plural version should you still use plural version in main on-page areas to try and catch both instances or focus on the singular ? cheers dan
On-Page Optimization | | Dan-Lawrence0 -
What makes a "perfectly optimized page" in 2013?
Hi all, I was re-reading this blog http://moz.com/blog/perfecting-keyword-targeting-on-page-optimization and wondered how relevant you guys thought this post still was? Moz link directly to it from their 'learn' page http://moz.com/learn/seo so I'm interpreting that to mean it is still accurate and as current as it can be? What else would you add to it? Thanks,
On-Page Optimization | | James-Distinction
James0 -
Is rel=canonical used only for duplicate content
Can the rel-canonical be used to tell the search engines which page is "preferred" when there are similar pages? For instance, I have an internal page that Google is showing on the first page of the SERPs that I would prefer the home page be ranked for. Both the home and internal page have been optimized for the same keyword. What is interesting is that the internal page has very few backlinks compared to the home page but Google seems to favor it since the keyword is in the URL. I am afraid a 301 will drop us from the first page of the SERPs.
On-Page Optimization | | surveygizmo0 -
How could I avoid the "Duplicate Page Content" issue on the search result pages of a webshop site?
My webshop site was just crawled by Roger, and it found 683 "Duplicate Page Content" issues. Most of them are result pages of different product searches, that are not really identical, but very similar to each other. Do I have to worry about this? If yes, how could I make the search result pages different? IS there any solution for this? Thanks: Zoltan
On-Page Optimization | | csajbokz0 -
Using Transcriptions
Hi everyone, I've spent a long time trying to figure this one out, so I'm looking forward to your insights. I've recently started having our videos transcribed and keyworded. The videos are hosted on youtube and already embedded on our website. Each embedded video is accompanied by an existing keyword-rich article that covers pretty much the same content of the video, but in a little more detail. I'm now going back and having these videos transcribed. The reason I started doing this was to essentially lengthen the article and get more keywords on the page. Question A. My concern is that the transcription covers the same content as the article, so doesn't add that much for the reader. That's why when I post the transcription (below the embedded video), I use a little javascript link for people to click if they want to read it. Then it becomes visible. Otherwise it's not visible. Note that I am NOT trying to hide it from google by doing this - and it will still show up for people who don't have javascript on - so I'm not trying to cheat google at all and I think I'm doing it based on how they want it done. You can see an example here: http://www.healthyeatingstartshere.com/nutrition/healthy-diet-plan-mistakes So my first question is: do you think the javascript method is a good way of doing it? Question B. Does anyone have any insight on whether it would be better to put the transcription:
On-Page Optimization | | philraymond
1. On the same page as the embedded video/article (which I am doing now), or
2. On a different page, linked to from the above page, or
3. On various other websites (wordpress, blogspot, web2.0 sites) that link back to the video/article on our site. I know it's usually best practice to put it on the same page as the video, but I'm wondering from an <acronym title="Search Engine Optimization">SEO</acronym> point of view if I'm wasting a 500 word transcription by posting it on the same page as a 500 article that covers the same topic and uses the same keywords, and I wonder if it would be better to use the transcription elsewhere. Do you have any thoughts on which of the above methods would be best? Thanks so much for reading and any advice you may have.0