Alternatives to SEOmoz's Crawl Diagnistics
-
I really like SEOmoz's Crawl diagnostics reports, it goes through the pages and finds all sorts of valuable information, I wanted to know if there are any other services that compete against this specific service, to test the accuracy of their crawl diagnistics.
Thanks
-
Google Webmaster Tools is handy to use in conjunction with the other tools listed.
Also, SEO Powersuite's SEO Spyglass tool is pretty decent.
-
I agree with Irving as well. Dr. Pete wrote a great comparison of the two: http://www.seomoz.org/blog/crawler-faceoff-xenu-vs-screaming-frog
In brief, Xenu is free. It is old but it works. If you were around before Windows and don't mind using older tech, then Xenu is great.
Screaming Frog costs about $99 Euros but it is much more user friendly. They have an excellent support staff as well. I use the tool almost daily.
-
Xenu's Link Sleuth is very helpful in finding link issues
-
I like Screaming Frog
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site's meta description is not being shown in Google Search results. Instead our privacy policy is getting indexed.
We re-launched our new site and put in the re-directs. Our site is https://www.fico.com/en. When I search for "fico" in Google. I see the privacy policy getting indexed as meta descriptions instead of our actual meta description. I have edited the meta description, requested Google to re-index our site. Not sure what to do next? Thanks for your advise.
Technical SEO | | gosheen0 -
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Will multiple internal links with the same anchor text hurt a site's ranking?
Hello, I just watched this video from the Google Webmasters channel at YouTube: http://www.youtube.com/watch?v=6ybpXU0ckKQ My question: If a site is built up on subdomains, will linking the different subdomains with exact anchor text hurt the site's ranking? Thanks
Technical SEO | | arnoldwender0 -
What to do about removing pages for the 'offseason' (IE the same URL will be brought back in 6-7 months)?
I manage a site for an event that runs annually, and now that the event has concluded we would like to remove some of the pages (schedule, event info, TV schedule, etc.) that won't be relevant again until next year's event. That said, if we simply remove those pages from the web, I'm afraid that we'll lose out on valuable backlinks that already exist, and when those pages return they will have the same URLs as before. Is there a best course of action here? Should I redirect the removed pages to the homepage for the time being using a 302? Is there any risk there if the 'temporary' period is ~7 months? Thanks in advance.
Technical SEO | | KTY550 -
My blog post for a specific keyword is in the 'omitted results'. Why might this be, and how to overcome it?
My website Homepage: http://kulraj.org Here is the page I am working to rank for:** http://kulraj.org/2014/07/15/hedonic-treadmill/** When I search specifically for 'kulraj hedonic treadmill' just to test it, the first result is this: kulraj.org_/tag/_hedonic-treadmill. It shows the shortened version of the article that is within the Tag page. [I'm new to SEO and Moz, please keep in mind] Moz has told me I have duplicate content, which is regarding my main Blog page and Tags page, which is true the content is duplicate. However, the actual blog post itself is not displayed anywhere else on the website, or anywhere else on the web. Moz confirms this, and reports no duplicate content warning. My questions, therefore, are: 1. How do I actually go about installing a rel canonical tag within a standard WordPress dashboard (I'm using Genesis Framework) - I'm finding great difficulty finding instructions on this anywhere on the web. I clearly need to fix the issue with Blog page and Tags Page. 2. Why would my blog post be omitted, and are there any suggestions I could implement to bring it into the main search results. Other things I've noticed: 1. If I type this URL in: kulraj.org/hedonic-treadmill, it automatically redirects to http://kulraj.org/2014/07/15/hedonic-treadmill/ 2. Inside Google Webmaster Tools it says: No new messages or recent critical issues. 3. Regarding the above, when I click 'Labs > author stats' within Webmaster Tools, it shows nil stats, so something there is not quite right either, even though Google+ Authorship is confirmed.
Technical SEO | | Kulraj0 -
Are Collapsible DIV's SEO-Friendly?
When I have a long article about a single topic with sub-topics I can make it user friendlier when I limit the text and hide text just showing the next headlines, by using expandable-collapsible div's. My doubt is if Google is really able to read onclick textlinks (with javaScript) or if it could be "seen" as hidden text? I think I read in the SEOmoz Users Guide, that all javaScript "manipulated" contend will not be crawled. So from SEOmoz's Point of View I should better make use of old school named anchors and a side-navigation to jump to the sub-topics? (I had a similar question in my post before, but I did not use the perfect terms to describe what I really wanted. Also my text is not too long (<1000 Words) that I should use pagination with rel="next" and rel="prev" attributes.) THANKS for every answer 🙂
Technical SEO | | inlinear0 -
What is the best approach to specifying a page's language?
I have read about a number of different tags that can accomplish this so it is very confusing. For example, should I be using: OR
Technical SEO | | BlueLinkERP0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0