Which tools are better? SEOMoz Tools or Bruce Clay's Tools.
-
I've ALWAYS wanted to hear some discussion on this, please give me your honest opinion so I can make the correct decision.
-
Ive used both Bruce Clay and SeoMoz toolsets. The one thing I really liked about Bruce Clays tools is that I can punch in a url and get a full analysis report within minutes. Seomoz takes a week. That's a major drawback in my opinion.Seomoz has a great q and a forum so thats a major plus.
-
You are most welcome Aspirant!
In my case, the parameters are disregarded due to the same page being listed 15 times, one for each sort order. An example of the parameters involved are ?order=asc, type=authorname.
In the above case, once Google has evaluated the main page, there is no reason for them to look any further. The other pages offer the identical content sorted differently, so I instruct Google to disregard those pages.
If you wish to tell Google to disable parameter pages take the following steps: Log into Google WMT, select your domain then Site Configuration > Settings > Parameter Handling tab. Choose your parameter from the list (in your case ?from is not on the list so you would ad it) then change the Action to ignore.
Without seeing your pages I am not sure if the above is the best approach for you, but if you want to use the process, here is the information. The process for Bing is very similar to Google.
-
Thank you for your reply Journeyman! Right now I have an article directory on one of my sites. SEOMoz Crawl errors are reporting the following:
www.sitename.com/articles/?from=0
www.sitename.com/articles/?from=10
www.sitename.com/articles/?from=100
www.sitename.com/articles/?from=105
etc.... www.sitename.com/articles/?from=225
The crawl errors are for duplicate page title and duplicate page content. You said, "In Google WMT and Bing I have provided instructions to disregard the parameter pages and only index the main page."
Do I need to be doing the same thing you're doing with the parameters in WMT? If so, do you have a link to Google that will show me how and explain things a little bit further about parameters?
-
Specifically in regards to crawl errors...I try to use seo tools from a multitude of different sources to see if the errors (if any) they are returning are comparable. If I see the same errors across multiple tools then I usually try to drill down and fix the problem at that point.
-
I used some of Bruce Clays a while ago, wasn't as many tools as you get here... he might have more now. I found that I preferred the SEOmoz tools anyway. I still think you get some great and valuable info from Bruce Clay Inc but over-all I'd say here was best for tools and data. Just my opinion.
-
How often are the tools the deciding factor in what you do?
-
My feeling are the same as Richard.
I only use the SEOmoz tools but would like to learn about other tools which overcome some limitations of the SEOmoz tools.
For example, I don't like the moz crawl reports. They lack the ability to offer customizations. I have category pages which offer the ability to use order parameters. In Google WMT and Bing I have provided instructions to disregard the parameter pages and only index the main page. I need a crawl tool that can either pull these settings, or allow me to set them in the tool.
A good crawl report is one that provides actionable data. To see the same 3000 error records every week is annoying, and it makes it more difficult to drill down to the few pages which actually require attention. It would be nice to share the crawl report with a client without having to say "just disregard those 3000 error messages as they are not important".
-
I am giving a thumbs up to this post as I am also interested in the findings. I have only use, and plan to use SEOmoz as their tools are very good, and are getting better.
The biggest issue I have with SEOmoz is the lack of incorporating old tools into the new campaigns. I am sure that is being worked on.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Mozscape index released! Learn just what's been going on.
Hi everyone, I'm very, very pleased to announce that the latest Mozscape index has been released five days ahead of schedule, and it's back up to the quality you expect from Moz! It's no secret that we've struggled this year with our index. I can't overstate how much we appreciate everyone sticking with us—we know how hard this has been. The good news—well, apart from the early release—is that we've worked out the issue, and we expect to finally be back on track moving forward. Director of Engineering Martin York has shared a post to the Moz blog explaining just what's been going so wrong, and outlining our plan for the future. Thank you, from the bottom of our hearts, for being such an incredible community. As for the folks who are seeing Domain Authority and Page Authority drops since the index release, please see Rand's recent post for more.
Moz Pro | | MattRoney7 -
What's my best strategy for Duplicate Content if only www pages are indexed?
The MOZ crawl report for my site shows duplicate content with both www and non-www pages on the site. (Only the www are indexed by Google, however.) Do I still need to use a 301 redirect - even if the non-www are not indexed? Is rel=canonical less preferable, as usual? Facts: the site is built using asp.net the homepage has multiple versions which use 'meta refresh' tags to point to 'default.asp'. most links already point to www Current Strategy: set the preferred domain to 'www' in Google's Webmaster Tools. set the Wordpress blog (which sits in a /blog subdirectory) with rel="canonical" to point to the www version. Ask programmer to add 301 redirects from the non-www pages to the www pages. Ask programmer to use 301 redirects as opposed to meta refresh tags & point all homepage versions to www.site.org. Does this strategy make the most sense? (Especially considering the non-indexed but existent non-www pages.) Thanks!!
Moz Pro | | kimmiedawn0 -
Why does SEOMoz only crawl 1 page of my site?
My site is: www.thetravelingdutchman.com. It has quite a few pages, but for some reason SEOMoz only crawls one. Please advise. Thanks, Jasper
Moz Pro | | Japking0 -
Archiving Campaigns in SEOmoz
First off, I love the campaign archive feature. Very useful for my purposes. My question is: Is there a limit to how many campaigns I can archive? Thanks in advance!
Moz Pro | | CollinJarman0 -
SEOMoz only crawling 5 pages of my website
Hello, I've added a new website to my SEOmoz campaign tool. It only crawls 5 pages of the site. I know the site has way more pages then this and also has a blog. Google shows at least 1000 results indexed. Am I doing something wrong? Could it be that the site is preventing a proper crawl? Thanks Bill
Moz Pro | | wparlaman0 -
SEOMoz Campaign Tool
I've noticed that when looking at the SEOmoz tool, specifically the On Page analysis tool, it is still looking at an old url. About two months ago I made updates to all of our category page URLs. Previously the old urls were stuffed with keywords, strange characters and were really long. When looking at the on-page tool though it is referencing the old urls for keywords and I'm wondering why? I figure its been long enough to recognize the new urls. Is the paring of a keyword and a url saved and just graded on a weekly basis to produce the report? I had expected to see the new url's by now which are also represented in the sitemap. Around that same time I also added our TellAFriend Page and Review pages to our Robots.txt file as not to be crawled but I still see these pages come up in the errors report. Should this update as well?
Moz Pro | | dgmiles0 -
Any tools for scraping blogroll URLs from sites?
This question is entirely in the whitehat realm... Let's say you've encountered a great blog - with a strong blogroll of 40 sites. The 40-site blogroll is interesting to you for any number of reasons, from link building targets to simply subscribing in your feedreader. Right now, it's tedious to extract the URLs from the site. There are some "save all links" tools, but they are also messy. Are there any good tools that will a) allow you to grab the blogroll (only) of any site into a list of URLs (yeah, ok, it might not be perfect since some sites call it "sites I like" etc.) b) same, but export as OPML so you can subscribe. Thanks! Scott
Moz Pro | | scottclark0 -
What's name of SEOmoz and Open Site Explorer robots?!
I would like to exclude in robots.txt SEOmoz and Open Site Explorer bots to don't let them index my sites… what's their names?
Moz Pro | | cezarylech0