Which tools are better? SEOMoz Tools or Bruce Clay's Tools.
-
I've ALWAYS wanted to hear some discussion on this, please give me your honest opinion so I can make the correct decision.
-
Ive used both Bruce Clay and SeoMoz toolsets. The one thing I really liked about Bruce Clays tools is that I can punch in a url and get a full analysis report within minutes. Seomoz takes a week. That's a major drawback in my opinion.Seomoz has a great q and a forum so thats a major plus.
-
You are most welcome Aspirant!
In my case, the parameters are disregarded due to the same page being listed 15 times, one for each sort order. An example of the parameters involved are ?order=asc, type=authorname.
In the above case, once Google has evaluated the main page, there is no reason for them to look any further. The other pages offer the identical content sorted differently, so I instruct Google to disregard those pages.
If you wish to tell Google to disable parameter pages take the following steps: Log into Google WMT, select your domain then Site Configuration > Settings > Parameter Handling tab. Choose your parameter from the list (in your case ?from is not on the list so you would ad it) then change the Action to ignore.
Without seeing your pages I am not sure if the above is the best approach for you, but if you want to use the process, here is the information. The process for Bing is very similar to Google.
-
Thank you for your reply Journeyman! Right now I have an article directory on one of my sites. SEOMoz Crawl errors are reporting the following:
www.sitename.com/articles/?from=0
www.sitename.com/articles/?from=10
www.sitename.com/articles/?from=100
www.sitename.com/articles/?from=105
etc.... www.sitename.com/articles/?from=225
The crawl errors are for duplicate page title and duplicate page content. You said, "In Google WMT and Bing I have provided instructions to disregard the parameter pages and only index the main page."
Do I need to be doing the same thing you're doing with the parameters in WMT? If so, do you have a link to Google that will show me how and explain things a little bit further about parameters?
-
Specifically in regards to crawl errors...I try to use seo tools from a multitude of different sources to see if the errors (if any) they are returning are comparable. If I see the same errors across multiple tools then I usually try to drill down and fix the problem at that point.
-
I used some of Bruce Clays a while ago, wasn't as many tools as you get here... he might have more now. I found that I preferred the SEOmoz tools anyway. I still think you get some great and valuable info from Bruce Clay Inc but over-all I'd say here was best for tools and data. Just my opinion.
-
How often are the tools the deciding factor in what you do?
-
My feeling are the same as Richard.
I only use the SEOmoz tools but would like to learn about other tools which overcome some limitations of the SEOmoz tools.
For example, I don't like the moz crawl reports. They lack the ability to offer customizations. I have category pages which offer the ability to use order parameters. In Google WMT and Bing I have provided instructions to disregard the parameter pages and only index the main page. I need a crawl tool that can either pull these settings, or allow me to set them in the tool.
A good crawl report is one that provides actionable data. To see the same 3000 error records every week is annoying, and it makes it more difficult to drill down to the few pages which actually require attention. It would be nice to share the crawl report with a client without having to say "just disregard those 3000 error messages as they are not important".
-
I am giving a thumbs up to this post as I am also interested in the findings. I have only use, and plan to use SEOmoz as their tools are very good, and are getting better.
The biggest issue I have with SEOmoz is the lack of incorporating old tools into the new campaigns. I am sure that is being worked on.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What SEO tools do you use in conjunction with Moz?
It seems like most people use multiple SEO tools. I am interested in hearing what you use in conjunction with Moz and why. -Stephen
Moz Pro | | martechwiz2 -
Solving 'duplicate content' for page 2 of X for 1 blog post
Hi to all SEO wizards, For my Dutch blog google-plus-marketing.nl I'm using WordPress Genesis framework 2.0 with news theme pro 2.0 responsive theme. I love the out of the box SEO friendliness and features. One of those features is that it allows for a blog post or page to be divided into several pages. This results in MOZ signaling duplicate titles for all pages after the 1st page. Now I was thinking that a canonical url set to the first page should do the trick for me as I reason that the rank will go the the first page and the rest will not be seen as duplicates. Genesis does some good stuff on it's own and places the following meta tags in the header for the first page. All looks well and my question is about the same meta tags for the 2nd page and higher that I pasted below this one for the 1st page. Meta tags page 1 of X for blog post Meta tags page 2 of X for the same blog post Would it not be better to point the canonical url for page 2 till X to always point to the first page? In this case:
Moz Pro | | DanielMulderNL0 -
Magento creating odd URL's, no idea why. GWT reporting 404 errors
Hi Mozzes! Problem 1 GWT and Moz, both are reporting approximately one hundred 404 errors for certain URL's. Examples shown below. We have no idea why or how these URL's are being created in Magento. Any hypothesis on the matter would be appreciated. The domain name in question is http://www.artorca.com/ These are valid URL's if /privacy is removed. The first URL is for a product, second for an artist profile and third for a CMS page 1. semi-abstract-landscape/privacy 2. jose-de-la-barra/privacy 3. seller-guide/privacy What may be the source for these URL's? What solution should we implement to fix existing 404's? 301 redirects should be fine? Problem 2 Website pages seem to also be accessible with index.php in the domain name. Example Artorca.com/index.php/URL's. Will this cause a duplicate content issue? Should we implement 301's, canonicals, or just leave as is? Cheers! MozAddict
Moz Pro | | MozAddict0 -
Why can't I get the search volume data from Google in the Keyword difficulty tool?
Hi, For some reason I can't get the traffic data from Google. when I'm doing a keyword research via the Difficult tool then i need to check the keywords volume in Google Adwords and copy the words into MOZ's keyword tool to check feasibility. mZGRSbC.png?1
Moz Pro | | Indiatravelz0 -
Can I rely on Keyword Difficulty tool?
I just ran into a problem that I hadn't expected. Testing the Keyword Difficulty I saw the results contained a result for a page that has Domain Authority=1 and Page Authority=1. As a result, Keyword Difficulty was reduced (compared to last month), which may actually be reversed if the site is crawled. Sadly, I didn't run the report on the figures as it was a small project. Questions: Can I rely on results shown by Keyword Difficulty? Are results where Domain Authority =1 are used to calculate Keyword Difficulty? If so, why is that? Is there any difference between a page that has received no links and a page that OSE/Mozscape has no link data for? The problem Using the Keyword Difficulty tool, I found swings of up to 14% in Keyword Difficulty (between Oct -Nov). Dr Pete may suggest that this is because of changes in Google's index ( http://www.seomoz.org/blog/a-week-in-the-life-of-3-keywords ). However, It would be helpful to have a figure for Keyword Difficulty that isn't affected by the gaps in the Mozscape data. The (bad) solution You can mirror something close to Keyword Difficulty using: =(Sum of Page Authorities + Sum of Domain Authorities )/20 Right now, I have resorted to manually calculating keyword difficulty. I use the SEOMoz Page Authority & Domain Authority figures and a quick splash of Excel SUMIF and COUNTIF. I find the results don't look as 'easy' when I can ignore results where the data is unknown (PageAuthority=1 & DomainAuthority = 1). Background Info One result I still have a report on is for the phrase [fixing your business puzzle] using US Results on Google. For the specific result, I found the additional information about the site: DNS lookup shows the domain was registered in 2010
Moz Pro | | Darroch
Archive.org shows no records
OSE shows no data for the site
Site uses https
Google showing No links
Robots.txt file seems fine
No Sitemap.xml0 -
Need to find all pages that link to list of pages/pdf's
I know I can do this in OSE page by page, but is there a way I can do this in a large batch? There are 200+ PDF's that I need to figure out what pages (if any) link to the PDF. I'd rather not do this page by page, but rather copy-paste the entire list of pages I'm looking for. Any tools you know of that can do this?
Moz Pro | | ryanwats0 -
Is SEOMoz better than Traffic Travis ?
I've been trying both programs for the free trial and I'm not sure which one is exactly better. they seem to me that both give a similar asepct of research, However SEOMOZ seems to be updating more frequntly. On the seconnd hand, traffic travis is way more cheaper only $97 one time fee as apposed to SEOMoz $100 per month Could someone convince me SEOMOZ is the better tool?
Moz Pro | | ilan_connnect1