Getting SEOMoz reports to ignore certain parameters
-
I want the SEOMoz reports to ignore duplicate content caused by link-specific parameters being added to URLs (same page reachable from different pages, having marker parameters regarding source page added to the URLs). I can get Google and Bing webmaster tools to ignore parameters I specify. I need to get SEOMoz tools to do it also!
-
Hey There,
Thanks for writing in! This question is more suited for our help team at SEOmoz. I am going to create a ticket with your email address and this issue so we can take a look at it closer over there. If you have any other Tool related questions or problems with billing, etc, please email us at help@seomoz.org. We will gladly be able to help you with any of those issues over there.
If you have any SEO related questions please don’t hesitate to ask on the QA community site!
Have a good one!
Nick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate page titles in SEOMoz
My on page reports are showing a good number of duplicate title tags, but they are all because of a url tracking parameter that tells us which link the visitor clicked on. For example, http://www.example.com/example-product.htm?ref=navside and http://www.example.com/example-product.htm are the same page, but are treated as to different urls in SEOMoz. This is creating "fake" number of duplicate page titles in my reports. This has not been a problem with Google, but SEOMoz is treating it like this and it's confusing my data. Is there a way to specify this as a url parameter in the Moz software? Or does anybody have another suggestion? Should I specify this in GWT and BWT?
Moz Pro | | InetAll0 -
Difference in data between http://pro.seomoz.org/tools/keyword-difficulty and http://lsapi.seomoz.com/linkscape/url-metrics/
Hi, Has any once else experienced any difference in data between http://lsapi.seomoz.com/linkscape/url-metrics/ and http://pro.seomoz.org/tools/keyword-difficulty Please look at the attached image. For "http://www.webmd.com/diet/guide/choosing-weight-loss-program" and "http://www.freedieting.com/" page authority and domain authority match exactly. But for "http://www.fitnessmagazine.com/weight-loss/plans/" data does not match. The data from "http://lsapi.seomoz.com/linkscape/url-metrics/" was retrieved brely 60 seconds latter after data from "http://pro.seomoz.org/tools/keyword-difficulty". We used our custom app for retrieve data from "http://lsapi.seomoz.com/linkscape/url-metrics/". The columns were matched against the specs given in "http://apiwiki.seomoz.org/w/page/13991153/URL-Metrics-API". We are retrieving following columns 1)ut(Title) 2)ueid(External Links) 3)uid(Links) 4)umrp(mozRank) 5)upa(Page Authority) 6)pda(Domain Authority) Any help will be greatly appreciated. zvFif.jpg
Moz Pro | | claytons0 -
SEOmoz crawler bug?
I just noticed that a few of my campaigns have number of pages crawled 1. Can someone tell me what this is.... from 5 campaigns 2 have only one pages crawled from which one is an online shop with over 2000 products 🙂
Moz Pro | | mosaicpro0 -
How to remove Duplicate content due to url parameters from SEOMoz Crawl Diagnostics
Hello all I'm currently getting back over 8000 crawl errors for duplicate content pages . Its a joomla site with virtuemart and 95% of the errors are for parameters in the url that the customer can use to filter products. Google is handling them fine under webmaster tools parameters but its pretty hard to find the other duplicate content issues in SEOMoz with all of these in the way. All of the problem parameters start with ?product_type_ Should i try and use the robot.txt to stop them from being crawled and if so what would be the best way to include them in the robot.txt Any help greatly appreciated.
Moz Pro | | dfeg0 -
About Duplicate Content found by SEOMOZ... that is not duplicate
Hi folks, I am hunting for duplicate content based on SEOMOZ great tool for that 🙂 I have some pages that are mentioned as duplicate but I cant say why. They are video page. The content is minimalistic so I guess it might be because all the navigation is the same but for instance http://www.nuxeo.com/en/resource-center/Videos/Nuxeo-World-2010/Nuxeo-World-2010-Presentation-Thierry-Delprat-CTO and http://www.nuxeo.com/en/resource-center/Videos/Nuxeo-World-2010/Nuxeo-World-2010-Presentation-Cheryl-McKinnon-CMO are mentioned as duplicate. Any idea? Is it hurting? Cheers,
Moz Pro | | nuxeo0 -
How often does SEOmoz reports get refreshed?
Is there a way I can refresh the reports manually instead of waiting for it to pull the updated data?
Moz Pro | | RBA0 -
How do I get my crawl report?
I received a message that my crawl report is complete with a link - went to the link however - when I click on the icon that has the report name and the complete check mark nothing happens looked around can't find the results. Need to bid on this job so it would be helpful to know where to get it. Thanks for all you do. Wickey
Moz Pro | | Wickey0 -
How to track with SEOMOZ a website in several language
Hi, We have a customer with a website in EN, FR and ES. They used Joomfish, so each language is in a subdirectory : sitename/en sitename/fr sitename/es and they want their website to be well placed on the web for all that languages and countries: English, French, Spanish, German and Italian. It is a website for specific affiliation, that's why there is no barriers. What I need to do to use the best way SEOmoz. For the moment I created one campaign following Google US, google Germany and Google France. To go deeper, I would need to create different campaigns in my account? And also, your robot will be able to recognize the different subdirectories and languages? And to improve the SEO of this website, it wouldn't be better to have 3 domains name, one for each country? Thanks a lot in advance for your answer, Anne
Moz Pro | | ahernoux1