Duplicate Page Content for sorted archives?
-
Experienced backend dev, but SEO newbie here
When SEOmoz crawls my site, I get notified of DPC errors on some list/archive sorted pages (appending ?sort=X to the url).
The pages all have rel=canonical to the archive home.
Some of the pages are shorter (have only one or two entries).
Is there a way to resolve this error? Perhaps add rel=nofollow to the sorting menu? Or perhaps find a method that utilizes a non-link navigation method to sort / switch sorted pages?
No issues with duplicate content are showing up on google webmaster tools.
Thanks for your help!
-
Cool, ill dig them out and fire over. Thanks. Marcus
-
If you've got a couple of examples, I'd be happy to take a look and/or ping the Product team. Just DM me or email at [peter at seomoz dot org].
-
Hey Peter
I have had this a problem a couple of times when it is not picking up on the canonical tag, any ideas what could cause that?
Cheers
Marcus
-
Actually, except in isolated cases, I believe we do parse rel=canonical. I know there are some weird exceptions (such as conflicting canonical tags).
Agreed, though, that the canonical itself should be fine in this case. You could block your sorting from the crawl by putting it in pull-down, for example (not create the URL at all). You could also block the sort= parameter in Google Webmaster Tools. Generally, though, I like canonical better than GWT for this.
-
Thank you, Marcus, that helped a great deal!
They both show the correct URL, so I'll disregard those errors
-
Annoyingly, the SEOMOz crawler does not yet take the rel=canonical into account so creates these misleading reports.
You can always check to see if both pages are indexed in Google to make sure they are only indexing the correct one to be sure
info:www.url.com/page.php?sort=x
info:www.url.com/page.php
Both should just show the main URL without the ?sort=x name / value pair and if so, Google is correctly referring the value of the sorted page to the archive home.
Hope that helps!
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
Duplicate content and canonicalization confusion
Hello, http://bit.ly/1b48Lmp and http://bit.ly/1BuJkUR pages have same content and their canonical refers to the page itself. Yet, they rank in search engines. Is it because they have been targeted to different geographical locations? If so, still the content is same. Please help me clear this confusion. Regards
Technical SEO | | IM_Learner0 -
Duplicate content question...
I have a high duplicate content issue on my website. However, I'm not sure how to handle or fix this issue. I have 2 different URLs landing to the same page content. http://www.myfitstation.com/tag/vegan/ and http://www.myfitstation.com/tag/raw-food/ .In this situation, I cannot redirect one URL to the other since in the future I will probably be adding additional posts to either the "vegan" tag or the "raw food tag". What is the solution in this case? Thank you
Technical SEO | | myfitstation0 -
Duplicate Content for Multiple Instances of the Same Product?
Hi again! We're set to launch a new inventory-based site for a chain of car dealers with various locations across the midwest. Here's our issue: The different branches have overlap in the products that they sell, and each branch is adamant that their inventory comes up uniquely in site search. We don't want the site to get penalized for duplicate content; however, we don't want to implement a link rel=canonical because each product should carry the same weight in search. We've talked about having a basic URL for these product descriptions, and each instance of the inventory would be canonicalized to this main product, but it doesn't really make sense for the site structure to do this. Do you have any tips on how to ensure that these products (same description, new product from manufacturer) won't be penalized as duplicate content?
Technical SEO | | newwhy0 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
Avoiding duplicate content on product pages?
Hi, I'm creating a bunch of product pages for courses for a university and I'm concerned about duplicate content penalties. While the page names are different and some of the test is different, much of the text is the same between pairs of pages. I.e. a BA and an MA in a particular subject (say 'hairdressing' will have the same subject descriptions, school introduction paragraph, industry overview paragraph etc. 1. Is this a problem? In a site with 100 pages, if sets of 2 pages have about 50% identical content... 2. If it is a problem, is there anything I can do, other than rewrite the text? 3. From a search perspective, would both pages show up in search results in searches related to 'hairdressing courses' 'study hairdressing' etc? Thanks!
Technical SEO | | AISFM0 -
Fixing Duplicate Pages Titles/Content
I have a DNN site, which I created friendly URL's for; however, the creation of the friendly URL's then created duplicate page content and titles. I was able to fix all but two URL's with rel="canonical" links. BUT The two that are giving me the most issues are pointing to my homepage. When I added the rel = "canonical" link the page then becomes not indexable. And for whatever reason, I can't add a 301 redirect to the homepage because it then gives me "can't display webpage" error message. I am new to SEO and to DNN, so any help would be greatly appreciated.
Technical SEO | | VeronicaCFowler0 -
Local Search | Website Issue with Duplicate Content (97 pages)
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page. Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page. Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
Technical SEO | | ToddSEOBoston0