Why are my pages getting duplicate content errors?
-
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page:
http://www.mapsalive.com/Features/audio.aspx
http://www.mapsalive.com/Features/Audio.aspx
The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
-
Dr. Pete doesn't cover case (though it's mentioned in the comments), but just about everything else you might want to know about duplicate content is talked about at http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world, including ways to remedy it. It sounds like you've got a plan here, but I'm also adding it for the benefit of others looking at this thread.
-
I think this is one of the most overlooked duplicate content issues. Not sure why it's not talked about as much as it is. I quite often have been using upper and lowercase intermittently. E.g., mysite.com/Las-Vegas/ and mysite.com/las-vegas/, not knowing it made any difference.
I guess a .htaccess rewrite to all lowercase is in order. Thanks SEOMoz. You guys rock.
-
Glad to be of help Janice.
From a readability perspective, in which case I'd suggest to have all lower case.
-
Well, it is a Windows server and my understanding is that it is case-insensitive, but I'll verify this with our hosting provider. Nevertheless, would it be preferable to set up the rewrite from the mixed case names to all lowercase names or vice versa? Or perhaps it doesn't matter.
Thanks for your help with this - lots to learn and work through with these tools.
-
If the server allows upper case and lower case then from a technical perspective they could both be different files. Like having www.domain.com and domain.com point to the same home page - they may be the same, but technically they could be two different places.
The solution should be set up to not require having to do a rewrite every time a new page is created. It should be automatic.
-
I understand your answer and about setting up rewrites, but what I really want to know is why there are two pages listed (one uppercase, one lowercase) when there is only one physical page on the site. All links within the site point to the page using the uppercase name.
I don't want to have to add a rewrite for the lowercase name every time I add a page to the site - this doesn't seem right which is why I'm wondering if there is something else wrong.
-
Janice,
The proper solution would be to have the site set up at the server level to automatically rewrite URLs so they have one consistent pattern (typically all lower case). And to make sure all links within the site pointing to other pages on the site use that preferred capitalization method. While having Canonical tags can help alleviate the problem, they're not a best practices "only" solution. So speak with the site administrator or programmer to get the rewrite functionality implemented.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue
Hi, A client of ours has one URL for the moment (https://aalst.mobilepoint.be/) and wants to create a second one with exactly the same content (https://deinze.mobilepoint.be/). Will that mean Google punishes the second one because of duplicate content? What are the recommendations?
Technical SEO | | conversal0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Duplicate Content issue in Magento: The product pages are available true 3 URL's! How can we solve this?
Right now the product page "gedroogde goji bessen" (Dutch for: dried goji berries) is available true 3 URL's! **http://www.sportvoeding.net/gedroogde-goji-bessen ** =>
Technical SEO | | Zanox
By clicking on the product slider on the homepage
http://www.sportvoeding.net/superfood/gedroogde-goji-bessen =>
First go to sportvoeding.net/superfood (main categorie) and than clicking on "gedroogde Goji bessen"
http://www.sportvoeding.net/superfood/goji-bessen/gedroogde-goji-bessen =>
When directly go to the subcategorie "Goji Bessen" true the menu and there clicking on "gedroogde Goji Bessen" We want to have the following product URL:
http://www.sportvoeding.net/superfood/goji-bessen/gedroogde-goji-bessen Does someone know´s a good Exetension for this issue?0 -
Duplicate Page Title
Hi I just got back from first crawl report and there were plenty of errors. I know this has been asked before but I am newbie here so bear with me. I captured the video. Any ideas on how to address the issue? ktXKDxRttK
Technical SEO | | mcardenal0 -
Avoiding Cannibalism and Duplication with content
Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store -
Technical SEO | | BeytzNet
How to choose a laptop
How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?0 -
Duplicate Page Content for sorted archives?
Experienced backend dev, but SEO newbie here 🙂 When SEOmoz crawls my site, I get notified of DPC errors on some list/archive sorted pages (appending ?sort=X to the url). The pages all have rel=canonical to the archive home. Some of the pages are shorter (have only one or two entries). Is there a way to resolve this error? Perhaps add rel=nofollow to the sorting menu? Or perhaps find a method that utilizes a non-link navigation method to sort / switch sorted pages? No issues with duplicate content are showing up on google webmaster tools. Thanks for your help!
Technical SEO | | jwondrusch0 -
How do I get rid of duplicate content
I have a site that is new but I managed to get it to page one. Now when I scan it on SEO Moz I see that I have duplicate content. Ex: www.mysite.com, www.mysite.com/index and www.mysite.com/ How do I fix this without jeopardizing my SERPS ranking? Any tips?
Technical SEO | | bronxpad0 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0