Why are my pages getting duplicate content errors?
-
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page:
http://www.mapsalive.com/Features/audio.aspx
http://www.mapsalive.com/Features/Audio.aspx
The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
-
Dr. Pete doesn't cover case (though it's mentioned in the comments), but just about everything else you might want to know about duplicate content is talked about at http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world, including ways to remedy it. It sounds like you've got a plan here, but I'm also adding it for the benefit of others looking at this thread.
-
I think this is one of the most overlooked duplicate content issues. Not sure why it's not talked about as much as it is. I quite often have been using upper and lowercase intermittently. E.g., mysite.com/Las-Vegas/ and mysite.com/las-vegas/, not knowing it made any difference.
I guess a .htaccess rewrite to all lowercase is in order. Thanks SEOMoz. You guys rock.
-
Glad to be of help Janice.
From a readability perspective, in which case I'd suggest to have all lower case.
-
Well, it is a Windows server and my understanding is that it is case-insensitive, but I'll verify this with our hosting provider. Nevertheless, would it be preferable to set up the rewrite from the mixed case names to all lowercase names or vice versa? Or perhaps it doesn't matter.
Thanks for your help with this - lots to learn and work through with these tools.
-
If the server allows upper case and lower case then from a technical perspective they could both be different files. Like having www.domain.com and domain.com point to the same home page - they may be the same, but technically they could be two different places.
The solution should be set up to not require having to do a rewrite every time a new page is created. It should be automatic.
-
I understand your answer and about setting up rewrites, but what I really want to know is why there are two pages listed (one uppercase, one lowercase) when there is only one physical page on the site. All links within the site point to the page using the uppercase name.
I don't want to have to add a rewrite for the lowercase name every time I add a page to the site - this doesn't seem right which is why I'm wondering if there is something else wrong.
-
Janice,
The proper solution would be to have the site set up at the server level to automatically rewrite URLs so they have one consistent pattern (typically all lower case). And to make sure all links within the site pointing to other pages on the site use that preferred capitalization method. While having Canonical tags can help alleviate the problem, they're not a best practices "only" solution. So speak with the site administrator or programmer to get the rewrite functionality implemented.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content Issue
Hello, I recently solved www / no www duplicate issue for my website, but now I am in trouble with duplicate content again. This time something that I cannot understand happens: In Crawl Issues Report, I received Duplicate Page Content for http://yourappliancerepairla.com (DA 19) http://yourappliancerepairla.com/index.html (DA 1) Could you please help me figure out what is happenning here? By default, index.html is being loaded, but this is the only index.html I have in the folder. And it looks like the crawler sees two different pages with different DA... What should I do to handle this issue?
Technical SEO | | kirupa0 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Duplicate Content within Site
I'm very new here... been reading a lot about Panda and duplicate content. I have a main website and a mobile site (same domain - m.domain.com). I've copied the same text over to those other web pages. Is that okay? Or is that considered duplicate content?
Technical SEO | | CalicoKitty20000 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
Cant get my head around this duplicate content dilemma!
Hi, Lets say you have a cleaning company, you have a services page, which covers window cleaning, carpet cleaning etc, lets say the content on this page adds up to around 750 words. Now lets say you would like to create new pages which targeted location specific keywords in your area. The easiest way would be to copy the services page and just change all the tags to the location specific term but now you have duplicate content. If I wanted to target 10 locations, does this now mean I need to generate 750 words of unique content for each page which is basically the services page rewritten? Cheers
Technical SEO | | activitysuper0 -
Crawl Errors for duplicate titles/content when canonicalised or noindexed
Hi there, I run an ecommerce store and we've recently started changing the way we handle pagination links and canonical links. We run Magento, so each category eg /shoes has a number of parameters and pages depending on the number of products in the category. For example /shoes?mode=grid will display products in grid view, /shoes?mode=grid&p=2 is page 2 in grid mode. Previously, all URL variations per category were canonicalised to /shoes. Now, we've been advised to paginate the base URLs with page number only. So /shoes has a pagination next link to /shoes?p=2, page 2 has a prev link to /shoes and a next link to /shoes?p=3. When any other parameter is introduced (such as mode=grid) we canonicalise that back to the main category URL of /shoes and put a noindex meta tag on the page. However, SEOMoz is picking up duplicate title warnings for urls like /shoes?p=2 and /shoes?mode=grid&p=2 despite the latter being canonicalised and having a noindex tag. Presumably search engines will look at the canonical and the noindex tag so this shouldn't be an issue. Is that correct, or should I be concerned by these errors? Thanks.
Technical SEO | | Fergus_Macdonald0 -
How can I have pages with media that changes and avoid duplicate content when the text stays the same?
I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.
Technical SEO | | WebsightDesign0 -
Duplicate content handling.
Hi all, I have a site that has a great deal of duplicate content because my clients list the same content on a few of my competitors sites. You can see an example of the page here: http://tinyurl.com/62wghs5 As you can see the search results are on the right. A majority of these results will also appear on my competitors sites. My homepage does not seem to want to pass link juice to these pages. Is it because of the high level of Dup Content or is it because of the large amount of links on the page? Would it be better to hide the content from the results in a nofollowed iframe to reduce duplicate contents visibilty while at the same time increasing unique content with articles, guides etc? or can the two exist together on a page and still allow link juice to be passed to the site. My PR is 3 but I can't seem to get any of my internal pages(except a couple of pages that appear in my navigation menu) to budge of the PR0 mark even if they are only one click from the homepage.
Technical SEO | | Mulith0