Why are my pages getting duplicate content errors?
-
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page:
http://www.mapsalive.com/Features/audio.aspx
http://www.mapsalive.com/Features/Audio.aspx
The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
-
Dr. Pete doesn't cover case (though it's mentioned in the comments), but just about everything else you might want to know about duplicate content is talked about at http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world, including ways to remedy it. It sounds like you've got a plan here, but I'm also adding it for the benefit of others looking at this thread.
-
I think this is one of the most overlooked duplicate content issues. Not sure why it's not talked about as much as it is. I quite often have been using upper and lowercase intermittently. E.g., mysite.com/Las-Vegas/ and mysite.com/las-vegas/, not knowing it made any difference.
I guess a .htaccess rewrite to all lowercase is in order. Thanks SEOMoz. You guys rock.
-
Glad to be of help Janice.
From a readability perspective, in which case I'd suggest to have all lower case.
-
Well, it is a Windows server and my understanding is that it is case-insensitive, but I'll verify this with our hosting provider. Nevertheless, would it be preferable to set up the rewrite from the mixed case names to all lowercase names or vice versa? Or perhaps it doesn't matter.
Thanks for your help with this - lots to learn and work through with these tools.
-
If the server allows upper case and lower case then from a technical perspective they could both be different files. Like having www.domain.com and domain.com point to the same home page - they may be the same, but technically they could be two different places.
The solution should be set up to not require having to do a rewrite every time a new page is created. It should be automatic.
-
I understand your answer and about setting up rewrites, but what I really want to know is why there are two pages listed (one uppercase, one lowercase) when there is only one physical page on the site. All links within the site point to the page using the uppercase name.
I don't want to have to add a rewrite for the lowercase name every time I add a page to the site - this doesn't seem right which is why I'm wondering if there is something else wrong.
-
Janice,
The proper solution would be to have the site set up at the server level to automatically rewrite URLs so they have one consistent pattern (typically all lower case). And to make sure all links within the site pointing to other pages on the site use that preferred capitalization method. While having Canonical tags can help alleviate the problem, they're not a best practices "only" solution. So speak with the site administrator or programmer to get the rewrite functionality implemented.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get google to forget my old but still working page and list my new fully optimized page for a keyword?
Hi There! (i am beginner in seo) I have dynamic and static pages on our site. I created a static page for a specific keyword. Fully optimized it, (h1, alt, metas, etc.....maybe too optimized). My problem is that this page is alive for weeks, checked it in GWT and it is in robots.txt, google sees it, and indexed it. BUT whenewer i do a search for that keyword, we still appear with the dynamically created link in the google listings. How could i "redirect" google, if sy make a search for that keyword than shows our optimized page? Is there a tool for that? I cant delete the dynamic page... Any ideas? Thx Andrew
Technical SEO | | Neckermann0 -
Do mobile and desktop sites that pull content from the same source count as duplicate content?
We are about to launch a mobile site that pulls content from the same CMS, including metadata. They both have different top-level domains, however (www.abcd.com and www.m.abcd.com). How will this affect us in terms of search engine ranking?
Technical SEO | | ovenbird0 -
Wordpress tags and duplicate content?
I've seen a few other Q&A posts on this but I haven't found a complete answer. I read somewhere a while ago that you can use as many tags as you would like. I found that I rank for each tag I used. For example, I could rank for best night clubs in san antonio, good best night clubs in san antonio, great best night clubs in san antonio, top best night clubs in san antonio, etc. However, I now see that I'm creating a ton of duplicate content. Is there any way to set a canonical tag on the tag pages to link back to the original post so that I still keep my rankings? Would future tags be ignored if I did this?
Technical SEO | | howlusa0 -
Duplicate Content - That Old Chestnut!!!
Hi Guys, Hope all is well, I have a question if I may? I have several articles which we have written and I want to try and find out the best way to post these but I have a number of concerns. I am hoping to use the content in an attempt to increase the Kudos of our site by providing quality content and hopefully receiving decent back links. 1. In terms of duplicate content should I only post it on one place or should I post the article in several places? Also where would you say the top 5 or 10 places would be? These are articles on XML, Social Media & Back Links. 2. Can I post the article on another blog or article directory and post it on my websites blog or is this a bad idea? A million thanks for any guidance. Kind Regards, C
Technical SEO | | fenwaymedia0 -
Why do I get duplicate page title errors.
I keep getting duplicate page title errors on www.etraxc.com/ and www.etraxc.com/default.asp, which are both pointing to the same page. How do i resolve this and how bad is it hurting my SEO.
Technical SEO | | bobbabuoy0 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0 -
How can I have pages with media that changes and avoid duplicate content when the text stays the same?
I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.
Technical SEO | | WebsightDesign0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0