Why are my pages getting duplicate content errors?
-
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page:
http://www.mapsalive.com/Features/audio.aspx
http://www.mapsalive.com/Features/Audio.aspx
The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
-
Dr. Pete doesn't cover case (though it's mentioned in the comments), but just about everything else you might want to know about duplicate content is talked about at http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world, including ways to remedy it. It sounds like you've got a plan here, but I'm also adding it for the benefit of others looking at this thread.
-
I think this is one of the most overlooked duplicate content issues. Not sure why it's not talked about as much as it is. I quite often have been using upper and lowercase intermittently. E.g., mysite.com/Las-Vegas/ and mysite.com/las-vegas/, not knowing it made any difference.
I guess a .htaccess rewrite to all lowercase is in order. Thanks SEOMoz. You guys rock.
-
Glad to be of help Janice.
From a readability perspective, in which case I'd suggest to have all lower case.
-
Well, it is a Windows server and my understanding is that it is case-insensitive, but I'll verify this with our hosting provider. Nevertheless, would it be preferable to set up the rewrite from the mixed case names to all lowercase names or vice versa? Or perhaps it doesn't matter.
Thanks for your help with this - lots to learn and work through with these tools.
-
If the server allows upper case and lower case then from a technical perspective they could both be different files. Like having www.domain.com and domain.com point to the same home page - they may be the same, but technically they could be two different places.
The solution should be set up to not require having to do a rewrite every time a new page is created. It should be automatic.
-
I understand your answer and about setting up rewrites, but what I really want to know is why there are two pages listed (one uppercase, one lowercase) when there is only one physical page on the site. All links within the site point to the page using the uppercase name.
I don't want to have to add a rewrite for the lowercase name every time I add a page to the site - this doesn't seem right which is why I'm wondering if there is something else wrong.
-
Janice,
The proper solution would be to have the site set up at the server level to automatically rewrite URLs so they have one consistent pattern (typically all lower case). And to make sure all links within the site pointing to other pages on the site use that preferred capitalization method. While having Canonical tags can help alleviate the problem, they're not a best practices "only" solution. So speak with the site administrator or programmer to get the rewrite functionality implemented.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on job sites
Hi, I have a question regarding job boards. Many job advertisers will upload the same job description to multiple websites e.g. monster, gumtree, etc. This would therefore be viewed as duplicate content. What is the best way to handle this if we want to ensure our particular site ranks well? Thanks in advance for the help. H
Technical SEO | | HiteshP0 -
Indexing pages content that is not needed
Hi All, I have a site that has articles and a side block that shows interesting articles in a column block. While we google for a keyword i can see the page but the meta description is picked from the side block "interesting articles" and not the actual article in the page. How can i deny indexing that block alone Thanks
Technical SEO | | jomin740 -
Removed .html - Now Get Duplicate Content
Hi there, I run a wordpress website and have removed the .html from my links. Moz has done a crawl and now a bunch of duplicated are coming up. Is there anything I need to do in perhaps my htaccess to help it along? Google appears to still be indexing the .html versions of my links
Technical SEO | | MrPenguin0 -
Duplicate title/content errors for blog archives
Hi All Would love some help, fairly new at SEO and using SEOMoz, I've looked through the forums and have just managed to confuse myself. I have a customer with a lot of duplicate page title/content errors in SEOMoz. It's an umbraco CMS and a lot of the errors appear to be blog archives and pagination. i.e. http://example.com/blog http://example.com/blog/ http://example.com/blog/?page=1 http://example.com/blog/?page=2 and then also http://example.com/blog/2011/08 http://example.com/blog/2011/08?page=1 http://example.com/blog/2011/08?page=2 http://example.com/blog/2011/08?page=3 (empty page) http://example.com/blog/2011/08?page=4 (empty page) This continues for different years and months and blog entries and creates hundreds of errors. What's the best way to handle this for the SEOMoz report and the search engines. Should I rel=canonical the /blog page? I think this would probably affect the SEO of all the blog entries? Use robots.txt? Sitemaps? URL parameters in the search engines? Appreciate any assistance/recommendations Thanks in advance Ian
Technical SEO | | iragless0 -
Container Page/Content Page Duplicate Content
My client has a container page on their website, they are using SiteFinity, so it is called a "group page", in which individual pages appear and can be scrolled through. When link are followed, they first lead to the group page URL, in which the first content page is shown. However, when navigating through the content pages, the URL changes. When navigating BACK to the first content page, the URL is that for the content page, but it appears to indexers as a duplicate of the group page, that is, the URL that appeared when first linking to the group page. The client updates this on the regular, so I need to find a solution that will allow them to add more pages, the new one always becoming the top page, without requiring extra coding. For instance, I had considered integrating REL=NEXT and REL=PREV, but they aren't going to keep that up to date.
Technical SEO | | SpokeHQ1 -
Duplicate content with "no results found" search result pages
We have a motorcycle classifieds section that lets users search for motorcycles for sale using various drop down menus to pick year-make-type-model-trim, etc.. These search results create urls such as:
Technical SEO | | seoninjaz
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=On-Off Road&vehicle_model=Tiger&vehicle_trim=800 XC ABS We understand that all of these URL varieties are considered unique URLs by Google. The issue is that we are getting duplicate content errors on the pages that have no results as they have no content to distinguish themselves from each other. A URL like:
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=Sportbike
and
www.example.com/classifieds/search.php?vehicle_manufacturer=Honda&vehicle_category=Streetbike Will have a results page that says "0 results found". I'm wondering how we can distinguish these "unique" pages better? Some thoughts:
-make sure <title>reflects what was search<br />-add a heading that may say "0 results found for Triumph On-Off Road Tiger 800 XC ABS"<br /><br />Can anyone please help out and lend some ideas in solving this? <br /><br />Thank you.</p></title>0 -
Errors - 7300 - Duplicate Page Content..Help me..
Hi, I just received the crawl report with 7300 errors of duplicate page content. Site built using php. list of errors will be like this.. http://xxxxx.com/channels/ http://xxxxx.com/channels/?page=1 http://xxxxxx.com/channels/?page=2 I am not good in coding and using readymade script for this website. could anyone guide me to fix this issue? Thanks.
Technical SEO | | vilambara0 -
Duplicate content error from url generated
We are getting a duplicate content error, with "online form/" being returned numerous times. Upon inspecting the code, we are calling an input form via jQuery which is initially called by something like this: Opens Form Why would this be causing it the amend the URL and to be crawled?
Technical SEO | | pauledwards0