Caps in URL creating duplicate content
-
Im getting a bunch of duplicate content errors where the crawl is saying
www.url.com/abc has duplicate at www.url.com/ABC
The content is in magento and the url settings are lowercase, and I cant figure out why it thinks there is duplicate consent. These are pages with a decent number of inbound links.
-
I checked and it is a magento feature to rewrite caps to lower case.
I added this to htaccess anyway
<code>RewriteMap lc int:tolower RewriteCond %{REQUEST_URI} [A-Z] RewriteRule (.*) ${lc:$1} [R=301,L]</code>
One last question before I take this question to a magento forum - how can I look at a page with a caps URL and lower URL and see if they are really different pages or link to the same address.
When you change random letters to caps in our site it sends you to the right page but my browser still shows the mixed caps url instead of replacing with an all lower url - but is that really a different page or is the browser just not changing the caps display when it is really getting the lower case page ```
-
Hi John,
I checked the URL you sent me. You do have duplicate pages:
http://www.madebysurvivors.com/destiny
http://www.madebysurvivors.com/DESTINY
both work and return the same page..
I also tried clicking on other links on your site, and then just changing a few letters to the upper case something like this
http://www.madebysurvivors.com/LEArn-human-trafficking-slavery
and it returns the same page
From what I can tell its one of the features in Magento that is making this possible. I would go into settings and disable that setting that forces Magento to use lower case.
Then test it make sure that you DO get a 404 page if you change the letter case on any of your links. Once you test it and you do get a 404 page.
I'm not familiar with Magento so not sure if it has that option or not, but many CMS and ecommerce platforms have a field where you can specify the URL for that page, I would change that field to all lower case.
Test it again, if it works there is one more step that you have to do if you want to keep the same juice from the pages that had the uppercase URL.
You need to duplicate your pages, but you need to make sure that the URL address is the same as it was before (in all CAPS) and then do a 301 redirect to the new page which is in lower case.
Hope this helps and makes sense.
-
This is intended functionality in Magento. It's supposed to help the user experience, as a user can navigate to a page even if they aren't sure on the casing of the words.
Of course that's bad for SEO. You'll need to put in the concept of canonicalization. Here's a free extension by Yoast:
http://www.magentocommerce.com/magento-connect/canonical-url-for-magento.html
Cheers.
Update: seeing your response, your solution of putting in redirects wouldn't be possible. You'd have to cover all combinations of caps/non-caps, and well, that's more work than you should want :). As for why this happens, the uppercase character is being lowercased when checking if something in the database matches the URL. Again, this is intended functionality.
-
Looks like I do need some more help.
I get a redirect loop if I enter a redirect from
http://www.madebysurvivors.com/DESTINY
to
http://www.madebysurvivors.com/destiny
but I checked and there is no redirect the other way in our database or htaccess.
If I leave the redirect off I get duplicate content - but in the CMS parts of magento there is only one table for this page.
-
I actually moved all the content from a drupal install so I dont have that many URLs that have the problem. It looks like the faster way to do this is just redirects the caps to lower case as thats what we use elsewhere..
I dug into the underlying database and cant find any duplicate entries for these pages or odd redirects so I have no idea of the cause.
For some of the pages I think you are right that magento is moving caps down to lower, but there are a few others where it is lower to caps - but it was caps in the drupal site.
Anyway -good to know google sees them differently so Ill put in redirects. Its only about 20 pages
-
Hello John,
If you can provide us with a URL we might be able to dig in to see what is going on. Without it its almost impossible to tell. Also it doesn't matter if you have a decent number of inbound links, duplicate content only refers to pages with similar content. I'm not familiar with Magento platform so this is just a guess, when you created (or imported) pages or categories in Magento originally were they lowercased? If not its possible that Magento added them as all in CAPS and Magento might be forcing it to lower case, therefore you might have duplicates, but once again this is just a guess and without a URL to your site I doubt that someone will be able to help you further.
-
www.url.com/abc and www.url.com/ABC are two completely different pages according to Google
I would redirect any and all pages with capitals to the corresponding lower case URL's.
Dont worry about the link juice as it will pass over via the redirect. It will also be much better than having 2 identical pages competing with eachother (according to Google)
Greg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible to deindex old URLs that contain duplicate content?
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website). Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file. Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages? In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs? Any help is greatly appreciated!
Technical SEO | | ClickHub-Harry0 -
What is best practice for fixing urls that have duplicate content, non-static and other issues?
Hi, I know there are several good answers regarding duplicate content issues on this website already, however I have a question that involves the best way to avoid negative SEO impacts if I change the urls for an ecommerce site. Basically a new client has the following website http://www.gardenbeauty.co.uk and I notice that it suffers from duplicate content due to the http://www version and the non www version of the pages - this seems quite easy to fix using the guidance on this website. However I notice that the product page urls are far from ideal in that they have several issues including:- (a) they are mostly too long (b) don't include the keyword terms (in terms of best practice) (c) they don't use Static URLS An example of one these product urls would be http://www.gardenbeauty.co.uk/plant-details.php?name=Autumn Glory&p_genus=Hebe&code=heagl&category=hebe I'd like to address these issues, but the pages rank highly for the products themselves, therefore my question is what would you recommend I do to fix the urls without risking the high positions that many of these product pages have? thanks, Ben
Technical SEO | | bendyman0 -
Responsive Code Creating Duplicate Content Issue
Good morning, Our developers have recently created a new site for our agency. The site is responsive for mobile/tablets. I've just put the site through Screaming Frog and I've been informed of duplicate H2s. When I've looked at some of the page sources, there are some instances of duplicated H2s and duplicated content. These duplicates don't actually appear on the site, only in the code. When I asked the development guys about this, they advised this is duplicated because of the code for the responsive site. Will the site be negatively affected because of this? Not everything is duplicated, which leads me to believe it probably could have been designed better... but I'm no developer so don't know for sure. I've checked the code for other responsive sites and no duplicates can be found. Thanks in advance, Lewis
Technical SEO | | PeaSoupDigital0 -
Is duplicate content ok if its on LinkedIn?
Hey everyone, I am doing a duplicate content check using copyscape, and realized we have used a ton of the same content on LinkedIn as our website. Should we change the LinkedIn company page to be original? Or does it matter? Thank you!
Technical SEO | | jhinchcliffe0 -
Tips and duplicate content
Hello, we have a search site that offers tips to help with search/find. These tips are organized on the site in xml format with commas... of course the search parameters are duplicated in the xml so that we have a number of tips for each search parameter. For example if the parameter is "dining room" we might have 35 pieces of advice - all less than a tweet long. My question - will I be penalized for keyword stuffing - how can I avoid this?
Technical SEO | | acraigi0 -
Duplicate content - wordpress image attachement
I have run my seomoz campaign through my wordpress site and found duplicate content. However, all of this duplicate content was either my logo or images and no content with addresses like /?attachement_id=4 for example . How should I resolve this? thank you.
Technical SEO | | htmanage0 -
Duplicate Content Caused By Blog Filters
We are getting some duplicate content warnings based on our blog. Canonical URL's can work for some of the pages, but most of the duplicate content is caused by blog posts appearing on more than 1 URL. What is the best way to fix this?
Technical SEO | | Marketpath0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0