Duplicate content throughout multiple URLs dilemma
-
We have a website with lots of categories and there are problems that some subcategories have identical content on them.
So, is it enough to just add different text on those problematic subcategories or we need to use "canonical" tag to main category.
Same dilemma is with our search system and duplicate content. For example, "/category/sports" URL would have similar to identical content with "/search/sports" and "/search/sports-fitness/" URLs.
Ranking factors is important for all different categories and subcategories. Ranking factors is also important for search individual keywords. So, the question is, how to make them somehow unique/different to rank on all those pages well?
Would love to hear advices how it can be solved using different methods and how it would affect our rankings. When we actually need to use "canonical" tag and when 301 redirect is better. Thanks!
-
Commonly on large ecommerce websites duplication between categories and sub categories is an issue. The views of search results can also be a factor IE: grid, list, etc-
Association the sub categories with the main top level categories by using the canonical tag. It is also a possibility to block the /search/ results by using the robots.txt- search results aren't the most beneficial to the user and this is a ranking factor.
The canonical tags are exceptionally effective for ecommerce structures.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I added an SSL certificate this morning and now I noticed duplicate content
Ok, so Im a newbie, therefor I make mistakes! Lots of them. I added an SSL certificate this morning bc it was free and I read it can help my rankings. Now I just checked it in screaming frog and saw two duplicate content pages due to the https. So im panicking! What's the easiest way to fix this?? Can I undue an SSL certificate? I guess what's the easiest that will also be best for ranking. Thank you!! Rena
Intermediate & Advanced SEO | | palila0 -
Duplicate page content errors for Web App Login
Hi There I have 6 duplicate content errors, but they are for the WebApp login from our website. I have put a Noindex on the Sitemap to stop google from indexing them to see if that would work. But it didn't. These links as far as I can see are not even on the website www.skemaz.net, but are links beyond the website and on the Web App itself eg : <colgroup><col width="529"></colgroup>
Intermediate & Advanced SEO | | Skemazer
| http://login.skemaz.net |
| http://login.skemaz.net/LogIn?ReturnUrl=%2Fchangepassword |
| http://login.skemaz.net/Login |
| http://login.skemaz.net/LogIn?ReturnUrl=%2FHome | Any suggestions would be greatly appreciated. Kind regards Sarah0 -
Best Way to Create SEO Content for Multiple, International Websites
I have a client that has multiple websites for providing to other countries. For instance, they have a .com website for the US (abccompany.com), a .co.uk website for the UK (abccompany.co.uk), a .de website for Germany (abccompany.de), and so on. The have websites for the Netherlands, France, and even China. These all act as separate websites. They have their own addresses, their own content (some duplicated but translated), their own pricing, their own Domain Authority, backlinks, etc. Right now, I write content for the US site. The goal is to write content for long and medium tail keywords. However, the UK site is interested in having myself write content for them as well. The issue I'm having is how can I differentiate the content? And what is the best way to target content for each country? Does it make sense to write separate content for each website to target results in that country? The .com site will still show up in UK web results still fairly high. Does it make sense to just duplicate the content but in a different language or for the specific audience in that country? I guess the biggest question I'm asking is, what is the best way of creating content for multiples countries' search results? I don't want the different websites to compete with each other in a sense nor do I want to spend extra time trying to rank content for multiple sites when I could just focus on trying to rank one for all countries. Any help is appreciated!
Intermediate & Advanced SEO | | cody1090 -
CMS Pages - Multiple URLS (/)
Hi guys, this type of question has been asked a few times before but I couldn't find something that told me what i need so apologies if its a tad repetitive. I use Magento, and have several pages using its CMS. However, it produces 2 URLS for each page with a simple /. For example, website.com/hire
Intermediate & Advanced SEO | | ATP
website.com/hire/ I know google treats this as 2 separate pages, which would be the better solution. 1. Write a URL re-write for every CMS page
RewriteRule ^hire$ http://www.website.com/hire/ [R=301,L] (Is this right?) 2. Write a general rewrite rule to always add the /
No idea where to begin with this 3. Add a Canonical tag to the page which i think is possible in magento by adding this to the Custom Design Layout XML option in the page CMS. <action method="addLinkRel"></action> <rel>canonical</rel> <href>http://www.website.com/hire/</href> This would make the /hire/ page self-reference and the /hire page reference the /hire/ page I think. Which one of these solutions is the best and any pointers with the coding would be grand.0 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
How to remove duplicate content, which is still indexed, but not linked to anymore?
Dear community A bug in the tool, which we use to create search-engine-friendly URLs (sh404sef) changed our whole URL-structure overnight, and we only noticed after Google already indexed the page. Now, we have a massive duplicate content issue, causing a harsh drop in rankings. Webmaster Tools shows over 1,000 duplicate title tags, so I don't think, Google understands what is going on. <code>Right URL: abc.com/price/sharp-ah-l13-12000-btu.html Wrong URL: abc.com/item/sharp-l-series-ahl13-12000-btu.html (created by mistake)</code> After that, we ... Changed back all URLs to the "Right URLs" Set up a 301-redirect for all "Wrong URLs" a few days later Now, still a massive amount of pages is in the index twice. As we do not link internally to the "Wrong URLs" anymore, I am not sure, if Google will re-crawl them very soon. What can we do to solve this issue and tell Google, that all the "Wrong URLs" now redirect to the "Right URLs"? Best, David
Intermediate & Advanced SEO | | rmvw0 -
Is this duplicate content?
My client has several articles and pages that have 2 different URLs For example: /bc-blazes-construction-trail is the same article as: /article.cfm?intDocID=22572 I was not sure if this was duplicate content or not ... Or if I should be putting "/article.cfm" into the robots.txt file or not.. if anyone could help me out, that would be awesome! Thanks 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Duplicate content even with 301 redirects
I know this isn't a developer forum but I figure someone will know the answer to this. My site is http://www.stadriemblems.com and I have a 301 redirect in my .htaccess file to redirect all non-www to www and it works great. But SEOmoz seems to think this doesn't apply to my blog, which is located at http://www.stadriemblems.com/blog It doesn't seem to make sense that I'd need to place code in every .htaccess file of every sub-folder. If I do, what code can I use? The weirdest part about this is that the redirecting works just fine; it's just SEOmoz's crawler that doesn't seem to be with the program here. Does this happen to you?
Intermediate & Advanced SEO | | UnderRugSwept0