Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content on subdomains.
-
Hi Mozer's,
I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com.
So, I want to know how can i avoid content duplication.
Many Thanks!
-
It would probably be better (and more likely to get you responses) if you started a new question - this one is three years old. Generally, I think it depends on your scope. If you need some kind of separation (corporate, legal, technical), then separate domains or sub-domains may make sense. They're also easier to target, in some ways. However, you're right that authority may be diluted and you'll need more marketing effort against each one.
If resources are limited and you don't need each country to be a fully separate entity, then you'll probably have less headaches with sub-folders. I'm speaking in broad generalities, though - this is a big decision that depends a lot on the details.
-
Dear all,
I have bought 30 geo top level domains. This is for an ecommerce project that has not launcehd yet (and isn't indexed by Google).
I am now at a point where I can change/consolidate all domains as sub domains or sub folders or keep things as they are.
I just worry that link building would be scattered and not focused and that it might be better to concentrate the efforts on one domain.
What are your views on this?
Many thanks.
-
Yeah - I'm really afraid that stacking all those sub-domains is going to cause you long-term issues with your link-building, and that some of those sub-domains could fragment. If the country needs to be in a sub-domain, then I think the hybrid approach (with "/shop" as a sub-folder) may cause you less trouble.
I will warn, though, that any change like this carries some risk. You'll have to put proper 301-redirects in place.
I might try the href lang tags first, though, and see if it helps the current problem (it may take a few weeks). Changing too many aspects of the on-page SEO at once could cause you a lot of grief.
-
shop. pages are simply new pages which are added for products to be sold with ease. I think that i might move shop.uk.xyz.com pages to uk.xyz.com/shop/product as in a sub folder. Do you think this will help in passing on the link juice to those pages after the change and would be easy for me to include them in the sitemap as well??
-
If you have separate GWT profiles, then I think the XML sitemap may have to be under the sub-domain - Google has to be able to access it from a sub-domain URL. It doesn't have to be in the root of the sub-domain.
I'm not clear on what the "shop." pages are, but stacking sub-domains like that sounds like it's getting pretty messy. Why the separation?
-
I have already created separate profiles for the subdomains, but my only worry is where to place the sitemap on the server eg in the root directory of the root domain or in the root directory of the sub domain.
Coming to the (2) the pages which i want to include in the site map are my product pages. so want to know if shop.uk.xyz.com can be included in the sitemap which will be for uk.xyz.com and also if does that count as a internal page of uk.xyz.com
-
It is probably best to create separate profiles in Google Webmaster Tools, because then you can target the sub-domains to the countries in question. At that point, you could also set up separate sitemaps. It'll give you a cleaner view of how each sub-domain is indexed and ranking.
I'm not sure I understand (2) - why wouldn't you include those pages in the sitemap?
-
Thank you for your inputs. I has relly helped me understand the situation.
I will try to implement this and let you know how I have done on this. Also I had few more things on this:
1. do i require a separate sitemap and robots file for all the sub domains and where shall i place it on the server?
2. in the sub domain there are pages like shop.uk.xyz.com/product1. so can i include that in the sitemaps as those are the pages which i really want to rank for.
-
There's no perfect answer. Canonical tags would keep the sub-domains from ranking, in many cases. The cross-TLD stuff is weird, though - Google can, in some cases, ignore the canonical if they think that one sub-domain is more appropriate for the country/ccTLD the searcher is using.
Sub-domains can be tricky in and of themselves, unfortunately, because they sometimes fragment and don't pass link "juice" fully to the root domain. I generally still think sub-folders are better for cases like this, but obviously that would be a big change (and potentially risky).
You could try the rel="alternate" hreflang tags. They're similar to canonical (a bit weaker), but basically are designed to handle the same content in different languages and regions:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
They're basically designed for exactly this problem. You can set the root domain to "en-US", the UK sub-domain to "en-UK", etc. I've heard generally good things, and they're low-risk, but you have to try it and see. They can be a little tricky to implement properly.
-
No, 301 and canonicals are completely different
A 301 will redirect a page and a canonical is setting the preferred version of the page. For example:
301 - you have an old version of the page that looks like this www.example.com/p?=153 and you want it to look like www.example.com/red-apples. You would use a 301 from the old page (www.example.com/p?=153) to the new page (www.example.com/red-apples)
Canonical - Lets go back to the red apples example. Lets say you have a ecommerce site and you have different ways to search for products. One way is to search by fruit and the other by color. So what you'll have is two versions of the end result. For example. You'll have www.example.com/fruit/red-apples and you might have www.example.com/red/red-apples. Since both of those pages show the same information you don't want the engines to think its duplicate content so you can add a rel=canonical link element to both pages to the preferred version of the two. (ie you might want to have the canonical be www.example.com/red-apples) That's all it does. It tells the engines your preferred version of the pages that may be the same.
Back to your original post, you really don't need to "noindex" but I thought you were having a duplicate content issue and that would solve the issue. (Generally, Google won't penalize you this sort of duplicate content)
Here is what I would do.
If you don't have Google Webmaster tools already set up then do so. Verify each version of your subdomain, (ie. india.xyz.com, uk.xyz.com, etc)(let me know if you need help) and then set your Geo Target for each them manually (You'll have to set this up manually because you have a gTLD and not a ccTLD)
How to set your Geo Target manually.
To to a particular version of your site in WMT (ie. india.xyz.com) and click on "configuration" then "settings". Under "settings" the first sections says "Geographical Target". "Check" the box and then use the drop down to select "india".
Repeat this for all of your subdomains for each specific country.
This will let Google know that you are trying to target users in a specific country.
If you have the money to invest in it, I would also try to have those subdomains hosted by a server in each particular country. (strong signal for Google)
Hope it helps.
-
Thanx Darin!
I have few doubts on this:
1. is rel canonical like a 301 redirect? As my concern is if my user goes to www.uk.xyz.com/productx , will he be redirected to to www.xyz.com/product
2. my sub domain pages are ranking in the country specific search engine. For ex, www.uk.xyz.com is ranking for keywords in google.co.uk. So if i noindex then i will loose my search engine presence in the country specific search engine.
PS the content on the pages is all same apart from the product currency.
-
I disagree. I said "noindex" not "nofollow". Link juice will be passed but not show up in the Serps. I do agree with you though that the strategy as a whole, if there is in-fact exact/duplicate content, seems to be a waste. Unless these pages are in another language, I don't see the point of this subdomain strategy.
-
Canonical will help to remove duplicate issues and also to consolidate your link values. I didn't see any issue with cross domain implementation.
If you add "noindex" to any of these pages, you won't get any link credit.
-
Short Answer: Set a canonical url on the pages to the root domain version and noindex the subdomain pages.
What this does is avoid the duplicate content problem. Generally, those subdomain pages won't rank anyway because the same information is on the "main" site. You can still build links to those subdomain pages and do a strong internal link structure to help the "main" site rankings.
The only negative to this is that the pages in your subdomain won't rank. That's not necessarily a bad thing but just know they won't. But, if the pages are truly duplicate content, they won't rank anyway.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Do I need to add the actual language for meta tags and description for different languages? cited for duplicate content for different language
Hi, I am fairly new to SEO and this community so pardon my questions. We recently launched on our drupal site mandarin language version for the entire site. And when i do the crawl site, i get duplicate content for the pages that are in mandarin. Is this a problem or can i ignore this? Should i make different page titles for the different languages? Also, for the metatag and descriptions, would it better in the native language for google to search for? thanks in advance.
Intermediate & Advanced SEO | | lynetteboss0 -
Same product in different categories and duplicate content issues
Hi,I have some questions related to duplicate content on e-commerce websites. 1)If a single product goes to multiple categories (eg. A black elegant dress could be listed in two categories like "black dresses" and "elegant dresses") is it considered duplicate content even if the product url is unique? e.g www.website.com/black-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/elegant-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/black-elegant-dress unique url > this is the way my products urls look like Does google perceive this as duplicated content? The path to the content is only one, so it shouldn't be seen as duplicated content, though the product is repeated in different categories.This is the most important concern I actually have. It is a small thing but if I set this wrong all website would be affected and thus penalised, so I need to know how I can handle it. 2- I am using wordpress + woocommerce. The website is built with categories and subcategories. When I create a product in the product page backend is it advisable to select just the lowest subcategory or is it better to select both main category and subcategory in which the product belongs? I usually select the subcategory alone. Looking forward to your reply and suggestions. thanks
Intermediate & Advanced SEO | | cinzia091 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | | BobGW0