Duplicate content on subdomains.
-
Hi Mozer's,
I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com.
So, I want to know how can i avoid content duplication.
Many Thanks!
-
It would probably be better (and more likely to get you responses) if you started a new question - this one is three years old. Generally, I think it depends on your scope. If you need some kind of separation (corporate, legal, technical), then separate domains or sub-domains may make sense. They're also easier to target, in some ways. However, you're right that authority may be diluted and you'll need more marketing effort against each one.
If resources are limited and you don't need each country to be a fully separate entity, then you'll probably have less headaches with sub-folders. I'm speaking in broad generalities, though - this is a big decision that depends a lot on the details.
-
Dear all,
I have bought 30 geo top level domains. This is for an ecommerce project that has not launcehd yet (and isn't indexed by Google).
I am now at a point where I can change/consolidate all domains as sub domains or sub folders or keep things as they are.
I just worry that link building would be scattered and not focused and that it might be better to concentrate the efforts on one domain.
What are your views on this?
Many thanks.
-
Yeah - I'm really afraid that stacking all those sub-domains is going to cause you long-term issues with your link-building, and that some of those sub-domains could fragment. If the country needs to be in a sub-domain, then I think the hybrid approach (with "/shop" as a sub-folder) may cause you less trouble.
I will warn, though, that any change like this carries some risk. You'll have to put proper 301-redirects in place.
I might try the href lang tags first, though, and see if it helps the current problem (it may take a few weeks). Changing too many aspects of the on-page SEO at once could cause you a lot of grief.
-
shop. pages are simply new pages which are added for products to be sold with ease. I think that i might move shop.uk.xyz.com pages to uk.xyz.com/shop/product as in a sub folder. Do you think this will help in passing on the link juice to those pages after the change and would be easy for me to include them in the sitemap as well??
-
If you have separate GWT profiles, then I think the XML sitemap may have to be under the sub-domain - Google has to be able to access it from a sub-domain URL. It doesn't have to be in the root of the sub-domain.
I'm not clear on what the "shop." pages are, but stacking sub-domains like that sounds like it's getting pretty messy. Why the separation?
-
I have already created separate profiles for the subdomains, but my only worry is where to place the sitemap on the server eg in the root directory of the root domain or in the root directory of the sub domain.
Coming to the (2) the pages which i want to include in the site map are my product pages. so want to know if shop.uk.xyz.com can be included in the sitemap which will be for uk.xyz.com and also if does that count as a internal page of uk.xyz.com
-
It is probably best to create separate profiles in Google Webmaster Tools, because then you can target the sub-domains to the countries in question. At that point, you could also set up separate sitemaps. It'll give you a cleaner view of how each sub-domain is indexed and ranking.
I'm not sure I understand (2) - why wouldn't you include those pages in the sitemap?
-
Thank you for your inputs. I has relly helped me understand the situation.
I will try to implement this and let you know how I have done on this. Also I had few more things on this:
1. do i require a separate sitemap and robots file for all the sub domains and where shall i place it on the server?
2. in the sub domain there are pages like shop.uk.xyz.com/product1. so can i include that in the sitemaps as those are the pages which i really want to rank for.
-
There's no perfect answer. Canonical tags would keep the sub-domains from ranking, in many cases. The cross-TLD stuff is weird, though - Google can, in some cases, ignore the canonical if they think that one sub-domain is more appropriate for the country/ccTLD the searcher is using.
Sub-domains can be tricky in and of themselves, unfortunately, because they sometimes fragment and don't pass link "juice" fully to the root domain. I generally still think sub-folders are better for cases like this, but obviously that would be a big change (and potentially risky).
You could try the rel="alternate" hreflang tags. They're similar to canonical (a bit weaker), but basically are designed to handle the same content in different languages and regions:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
They're basically designed for exactly this problem. You can set the root domain to "en-US", the UK sub-domain to "en-UK", etc. I've heard generally good things, and they're low-risk, but you have to try it and see. They can be a little tricky to implement properly.
-
No, 301 and canonicals are completely different
A 301 will redirect a page and a canonical is setting the preferred version of the page. For example:
301 - you have an old version of the page that looks like this www.example.com/p?=153 and you want it to look like www.example.com/red-apples. You would use a 301 from the old page (www.example.com/p?=153) to the new page (www.example.com/red-apples)
Canonical - Lets go back to the red apples example. Lets say you have a ecommerce site and you have different ways to search for products. One way is to search by fruit and the other by color. So what you'll have is two versions of the end result. For example. You'll have www.example.com/fruit/red-apples and you might have www.example.com/red/red-apples. Since both of those pages show the same information you don't want the engines to think its duplicate content so you can add a rel=canonical link element to both pages to the preferred version of the two. (ie you might want to have the canonical be www.example.com/red-apples) That's all it does. It tells the engines your preferred version of the pages that may be the same.
Back to your original post, you really don't need to "noindex" but I thought you were having a duplicate content issue and that would solve the issue. (Generally, Google won't penalize you this sort of duplicate content)
Here is what I would do.
If you don't have Google Webmaster tools already set up then do so. Verify each version of your subdomain, (ie. india.xyz.com, uk.xyz.com, etc)(let me know if you need help) and then set your Geo Target for each them manually (You'll have to set this up manually because you have a gTLD and not a ccTLD)
How to set your Geo Target manually.
To to a particular version of your site in WMT (ie. india.xyz.com) and click on "configuration" then "settings". Under "settings" the first sections says "Geographical Target". "Check" the box and then use the drop down to select "india".
Repeat this for all of your subdomains for each specific country.
This will let Google know that you are trying to target users in a specific country.
If you have the money to invest in it, I would also try to have those subdomains hosted by a server in each particular country. (strong signal for Google)
Hope it helps.
-
Thanx Darin!
I have few doubts on this:
1. is rel canonical like a 301 redirect? As my concern is if my user goes to www.uk.xyz.com/productx , will he be redirected to to www.xyz.com/product
2. my sub domain pages are ranking in the country specific search engine. For ex, www.uk.xyz.com is ranking for keywords in google.co.uk. So if i noindex then i will loose my search engine presence in the country specific search engine.
PS the content on the pages is all same apart from the product currency.
-
I disagree. I said "noindex" not "nofollow". Link juice will be passed but not show up in the Serps. I do agree with you though that the strategy as a whole, if there is in-fact exact/duplicate content, seems to be a waste. Unless these pages are in another language, I don't see the point of this subdomain strategy.
-
Canonical will help to remove duplicate issues and also to consolidate your link values. I didn't see any issue with cross domain implementation.
If you add "noindex" to any of these pages, you won't get any link credit.
-
Short Answer: Set a canonical url on the pages to the root domain version and noindex the subdomain pages.
What this does is avoid the duplicate content problem. Generally, those subdomain pages won't rank anyway because the same information is on the "main" site. You can still build links to those subdomain pages and do a strong internal link structure to help the "main" site rankings.
The only negative to this is that the pages in your subdomain won't rank. That's not necessarily a bad thing but just know they won't. But, if the pages are truly duplicate content, they won't rank anyway.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
Duplicate Multi-site Content, Duplicate URLs
We have 2 ecommerce sites that are 95% identical. Both sites carry the same 2000 products, and for the most part, have the identical product descriptions. They both have a lot of branded search, and a considerable amount of domain authority. We are in the process of changing out product descriptions so that they are unique. Certain categories of products rank better on one site than another. When we've deployed unique product descriptions on both sites, we've been able to get some double listings on Page 1 of the SERPs. The categories on the sites have different names, and our URL structure is www.domain.com/category-name/sub-category-name/product-name.cfm. So even though the product names are the same, the URLs are different including the category names. We are in the process of flattening our URL structures, eliminating the category and subcategory names from the product URLs: www.domain.com/product-name.cfm. The upshot is that the product URLs will be the same. Is that going to cause us any ranking issues?
Intermediate & Advanced SEO | | AMHC0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Trying to advise on what seems to be a duplicate content penalty
So a friend of a friend was referred to me a few weeks ago as his Google traffic fell off a cliff. I told him I'd take a look at it and see what I could find and here's the situation I encountered. I'm a bit stumped at this point, so I figured I'd toss this out to the Moz crowd and see if anyone sees something I'm missing. The site in question is www.finishlinewheels.com In Mid June looking at the site's webmaster tools impressions went from around 20,000 per day down to 1,000. Interestingly, some of their major historic keywords like "stock rims" had basically disappeared while some secondary keywords hadn't budged. The owner submitted a reconsideration request and was told he hadn't received a manual penalty. I figured it was the result of either an automated filter/penalty from bad links, the result of a horribly slow server or possibly a duplicate content issue. I ran the backlinks on OSE, Majestic and pulled the links from Webmaster Tools. While there aren't a lot of spectacular links there also doesn't seem to be anything that stands out as terribly dangerous. Lots of links from automotive forums and the like - low authority and such, but in the grand scheme of things their links seem relevant and reasonable. I checked the site's speed in analytics and WMT as well as some external tools and everything checked out as plenty fast enough. So that wasn't the issue either. I tossed the home page into copyscape and I found the site brandwheelsandtires.com - which had completely ripped the site - it was thousands of the same pages with every element copied, including the phone number and contact info. Furthering my suspicions was after looking at the Internet Archive the first appearance was mid-May, shortly before his site took the nose dive (still visible at http://web.archive.org/web/20130517041513/http://brandwheelsandtires.com) THIS, i figured was the problem. Particularly when I started doing exact match searches for text on the finishlinewheels.com home page like "welcome to finish line wheels" and it was nowhere to be found. I figured the site had to be sandboxed. I contacted the owner and asked if this was his and he said it wasn't. So I gave him the contact info and he contacted the site owner and told them it had to come down and the owner apparently complied because it was gone the next day. He also filed a DMCA complaint with Google and they responded after the site was gone and said they didn't see the site in question (seriously, the guys at Google don't know how to look at their own cache?). I then had the site owner send them a list of cached URLs for this site and since then Google has said nothing. I figure at this point it's just a matter of Google running it's course. I suggested he revise the home page content and build some new quality links but I'm still a little stumped as to how/why this happened. If it was seen as duplicate content, how did this site with no links and zero authority manage to knock out a site that ranked well for hundreds of terms that had been around for 7 years? I get that it doesn't have a ton of authority but this other site had none. I'm doing this pro bono at this point but I feel bad for this guy as he's losing a lot of money at the moment so any other eyeballs that see something that I don't would be very welcome. Thanks Mozzers!
Intermediate & Advanced SEO | | NetvantageMarketing2 -
Duplicate Content
Hi, So I have my great content (that contains a link to our site) that I want to distribute to high quality relevant sites in my niche as part of a link building campaign. Can I distribute this to lots of sites? The reason I ask is that those sites will then have duplicate content to all the other sites I distribute the content to won;t they? I this duplication bad for them and\or us? Thanks
Intermediate & Advanced SEO | | Studio330 -
Is an RSS feed considered duplicate content?
I have a large client with satellite sites. The large site produces many news articles and they want to put an RSS feed on the satellite sites that will display the articles from the large site. My question is, will the rss feeds on the satellite sites be considered duplicate content? If yes, do you have a suggestion to utilize the data from the large site without being penalized? If no, do you have suggestions on what tags should be used on the satellite pages? EX: wrapped in tags? THANKS for the help. Darlene
Intermediate & Advanced SEO | | gXeSEO0 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600