Duplicate Content on Multinational Sites?
-
Hi SEOmozers
Tried finding a solution to this all morning but can't, so just going to spell it out and hope someone can help me!
Pretty simple, my client has one site www.domain.com. UK-hosted and targeting the UK market. They want to launch www.domain.us, US-hosted and targeting the US market.
They don't want to set up a simple redirect because
a) the .com is UK-hosted
b) there's a number of regional spelling changes that need to be made
However, most of the content on domain.com applies to the US market and they want to copy it onto the new website. Are there ways to get around any duplicate content issues that will arise here? Or is the only answer to simply create completely unique content for the new site?
Any help much appreciated!
Thanks
-
Hi Coolpink,
from what I've understood from your question the potential panorama for your client can be this:
- .com for UK
- .us for USA
- both sites with almost identical content.
If I was you, I would follow these best practices:
- On Google Webmaster Tools I'd specify that domain.com must geotarget UK only. Even though .com domains name are meant to be global, if you order Google to geotarget your site for a specific country, then it should follow your directive even if your domain is a generic domain name;
- Again on Google Webmaster Tools. I'd specify that the domain .us must geotarget the USA only territory. Be aware that .us is the Country Level Domain of United States (as .co.uk is the cTLD of UK), therefore Google should have to geotarget automatically domains with that termination to the USA.
- I don't know the nature of your client's site, but if it is an eCommerce, surely there local signals that you may or must use: currencies (Pounds and Dollars), Addresses, Phone Numbers.
- You write that cannot be merged the US and UK market also because of the regional spelling to change. This is a correct intuition, also in term of International SEO. So, when creating the new .us site, pay attention to this issue and remember to translate to American English those contents that were writting in British English (i.e.: analise > analize... ). These regional differences help a lot Google understanding the target of the site
- A good idea in order to reinforce the fact that the .com site is meant just for the UK market, it should be to add in this site the rel="alternate" hreflang="x" tag this way: <rel="alternate" hreflang="en-us" href="http://www.domain.us">(please go read not at the end)</rel="alternate">
- BING > This page in the Bing’s Webmaster Center Blog (“How to tell Your Website’s Country and Language”) explains quite well what are the best practices to follow in order to have a site ranking in the regional versions of Bing. Actually the Metadata embedded in the document solution is the most important between the ones Bing suggests: should be the one to add in the .us site (target: USA)
Note well: the rel="alternate" hreflang="x" is a "page level tag", not domain.
That means that the home page will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us">(as seen above)</rel="alternate">
That page "A" will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us/page-equivalent-to-A"></rel="alternate">
and so on.
-
As of 3 years ago, Google wasn't filtering ccTLD sites for duplicate content. I haven't found anything indicated this isn't true anymore. Also, Rand had a good Whiteboard Friday on this very subject.
-
My experience was that well-researched copy tailored to the local market perfomed much better for non-brand terms. I don't use the .us tld, but I host all my sites in Norway and Google has not had any problems with my conuntry tlds such as .co.uk, .cn, .kr, etc.
-
Thanks for your reply Knut.
So you would advise against using the same copy?
Also, just to clarify, the .com is going to be the UK site, and they are planning on purchasing .us for the US site. Is this acceptable practice?
-
Even if the .com site is hosted in the UK, Google will figure out that .co.uk is for UK and .com is for the US customers. I manage two such sites, www.ekornes.com/us and www.ekornes.co.uk, and when the content was nearly duplicates we ranked well on brand-specific terms in both countries, but not well on non-brand or brand associated terms. The first thing wou want to do is to make the meta tags unique and follow up with unique content. You'll find that if you do your keyword research well, creating and unique content and tags becomes a lot easier as consumers use different words and lingo in different countries to find your product.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
Duplicate Content - Different URLs and Content on each
Seeing a lot of duplicate content instances of seemingly unrelated pages. For instance, http://www.rushimprint.com/custom-bluetooth-speakers.html?from=topnav3 is being tracked as a duplicate of http://www.rushimprint.com/custom-planners-diaries.html?resultsperpg=viewall. Does anyone else see this issue? Is there a solution anyone is aware of?
Technical SEO | | ClaytonKendall0 -
Duplicate Title and Content. How to fix?
So this is the biggest error I have. But I don't know how to fix it. I get that I have to make it so that the duplicats redirect to the source, but I don't know how to do that. For example, this is out of our crawl diagnostic: | On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3 1 1 0 On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3&s=8d631e0ac09b7a462164132b60433f98 | 1 | 1 | 0 | That's just an example. But I have over 1000+ like that. How would I go about fixing that? Getting rid of the "&s=8d631e0ac09b7a462164132b60433f98"? I have godaddy as my domain and web hoster. Could they be able to fix it?
Technical SEO | | taychatha0 -
Duplicate Content Issue
My issue with duplicate content is this. There are two versions of my website showing up http://www.example.com/ http://example.com/ What are the best practices for fixing this? Thanks!
Technical SEO | | OOMDODigital0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
Duplicate Page Content Lists the same page twice?
When checking my crawl diagnostics this morning I see that I have the error Duplicate page content. It lists the exact same url twice though and I don't understand how to fix this. It's also listed under duplicate page title. Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Does this have anything to do with a 301 redirect here? Why does it have http;// twice? Thanks all! | http://www.charlottepersonalassistant.com/ | http://http://charlottepersonalassistant.com/ |
Technical SEO | | eidna220 -
Does creating a mobile site in html5 create duplicate content?
We are creating a mobile site in html5 to serve smartphones only. On a seperate domain, m.example.com. From what I have read Google treats smartphones as desktops due to thier advanced web browser capabilities. So no need to bother with googlebot.mobile right? Googlebot should index the site once I create a normal sitemap.xml. My concern is that the mobile site pulls the same content as the main site which is already indexed. Would this not create duplicate content?
Technical SEO | | sfseo0