Duplicate Content on Multinational Sites?
-
Hi SEOmozers
Tried finding a solution to this all morning but can't, so just going to spell it out and hope someone can help me!
Pretty simple, my client has one site www.domain.com. UK-hosted and targeting the UK market. They want to launch www.domain.us, US-hosted and targeting the US market.
They don't want to set up a simple redirect because
a) the .com is UK-hosted
b) there's a number of regional spelling changes that need to be made
However, most of the content on domain.com applies to the US market and they want to copy it onto the new website. Are there ways to get around any duplicate content issues that will arise here? Or is the only answer to simply create completely unique content for the new site?
Any help much appreciated!
Thanks
-
Hi Coolpink,
from what I've understood from your question the potential panorama for your client can be this:
- .com for UK
- .us for USA
- both sites with almost identical content.
If I was you, I would follow these best practices:
- On Google Webmaster Tools I'd specify that domain.com must geotarget UK only. Even though .com domains name are meant to be global, if you order Google to geotarget your site for a specific country, then it should follow your directive even if your domain is a generic domain name;
- Again on Google Webmaster Tools. I'd specify that the domain .us must geotarget the USA only territory. Be aware that .us is the Country Level Domain of United States (as .co.uk is the cTLD of UK), therefore Google should have to geotarget automatically domains with that termination to the USA.
- I don't know the nature of your client's site, but if it is an eCommerce, surely there local signals that you may or must use: currencies (Pounds and Dollars), Addresses, Phone Numbers.
- You write that cannot be merged the US and UK market also because of the regional spelling to change. This is a correct intuition, also in term of International SEO. So, when creating the new .us site, pay attention to this issue and remember to translate to American English those contents that were writting in British English (i.e.: analise > analize... ). These regional differences help a lot Google understanding the target of the site
- A good idea in order to reinforce the fact that the .com site is meant just for the UK market, it should be to add in this site the rel="alternate" hreflang="x" tag this way: <rel="alternate" hreflang="en-us" href="http://www.domain.us">(please go read not at the end)</rel="alternate">
- BING > This page in the Bing’s Webmaster Center Blog (“How to tell Your Website’s Country and Language”) explains quite well what are the best practices to follow in order to have a site ranking in the regional versions of Bing. Actually the Metadata embedded in the document solution is the most important between the ones Bing suggests: should be the one to add in the .us site (target: USA)
Note well: the rel="alternate" hreflang="x" is a "page level tag", not domain.
That means that the home page will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us">(as seen above)</rel="alternate">
That page "A" will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us/page-equivalent-to-A"></rel="alternate">
and so on.
-
As of 3 years ago, Google wasn't filtering ccTLD sites for duplicate content. I haven't found anything indicated this isn't true anymore. Also, Rand had a good Whiteboard Friday on this very subject.
-
My experience was that well-researched copy tailored to the local market perfomed much better for non-brand terms. I don't use the .us tld, but I host all my sites in Norway and Google has not had any problems with my conuntry tlds such as .co.uk, .cn, .kr, etc.
-
Thanks for your reply Knut.
So you would advise against using the same copy?
Also, just to clarify, the .com is going to be the UK site, and they are planning on purchasing .us for the US site. Is this acceptable practice?
-
Even if the .com site is hosted in the UK, Google will figure out that .co.uk is for UK and .com is for the US customers. I manage two such sites, www.ekornes.com/us and www.ekornes.co.uk, and when the content was nearly duplicates we ranked well on brand-specific terms in both countries, but not well on non-brand or brand associated terms. The first thing wou want to do is to make the meta tags unique and follow up with unique content. You'll find that if you do your keyword research well, creating and unique content and tags becomes a lot easier as consumers use different words and lingo in different countries to find your product.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Selling same products under separate brands and can't consolidate sites...duplicate content issues?
I have a client selling home goods online and in-store under two different brand names in separate regions of the country. Currently, the websites are completely identical aside from branding. It is unlikely that they would have the capacity to write unique titles and page content for each website (~25,000 pages each), and the business would never consolidate the sites. Would it make sense to use canonical tags pointing to the higher-performing website on category and product pages? This way we could continue to capture branded search to the lesser brand while consolidating authority on the better performing website. What would you do?
Technical SEO | | jluke.fusion0 -
Does duplicate content not concern Rand?
Hello all, I'm a new SEOer and I'm currently trying to navigate the layman's minefield that is trying to understand duplicate content issues in as best I can. I'm working on a website at the moment where there's a duplicate content issue with blog archives/categories/tags etc. I was planning to beat this by implementing a noindex meta tag on those pages where there are duplicate content issues. Before I go ahead with this I thought: "Hey, these Moz guys seem to know what they're doing! What would Rand do?" Blogs on the website in question appear in full and in date order relating to the tag/category/what-have-you creating the duplicate content problem. Much like Rand's blog here at Moz - I thought I'd have a look at the source code to see how it was dealt with. My amateur eyes could find nothing to help answer this question: E.g. Both the following URLs appear in SERPs (using site:moz,com and very targeted keywords, but they're there): https://moz.com/rand/does-making-a-website-mobile-friendly-have-a-universally-positive-impact-on-mobile-traffic/ https://moz.com/rand/category/moz/ Both pages have a rel="canonical" pointing to themselves. I can understand why he wouldn't be fussed about the category not ranking, but the blog? Is this not having a negative effect? I'm just a little confused as there are so many conflicting "best practice" tips out there - and now after digging around in the source code on Rand's blog I'm more confused than ever! Any help much appreciated, Thanks
Technical SEO | | sbridle1 -
Content Duplication - Zencart
Hi Guys !!! Based on crawler results, it shows that I have 188 duplicate content pages, out of which some are those in which I am not able to understand where the duplication is ??? The page created is unique. All the URL's are static, all titles, metat tags are unique. How do I remove this duplication !!! I am using Zencart as a platform. Thanks in advance for the help !!! 🙂
Technical SEO | | sidjain4you0 -
Looking for a technical solution for duplicate content
Hello, Are there any technical solutions to duplicate content similar to the nofollow tag? A tag which can indicate to Google that we know that this is duplicate content but we want it there because it makes sense to the user. Thank you.
Technical SEO | | FusionMediaLimited0 -
Duplicate Title and Content. How to fix?
So this is the biggest error I have. But I don't know how to fix it. I get that I have to make it so that the duplicats redirect to the source, but I don't know how to do that. For example, this is out of our crawl diagnostic: | On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3 1 1 0 On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3&s=8d631e0ac09b7a462164132b60433f98 | 1 | 1 | 0 | That's just an example. But I have over 1000+ like that. How would I go about fixing that? Getting rid of the "&s=8d631e0ac09b7a462164132b60433f98"? I have godaddy as my domain and web hoster. Could they be able to fix it?
Technical SEO | | taychatha0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Tags causing Duplicate page content?
I was looking through the 'Duplicate Page Content' and Too Many On-Page Link' errors and they all seem to be linked to the 'Tags' on my blog pages. Is this really a problem and if so how should I be using tags properly to get the best SEO rewards?
Technical SEO | | zapprabbit1 -
Duplicate page content errors in SEOmoz
Hi everyone, we just launched this new site and I just ran it through SEOmoz and I got a bunch of duplicate page content errors. Here's one example -- it says these 3 are duplicate content: http://www.alicealan.com/collection/alexa-black-3inch http://www.alicealan.com/collection/alexa-camel-3inch http://www.alicealan.com/collection/alexa-gray-3inch You'll see from the pages that the titles, images and small pieces of the copy are all unique -- but there is some copy that is the same (after all, these are pretty much the same shoe, just a different color). So, why am I getting this error and is there any best way to address? Thanks so much!
Technical SEO | | ketanmv
Ketan0