Duplicate Content on Multinational Sites?
-
Hi SEOmozers
Tried finding a solution to this all morning but can't, so just going to spell it out and hope someone can help me!
Pretty simple, my client has one site www.domain.com. UK-hosted and targeting the UK market. They want to launch www.domain.us, US-hosted and targeting the US market.
They don't want to set up a simple redirect because
a) the .com is UK-hosted
b) there's a number of regional spelling changes that need to be made
However, most of the content on domain.com applies to the US market and they want to copy it onto the new website. Are there ways to get around any duplicate content issues that will arise here? Or is the only answer to simply create completely unique content for the new site?
Any help much appreciated!
Thanks
-
Hi Coolpink,
from what I've understood from your question the potential panorama for your client can be this:
- .com for UK
- .us for USA
- both sites with almost identical content.
If I was you, I would follow these best practices:
- On Google Webmaster Tools I'd specify that domain.com must geotarget UK only. Even though .com domains name are meant to be global, if you order Google to geotarget your site for a specific country, then it should follow your directive even if your domain is a generic domain name;
- Again on Google Webmaster Tools. I'd specify that the domain .us must geotarget the USA only territory. Be aware that .us is the Country Level Domain of United States (as .co.uk is the cTLD of UK), therefore Google should have to geotarget automatically domains with that termination to the USA.
- I don't know the nature of your client's site, but if it is an eCommerce, surely there local signals that you may or must use: currencies (Pounds and Dollars), Addresses, Phone Numbers.
- You write that cannot be merged the US and UK market also because of the regional spelling to change. This is a correct intuition, also in term of International SEO. So, when creating the new .us site, pay attention to this issue and remember to translate to American English those contents that were writting in British English (i.e.: analise > analize... ). These regional differences help a lot Google understanding the target of the site
- A good idea in order to reinforce the fact that the .com site is meant just for the UK market, it should be to add in this site the rel="alternate" hreflang="x" tag this way: <rel="alternate" hreflang="en-us" href="http://www.domain.us">(please go read not at the end)</rel="alternate">
- BING > This page in the Bing’s Webmaster Center Blog (“How to tell Your Website’s Country and Language”) explains quite well what are the best practices to follow in order to have a site ranking in the regional versions of Bing. Actually the Metadata embedded in the document solution is the most important between the ones Bing suggests: should be the one to add in the .us site (target: USA)
Note well: the rel="alternate" hreflang="x" is a "page level tag", not domain.
That means that the home page will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us">(as seen above)</rel="alternate">
That page "A" will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us/page-equivalent-to-A"></rel="alternate">
and so on.
-
As of 3 years ago, Google wasn't filtering ccTLD sites for duplicate content. I haven't found anything indicated this isn't true anymore. Also, Rand had a good Whiteboard Friday on this very subject.
-
My experience was that well-researched copy tailored to the local market perfomed much better for non-brand terms. I don't use the .us tld, but I host all my sites in Norway and Google has not had any problems with my conuntry tlds such as .co.uk, .cn, .kr, etc.
-
Thanks for your reply Knut.
So you would advise against using the same copy?
Also, just to clarify, the .com is going to be the UK site, and they are planning on purchasing .us for the US site. Is this acceptable practice?
-
Even if the .com site is hosted in the UK, Google will figure out that .co.uk is for UK and .com is for the US customers. I manage two such sites, www.ekornes.com/us and www.ekornes.co.uk, and when the content was nearly duplicates we ranked well on brand-specific terms in both countries, but not well on non-brand or brand associated terms. The first thing wou want to do is to make the meta tags unique and follow up with unique content. You'll find that if you do your keyword research well, creating and unique content and tags becomes a lot easier as consumers use different words and lingo in different countries to find your product.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our original content is being outranked on search engines by smaller sites republishing our content.
We a media site, www.hope1032.com.au that publishes daily content on the WordPress platform using the Yoast SEO plugin. We allow smaller media sites to republish some of our content with canonical field using our URL. We have discovered some of our content is now ranking below Or not visible on some search engines when searching for the article heading. Any thoughts as to why? Have we got an SEO proble? An interesting point is the small amount of content we have republished is not ranking against the original author on search engines.
Technical SEO | | Hope-Media0 -
Does adding a noindex tag reduce duplicate content?
I've been working under the assumption for some time that if I have two (or more) pages which are very similar that I can add a noindex tag to the pages I don't need and that will reduce duplicate content. As far as I know this removes the pages with the tag from Google's index and stops any potential issues with duplicate content. It's the second part of that assumption that i'm now questioning. Despite pages having the noindex tag they continue to appear in Google Search console as duplicate content, soft 404 etc. That is, new pages are appearing regularly that I know to have the noindex tag. My thoughts on this so far are that Google can still crawl these pages (although won't index them) so shows them in GSC due to a crude issue flagging process. I mainly want to know: a) Is the actual Google algorithm sophisticated enough to ignore these pages even through GSC doesn't. b) How do I explain this to a client.
Technical SEO | | ChrisJFoster0 -
Affiliate Url & duplicate content
Hi i have checked passed Q&As and couldn't find anything on this so thought I would ask.
Technical SEO | | Direct_Ram
I have recently noticed my URLS adding the following to the end: mydomain.com/?fullweb=1 I cant seem to locate where these URLS are coming from and how this is being created? This is causing duplicate content on google. I wanted to know ig anyone has had any previous experience with something like this? If anyone has any information on this it would be a great help. thanks E0 -
Issue with duplicate content
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot. I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue. Thank you!
Technical SEO | | odmsoft0 -
Do mobile and desktop sites that pull content from the same source count as duplicate content?
We are about to launch a mobile site that pulls content from the same CMS, including metadata. They both have different top-level domains, however (www.abcd.com and www.m.abcd.com). How will this affect us in terms of search engine ranking?
Technical SEO | | ovenbird0 -
Subdomain Severe Duplicate Content Issue
Hi A subdomain for our admin site has been indexed and it has caused over 2000 instances of duplicate content. To fix this issue, is a 301 redirect or canoncial tag the best option? http://www.example.com/services http://admin.example.com/services Really appreciate your advice J
Technical SEO | | Metricly-Marketing0 -
Shopify duplicate content issue
We recently moved out site to shopify but now have a duplicate content issue as we have the same products in different collections. I have added canonical code to get rid of this but my webmaster tools still shows hundreds of duplicate pages. How can I tell if the code I added is working? How long will it take for google to recognise this and drop the duplicates from their index and is this likely to have a significant impact on SERPS? Our we page is www.devoted2vintage.co.uk. Thanks Paul
Technical SEO | | devoted2vintage1 -
What is the best practice to handle duplicate content?
I have several large sections that SEOMOZ is indicating has duplicate content, even though the content is not identical. For example: Leather Passport Section - Leather Passports - Black - Leather Passposts - Blue - Leather Passports - Tan - Etc. Each of the items has good content, but it is identical, since they are the same products. What is the best practice here: 1. Have only one product with a drop down (fear is that this is not best for the customer) 2. Make up content to have them sound different? 3. Put a do-no-follow on the passport section? 4. Use a rel canonical even though the sections are technically not identical? Thanks!
Technical SEO | | trophycentraltrophiesandawards0