Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sites in multiple countries using same content question
-
Hey Moz,
I am looking to target international audiences. But I may have duplicate content. For example, I have article 123 on each domain listed below. Will each content rank separately (in US and UK and Canada) because of the domain?
The idea is to rank well in several different countries. But should I never have an article duplicated? Should we start from ground up creating articles per country? Some articles may apply to both! I guess this whole duplicate content thing is quite confusing to me.
I understand that I can submit to GWT and do geographic location and add rel=alternate tag but will that allow all of them to rank separately?
Please help and thanks so much!
Cole
-
Just asking.
-
Are you sure eyepaq?
** Yes. I have the same format implemented across several projects - big and small. All is perfect. I have a few cases when some domains are helping eachouther out – so when a new country is deployed it gets a small boost in that geo location due to the others. The approach was also confirmed by several trend analysis in Google in the google forum and at least one Google hangout and across the web in different articles.
If I had 5 domains so say .uk .fr .de .ie and .es and pasted the same 1000 words on each I would assume it would be duplicate content and wouldn't have equal rankings across all 5 domains, but I may be wrong?
** It won't be duplicate if you have the content in de in german and the content in uk in english. It will have the same message but it is not duplicate
Of course you won't have the same rankings since it's different competition in Germany and UK for example and also the signals, mainly links are counted different for each country. One link from x.de will count towards the de domain in a different way then y.co.uk linking to the your uk domain.I don't think Cole is talking about recreating the same article in different languages because then I would understand the use of the href-lang tag but I think he means the exact same article on separate domains, could be wrong here as well

*** if I understand correctly he is mainly concern about english content on different geo english based domains (uk, com, canada, co.nz, co au let's say) and for that - if it's the same content - he needs hreflang set for those and he is safe. Google will then rank co.uk domain and content in UK and not the canadian domain. He will also be safe with any "duplicate content issues" - although even without href lang there won’t be any.
-
Are you sure eyepaq?
If I had 5 domains so say .uk .fr .de .ie and .es and pasted the same 1000 words on each I would assume it would be duplicate content and wouldn't have equal rankings across all 5 domains, but I may be wrong?
I don't think Cole is talking about recreating the same article in different languages because then I would understand the use of the href-lang tag but I think he means the exact same article on separate domains, could be wrong here as well

@Colelusby - Is a sub-domain for each location on one domain out the question? So
uk.example.com, fr.example.com etc You can then tell WMTs the sub domain UK targets the UK and the fr targets France etc.
-
Yes, that's it

The use of hreflang has a lot of benefits and overall is very straight forward - google will understand how the structure is setup and you are safe.
Cheers.
-
Is that it?
The same article will rank it two different geographic locations and duplicate content won't hurt me?
I feel like that's too easy. Maybe I'm overthinking it.
Thanks!
-
HI,
In this case the use of hreflang is needed:
https://support.google.com/webmasters/answer/189077?hl=en
As summary each version will have rel alternate hreflang set with hreflang="en-ca" for Canada for example, hreflang="en-us" for US and so on. (first is language and second geo location). So even if the language is the same, it's for a particular region as in some cases you might have some small differences in UK vs Au or Ca etc.
Whne you have a domain with example.ch, the hreflang will be hreflang="de-ch" .
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | | 945010 -
Wrong country sites being shown in google
Hi, I am having some issues with country targeting of our sites. Just to give a brief background of our setup and web domains We use magento and have 7 connected ecommerce sites on that magento installation 1.www.tidy-books.co.uk (UK) - main site 2. www.tidy-books.com (US) - variations in copy but basically a duplicate of UK 3.www.tidy-books.it (Italy) - fully translated by a native speaker - its' own country based social medias and content regularly updated/created 4.www.tidy-books.fr (France) - fully translated by a native speaker - its' own country based social medias and content regularly updated/created 5.www.tidy-books.de (Germany) - fully translated by a native speaker - uits' own country based social medias and content regularly updated/created 6.www.tidy-books.com.au (Australia) - duplicate of UK 7.www.tidy-books.eu (rest of Europe) - duplicate of UK I’ve added the country and language href tags to all sites. We use cross domain canonical URLS I’ve targeted in the international targeting in Google webmaster the correct country where appropriate So we are getting number issues which are driving me crazy trying to work out why The major one is for example If you search with an Italian IP in google.it for our brand name Tidy Books the .com site is shown first then .co.uk and then all other sites followed on page 3 the correct site www.tidy-books.it The Italian site is most extreme example but the French and German site still appear below the .com site. This surely shouldn’t be the case? Again this problem happens with the co.uk and .com sites with when searching google.co.uk for our keywords the .com often comes up before the .co.uk so it seems we have are sites competing against each other which again can’t be right or good. The next problem lies in the errors we are getting on google webmaster on all sites is having no return tags in the international targeting section. Any advice or help would be very much appreciated. I’ve added some screen shots to help illustrate and happy to provide extra details. Thanks UK%20hreflang%20errors.png de%20search.png fr%20search.png it%20search.png
Intermediate & Advanced SEO | | tidybooks1 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
E-commerce site, one product multiple categories best practice
Hi there, We have an e-commerce shopping site with over 8000 products and over 100 categories. Some sub categories belong to multiple categories - for example, A Christmas trees can be under "Gardening > Plants > Trees" and under "Gifts > Holidays > Christmas > Trees" The product itself (example: Scandinavian Xmas Tree) can naturally belong to both these categories as well. Naturally these two (or more) categories have different breadcrumbs, different navigation bars, etc. From an SEO point of view, to avoid duplicate content issues, I see the following options: Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking. Use the same URL and display only one "main" version of breadcrumbs and menus. Possibly add the other "not main" categories as links to the category / product page. Use a different URL based on where we came from and do nothing (will create essentially the same content on different urls except breadcrumbs and menus - there's a possibiliy to change the category text and page title as well) Use a different URL based on where we came from with different menus and breadcrumbs and use rel=canonical that points to the "main" category / product pages This is a very interesting issue and I would love to hear what you guys think as we are finalizing plans for a new website and would like to get the most out of it. Thank you all!
Intermediate & Advanced SEO | | arikbar0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Multiple Keyword Research Questions, Help
Hello , I've been trying for several days to understand how keyword research works for a multi purpose website,I've read guides, articles even some chapters from the book" The Art of Seo" by O'Reilly and still no luck. It seems i can't wrap my head around keyword research,lets say I have a social gaming community website and I'm trying to rank it first on some low competition keywords + some long tail keywords.The website has functions like leaderboards, profiles,events, competitions,etc so it's not actually a news related website but it will have a blog. My website being on the games niche It would imply that I should target words that contain the word "Games" but this word generates millions of searches globally so ranking first its nearly impossible if the website is brand new. This made me pursue generic keywords formed with 2 / 3 words like fresh games, new games, mmorpg games, fps games,etc which still generate lets say 30.000 searches globally each. Due to the different areas of the website like latest game events,latest games competitions,etc I'm confused If i should pursue website specific keywords like latest games events, fresh games events, latest games competitions, upcoming games competitions but these too generate 30.000 global searches each,so... 0.should i use generic keywords or keywords that include site features? So let's say I decide to pursue generic "games" keywords,due to a high competition based on the keyword I decide to go a layer deeper and for the keyword "fresh games" I obtain keywords like** "fresh games 2011,top fresh games 2011, upcoming fresh games** " and thus building a list of 30 keywords that contain " fresh games".If i do this for the rest of the keywords: ** new games, mmorpg games, fps games,etc** I end up with a list of 10.000 keywords or more since each keyword generates other keywords. Is this the correct approach ? since generating 10.000 keywords sounds a lot and I'm getting the feeling that It's not how it supposed to be done,like were would I insert 10.000 keywords? So how do I know which keywords to pick and aim in order to try to get no.1 ranking? and why those? How many keywords should I use? and where should i put them? since it's not a news website so writing a lot of articles isn't an option. Should I focus on 2 words keywords with around 10.000-30.000 seaches or 2 words keywords + long tail keywords with less traffic like 100 - 5000? Is there a guide for the Keyword Analysis Tool since if i enter "fresh new games" i get a 39% keyword difficulty,is that hard to rank? and I don't know what all those color mean since some of them have higher numbers then others that are found at the top and how can i get beat a website that has has rank 10. So hopefully with your help & by some miracle I will finally be able to build a keyword list. Thank you !
Intermediate & Advanced SEO | | arching0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1