Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sites in multiple countries using same content question
-
Hey Moz,
I am looking to target international audiences. But I may have duplicate content. For example, I have article 123 on each domain listed below. Will each content rank separately (in US and UK and Canada) because of the domain?
The idea is to rank well in several different countries. But should I never have an article duplicated? Should we start from ground up creating articles per country? Some articles may apply to both! I guess this whole duplicate content thing is quite confusing to me.
I understand that I can submit to GWT and do geographic location and add rel=alternate tag but will that allow all of them to rank separately?
Please help and thanks so much!
Cole
-
Just asking.
-
Are you sure eyepaq?
** Yes. I have the same format implemented across several projects - big and small. All is perfect. I have a few cases when some domains are helping eachouther out – so when a new country is deployed it gets a small boost in that geo location due to the others. The approach was also confirmed by several trend analysis in Google in the google forum and at least one Google hangout and across the web in different articles.
If I had 5 domains so say .uk .fr .de .ie and .es and pasted the same 1000 words on each I would assume it would be duplicate content and wouldn't have equal rankings across all 5 domains, but I may be wrong?
** It won't be duplicate if you have the content in de in german and the content in uk in english. It will have the same message but it is not duplicate
Of course you won't have the same rankings since it's different competition in Germany and UK for example and also the signals, mainly links are counted different for each country. One link from x.de will count towards the de domain in a different way then y.co.uk linking to the your uk domain.
I don't think Cole is talking about recreating the same article in different languages because then I would understand the use of the href-lang tag but I think he means the exact same article on separate domains, could be wrong here as well
*** if I understand correctly he is mainly concern about english content on different geo english based domains (uk, com, canada, co.nz, co au let's say) and for that - if it's the same content - he needs hreflang set for those and he is safe. Google will then rank co.uk domain and content in UK and not the canadian domain. He will also be safe with any "duplicate content issues" - although even without href lang there won’t be any.
-
Are you sure eyepaq?
If I had 5 domains so say .uk .fr .de .ie and .es and pasted the same 1000 words on each I would assume it would be duplicate content and wouldn't have equal rankings across all 5 domains, but I may be wrong?
I don't think Cole is talking about recreating the same article in different languages because then I would understand the use of the href-lang tag but I think he means the exact same article on separate domains, could be wrong here as well
@Colelusby - Is a sub-domain for each location on one domain out the question? So
uk.example.com, fr.example.com etc You can then tell WMTs the sub domain UK targets the UK and the fr targets France etc.
-
Yes, that's it
The use of hreflang has a lot of benefits and overall is very straight forward - google will understand how the structure is setup and you are safe.
Cheers.
-
Is that it?
The same article will rank it two different geographic locations and duplicate content won't hurt me?
I feel like that's too easy. Maybe I'm overthinking it.
Thanks!
-
HI,
In this case the use of hreflang is needed:
https://support.google.com/webmasters/answer/189077?hl=en
As summary each version will have rel alternate hreflang set with hreflang="en-ca" for Canada for example, hreflang="en-us" for US and so on. (first is language and second geo location). So even if the language is the same, it's for a particular region as in some cases you might have some small differences in UK vs Au or Ca etc.
Whne you have a domain with example.ch, the hreflang will be hreflang="de-ch" .
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Medical / Health Content Authority - Content Mix Question
Greetings, I have an interesting challenge for you. Well, I suppose "interesting" is an understatement, but here goes. Our company is a women's health site. However, over the years our content mix has grown to nearly 50/50 between unique health / medical content and general lifestyle/DIY/well being content (non-health). Basically, there is a "great divide" between health and non-health content. As you can imagine, this has put a serious damper on gaining ground with our medical / health organic traffic. It's my understanding that Google does not see us as an authority site with regard to medical / health content since we "have two faces" in the eyes of Google. My recommendation is to create a new domain and separate the content entirely so that one domain is focused exclusively on health / medical while the other focuses on general lifestyle/DIY/well being. Because health / medical pages undergo an additional level of scrutiny per Google - YMYL pages - it seems to me the only way to make serious ground in this hyper-competitive vertical is to be laser targeted with our health/medical content. I see no other way. Am I thinking clearly here, or have I totally gone insane? Thanks in advance for any reply. Kind regards, Eric
Intermediate & Advanced SEO | | Eric_Lifescript0 -
Robots.txt - Do I block Bots from crawling the non-www version if I use www.site.com ?
my site uses is set up at http://www.site.com I have my site redirected from non- www to the www in htacess file. My question is... what should my robots.txt file look like for the non-www site? Do you block robots from crawling the site like this? Or do you leave it blank? User-agent: * Disallow: / Sitemap: http://www.morganlindsayphotography.com/sitemap.xml Sitemap: http://www.morganlindsayphotography.com/video-sitemap.xml
Intermediate & Advanced SEO | | morg454540 -
Using a US CDN (Cloudflare) for a UK Site. Should I use a UK Based CDN as it says my server is based in USA
Hi All, We are a UK Company with Uk customers only and use CloudFlare CND. Our Site is hosted by a UK company with servers here but from looking online and checking where my site is hosted etc etc , some sites are telling me the name of our UK Hosted company and other sites are telling me my site is hosted in San Fran (USA) , where I presume the Cloudflare is based. I know Cloudflare has a couple of servers in the UK it uses but given all my customers are UK based ,I don't want this is affect rankings etc , as I thought it was a ranking benefit to be hosted in the country you are based. Is there any issue with this and should I change or is google clever enough to know so i shouldn't worry. thanks Pet
Intermediate & Advanced SEO | | PeteC120 -
Using the same content on different TLD's
HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.
Intermediate & Advanced SEO | | dancape1 -
Focusing on Multiple Niches for one site: good or bad?
Is it wise to focus on multiple niches for one site, rather than zoning in one or two different niches? On one hand, you can target many more topics and go after tons of keywords, but on the other hand doesn't google get confused of what your site is really about? Won't google just focus on one of the niches that you provide more than all others? Any input would be great!
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Multiple sites linking back with pornographic anchor text
I discovered a while ago that we had quite a number of links pointing back to one of our customer's websites. The anchor text of these links contain porn that is extremely bad. These links are originating from forums that seems to link between themselves and then throw my customers web address in there at the same time. Any thoughts on this? I'm seriously worried that this may negatively affect the site.
Intermediate & Advanced SEO | | GeorgeMaven0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1