Duplicated content multi language / regional websites
-
Hi Guys,
I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://moz.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research.
The case:
A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries.
They are willing to implement the following changes:
- - Href lang tags
- - Possible a Local Phone number
- - Possible a Local translation of the menu
- - Language meta tag (for Bing)
Optional they are willing to take the following steps:
- - Crosslinking every page though a language flag or similar navigation in the header.
- - Invest in gaining local .be backlinks
- - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant).
The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands?
Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website.
Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example.
I would love to hear from you guys.
Best regards,
Bob van Biezen
-
Thanks, valuable advice! I will put it to good use.
-
Bob,
It depends on the category & type of product. I remember a Dutch site selling shutters who just put the NL content on a BE domain - problem was that in Belgium we don't use this word when looking for this type of product and hence Google wasn't showing the site (they did rank pos. 1 for shutters in Belgium but probably with 0 traffic)
You don't have to rewrite the content for Google - but it would probably be a good idea to let a Flemish person check the content. If it's just a small word here and there it's no problem - if it's about your main keywords then it's an issue
To reply to your other question - when searching in BE I quite often get NL results if Google doesn't find a good BE result or the NL site is just better. You could just put the content on the be domain - and see if it brings results (even without doing the cross-linking - although I think that would be a useful feature). Belgian backlinks will always help - but it will take time & effort. Take a trial & error approach - there is no risk - if it doesn't work you can always improve later on.
Dirk
-
Thanks for your comment Dirk!
Rewriting the content would be the best case scenario. Do you think it's a absolute must to rewrite those words (let's say, because Google would els filter out the .be domain if it's a exact copy) or would it be an extra to make the website convert even better and add a extra trust signal to Google?
It would probably be a pain in the ass for this webshop to check all there product descriptions for any possible words to change. They would probably not launch the .be website if it would take them a week or two to go through all the pages.
-
Thanks for both of your opinions! Since this client is looking for the quickest fix possible, what is your opinion on the optional points:
- Crosslinking every page though a language flag or similar navigation in the header.
- Invest in gaining local .be backlinks
Do you think they are neccessary or add enough extra value to justify the extra costs (especialy for the extra backlinks)?
-
I agree with Jordan on this - shouldn't cause troubles.
Just make sure that you at least adapt the wording on the site - we might both speak dutch but not all the words have the same meaning & we don't use the same words to describe the same things. As an example - in Belgium we like "konfituur" - you prefer "jam" - pretty useless to try put a page optimised for "jam" in Belgium as nobody will look for it.
Dirk
-
Google has stated duplicate content for international sites is generally not an issue as long as the content is for different users in different countries. With the steps you have previously outlined I believe you should be fine.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Duplicate content issue with pages that have navigation
We have a large consumer website with several sections that have navigation of several pages. How would I prevent the pages from getting duplicate content errors and how best would I handle SEO for these? For example we have about 500 events with 20 events showing on each page. What is the best way to prevent all the subsequent navigation pages from getting a duplicate content and duplicate title error?
Intermediate & Advanced SEO | | roundbrix0 -
How can a website have multiple pages of duplicate content - still rank?
Can you have a website with multiple pages of the exact same copy, (being different locations of a franchise business), and still be able to rank for each individual franchise? Is that possible?
Intermediate & Advanced SEO | | OhYeahSteve0 -
Website.com/blog/post vs website.com/post
I have clients with Wordpress sites and clients with just a Wordpress blog on the back of website. The clients with entire Wordpress sites seem to be ranking better. Do you think the URL structure could have anything to do with it? Does having that extra /blog folder decrease any SEO effectiveness? Setting up a few new blogs now...
Intermediate & Advanced SEO | | PortlandGuy0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
Do you bother cleaning duplicate content from Googles Index?
Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
Affiliate Site Duplicate Content Question
Hi Guys I have been un-able to find a definite answer to this on various forums, your views on this will be very valuable. I am doing a few Amazon affiliate sites and will be pulling in product data from Amazon via a Wordpress plugin. The plugin pulls in titles, descriptions, images, prices etc, however this presents a duplicate content issue and hence I can not publish the product pages with amazon descriptions. Due to the large number of products, it is not feasible to re-write all descriptions, but I plan re-write descriptions and titles for 50% of the products and publish then with “index, follow” attribute. However, for the other 50%, what would be the best way to handle them? Should I publish them as “noindex,follow”? **- Or is there another solution? Many thanks for your time.**
Intermediate & Advanced SEO | | SamBuck0