Duplicated content multi language / regional websites
-
Hi Guys,
I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://moz.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research.
The case:
A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries.
They are willing to implement the following changes:
- - Href lang tags
- - Possible a Local Phone number
- - Possible a Local translation of the menu
- - Language meta tag (for Bing)
Optional they are willing to take the following steps:
- - Crosslinking every page though a language flag or similar navigation in the header.
- - Invest in gaining local .be backlinks
- - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant).
The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands?
Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website.
Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example.
I would love to hear from you guys.
Best regards,
Bob van Biezen
-
Thanks, valuable advice! I will put it to good use.
-
Bob,
It depends on the category & type of product. I remember a Dutch site selling shutters who just put the NL content on a BE domain - problem was that in Belgium we don't use this word when looking for this type of product and hence Google wasn't showing the site (they did rank pos. 1 for shutters in Belgium but probably with 0 traffic)
You don't have to rewrite the content for Google - but it would probably be a good idea to let a Flemish person check the content. If it's just a small word here and there it's no problem - if it's about your main keywords then it's an issue
To reply to your other question - when searching in BE I quite often get NL results if Google doesn't find a good BE result or the NL site is just better. You could just put the content on the be domain - and see if it brings results (even without doing the cross-linking - although I think that would be a useful feature). Belgian backlinks will always help - but it will take time & effort. Take a trial & error approach - there is no risk - if it doesn't work you can always improve later on.
Dirk
-
Thanks for your comment Dirk!
Rewriting the content would be the best case scenario. Do you think it's a absolute must to rewrite those words (let's say, because Google would els filter out the .be domain if it's a exact copy) or would it be an extra to make the website convert even better and add a extra trust signal to Google?
It would probably be a pain in the ass for this webshop to check all there product descriptions for any possible words to change. They would probably not launch the .be website if it would take them a week or two to go through all the pages.
-
Thanks for both of your opinions! Since this client is looking for the quickest fix possible, what is your opinion on the optional points:
- Crosslinking every page though a language flag or similar navigation in the header.
- Invest in gaining local .be backlinks
Do you think they are neccessary or add enough extra value to justify the extra costs (especialy for the extra backlinks)?
-
I agree with Jordan on this - shouldn't cause troubles.
Just make sure that you at least adapt the wording on the site - we might both speak dutch but not all the words have the same meaning & we don't use the same words to describe the same things. As an example - in Belgium we like "konfituur" - you prefer "jam" - pretty useless to try put a page optimised for "jam" in Belgium as nobody will look for it.
Dirk
-
Google has stated duplicate content for international sites is generally not an issue as long as the content is for different users in different countries. With the steps you have previously outlined I believe you should be fine.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Questions about Event Calendar Format and Duplicate Content
Hi there: We maintain a calendar of digital events and conferences on our website here: https://splatworld.tv/events/ . We're trying to add as many events as we can and I'm wondering about the descriptions of each. We're pulling them from the conference websites, mostly, but I'm worried about the scraped content creating duplicate content issues. I've also noticed that most calendars of this type which rank well are not including actual event descriptions, but rather just names, locations and a link out to the conference website. See https://www.semrush.com/blog/the-ultimate-calendar-of-digital-marketing-events-2017/ and http://www.marketingterms.com/conferences/ . Anyone have any thoughts on this? Thanks, in ..advance..
Intermediate & Advanced SEO | | Daaveey0 -
Two Domains, Same Products/Content
We're an e-commerce company with two domains. One is our original company name/domain, one is a newer top-level domain. The older domain doesn't receive as much traffic but is still searched and used by long-time customers who are loyal to that brand, who we don't want to alienate. The sites are both identical in products and content, which creates a duplicate content issue. I have come across two options so far: 1. a 301 redirect from the old domain to the new one. 2. Optimize the content on the newer domain (the strongest of the two) and leave the older domain content as is. Does anyone know of a solution better than the two I listed above or have experience resolving a similar problem in the past?
Intermediate & Advanced SEO | | ilewis0 -
Domain Authority... http://www.domain.com/ vs. http://domain.com vs. http://domain.com/
Hey Guys, Looking at Page Authority for my Site and ranking them in Decending Order, I see these 3 http://www.domain.com/ | Authority 62 http://domain.com | Authority 52 http://domain.com/ | Authority 52 Since the first one listed has the highest Authority, should I be using a 301 redirects on the lower ranking variations (which I understand how works) or should I be using rel="canonical" (which I don't really understand how it works) Also, if this is a problem that I should address, should we see a significant boost if fixed? Thanks ahead of time for anyone who can help a lost sailor who doesn't know how to sail and probably shouldn't have left shore in the first place. Cheers ZP!
Intermediate & Advanced SEO | | Mr_Snack0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Are all duplicate content issues bad? (Blog article Tags)
If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
Intermediate & Advanced SEO | | Daniel_B
Daniel.0 -
All Thin Content removed and duplicate content replaced. But still no success?
Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0