How to avoid getting penalized for having same website in 2 languages
-
Hi,
I have a price comparison website in English with .com domain. Now as we are expanding, we want to localize our website and target different markets in their local languages. The first market we are targetting is France. For that purpose we have
-
a different domain name in French
-
.fr domain
The website however, will have the exact same content & mostly translated in French. My question is what is the best way to avoid getting penalized by Google for having duplicate content?
Thanks,
-
-
If you are targeting the .com at English speaking countries, then why start targeting other sites at specific countries rather than languages? If you targeted the French language (in which you should not use .fr), you'd target more people than targeting France.
Based on your situation, I would have one site with translated content, rather than country-specific content. This would allow you to use the strength of one domain while giving your users the right content in their language.
Using hreflang tags between translated content is how you alert the search engines that the content is the same, just translated.
-
Ok I don't think anyone mentioned dynamic Google Cloud API Translator? It's definitely not going to be perfect, however if it isn't something that you paid a lawyer to make exact, Then I suppose if you used Max Mind Geo-Ip to deturmine the location of visitors, you could still do the subdomain based on that and an output language.
It would totally make the entire process quite a bit more simple, and still cater to your potential leads. No matter what you end up doing, I suggest using more icons as they are the universal language
-
Hi Kate,
Thanks for your answer. I think there is no one answer for this question. Yes, we are targetting countries, but as long as the language of any country is English we would not start a new website. Then, we are a price comparison website so our website content is majorly the products. Any website that we might start we will always have 90% of the content same. We could change a bit of written content maybe, but not more than that.
Does that give you more clarification?
Thanks,
Priyam
-
Hi!
First, it sounds like you are targeting countries, not languages. Can you confirm that? Meaning, someday you might want to target Canada, which will need French language content that is in the Canadian dialect of French.
If you are targeting countries, this is the right setup. The key is to treat each site like it's own site. Don't just make a copy and translate to general French. If you want to go about targeting languages, not countries, then I suggest using one domain and having a subfolder per language. In that case, you would use hreflang tags to show the SEs that the content is the same just translated.
It all depends on what you want to do in the future. It sounds more like you want to do language translation, not geo-targeting. But again, I'll need you to confirm that to give the right answer.
-
Hi there,
What you are needing is to implement Hreflang tags and, just to make sure Google comprehends it, set the geo targetting in each of the Search Console properties.
Remember always to have both tags in both sites (as you are only woking woth 2 sites). Hreflang tags MUST be self referential and point to the other versions of the web.Here some useful resources, specially the ones from Aleyda Solis:
Hreflang generator - Aleyda Solis International SEO - Moz Learning Center The Guide to International Website Expansion: Hreflang, ccTLDs, & More! - Moz Blog The International SEO Checklist - Moz Blog Using the correct hreglang tag - Moz Blog Tool for checking hreflang anotations - Moz Blog
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Indexing Issues - Search Bots will only crawl Homepage of Website, Help!
Hello Moz World, I am stuck on a problem, and wanted to get some insight. When I attempt to use Screaming Spider or SEO Powersuite, the software is only crawling the homepage of my website. I have 17 pages associated with the main domain i.e. example.com/home, example.com/sevices, etc. I've done a bit of investigating, and I have found that my client's website does not have Robot.txt file or a site map. However, under Google Search Console, all of my client's website pages have been indexed. My questions, Why is my software not crawling all of the pages associated with the website? If I integrate a Robot.txt file & sitemap will that resolve the issue? Thanks ahead of time for all of the great responses. B/R Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
Could i add my website in Google News
Hi , I am looking to add article news section of my website http://goo.gl/De5MKo in google news. We'll remove all content from this news section http://www.99acres.com/articles/real-estate-news and upload unique news, might be possible we use graph/chart and images from some other news site and other sources but the content will be unique and fresh. Any specific guidelines for URL structure for this news section? We are thinking about to create URL like xyz.com/news/ <title>. Is it okay and will not harm our site? Google News can consider my section as a news?</p></title>
Intermediate & Advanced SEO | | vivekrathore0 -
Training events - optimisation and avoiding cannibalisation
This is quite a broad question I’m afraid – any help would be appreciated. I’m trying to find the best way of optimising our new training pages. These events are aimed at teaching our customers how to use our software to do different tasks. Inevitably, the themes and naming of these training workshops overlap with some of our products. A close example would be, to make up a product, ‘Keyword Ranker’ and ‘Keyword Ranker Training’. Someone has raised the concern that the training pages might start outranking the pages for our main tool, particularly as the training will be heavily promoted via social media. Also, the on-page content talks about similar topics. They’ve suggested that we use rel=canonical tags pointing from the each training page to the related product page to prevent this from happening. I myself don’t think this is a good idea as this is not what the rel=canonical tags are designed for. I think that they might prevent the events pages ranking for any query at all, which is not what we want. Also, I believe that the training pages and the products are different enough that Google will work out which to rank for relevant queries. Has anyone else had an experience of doing this? Are there any approaches that people would recommend? Or is this something that we shouldn’t be worried about? A few other thoughts that I’ve had: Using schema.org event markup to emphasise what the events pages are about. Making sure to remove old events once they have expired. I thought it best to let these 404 as I’ve read that 301s to a category page than cause Google to penalise content. Putting internal links from the product pages to the relevant training workshop pages. Using the meta unavailable tag on events pages, so that when the event has happened then it will be removed from Google’s index.
Intermediate & Advanced SEO | | RG_SEO0 -
How many pages should be on landscapers website
Hi Guys, We have a good website strong onsite and offsite seo. A year ago, we had a 15 pages website for all main keywords we needed and we were on top 3 for most of these keywords in google. We were happy but we wanted more.. So we created lots of unique content targeting long tail keywords and created 100 more pages for the website. In next 4-5 months we lost positions for almost all our main keywords but got lots of longtails SERPs. Trafiic grew but the quality and the conversion rate shrinked. Everybody keep saying that it doesn't matter how many pages you have on the website as long as content is unique and I don't think it is true. I see lots of 3-5 paged websites without any seo in top 3 results in google. Does it mean that if I delete all these 100 pages that I created I will have more chances to get my main keywords SERP back? Basically does the seo juice that you have on domain is spreading across all pages and the more pages you have the less juice every page will get?
Intermediate & Advanced SEO | | vadimmarusin100 -
Robots.txt issue for international websites
In Google.co.uk, our US based (abcd.com) is showing: A description for this result is not available because of this site's robots.txt – learn more But UK website (uk.abcd.com) is working properly. We would like to disappear .com result totally, if possible. How to fix it? Thanks in advance.
Intermediate & Advanced SEO | | JinnatUlHasan0 -
Websites with same content
Hi, Both my .co.uk and .ie websites have the exact same content which consists of hundreds of pages, is this going to cause an issue? I have a hreflang on both websites plus google webmaster tools is picking up that both websites are targeting different counties. Thanks
Intermediate & Advanced SEO | | Paul780 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0 -
Why would the PageRank for all of our websites show the same?
The last time I checked (early this year), the PageRank on the sites I manage varied, with the highest showing as 6. It made sense as the PR6 site has loads of links and has been around for a long time, whereas the other sites hadn't. Now all of our websites are showing the same PageRank - 6, even one that has recently launched and another that has barely any links/traffic or anything to it. I didn't check the PR of that one last time (I'd be surprised if it was 2), but the sites now showing as 6 ranged from PR3 to PR6 back then. We changed server in February...so could this issue be something to do with all of the sites being stored on the same server? It doesn't seem right but it's the only thing I can think of. At the moment, the Domain Authority for these six websites ranges from 27 to 62.
Intermediate & Advanced SEO | | Alex-Harford0