How to avoid getting penalized for having same website in 2 languages
-
Hi,
I have a price comparison website in English with .com domain. Now as we are expanding, we want to localize our website and target different markets in their local languages. The first market we are targetting is France. For that purpose we have
-
a different domain name in French
-
.fr domain
The website however, will have the exact same content & mostly translated in French. My question is what is the best way to avoid getting penalized by Google for having duplicate content?
Thanks,
-
-
If you are targeting the .com at English speaking countries, then why start targeting other sites at specific countries rather than languages? If you targeted the French language (in which you should not use .fr), you'd target more people than targeting France.
Based on your situation, I would have one site with translated content, rather than country-specific content. This would allow you to use the strength of one domain while giving your users the right content in their language.
Using hreflang tags between translated content is how you alert the search engines that the content is the same, just translated.
-
Ok I don't think anyone mentioned dynamic Google Cloud API Translator? It's definitely not going to be perfect, however if it isn't something that you paid a lawyer to make exact, Then I suppose if you used Max Mind Geo-Ip to deturmine the location of visitors, you could still do the subdomain based on that and an output language.
It would totally make the entire process quite a bit more simple, and still cater to your potential leads. No matter what you end up doing, I suggest using more icons as they are the universal language
-
Hi Kate,
Thanks for your answer. I think there is no one answer for this question. Yes, we are targetting countries, but as long as the language of any country is English we would not start a new website. Then, we are a price comparison website so our website content is majorly the products. Any website that we might start we will always have 90% of the content same. We could change a bit of written content maybe, but not more than that.
Does that give you more clarification?
Thanks,
Priyam
-
Hi!
First, it sounds like you are targeting countries, not languages. Can you confirm that? Meaning, someday you might want to target Canada, which will need French language content that is in the Canadian dialect of French.
If you are targeting countries, this is the right setup. The key is to treat each site like it's own site. Don't just make a copy and translate to general French. If you want to go about targeting languages, not countries, then I suggest using one domain and having a subfolder per language. In that case, you would use hreflang tags to show the SEs that the content is the same just translated.
It all depends on what you want to do in the future. It sounds more like you want to do language translation, not geo-targeting. But again, I'll need you to confirm that to give the right answer.
-
Hi there,
What you are needing is to implement Hreflang tags and, just to make sure Google comprehends it, set the geo targetting in each of the Search Console properties.
Remember always to have both tags in both sites (as you are only woking woth 2 sites). Hreflang tags MUST be self referential and point to the other versions of the web.Here some useful resources, specially the ones from Aleyda Solis:
Hreflang generator - Aleyda Solis International SEO - Moz Learning Center The Guide to International Website Expansion: Hreflang, ccTLDs, & More! - Moz Blog The International SEO Checklist - Moz Blog Using the correct hreglang tag - Moz Blog Tool for checking hreflang anotations - Moz Blog
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO QA automation of large websites
Can you share your experiences in managing SEO QA automation of large websites with millions of pages?
Intermediate & Advanced SEO | | terentyev
what are the things you are regularly testing for, besides the most obvious - hreflangs/canonicals, robots.txt, sitemap, non-200 status codes, redirect rules?
do you use in-house developed tools or external tools?
if external - which ones?
how do you run your QA automation scripts? external server or some online tools? upon every release or hourly/daily/monthly?0 -
If my website do not have a robot.txt file, does it hurt my website ranking?
After a site audit, I find out that my website don't have a robot.txt. Does it hurt my website rankings? One more thing, when I type mywebsite.com/robot.txt, it automatically redirect to the homepage. Please help!
Intermediate & Advanced SEO | | binhlai0 -
Getting SEO Juice back after Redirect
Hi, On my website, many product pages were redirected over time to its product category, due to the product being unavailable. I understand with a 301 redirect, the final URL would have lost about 15% of the link juice. However - if after some time (e.g. 2 months, or 1 year) I remove the redirection - is the original page going to have any SEO juice, or did it already lose all of it? Thanks,
Intermediate & Advanced SEO | | viatrading10 -
Website Indexing Issues - Search Bots will only crawl Homepage of Website, Help!
Hello Moz World, I am stuck on a problem, and wanted to get some insight. When I attempt to use Screaming Spider or SEO Powersuite, the software is only crawling the homepage of my website. I have 17 pages associated with the main domain i.e. example.com/home, example.com/sevices, etc. I've done a bit of investigating, and I have found that my client's website does not have Robot.txt file or a site map. However, under Google Search Console, all of my client's website pages have been indexed. My questions, Why is my software not crawling all of the pages associated with the website? If I integrate a Robot.txt file & sitemap will that resolve the issue? Thanks ahead of time for all of the great responses. B/R Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
Website completely delisted - reasons?
Hi, I got a request from a potential client as he do not understand why his website cannot be found on Google. I've checked that and found out that the complete website is not listed (complete delist) at all - expect just one pdf file.
Intermediate & Advanced SEO | | TheHecksler
I've checked his robots.txt - but this is ok. I've checked the META Robots - but they are on index,follow ... ok so far. I've checked his backlinks but could not found any massive linking from bad pages - just 6 backlinks and only four of them from designdomains.com which looks like a linklist or so. I've requested access to their GWT account if available in hope to find more infos, but does anyone of you may have a quick idea what els it could be? What could be the issue? I think that they got delisted due to any bad reason ... Let me know your Ideas 🙂 THANX 🙂 Sebi0 -
How to Avoid Duplicate Content Issues with Google?
We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
Intermediate & Advanced SEO | | lbohen
deal of our description pages are the same as the publishers = duplicate content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?0 -
Tips for Getting into Bing News?
For anyone that has gone through the process of getting accepted into Bing news---do you have any suggestions for what we can do? Any resources you'd recommend reading?
Intermediate & Advanced SEO | | nicole.healthline0 -
Website redesign - how do I avoid screwing up my site SEO?
We are preparing to launch a newly designed (and much improved) website in the next few months. I want to be very careful to ensure we do not mess up any rankings (and hopefully actually improve rankings) when switching over the site. I'm particularly concerned about one key phrase that our homepage currently ranks on. After the redesign it would be more appropriate for our of our subpages to rank for that term, but I'd rather have our homepage rank (less relevant for this keyword than the subpage) then nothing at all. I know about 301 redirects, and we are planning on creating a few comprehensive diagrams to ensure we redirect old pages to the correct new pages. Beyond that, what can I do to preserve our rankings? Thanks! -Ryan
Intermediate & Advanced SEO | | RyanD.0