Best way to deal with multiple languages
-
Hey guys,
I've been trying to read up on this and have found that answers vary greatly, so I figured I'd seek your expertise.
When dealing with the url structure of a site that is translated into multiple languages, is it better SEO wise to structure a site like this : domain.com/en domain.com/it etc
or to simply add url modifiers like domain.com/?lang=en domain.com/?lang=it
In the first example, I'm afraid google might see my content as duplicate even though its in a different language.
-
I'd concur with this approach - however you can only Geo-Target with Google Webmaster Tools, not language target.
You might be better to implement rel="alternate" hreflang = "x" via your sitemaps to help Google understand which content is intended for which audience. See - http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865
I hope this helps,
Hannah
-
Careful with this
Content in different languages shouldn't be viewed as duplicate, however I have seen sites run into problems when they have say US English and UK English content which is very similar.
-
I always use the /es approach and you can use Google Webmaster Tools to Geo target different sub- directories
-
Its a fact that different languages are not considered as duplicate content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practices for SEO 2021
What are the best way to do on page and off page seo in 2021?
Intermediate & Advanced SEO | | SaraClay0 -
Different ways to implement schema markup?
Hey guys, What are all the ways you can implement schema markup Plugins (e.g. wordpress, shopify) Google data highlighter in GSC Google Structured Data Markup Helper Google Tag Manager Also is google data highlighter different to Google Structured Data Markup Helper? Cheers.
Intermediate & Advanced SEO | | michel_80 -
Rel language
This site ranks well in the States and Austraila. Would rel language tags help with search for both states? https://www.dinkleboo.com/ Separating duplicate content. Or how would you go about increasing ranking? The site ranks higher in the states than Austrailia but they want to increase their presence in AU. US Position 4.9k AU Position 1.3
Intermediate & Advanced SEO | | AaronRainsSEO0 -
Best Way to Create SEO Content for Multiple, International Websites
I have a client that has multiple websites for providing to other countries. For instance, they have a .com website for the US (abccompany.com), a .co.uk website for the UK (abccompany.co.uk), a .de website for Germany (abccompany.de), and so on. The have websites for the Netherlands, France, and even China. These all act as separate websites. They have their own addresses, their own content (some duplicated but translated), their own pricing, their own Domain Authority, backlinks, etc. Right now, I write content for the US site. The goal is to write content for long and medium tail keywords. However, the UK site is interested in having myself write content for them as well. The issue I'm having is how can I differentiate the content? And what is the best way to target content for each country? Does it make sense to write separate content for each website to target results in that country? The .com site will still show up in UK web results still fairly high. Does it make sense to just duplicate the content but in a different language or for the specific audience in that country? I guess the biggest question I'm asking is, what is the best way of creating content for multiples countries' search results? I don't want the different websites to compete with each other in a sense nor do I want to spend extra time trying to rank content for multiple sites when I could just focus on trying to rank one for all countries. Any help is appreciated!
Intermediate & Advanced SEO | | cody1090 -
Best server-side sitemap generators
I've been looking into sitemap generators recently and have got a good knowledge of what creating a sitemap for a small website of below 500 URLs involves. I have successfully generated a sitemap for a very small site, but I’m trying to work out the best way of crawling a large site with millions of URLs. I’ve decided that the best way to crawl such a large number of URLs is to use a server side sitemap, but this is an area that doesn’t seem to be covered in detail on SEO blogs / forums. Could anyone recommend a good server side sitemap generator? What do you think of the automated offerings from Google and Bing? I’ve found a list of server side sitemap generators from Google, but I can’t see any way to choose between them. I realise that a lot will depend on the type of technologies we use server side, but I'm afraid that I don't know them at this time.
Intermediate & Advanced SEO | | RG_SEO0 -
What is the best way to hide duplicate, image embedded links from search engines?
**Hello! Hoping to get the community’s advice on a technical SEO challenge we are currently facing. [My apologies in advance for the long-ish post. I tried my best to condense the issue, but it is complicated and I wanted to make sure I also provided enough detail.] Context: I manage a human anatomy educational website that helps students learn about the various parts of the human body. We have been around for a while now, and recently launched a completely new version of our site using 3D CAD images. While we tried our best to design our new site with SEO best practices in mind, our daily visitors dropped by ~15%, despite drastic improvements we saw in our user interaction metrics, soon after we flipped the switch. SEOMoz’s Website Crawler helped us uncover that we now may have too many links on our pages and that this could be at least part of the reason behind the lower traffic. i.e. we are not making optimal use of links and are potentially ‘leaking’ link juice now. Since students learn about human anatomy in different ways, most of our anatomy pages contain two sets of links: Clickable links embedded via JavaScript in our images. This allows users to explore parts of the body by clicking on whatever objects interests them. For example, if you are viewing a page on muscles of the arm and hand and you want to zoom in on the biceps, you can click on the biceps and go to our detailed biceps page. Anatomy Terms lists (to the left of the image) that list all the different parts of the body on the image. This is for users who might not know where on the arms the biceps actually are. But this user could then simply click on the term “Biceps” and get to our biceps page that way. Since many sections of the body have hundreds of smaller parts, this means many of our pages have 150 links or more each. And to make matters worse, in most cases, the links in the images and in the terms lists go to the exact same page. My Question: Is there any way we could hide one set of links (preferably the anchor text-less image based links) from search engines, such that only one set of links would be visible? I have read conflicting accounts of different methods from using JavaScript to embedding links into HTML5 tags. And we definitely do not want to do anything that could be considered black hat. Thanks in advance for your thoughts! Eric**
Intermediate & Advanced SEO | | Eric_R0 -
Keyword Targeting Best Practices??
What is the best way to target a specific keyword? I rank well for several of my keywords but want to do better on others. How do I go about doing this?
Intermediate & Advanced SEO | | bronxpad0 -
What is the best permalink structure for SEO?
Your feedback here is definitely appreciated, but I'm also doing a public study and would be honored and humbled if you answered the 5 questions in my survey as well. For those who do not wish to participate, I'd appreciate your general feedback on permalink structure best practices based on what Amazon.com and eBay.com have done to their URLs in recent times. Thanks!
Intermediate & Advanced SEO | | stevewiideman0