What’s the best way to handle multiple website languages in terms of metatags that should be used and pages sent on our sitemap?
-
Hey everyone,
Has anyone here worked with SEO + website translations?
- When should we use canonical or alternate tag if we want the user to find our page on the language he used on Google?
- Should we send all pages on all the different locales on the sitemap?
Looking forward to hearing from you!
Thanks!
-
Allan, the Google resource is the base.
However, remember always that:
-
Google suggests using always the rel="canonical", even if it's only self-referential.
-
The href in the hreflang must always indicate a canonical URL. If not, Google will consider it a mistake and won't consider the hreflang.
-
If you're targeting two different languages (i.e.: English and Spanish), the use of the hreflang is not strictly necessary, however using it is another signal you give to Google about how you want your urls target a specific audience based on language or language/country.
-
-
Thanks, Thomas!
I'll definitely have a look here and let you know if we got any other questions.
-
This is my go to resource. Really helps plot it all out visually.
https://hreflang.org/use-hreflang-canonical-together/
As for sitemaps, refer to this: https://support.google.com/webmasters/answer/2620865?hl=en
If after those articles you aren't 100% sure, i'll try and answer any specific questions point by point.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it best to 301 redirect or use canonical Url when consolidating two pages?
I have build several pages (A and B) with high quantity content. Page A is aged and gets lots of organic traffic, ranks for lots of valuable keywords, and has only internal links to this page. Page B is newer (6 months) and gets little traffic, ranks for no keywords, but has terrific content and many high value external links. As Page A and B are related to a similar theme, I was going to merge content from page B onto page A, but don't know which would be the best approach for handling the links going to page B. For the purposes of keep as much link equity as possible, is it best to us a 301 redirect from B to A or use a canonical URL from B to A?
Intermediate & Advanced SEO | | Cutopia0 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Do you suggest I use the Yoast or the Google XML sitemap for my blog?
I just shut off the All-In-One seo pack plugin for wordpress, and turned on the Yoast plugin. It's great! So much helpful, seo boosting info! So, in watching a video on how to configure the plugin, it mentions that I should update the sitemap, using the Yoast sitemap I'm afraid to do this, because I'm pretty technologically behind... I see I have a Google XML Sitemaps (by Arne Brachhold) plugin turned on (and have had it for many years). Should I leave this one on? Or would you recommend going through the steps to use the Yoast plugin sitemap? If so, what are the benefits of the Yoast plugin, over the Google XML? Thanks!
Intermediate & Advanced SEO | | DavidC.0 -
Wrong pages ranking for key terms
Hi, I have a website that was rebuilt and redesigned earlier this year, and it's struggling to rank. The problem is that the wrong pages are ranking for the key terms. For example, there is a page for 'Loft Conversions Essex' but the page that's ranking is actually the FAQ page (which doesn't mention the word 'Essex' at all). I have been through all of the usual items, and none of them seem to apply: The landing pages have been properly optimised (not overly so), while the pages that rank only contain the terms within the menu (the link that goes to the actual landing page) We thought it may be a redirect issue since the site was a bit of a mess before the rebuild, so we removed all of the redirects and resubmitted the htaccess file but that hasn't helped Internal anchor text is relevant There aren't a huge number of external links to the old site pages, and many of these pages didn't exist at all so I don't think that's an issue Most of the pages were built at the same time so there's no real reason why one would have more authority than another There are no canonicals interfering with these pages I can't really canonical these since we do want the pages to rank, it's just that they're all ranking for the wrong thing (so the SERPs are a lot lower than they should be). Most of these pages are pretty new, as I said, so while we have tried smaller content changes I don't think a full refresh will really help. To make it even weirder, the pages that rank for each term change regularly but it's never the right page. Help! EDIT: Thanks for the responses everyone!
Intermediate & Advanced SEO | | innermedia10 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
Indexing a several millions pages new website
Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael
Intermediate & Advanced SEO | | Pureshore0 -
What is the best approach to a keyword that has multiple abbreviations?
I have a site for which the primary keyword has multiple abbreviations. The site is for the computer game "Football Manager", each iteration is often referred to as FM2012, FM12 or Football Manager 2012, the first two can also be used with or without spaces inbetween. While this is only 3 keywords to target, it means that every key phrase such as "FM2012 Tactics", must also be targeted in 3 ways. Is there a recommended approach to make sure that all 3 are targeted? At present I use the full title "Football Manager" in the the title and try to use the shorter abbreviations in the page, I also make sure the title tags always have an alternative e.g FM2012 Tactics Two specific questions as well as general tips: Does the <abbr>HTML tag help very much?</abbr> Are results likely to differ much for searches for "FM 2012" and "FM2012" i.e. without the space.
Intermediate & Advanced SEO | | freezedriedmedia1 -
Xml sitemap advice for website with over 100,000 articles
Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000. So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go? Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to. We are using canonical tags. Thanks, Jarrett
Intermediate & Advanced SEO | | jarrett.mackay0