Hreflang doubt use correctly
-
Hello,I have a question, I want to know which option is best for implementing a multi languages. We have a client whose website will have English and Spanish languages, both languages have the same content but English we focus on the US and UK, and Spanish only for the country Spain, the question arises what is the correct nomenclature we use or would it be the best value.**Option 1:****Option 2:**Or any of the two options is correct What would be the correct ?. Another question, if a German user is in Spain, and do a search on (Google Spain), what will be the best option that should be implemented, / is-de / or single / de /, which one will position before ( provided that the statement I is correct). A greeting and thanks.
-
I would go with none of the above.
Your second option has "en" on the Spanish line by the way.
I would use this structure (and hreflang):
Also, once you're done test your code on Flang. And don't forget to do your inner pages as well.
-
You'll want the shortest, most technically feasible URL structure based on whatever website platform you're using. Subfolders like http://example.com/es/ and http://example.com/en/ are ideal, and there is a lot of talk about subdomain vs. subfolder on MOZ. Keep in mind you also have a http://example.com landing page, so you'll either need to redirect users or have visitors select a language on this home page. There is pretty thorough documentation here https://support.google.com/webmasters/answer/189077?hl=en on how to use the hreflang attribute in each case.
I leave my default language as the base URL and put additional languages in subfolders. Statistically, I tend to rank higher for keywords in my default language than for the translated keywords in additional languages. You might want to target the market with the most traffic or conversions (whatever metric you prioritize) with the default URL and then add additional languages as subfolders, preferably without hyphens or underscores in the locale code, i.e. en-us, en-uk, etc. This is more for your visitors and not a particular ranking factor, but shorter domains are preferable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I use content from DomainA on DomainB, but spread it - how do I implement the canoncial tag?
Hey community, I have a question regarding canonical tags.
Intermediate & Advanced SEO | | ElliPirelli
I used the content of one of my domains (nameA.com/ContentA) and copied it to another domain, but on several pages: nameB.com/ContentA1
nameB.com/ContentA2
nameB.com/ContentA3
and so on). So I divided the content from domainA to several pages of domainB. The reason is, that my client wants to build a new business on domainB and wants to use the exact same content from domainA, because he can't afford another copywriter at the moment (and he doesn't want to rewrite it himself). Problem: DomainA is ranking for this content and he wants to keep the rankings, until domainB ranks similar (for the same keywords, of course). So my question is: Can I put a canonical tag on domainA?
My thoughts are: Not a single page of domainB is 100% duplicate content, as it's always only partialy the same. Can I just choose one of those pages from domainB to put as link-goal for the canonical tag? Or do I need to create a "view-all" page on domainB, with all the content put together, so it's 100% duplicate to domainA, and then put a canonical tag to domainA and link to this "view-all" page? If I do so, do I need to also put canonicals on every single page from domainB, to link to this "view-all" page?
IMPORTANT: Would the other pages of domainB then be ranked/listed in the SERRPs, or only the "view-all" site? I would really appreciate your help, as I have been seaching for answers to this specific problem since more than a week... Thank you! Best regards0 -
Do you suggest I use the Yoast or the Google XML sitemap for my blog?
I just shut off the All-In-One seo pack plugin for wordpress, and turned on the Yoast plugin. It's great! So much helpful, seo boosting info! So, in watching a video on how to configure the plugin, it mentions that I should update the sitemap, using the Yoast sitemap I'm afraid to do this, because I'm pretty technologically behind... I see I have a Google XML Sitemaps (by Arne Brachhold) plugin turned on (and have had it for many years). Should I leave this one on? Or would you recommend going through the steps to use the Yoast plugin sitemap? If so, what are the benefits of the Yoast plugin, over the Google XML? Thanks!
Intermediate & Advanced SEO | | DavidC.0 -
Bingpreview/1.0b Useragent Using Adding Trailing Slash to all URLs
The Bingpreview crawler, which I think exists in order to take snapshots of mobile friendly pages, crawled my pages last night for the first time. However, it is adding a trailing slash to the end of each of my dynamic pages. The result is my program is giving the wrong page--my program is not expecting a trailing slash at the end of the urls. It was 160 pages, but I have thousands of pages it could do this to. I could try doing a mod rewrite but that seems like it should be unnecessary. ALL the other crawlers are crawling the proper urls. None of my hyperlinks have the slash on the end. I have written to Bing to tell them of the problem. Is anyone else having this issue? Any other suggestions for what to do? The user agent is: Mozilla/5.0 (iPhone; CPU iPhone OS 7_0 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A465 Safari/9537.53 BingPreview/1.0b
Intermediate & Advanced SEO | | friendoffood0 -
We're currently not using schemas on our website. How important is it? And are websites across the globe using it?
Schemas looks like an important thing when it comes to structuring your website and ensuring the crawl bots get all the details. I've been reading a lot of articles around the web and most of them are saying that schemas are important but very few websites are using it. Why so? Are the schemas on schema.org there to stay or am I wasting my time?
Intermediate & Advanced SEO | | Shreyans920 -
Which search engines still use Meta Keywords?
I know Google doesn't use meta keywords in meta tags, but i was wondering if there are other smaller search engines that still do? Id it worth it to add meta keywords for them?
Intermediate & Advanced SEO | | jhinchcliffe0 -
How many time should a keyword be used in the body of text?
We employee an outside agency to write content for our website as we do not have the ability in house to write unique and good quality content. They have just sent an article which is around 300 words. I told them the keyword phrases to use. When I got the document there is only 1 instance of the keyword phrase(s) in it. Now there seems to be a conflict here amongst posts I have read and general SEO advise as to how many times it should be present (SEOmoz indicates 4 times for instance), our outside agency says it doesn't matter. Now if I have a page optimised for 2 keywords this starts making things tricky and probably looks keyword stuffed to the reader. Assuming the keywords are present once in meta tags, H1, meta descriptions and alt text, what do people think is best practice taking into account recent panda updates? Thoughts appreciated. Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0 -
Meta Keywords: Should we use them or not?
I am working through our site and see that meta keywords are being used heavily and unnecessarily. Each of our info pages will have 2 or 3 keyword phrases built into them. Should we just duplicate the keyword phrases into the meta keyword field, should put in additional keywords beyond or not use it at all? Thoughts and opinions appreciated
Intermediate & Advanced SEO | | Towelsrus1 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0