Seeking guidance setting up hreflang en-gb for international english website and en-us for North American site
-
Our website is configured like so:
MyCompany.com Websites
- /en-gb - International English
- /fr-fr
- /zh-hans
- /m/en-us - North American site - completely different structure
The first three bullets share a Drupal instance where the North American site uses a different PHP framework and has it's own unique look and structure.
Currently none of the websites have hreflang tags which means that sometimes when searching in the US the en-gb results creep in. I want to turn on hreflang tags for the international english website (en-gb) but my fear is that Google may not return the en-gb results to English speaking users if they are not in the UK. We want these results to appear for anyone who is not in the US who speaks English.
Just a note, Canada is not included in this since they'll be added to the North American site soon and will have their own hreflang tags.
-
Can I have multiple defaults for a single URL when not using x-default?
Could you explain better?
Hreflang is alternate annotation... so - for instance - in the en-US home page you must indicate the alternative home pages for en (meant for global use of the en-GB, british version of your site), fr-FR (or fr only), en-CA and zh-hans, and vice versa in all the pages.
If you mean if the x-default can be different on a URL by URL situation - for instance if you want to set up a british product page as default for all users not targeted with specific geotargeting version of the same product page - in theory that is possible, because the hreflang is URL specific and not domain wide.
Said that, you should always state that the british version is meant for all English speaking users apart the geotargeted one (hreflang="en").
The xdefault will tell Google to show the british version URL to all the users from countries and languages not specifically geotargeted (eg.: Spanish speaking users from Spain).
-
Thank you for the detailed response Danial. Our US website only does business in the Americas (Canada, US, Latin America) which is the reason that setting it as the International site does not make sense to us.
Taking the feedback in it seems that I could get by with the following. Please correct me as needed.
America's Website: www.website.com/en-us
- hreflang US: en-us
- hreflang Canada: en-ca
- Can I have multiple defaults for a single URL when not using x-default?International English Website: www.website.com/en-gb
- hreflang: enFrench Website: www.website.com/fr-fr
- hreflang: fr-fr or frChinese Website: www.website.com/zh-hans
- hreflang: zh-hans -
If you want en-gb to be your standard English language subfolder of the site, I would just use the first part (language) of the href lang: en. The gb part is about country targeting which is actually an optional element. So that way US traffic will go to en-us, the rest will go to your en-gb subfolder.
I would say though that maybe US English is more "international standard" than British English.
Additionally, if the home page of your domain is a country selector (see http://www.emirates.com/index.aspx for an example) then it's a good idea to use X-default. So if you want ALL visitors to first go to your home page, then have them select the country and language which is most appropriate for them, you can use . This lets Google and Yandex know that regardless of country or language, all visitors should go to that home page first. There's more about x-default here: http://googlewebmastercentral.blogspot.ae/2013/04/x-default-hreflang-for-international-pages.html
Hope that helps
-
Thanks Tom. Two of these articles I have stumbled upon in my research of hreflang tags and they are filled with very helpful information.
After breezing through the articles rather quickly it seems that if I add the en-us hreflang tags AND the en-gb tags, the following is likely and somewhat obvious to occur:
- US Searches
- EN-US results should get priority followed by EN-GB - UK Searches
- EN-GB results should get priority followed by EN-US - For search outside of both the US and UK Google will revert to showing the most relevant result without bias to the hreflang tag.
My conclusion is that for EN-GB maybe I should not set hreflang so since we do not necessarily want the results to favor the UK - we want these results to populate searches for everywhere except the US.
Thoughts and feedback would be greatly appreciated.
- US Searches
-
Hi there
I'd highly recommend going through Aleyda Solis' international SEO posts here on the Moz blog. They can teach how to prepare for international SEO, how to approach site structure and how to generate relevant code and hreflang tags.
Here is her international SEO checklist
Here is her Hreflang blog post and generator tool
And 40 tools to help advance your international SEO
They're great reading and nothing that I'd be able to do add to, so I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to analysis Responsive VS Dynamic site?
Hello Experts, Can you please throw your knowledge on How to analysis Responsive VS Dynamic site? Also please suggest best tool to optimize both type of sites plus if I have to optimize manually then what is the best way? It will help lots of people as currently google focusing more on mobile site so we can optimize our responsive and dynamic site. Pls suggest. Thanks!
On-Page Optimization | | adamjack0 -
Microsite and main website alternate in rankings
Hi all, I just noticed a potential issue with our websites. We have two ecommerce websites, one is a very large one selling all sorts of products, while the microsite focuses on a small segment of products. All products sold on the microsite are also sold on the main website. In the beginning of September, we upgraded the microsite to the same script that the main website uses to make it mobile friendly and update the design. They now look very similar. Before, both websites used to rank on page 1 for a specific keyword. I have noticed that since we upgraded the microsite, the two websites have been taking turns ranking for the keyword. For a few weeks the microsite ranks and the main website doesn't rank for the keyword. Then for a few weeks only the main website ranks and the microsite doesn't. I think the reason this is happening is that Google understands that the content is the same and the websites are both owned by the same company. Fair enough. I remember reading an article about this phenomenon before but can't remember where. Does anyone know which article I'm talking about (it would have been on an SEO blog/website, e.g. Moz, SEJ, SE Roundtable etc)? I'm not even sure what this phenomenon is called. If we can only have one of the pages rank, we would prefer it to be the microsite at all times. Would a canonical tag on the main website referring to the URL on the microsite fix this? I think at the moment the product descriptions are either very similar or identical. Would it help to make them more different to get both to rank again if that is what we wanted to do? In the end it is still the same product being sold by the same company - after Google has already sort of merged the two, would they "un-merge" them if the content was more different? Thanks in advance!
On-Page Optimization | | ViviCa10 -
Disadvantages of Migrating Website to New URL
Hi There, I am currently struggling with the ranking of my website. No matter how many initiatives I try (backlinking, blog commenting, social posting, etc.) I can't seem to make any progression in Google Search. I've done competitive metrics through Open Site Explorer and can't seem to really find the reason why my site is not ranking as well as my competitors. The only one possible glaring element I've thought about is my website URL. This company is in the heating and cooling industry and majority of my competitors have either "heating" or "cooling" or both in their website URL's but mine does not. Does anyone have any thoughts or recommendations on if changing my URL and then redirecting my current URL would be a step in the right direction help me to climb the rankings in Google Search? Thanks!
On-Page Optimization | | MainstreamMktg0 -
Rel-nofollow for price comparison site?
I run a price comparison site, so we have TONS of outbound links. Should my outbound links be marked with 'rel=nofollow'?
On-Page Optimization | | lancerpanz0 -
Why my website some pages is not index in google ?
Hi, I have submitted my pages in Google fetch for consideration tool but they are not indexed yet in the Google search. Additionally, there is also no error shown by the Google.
On-Page Optimization | | seo.kishore890 -
On-Site Optimization Issue!
Hello, I have some confusion about how to structure my site to better in on-site optimization. I am using WordPress. Therefore, there are many things that I need to consider as following: Static Page for homepage OR Latest posts? Archive, Category, Author, Attachment and Tag pages - To put meta robots (no index, follow) or not to prevent duplication? If I use Static Page for homepage, do I need to add meta robots (no index, follow) to POINT 2 above or not? If I use Latest Posts for homepage, do I need to add meta robots (no index, follow) to POINT 2 above or not? To have breadcrumb or not? To have recent posts, comment, tag clouds or popular posts/comments widget or not? To have social sharing icons and related posts in single post or not? If you don't mind adding more tips that I don't know it would be very great! Thanks!
On-Page Optimization | | dinabrokoth0 -
What is the Best Way to Translate a Website?
What is the best way? Using a different domain Using a subdomain of the main site Using a folder of the main site What's your take?
On-Page Optimization | | sbrault740 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0