Hreflang for Canadian web visitors (when their browsers are set to en-us)
-
We're in the process of implementing hreflang markup for Canadian & US versions of a website.
We've found that about half of our Canadian traffic has browsers that are set to en-us (instead of en-ca, as would be expected). Should we be concerned that Canadians with en-us browser settings will be shown the US versions of the website (as the hreflang would markup 'en-us' for the US version of the page).
Our immediate thoughts are that since they're likely to be searching from Google.ca and would also have Canadian IP addresses, that this won't be an issue. Does anyone have any other thoughts here?
-
Don't have hard evidence - but from my personal perspective: My browser is set to be-nl (Belgium)- when I'm in the Netherlands (nl-nl) I am automatically redirected to google.nl & all the results I get are from the Netherlands (even for international sites were Dutch Belgian versions exist). Browser language will have an impact - but in my opinion proximity will be more important.
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Implications of extending browser caching for Google?
I have been asked to leverage browser caching on a few scripts in our code. http://www.googletagmanager.com/gtm.js?id=GTM-KBQ7B5 (16 minutes 22 seconds) http://www.google.com/jsapi (1 hour) https://www.google-analytics.com/plugins/ua/linkid.js (1 hour) https://www.google-analytics.com/analytics.js (2 hours) https://www.youtube.com/iframe_api (expiration not specified) https://ssl.google-analytics.com/ga.js (2 hours) The number beside each link is the expiration for cache applied by the owners. I'm being asked to extend the time to 24 hours. Part of this task is to make sure doing this is a good idea. It would not be in our best interest to do something that would disrupt the collection of data. Some of what I'm seeing is recommending having a local copy which would mean missing updates from ga/gtm or call for the creation of a cron job to download any updates on a daily basis. Another concern is would caching these have a delay/disruption in collecting data? That's an unknown to me – may not be to you. There is also the concern that Google recommends not caching outside of their settings. Any help on this is much appreciated. Do you see any issues/risks/benefits/etc. to doing this from your perspective?
Intermediate & Advanced SEO | | chrisvogel0 -
Hreflang tag on links to alternate language site
Hey everyone! In the interest of trying to be brief, here's the situation in my favorite form of communication, bullet points! Client has two sites; one is in English and one is in Japanese Each site is a separate URL, no sub-domains or sub-pages Each main page on the English version of the site has a link to the homepage of the Japanese site Site has decent rankings overall, with room for improvement from page 2 to page 1 No Hreflang tags currently used in links to the Japanese version from the English version Given that the site isn't really suffering for most rankings, would this be helpful to implement on the English version? Ideally, I'd like each link to be updated to the corresponding subject matter of the Japanese, but in the interim it seems like identifying to Google that the link on the other side is a different language might be helpful to both the user and to maybe help those rankings on page two creep a little higher to page one. Thanks for reading, I appreciate your time.
Intermediate & Advanced SEO | | Etna0 -
Can anyone see any issues with the canonical tags on this web site?
The main domain is: http://www.eumom.ie/ And these would be some of the core pages: http://www.eumom.ie/pregnancy/ http://www.eumom.ie/getting-pregnant/ Any help from the Moz community is much appreciated!
Intermediate & Advanced SEO | | IcanAgency0 -
How to resolve duplicate content issues when using Geo-targeted Subfolders to seperate US and CAN
A client of mine is about to launch into the USA market (currently only operating in Canada) and they are trying to find the best way to geo-target. We recommended they go with the geo-targeted subfolder approach (___.com and ___.com/ca). I'm looking for any ways to assist in not getting these pages flagged for duplicate content. Your help is greatly appreciated. Thanks!
Intermediate & Advanced SEO | | jyoung2220 -
Overly-Dynamic URLs & Changing URL Structure w Web Redesign
I have a client that has multiple apartment complexes in different states and metro areas. They get good traffic and pretty good conversions but the site needs a lot of updating, including the architecture, to implement SEO standards. Right now they rank for " <brand_name>apartments" on every place but not " <city_name>apartments".</city_name></brand_name> There current architecture displays their URLs like: http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=overview</client_apartments> http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=floorplans&floorPlanID=121</client_apartments> I know it is said to never change the URL structure but what about this site? I see this URL structure being bad for SEO, bad for users, and basically forces us to keep the current architecture. They don't have many links built to their community pages so will creating a new URL structure and doing 301 redirects to the new URLs drastically drop rankings? Is this something that we should bite the bullet on now for future rankings, traffic, and a better architecture?
Intermediate & Advanced SEO | | JaredDetroit0 -
Setting Up Google Analytics for domains with 301
I have a client with a google analytics account that is a mess. domaina.com domainb.com 302's to domaina.com domainc.com 302's to domaina.com domaind.com 302's to domaina.com I thought the client was doing 301s on all these domains to the primary domain. I have logged into there analytics account and found data is being tracked on the other domains i.e domainb,c,d.com etc. How is it possible that google analytics is tracking data on these domains when no analytics code has been created and the urls are redirecting to domaina.com? Also there are not sites on these domains so for webmaster tools should I enable domain verification through a cname on the dns? Also I can I best setup a way to track traffic coming from say domainb.com? Whats the best step by step guide to use to set this up.
Intermediate & Advanced SEO | | JohnW-UK0 -
Has important is it to set "priority" and "frequency" in sitemaps?
Has anyone ever done any testing on setting "priority' and "frequency" in their sitemaps? What was the result? Does specifying priority or frequency help quite a bit?
Intermediate & Advanced SEO | | nicole.healthline2