Canonical Question: Root Domain Geo Redirects to SubFolder.
-
Howdy,
Working on a larger eComm site that 302s you based on your location. With that in mind should I canonicalize the final page.
domain.com => 302 => domain.com/us/, domain.com/fr/, etc... (Should these all have a canonical pointing to the root domain.com?
-
Thanks for the tips man!
-
To be very honest I don't think it will make a difference if it's going to the /us/ version rather than the root.
If you prefer - you could keep the us version on the root & only redirect the non-us visitors to a country version.
Dirk
-
My only concern is domain.com/us/ showing up on google instead of domain.com
Is there anything I can do too keep it the SERP juice going to domain.com instead of the subfolder?
-
As far as I understand there is no content on domain.com so your last line makes no sense.
If you want the default version to be the us version you should put
Don't forget that hreflang needs to be placed on every page of your site - you can check if the implementation is correct here: http://flang.dejanseo.com.au/
Dirk
-
Dirk,
Great thoughts! We're currently talk through out long term international strategy right now. We're running about 20 local sites times 3 brands. Some are on subfolders, some on subdomains, and some on ccTLDs... so this is pretty tough right now.
We luckily caught an issue with Googlebot not being able to access internationally and corrected it. So I think we're safe on that front.
Most of the regions are cross accessible (Europe/APAC/North America) but you can't get from Asia to Europe if you need to from the site. So that's on our radar!
-
So in this case we don't need point a canonical from the subfolder to the root. But I need something like...
So then... will domain.com/us/ start ranking for google.com or will domain.com rank for google.com?
-
Be careful when redirecting based on ip - you have to make sure that Googlebot (accessing your site with a Californian ip) can access the non-US versions. If you have a link on each page to change the version to another country (and these pages are accessible without being redirected) you should be ok.
An alternative to ip based redirection is to use your main domain for a country select page and to store the selection in a cookie - so you can redirect to the chosen version on subsequent visits. Check volvocars.com as an example. The advantage is of this method is that you give control to the user (I personally find it quite annoying when I'm being redirected to the local version when I'm abroad and want to visit my "home" version).
rgds,
Dirk
-
If you are just changing the content a bit based on location, I think canonicalizing them all back to the root page is OK.
If this is redirecting based on translations, you should look into using the hreflang tag. It tells the search engines that there are alternate versions of the page in different languages.
Here are some resources for you.
- https://support.google.com/webmasters/answer/189077?hl=en
- https://moz.com/blog/hreflang-behaviour-insights
Once that is in place, each page can canonicalize to itself.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.co.uk or .com for a UK geo location domain?
Let's say you were in the market to buy a domain for a large city in the UK. Manchester for example. Would you prefer to own the .co.uk or the .com version? Do .co.uk have higher CTR/ranking factors over .com for GEO location based websites in the UK?
Intermediate & Advanced SEO | | Mokdiiek0 -
Move domain to new domain, for how much time should I keep forwarding?
I'm not sure but my website looks like is not getting it's juice as supposed to be. As we already know, google preferred https sites and this is what happened to mine, it was been crawling as https but when the time came to move my domain to new domain, I used 301 or domain forwarding service, unfortunately they didn't have a way to forward from https to new https, they only had regular http to https, when users clicked to my old domain from google search my site was returned to "site does not exist", I used hreflang at least that google would detect my new domain been forwarding and yes it worked but now I'm wondering, for how much time should I keep the forwarding the old domain to the new one, my site looks like is not going up, I have changed all the external links, any help would be appreciated. Thanks!
Intermediate & Advanced SEO | | Fulanito1 -
Danger in using utm_source and utm_medium to track tens of thousands of cross domain redirects
We just merged with another company and are redirecting their domains (competitive/similar content) to our own. We'll have several domains, redirecting (301) several hundred thousand URL's to our domain (not all the same page, very unique mappings). Will adding utm_source, et al parameters to the URL's have a negative impact on how google transfers value to the pages based on the redirect authority passed? Any points of view? We have a self referencing canonical, but given that we have 90 million pages on the current domain (and climbing), seems like cleanest approach would be to not use redirects. Thanks, Jeff
Intermediate & Advanced SEO | | jrjames830 -
Redirecting a Page from Domain A to Domain B
We have a page on Domain A, an established and well-ranking website, that would be more appropriate on Domain B, a site that we launched about two years ago. This page ranks well, pulls nice search traffic and has traffic from external links. We would like to move the page and its traffic from Domain A to Domain B using a 301 redirect. Have you ever done this or have you heard of how it has worked for someone else? Thanks!
Intermediate & Advanced SEO | | EGOL0 -
Should I buy a .co domain if my preferred .com and .co.uk domain are taken by other companies?
I'm looking to boost my website ranking and drive more traffic to it using a keyword rich domain name. I want to have my nearest city followed by the keyword "seo" in the domain name but the .co.uk and .com have already been taken. Should I take the plunge and buy .co at a higher price? What options do I have? Also whilst we're on domains and URL's is it best to separate keywords in url's with a (_) or a (-)? Many thanks for any help with this matter. Alex
Intermediate & Advanced SEO | | SeoSheikh0 -
302 redirect
Aloha, I do a small study of 302 redirects. I wonder if you have any examples of sites where the use of a 302 is made.
Intermediate & Advanced SEO | | android_lyon
For example, to ski resorts: where there is a summer version and a winter version. In this case, the field of 302 will return the version of the relevant season. ex: http://www.valmorel.com/ >> 302 >> http://www.valmorel.com/fr/hiver/accueil-hiver.html I wonder if the use of 302 is the right solution.
What do you think? D.0 -
Crawl questions
My first website crawl indicating many issues. I corrected the issues, requested another crawl and received the results. After viewing the excel file I have some questions. 1. There are many pages with missing Titles and Meta Descriptions in the Excel file. An example is http://www.terapvp.com/threads/help-us-decide-on-terapvp-com-logo.25/page-2 That page clearly has a meta description and title. It is a forum thread. My forum software does a solid job of always providing those tags. Why would my crawl report not show this information? This occurs on numerous pages. 2. I believe all my canonical URLs are properly set. My crawl report has 3k+ records, largely due to there being 10 records for many pages. These extra records are various sort orders and style differences for the same page i.e. ?direction=asc. My need for a crawl report is to provide actionable data so I can easily make SEO improvements to my site where necessary. These extra records don't provide any benefit. IF the crawl report determined there was not a clear canonical URL, then I could understand. But that is not the case. An example is http://www.terapvp.com/forums/news/ If you look at the source you will clearly see Where is the benefit to including the 10 other records in the Crawl report which show this same page in various sort orders? Am I missing anything? 3. My robots.txt appropriately blocks many pages that I do not wish to be crawled. What is the benefit to including these many pages in the crawl report? Perhaps I am over analyzing this report. I have read many articles on SEO, but now that I have found SEOmoz, I can see I will need to "unlearn what I have learned". Many things such as setting meta keyword tags are clearly not helpful. I wish to focus my energy and I was looking to the crawl report as my starting point. Either I am missing something, or the report design needs improvement.
Intermediate & Advanced SEO | | RyanKent0