International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
-
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/
Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc).
I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples...
I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help.
Appreciate your answers.
Cheers, Lincoln
-
Both should happen.
You should design your coming soon page in a way that allows visitors to visit the AU version meanwhile. Perhaps even adding a newsletter sign up form...
If you are already getting links, then Google is already aware of your site. They were probably not indexing the AU version as you were forcing them to go the US, which is an "under construction" page.
-
Actually, one last minor touch!
In this case, the US site is a 'coming soon' page. Is it necessary at this point to put the 'Choose Location' link in it (so that Google Bots follow the link to find the AU site, because they'll be initially redirected to the US site), or will the Google Bots find the AU site naturally through our other SEO efforts (and be able to index it because they've followed links that won't redirect them to the US site)?
-
Amazing. Exactly what I wanted to hear. After all the other posts that I've read, I think you've summed it up perfectly, as a lot of the other posts really didn't answer it technically or as specifically.
I agree with the setup proposed above and will see if my client's dev team can implement for him.
Thanks, Lincoln
-
Subfolders ARE the best approach
As to your options, I would go with A, but if a visitor goes to say the Canadian version: domain.com/ca even though he previously accessed domain.com and was redirected to the AU version as it was the first time and a cookie was created/session var to store that, I wouldn't redirect him/her.
Let me put this differently.
AU visitor accesses domain.com, you redirect to domain.com/au (and you don't create any cookie, as the user actually didn't select a specific location).
Visitor accesses again, redirected to the AU version, but chooses to change the Country to CA, then he/she is redirected to domain.com/ca (a cookie/session var is now created as the user actually chose another option).
Visitor accesses again domain.com (he has the cookie/session var), he is redirected to the CA version regardless he is in Australia.
Visitor accesses again, but this time he types domain.com/au instead of the naked domain. He has the cookie, but I wouldn't redirect him... as I figure he typed the AU version because he WANTED the AU version.
That's what I would do. However, you can choose to redirect him anyway to the CA version as he has a cookie/sessio var stored. That's up to you.
Then on the 302, what I meant is that every redirection you make in this case should return a 302 status code, not a 301, as the 301s can be stored by the browser and then the user will be "forced" to the redirection. EX: he is in the AU page, chooses to go to CA, you create a 301 (instead of a 302) then next time he accesses the AU version he is redirected BY THE BROWSER to the CA version.
Hope that clears it up.
-
Hi Frederico,
Understood regarding the subdomains. I've always thought subfolders to be a cleaner and more professional approach, especially when it comes to SEO.
What would I ask for from a technical standpoint for the following two options. Appreciate you clarifying for me:
Option 1 (best option) - subfolders
An Australian visitor visits domain.com and is redirected to domain.com/au - the website remembers that this person is Australian. The same Australian visitor then visits the site from the US and the domain.com/au site shows.
The same as Logitech, they have an option to select a different country via a link at the bottom of the page, and if they do so, the website remembers for next time.
Option 2 - subdomains
Idea A: An Australian visitor visits domain.com and is redirected to au.domain.com the first time. domain.com remembers this preference the first time and redirects them there every time thereafter.
The same as Logitech, the user has an option to select a different location, which would update domain.com to remember a different location next time.
**Idea B: **An Australian visitor visits domain.com - the first time they visit they are prompted to select a location. This remembers the preference and every time thereafter would redirect them there.
The same as Logitech, the user has an option to select a different location, which would redirect them to domain.com to update their location preference.
Not sure I follow you on 302 redirect. You mean once the fix has been coded?
Thanks Frederico!
-
Glad I was of help.
I do have some technical knowledge on redirections, however, as we are dealing with subdomains here, I'm not sure cookies will work. Remember that x.domain.com is a different domain than y.domain.com, thus making a cookie created by x.domain.com useless on y.domain.com.
I've checked a couple of sites that do this kind of redirection, and can't hardly found an example of it using cookies, I find lots of them using subfolders: domain.com/au/, etc. as the cookie is valid for all subfolders.
How about forgeting about a "Global" cookie, and just using one for the particular subdomain (if you still want to go with the subdomain route), here's how it will work:
domain.com -> redirects to the "best match" (US go to us version, AU go to au version, others go to whatever version you consider the "default").
Then, in the subdomain, you implement the lightbox pop-up (the less intrusive one you can come up with) and save their response, so if the user accesses the next day to au.domain.com they won't be prompted again to change the location, BUT if they access domain.com (a US visitor) he/she will be redirected to the US version and get the lightbox again.
You end up "basically" with the same results, however, it could be a little annoying for some users, at least I know I would be annoyed if that happened to me.
Give it a day and think if subfolders aren't better in your case, that should solve all problems, and implementation will as easy as 1,2,3 (I am capable of helping you with that approach). You won't be using cookies, but session variables (although cookies will allow you to remember the user choice for any time-frame you want).
Oops, forgot to mention, 302 redirects
-
Frederico, this is exactly the advice I was looking for. Just the simple statement that bots should be treated as a user and not 'forced' is obvious logic I've overlooked.
I believe that then the best scenario for my situation is the same as Logitech:
- User enters and is redirected by location when they visit domain.com
- When redirected to us.domain.com or au.domain.com they then have the option to change location using a link in the footer and by cookies are not forced to change location again
Now to have the developers figure out how to code this. I wonder if you might shed light on the technical terminology of exactly what style of redirection this is? IP redirection w/ cookies, plus choose location page that updates cookies?
Cheers, Linc.
-
A few Webmaster videos ago, Google's Matt Cutts pointed out that that Googlebot should be treated exactly the same as if it was a regular person visiting your site, which you are currently doing.
However, you are now FORCING users to stay on "their" corresponding location, instead you should "suggest" but not force it.
Example: user access the naked domain: domain.com then you check his/her IP and redirect the the appropriate location, you must in this case, use some kind of "we already redirected him/her" method to avoid forcing the user to a specific country subdomain, you can use either sessions or cookies. Once you redirect, you create a cookie saving the option or a session variable. You now have the visitor in the location you want and you should offer an easy way to switch the location (check live examples, like logitech.com), a drop down menu for example on the footer. Now, IF a user accesses a location, say au.domain.com, you shouldn't do the automatic redirection, but instead, you could bring a lightbox pop-up suggesting the user to go to their "best match" location.
Using the above method allows Google to access any page, without forcing it to a specific location, plus, from my point of view, it is the easier and friendlier way for users too. If I type au.domain.com (while in the US) I probably want to see the AU version, if not, the page will suggest me to switch (and based on my response (closing the window or clicking a "stay here") the site should remember it and avoid re-asking).
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Index, No Follow Short *but relevant) content?
One of the sections of our blog is "Community Involvement." In this section, we post pictures of the event, what it was for, and what we did to help. We want our clients, and potential clients, to see that we do give back to our local community. However, thee are all very short posts (maybe a few hundred words). I'm worried this might look like spam, or at the very least, thin content to google, so should I no index no follow the posts or just leave them as is? Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
SEO and Main Navigation Best Practices
I've read a number of articles on SEO and main navigation for websites. I'd like to get a solid answer/recommendation to help solve this one. This is the situation. We're helping a local business that offers pest control and property maintenance services. Under each of these, there area a number of services available, eg, cockroach control, termite inspections or lawn mowing services, rubbish removal and so on. Is it best to have a main nav containing the top keywords for the services - Pest Control | Property Maintenance, with a drop down to the services under each. Or, a simple approach - Our Services > drop down to each - Pest Control > Termite Inspections, etc. My concern here is that they have quite a lot of services, so the nav could be way too long. Really appreciate any assistance here. Many thanks.
Local Website Optimization | | RichardRColeman0 -
Building a new site and want to be found in both Google.co.uk and Goolge.ie. What is the best practice?
We are building a new site which is a .com site and the client would like to be found in both Google.co.uk and Goolge.ie. What is the best practice to go about this? Can you geo-target two countries with the one site?
Local Website Optimization | | WSIDW0 -
Virtual Offices & Google Search
United Kingdom We have a client who works from home and wants a virtual office so his clients do not know where he lives. Can a virtual office address be used on his business website pages & contact pages, in title tags and descriptions as well as Google places. The virtual office is manned at all times and phone calls will be directed to the client, the virtual office company say effectively it is a registered business address. Look forward to any helpful responses.
Local Website Optimization | | ChristinaRadisic0 -
Recommended blogs and sites about local seo
HI.
Local Website Optimization | | corn2015
Can you please tell me some great blogs/sites to read daily about local seo? I'm really wanting to beef up my knowledge in this area to assist local businesses. Corn1 -
Implementation advice on fighting international duplicate content
Hi All, Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation. The situation is that we have 5 sites with similar content. Out of these 5: 2 use the same URL stucture and have no suffix 2 have a different URL structure with a .html suffix 1 has an entirely different URL structure with a .asp suffix The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level). 4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content. Is there an easy way to go about this or is the only way a manual addition? Has anyone had a similar experience? Your advice will be greatly appreciated. Many thanks, Emeka.
Local Website Optimization | | OptiBacUK0 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0