International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
-
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/
Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc).
I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples...
I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help.
Appreciate your answers.
Cheers, Lincoln
-
Both should happen.
You should design your coming soon page in a way that allows visitors to visit the AU version meanwhile. Perhaps even adding a newsletter sign up form...
If you are already getting links, then Google is already aware of your site. They were probably not indexing the AU version as you were forcing them to go the US, which is an "under construction" page.
-
Actually, one last minor touch!
In this case, the US site is a 'coming soon' page. Is it necessary at this point to put the 'Choose Location' link in it (so that Google Bots follow the link to find the AU site, because they'll be initially redirected to the US site), or will the Google Bots find the AU site naturally through our other SEO efforts (and be able to index it because they've followed links that won't redirect them to the US site)?
-
Amazing. Exactly what I wanted to hear. After all the other posts that I've read, I think you've summed it up perfectly, as a lot of the other posts really didn't answer it technically or as specifically.
I agree with the setup proposed above and will see if my client's dev team can implement for him.
Thanks, Lincoln
-
Subfolders ARE the best approach
As to your options, I would go with A, but if a visitor goes to say the Canadian version: domain.com/ca even though he previously accessed domain.com and was redirected to the AU version as it was the first time and a cookie was created/session var to store that, I wouldn't redirect him/her.
Let me put this differently.
AU visitor accesses domain.com, you redirect to domain.com/au (and you don't create any cookie, as the user actually didn't select a specific location).
Visitor accesses again, redirected to the AU version, but chooses to change the Country to CA, then he/she is redirected to domain.com/ca (a cookie/session var is now created as the user actually chose another option).
Visitor accesses again domain.com (he has the cookie/session var), he is redirected to the CA version regardless he is in Australia.
Visitor accesses again, but this time he types domain.com/au instead of the naked domain. He has the cookie, but I wouldn't redirect him... as I figure he typed the AU version because he WANTED the AU version.
That's what I would do. However, you can choose to redirect him anyway to the CA version as he has a cookie/sessio var stored. That's up to you.
Then on the 302, what I meant is that every redirection you make in this case should return a 302 status code, not a 301, as the 301s can be stored by the browser and then the user will be "forced" to the redirection. EX: he is in the AU page, chooses to go to CA, you create a 301 (instead of a 302) then next time he accesses the AU version he is redirected BY THE BROWSER to the CA version.
Hope that clears it up.
-
Hi Frederico,
Understood regarding the subdomains. I've always thought subfolders to be a cleaner and more professional approach, especially when it comes to SEO.
What would I ask for from a technical standpoint for the following two options. Appreciate you clarifying for me:
Option 1 (best option) - subfolders
An Australian visitor visits domain.com and is redirected to domain.com/au - the website remembers that this person is Australian. The same Australian visitor then visits the site from the US and the domain.com/au site shows.
The same as Logitech, they have an option to select a different country via a link at the bottom of the page, and if they do so, the website remembers for next time.
Option 2 - subdomains
Idea A: An Australian visitor visits domain.com and is redirected to au.domain.com the first time. domain.com remembers this preference the first time and redirects them there every time thereafter.
The same as Logitech, the user has an option to select a different location, which would update domain.com to remember a different location next time.
**Idea B: **An Australian visitor visits domain.com - the first time they visit they are prompted to select a location. This remembers the preference and every time thereafter would redirect them there.
The same as Logitech, the user has an option to select a different location, which would redirect them to domain.com to update their location preference.
Not sure I follow you on 302 redirect. You mean once the fix has been coded?
Thanks Frederico!
-
Glad I was of help.
I do have some technical knowledge on redirections, however, as we are dealing with subdomains here, I'm not sure cookies will work. Remember that x.domain.com is a different domain than y.domain.com, thus making a cookie created by x.domain.com useless on y.domain.com.
I've checked a couple of sites that do this kind of redirection, and can't hardly found an example of it using cookies, I find lots of them using subfolders: domain.com/au/, etc. as the cookie is valid for all subfolders.
How about forgeting about a "Global" cookie, and just using one for the particular subdomain (if you still want to go with the subdomain route), here's how it will work:
domain.com -> redirects to the "best match" (US go to us version, AU go to au version, others go to whatever version you consider the "default").
Then, in the subdomain, you implement the lightbox pop-up (the less intrusive one you can come up with) and save their response, so if the user accesses the next day to au.domain.com they won't be prompted again to change the location, BUT if they access domain.com (a US visitor) he/she will be redirected to the US version and get the lightbox again.
You end up "basically" with the same results, however, it could be a little annoying for some users, at least I know I would be annoyed if that happened to me.
Give it a day and think if subfolders aren't better in your case, that should solve all problems, and implementation will as easy as 1,2,3 (I am capable of helping you with that approach). You won't be using cookies, but session variables (although cookies will allow you to remember the user choice for any time-frame you want).
Oops, forgot to mention, 302 redirects
-
Frederico, this is exactly the advice I was looking for. Just the simple statement that bots should be treated as a user and not 'forced' is obvious logic I've overlooked.
I believe that then the best scenario for my situation is the same as Logitech:
- User enters and is redirected by location when they visit domain.com
- When redirected to us.domain.com or au.domain.com they then have the option to change location using a link in the footer and by cookies are not forced to change location again
Now to have the developers figure out how to code this. I wonder if you might shed light on the technical terminology of exactly what style of redirection this is? IP redirection w/ cookies, plus choose location page that updates cookies?
Cheers, Linc.
-
A few Webmaster videos ago, Google's Matt Cutts pointed out that that Googlebot should be treated exactly the same as if it was a regular person visiting your site, which you are currently doing.
However, you are now FORCING users to stay on "their" corresponding location, instead you should "suggest" but not force it.
Example: user access the naked domain: domain.com then you check his/her IP and redirect the the appropriate location, you must in this case, use some kind of "we already redirected him/her" method to avoid forcing the user to a specific country subdomain, you can use either sessions or cookies. Once you redirect, you create a cookie saving the option or a session variable. You now have the visitor in the location you want and you should offer an easy way to switch the location (check live examples, like logitech.com), a drop down menu for example on the footer. Now, IF a user accesses a location, say au.domain.com, you shouldn't do the automatic redirection, but instead, you could bring a lightbox pop-up suggesting the user to go to their "best match" location.
Using the above method allows Google to access any page, without forcing it to a specific location, plus, from my point of view, it is the easier and friendlier way for users too. If I type au.domain.com (while in the US) I probably want to see the AU version, if not, the page will suggest me to switch (and based on my response (closing the window or clicking a "stay here") the site should remember it and avoid re-asking).
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
SEO Best Practice for Managing a Businesses NAP with Multiple Addresses
I have a client with multiple business addresses - 3 across 3 states, from an SEO perspective what would be the best approach for displaying a NAP on the website? So far I've read that its best: to get 3 GMB account to point to 3 location pages & use a local phone number as opposed to a 1300 number. Display all 3 locations in the footer, run of site
Local Website Optimization | | jasongmcmahon1 -
How to Get 1st Page Google Rankings for a Local Company?
Hi guys, I'm owning a London removal company - Mega Removals and wants to achieve 1st page rankings on Google UK for keywords like: "removals London", "removal company London", "house removals London" but have no success so far. I need professional advice on how to do it. Should I hire an SEO or should focus on content? I will be very grateful for your help.
Local Website Optimization | | nanton1 -
Removed huge spammy location footer, looking to rebuild traffic the right way
Hello, On this site, I removed a huge spammy location footer with hundreds of cities, states, and dog training types. The traffic and rankings have gone down a lot, and I'd like a discussion on how to rebuild things the right way. There's some local adjustments to be made to the home page content, but other than that: My plans: 1. Analyze top 10 Google analytics keyword queries and work them into the content as best as possible, though I am debating whether the client should make new pages and how many. 2. I'm going to suggest he add a lot of content to the home page, perhaps a story about a dog training that he did in Wisconsin. I'll think about what else. Any advice is appreciated. Thanks.
Local Website Optimization | | BobGW0 -
Does it matter how or what site you use to GeoTag your photos?
I found a site that was very easy for me to upload my pictures, add the coordinates, download it and put it on my site. The site is GeoImgr.com, but it's not nearly as popular as some of the other's out there. Does that matter? I'm under the impression that as long as the GPS coordinates show up in the XIF Viewer, then I've gotten whatever benefit (albeit slight) there is to get. Is that correct? Or is there something about tagging them from the more popular sites like Flickr or Panaramio? Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
SEO geolocation vs subdirectories vs local search vs traffic
My dear community and friends of MOZ, today I have a very interesting question to you all. Although I´ve got my opinion, and Im sure many of you will think the same way, I want to share the following dilemma with you. I have just joined a company as Online Marketing Manager and I have to quickly take a decision about site structure. The site of the company has just applied a big structure change. They used to have their information divided by country (each country one subdirectory) www.site.com/ar/news www.site.com/us/news . They have just changed this and erased the country subdirectory and started using geolocation. So if we go to www.site.com/news although the content is going to be the same for each country ( it’s a Latinamerican site, all the countries speak the same language except Brazil) the navigation links are going to drive you to different pages according to the country where you are located. They believe that having less subdirectories PA or PR is going to be higher for each page due to less linkjuice leaking. My guess is that if you want to have an important organic traffic presence you should A) get a TLD for the country you want to targe… if not B)have a subdirectory or subdomain for each country in your site. I don’t know what local sign could be a page giving to google if the URL and html doesn’t change between countries- We can not use schemas or rich formats neither…So, again, I would suggest to go back to the previous structure. On the other hand…I ve been taking a look to sensacine.com and although their site is pointing only to Spain | |
Local Website Optimization | | facupp1
| | |
| | |
| | |
| | |
| | |
| | |
| | | They have very good rankings for big volume keywords in all latinamerica, so I just want to quantify this change, since I will be sending to the designers and developers a lot of work1 -
What's with Google? All metrics in my favor, yet local competitors win.
In regards to local search with the most relevant keyword, I can't seem to get ahead of the competition. I've been going through a number of analytics reports, and in analyzing our trophy keyword (which is also the most relevant, to our service and site) our domain has consistently been better with a number of factors. There is not a moz report that I can find that doesn't present us as the winner. Of course I know MOZ analytics and google analytics are different, but I'm certain that we have them beat with both. When all metrics seem to be in our favor, why might other competitors continue to have better success? We should be dominating this niche industry. Instead, I see a company using blackhat seo, another with just a facebook page only, and several others that just don't manage their site or ever add unique, helpful content. What does it take to get ahead? I'm pretty certain I've been doing everything right, and doing everything better than our local competitors. I think google just has a very imperfect algorythm, and the answer is "a tremendous amount of patience" until they manage to get things right.
Local Website Optimization | | osaka730 -
2 clients. 2 websites. Same City. Both bankruptcy attorneys. How to make sure Google doesn't penalize...
Hi Moz'ers! I am creating 2 new websites for 2 different bankruptcy attorneys in the same city. I plan to use different templates BUT from the same template provider. I plan to host with the same hosting company (unless someone here advises me not to). The content will be custom, but similar, as they both practice bankruptcy law. They have different addresses, as they are different law firms. My concern is that Google will penalize for duplicate content because they both practice the same area of law, in the same city, hosting the same, template maker the same, and both won't rank. What should I do to make sure that doesn't happen? Will it be enough that they have different business names, address, and phone numbers? Thanks for any help!!
Local Website Optimization | | BBuck0