International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
-
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/
Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc).
I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples...
I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help.
Appreciate your answers.
Cheers, Lincoln
-
Both should happen.
You should design your coming soon page in a way that allows visitors to visit the AU version meanwhile. Perhaps even adding a newsletter sign up form...
If you are already getting links, then Google is already aware of your site. They were probably not indexing the AU version as you were forcing them to go the US, which is an "under construction" page.
-
Actually, one last minor touch!
In this case, the US site is a 'coming soon' page. Is it necessary at this point to put the 'Choose Location' link in it (so that Google Bots follow the link to find the AU site, because they'll be initially redirected to the US site), or will the Google Bots find the AU site naturally through our other SEO efforts (and be able to index it because they've followed links that won't redirect them to the US site)?
-
Amazing. Exactly what I wanted to hear. After all the other posts that I've read, I think you've summed it up perfectly, as a lot of the other posts really didn't answer it technically or as specifically.
I agree with the setup proposed above and will see if my client's dev team can implement for him.
Thanks, Lincoln
-
Subfolders ARE the best approach
As to your options, I would go with A, but if a visitor goes to say the Canadian version: domain.com/ca even though he previously accessed domain.com and was redirected to the AU version as it was the first time and a cookie was created/session var to store that, I wouldn't redirect him/her.
Let me put this differently.
AU visitor accesses domain.com, you redirect to domain.com/au (and you don't create any cookie, as the user actually didn't select a specific location).
Visitor accesses again, redirected to the AU version, but chooses to change the Country to CA, then he/she is redirected to domain.com/ca (a cookie/session var is now created as the user actually chose another option).
Visitor accesses again domain.com (he has the cookie/session var), he is redirected to the CA version regardless he is in Australia.
Visitor accesses again, but this time he types domain.com/au instead of the naked domain. He has the cookie, but I wouldn't redirect him... as I figure he typed the AU version because he WANTED the AU version.
That's what I would do. However, you can choose to redirect him anyway to the CA version as he has a cookie/sessio var stored. That's up to you.
Then on the 302, what I meant is that every redirection you make in this case should return a 302 status code, not a 301, as the 301s can be stored by the browser and then the user will be "forced" to the redirection. EX: he is in the AU page, chooses to go to CA, you create a 301 (instead of a 302) then next time he accesses the AU version he is redirected BY THE BROWSER to the CA version.
Hope that clears it up.
-
Hi Frederico,
Understood regarding the subdomains. I've always thought subfolders to be a cleaner and more professional approach, especially when it comes to SEO.
What would I ask for from a technical standpoint for the following two options. Appreciate you clarifying for me:
Option 1 (best option) - subfolders
An Australian visitor visits domain.com and is redirected to domain.com/au - the website remembers that this person is Australian. The same Australian visitor then visits the site from the US and the domain.com/au site shows.
The same as Logitech, they have an option to select a different country via a link at the bottom of the page, and if they do so, the website remembers for next time.
Option 2 - subdomains
Idea A: An Australian visitor visits domain.com and is redirected to au.domain.com the first time. domain.com remembers this preference the first time and redirects them there every time thereafter.
The same as Logitech, the user has an option to select a different location, which would update domain.com to remember a different location next time.
**Idea B: **An Australian visitor visits domain.com - the first time they visit they are prompted to select a location. This remembers the preference and every time thereafter would redirect them there.
The same as Logitech, the user has an option to select a different location, which would redirect them to domain.com to update their location preference.
Not sure I follow you on 302 redirect. You mean once the fix has been coded?
Thanks Frederico!
-
Glad I was of help.
I do have some technical knowledge on redirections, however, as we are dealing with subdomains here, I'm not sure cookies will work. Remember that x.domain.com is a different domain than y.domain.com, thus making a cookie created by x.domain.com useless on y.domain.com.
I've checked a couple of sites that do this kind of redirection, and can't hardly found an example of it using cookies, I find lots of them using subfolders: domain.com/au/, etc. as the cookie is valid for all subfolders.
How about forgeting about a "Global" cookie, and just using one for the particular subdomain (if you still want to go with the subdomain route), here's how it will work:
domain.com -> redirects to the "best match" (US go to us version, AU go to au version, others go to whatever version you consider the "default").
Then, in the subdomain, you implement the lightbox pop-up (the less intrusive one you can come up with) and save their response, so if the user accesses the next day to au.domain.com they won't be prompted again to change the location, BUT if they access domain.com (a US visitor) he/she will be redirected to the US version and get the lightbox again.
You end up "basically" with the same results, however, it could be a little annoying for some users, at least I know I would be annoyed if that happened to me.
Give it a day and think if subfolders aren't better in your case, that should solve all problems, and implementation will as easy as 1,2,3 (I am capable of helping you with that approach). You won't be using cookies, but session variables (although cookies will allow you to remember the user choice for any time-frame you want).
Oops, forgot to mention, 302 redirects
-
Frederico, this is exactly the advice I was looking for. Just the simple statement that bots should be treated as a user and not 'forced' is obvious logic I've overlooked.
I believe that then the best scenario for my situation is the same as Logitech:
- User enters and is redirected by location when they visit domain.com
- When redirected to us.domain.com or au.domain.com they then have the option to change location using a link in the footer and by cookies are not forced to change location again
Now to have the developers figure out how to code this. I wonder if you might shed light on the technical terminology of exactly what style of redirection this is? IP redirection w/ cookies, plus choose location page that updates cookies?
Cheers, Linc.
-
A few Webmaster videos ago, Google's Matt Cutts pointed out that that Googlebot should be treated exactly the same as if it was a regular person visiting your site, which you are currently doing.
However, you are now FORCING users to stay on "their" corresponding location, instead you should "suggest" but not force it.
Example: user access the naked domain: domain.com then you check his/her IP and redirect the the appropriate location, you must in this case, use some kind of "we already redirected him/her" method to avoid forcing the user to a specific country subdomain, you can use either sessions or cookies. Once you redirect, you create a cookie saving the option or a session variable. You now have the visitor in the location you want and you should offer an easy way to switch the location (check live examples, like logitech.com), a drop down menu for example on the footer. Now, IF a user accesses a location, say au.domain.com, you shouldn't do the automatic redirection, but instead, you could bring a lightbox pop-up suggesting the user to go to their "best match" location.
Using the above method allows Google to access any page, without forcing it to a specific location, plus, from my point of view, it is the easier and friendlier way for users too. If I type au.domain.com (while in the US) I probably want to see the AU version, if not, the page will suggest me to switch (and based on my response (closing the window or clicking a "stay here") the site should remember it and avoid re-asking).
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do old backlinks still help with new URL with 301 redirect? Also I added the www. How does this affect it all?
I changed my URL from exampledetailing. com to exampleautodetailing. com. It is redirected with a 301. Also, it is on Squarespace AND I opted to add the www. So will the old backlinks of exampledetailing. com still help the new URL exampleautodetailing. com or do I need to try and update all the links? Also, future links, do I need to include the www. or just the root domain of exampleautodetailing. com or even the whole https://wwwexampleautodetailing. com? I believe the www is considered a sub domain and a new entity on Google, so I am not sure how that works. Thank you!
Local Website Optimization | | Rmarkjr810 -
Redirecting to HTTPS for web design companies
Hello, We redirected our website about 3 months ago to https from http and noticed a drop in rankings after about 2 weeks. Unfortunately, our rankings have not yet recovered. Can anyone recommend a solutions? The website is https://www.web3.ca/ Do we have to build a lot of new links to https if we currently have links that are pointing to http, for the most part? Also, could the switch effect anchor text links? For example, if we have a link to web design, but the links is pointing to http, instead of https, would that link have less value now? Thanks, Anton Vasiliv
Local Website Optimization | | Web3
Creative Director
Web30 -
Keyword Tool Best Practices
VERY much a newbie with MOZ Pro (10 days into my 30-day trial), so for that, I apologize. I just want to get an idea of the time needed to accumulate data for keywords, and if I am doing all this right. I'm guessing this...use the tool and enter as many keyword possibilities as I can come up with for my website. Then wait a week or so. Then make a keyword decision. Is this the recommended or best practice? If it makes a difference, my website pertains to a local onsite service that I provide. Any web traffic more than 50 miles of my physical location would be a waste. My 1 on 1 session with Seun was great, but now I need to put it all together and make some traffic happen! Any other ideas, please throw them my way. If I'm completely off-base with this, feel free to send insults!
Local Website Optimization | | NewSEOguy0 -
How best to clean up doorway pages. 301 them or follow no index ?
Hi Mozzers, I have what is classed as doorway pages on my website. These have historically been location specific landing pages for some of our categories but from speaking to a number of different webmasters , then general consensus is that they are not in google guidelines so I will be getting punished by having them. My options are : I can 301 the pages back to their original category pages . This will conserve some link juice to pass back to the respective category page. I can set these as Follow No index. Not sure what will happen here with regards to link value etc. What would be best ?... Some of the pages do currently rank "fairly well" for some of the locations so I am getting traffic from them but I also know I will be getting a algorithmic penalty for having them so how best I clean these up ?. Also , by cleaning up the site structure , would I see any benefit here ? or will I have to wait for a new panda update/ refresh ? I thought the panda refresh won't use a new dataset thanks Pete
Local Website Optimization | | PeteC120 -
Search Result Discrepancy: Keyword "Dresses" shows international sites in the search results of Google.co.in.
Hi All, What would be the reason that Google shows international websites in the first page results while there are huge local players available. Eg: Dresses - Keyword that shows results with almost all the results from International websites whereas the local big players in the same category are not shown. This is not the case for other keywords like Women dresses, Clothing, Shoes etc., Is it a bug or any particular reasons? Thanks,
Local Website Optimization | | Myntra0 -
Google Structured Data Verses Yandex Structured Data Validator Query
Hi All, We have implemented Schema.org on our website and we have chosen a specific schema as opposed to just using the standard localbusiness. When we ran it through the Google Structured data tool it did not report any error , however , when we tried it on Yandex, it showed up as us having problems with the way we have tagged our addresses for our different locations so we have made additional changes to fix this. I have read somewhere that the Google Structured data tool is not 100% correct at showing any errors etc and that one should use Yandex as well for validation. I am wondering what others thoughts and if what I read should be taken as correct infomation?... I would be surprised if google did release something like the structured data validator if it wasn't as good at reporting than some others out there. thanks Pete
Local Website Optimization | | PeteC120 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0 -
HELP, My site have more than 40k visits by day and the server is down, I do not want all this visits...
Hello... I have a website for a local spa in ecuador, this website have a blog with some tips about health... and suddenly one of the articles goes viral on south america profiels on FB and I am receiving 40k visits by day from other countries that are not interested to me because my site is for a local bussines in ecuador... I already block some countries by IP , but Im still receiving visits from other south america countries, for this reason My hosting server company put down my website and I can not put it back online beacuse this thousands of visits use more than the 25% of the CPU of the server and the hosting company put down my website again... I really need to know what to do, I do not want to pay for a expensive special server because all this visits from other countries are not interesting to me .and as I said before my bussines is local.
Local Website Optimization | | lans27872