International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
-
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/
Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc).
I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples...
I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help.
Appreciate your answers.
Cheers, Lincoln
-
Both should happen.
You should design your coming soon page in a way that allows visitors to visit the AU version meanwhile. Perhaps even adding a newsletter sign up form...
If you are already getting links, then Google is already aware of your site. They were probably not indexing the AU version as you were forcing them to go the US, which is an "under construction" page.
-
Actually, one last minor touch!
In this case, the US site is a 'coming soon' page. Is it necessary at this point to put the 'Choose Location' link in it (so that Google Bots follow the link to find the AU site, because they'll be initially redirected to the US site), or will the Google Bots find the AU site naturally through our other SEO efforts (and be able to index it because they've followed links that won't redirect them to the US site)?
-
Amazing. Exactly what I wanted to hear. After all the other posts that I've read, I think you've summed it up perfectly, as a lot of the other posts really didn't answer it technically or as specifically.
I agree with the setup proposed above and will see if my client's dev team can implement for him.
Thanks, Lincoln
-
Subfolders ARE the best approach
As to your options, I would go with A, but if a visitor goes to say the Canadian version: domain.com/ca even though he previously accessed domain.com and was redirected to the AU version as it was the first time and a cookie was created/session var to store that, I wouldn't redirect him/her.
Let me put this differently.
AU visitor accesses domain.com, you redirect to domain.com/au (and you don't create any cookie, as the user actually didn't select a specific location).
Visitor accesses again, redirected to the AU version, but chooses to change the Country to CA, then he/she is redirected to domain.com/ca (a cookie/session var is now created as the user actually chose another option).
Visitor accesses again domain.com (he has the cookie/session var), he is redirected to the CA version regardless he is in Australia.
Visitor accesses again, but this time he types domain.com/au instead of the naked domain. He has the cookie, but I wouldn't redirect him... as I figure he typed the AU version because he WANTED the AU version.
That's what I would do. However, you can choose to redirect him anyway to the CA version as he has a cookie/sessio var stored. That's up to you.
Then on the 302, what I meant is that every redirection you make in this case should return a 302 status code, not a 301, as the 301s can be stored by the browser and then the user will be "forced" to the redirection. EX: he is in the AU page, chooses to go to CA, you create a 301 (instead of a 302) then next time he accesses the AU version he is redirected BY THE BROWSER to the CA version.
Hope that clears it up.
-
Hi Frederico,
Understood regarding the subdomains. I've always thought subfolders to be a cleaner and more professional approach, especially when it comes to SEO.
What would I ask for from a technical standpoint for the following two options. Appreciate you clarifying for me:
Option 1 (best option) - subfolders
An Australian visitor visits domain.com and is redirected to domain.com/au - the website remembers that this person is Australian. The same Australian visitor then visits the site from the US and the domain.com/au site shows.
The same as Logitech, they have an option to select a different country via a link at the bottom of the page, and if they do so, the website remembers for next time.
Option 2 - subdomains
Idea A: An Australian visitor visits domain.com and is redirected to au.domain.com the first time. domain.com remembers this preference the first time and redirects them there every time thereafter.
The same as Logitech, the user has an option to select a different location, which would update domain.com to remember a different location next time.
**Idea B: **An Australian visitor visits domain.com - the first time they visit they are prompted to select a location. This remembers the preference and every time thereafter would redirect them there.
The same as Logitech, the user has an option to select a different location, which would redirect them to domain.com to update their location preference.
Not sure I follow you on 302 redirect. You mean once the fix has been coded?
Thanks Frederico!
-
Glad I was of help.
I do have some technical knowledge on redirections, however, as we are dealing with subdomains here, I'm not sure cookies will work. Remember that x.domain.com is a different domain than y.domain.com, thus making a cookie created by x.domain.com useless on y.domain.com.
I've checked a couple of sites that do this kind of redirection, and can't hardly found an example of it using cookies, I find lots of them using subfolders: domain.com/au/, etc. as the cookie is valid for all subfolders.
How about forgeting about a "Global" cookie, and just using one for the particular subdomain (if you still want to go with the subdomain route), here's how it will work:
domain.com -> redirects to the "best match" (US go to us version, AU go to au version, others go to whatever version you consider the "default").
Then, in the subdomain, you implement the lightbox pop-up (the less intrusive one you can come up with) and save their response, so if the user accesses the next day to au.domain.com they won't be prompted again to change the location, BUT if they access domain.com (a US visitor) he/she will be redirected to the US version and get the lightbox again.
You end up "basically" with the same results, however, it could be a little annoying for some users, at least I know I would be annoyed if that happened to me.
Give it a day and think if subfolders aren't better in your case, that should solve all problems, and implementation will as easy as 1,2,3 (I am capable of helping you with that approach). You won't be using cookies, but session variables (although cookies will allow you to remember the user choice for any time-frame you want).
Oops, forgot to mention, 302 redirects
-
Frederico, this is exactly the advice I was looking for. Just the simple statement that bots should be treated as a user and not 'forced' is obvious logic I've overlooked.
I believe that then the best scenario for my situation is the same as Logitech:
- User enters and is redirected by location when they visit domain.com
- When redirected to us.domain.com or au.domain.com they then have the option to change location using a link in the footer and by cookies are not forced to change location again
Now to have the developers figure out how to code this. I wonder if you might shed light on the technical terminology of exactly what style of redirection this is? IP redirection w/ cookies, plus choose location page that updates cookies?
Cheers, Linc.
-
A few Webmaster videos ago, Google's Matt Cutts pointed out that that Googlebot should be treated exactly the same as if it was a regular person visiting your site, which you are currently doing.
However, you are now FORCING users to stay on "their" corresponding location, instead you should "suggest" but not force it.
Example: user access the naked domain: domain.com then you check his/her IP and redirect the the appropriate location, you must in this case, use some kind of "we already redirected him/her" method to avoid forcing the user to a specific country subdomain, you can use either sessions or cookies. Once you redirect, you create a cookie saving the option or a session variable. You now have the visitor in the location you want and you should offer an easy way to switch the location (check live examples, like logitech.com), a drop down menu for example on the footer. Now, IF a user accesses a location, say au.domain.com, you shouldn't do the automatic redirection, but instead, you could bring a lightbox pop-up suggesting the user to go to their "best match" location.
Using the above method allows Google to access any page, without forcing it to a specific location, plus, from my point of view, it is the easier and friendlier way for users too. If I type au.domain.com (while in the US) I probably want to see the AU version, if not, the page will suggest me to switch (and based on my response (closing the window or clicking a "stay here") the site should remember it and avoid re-asking).
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Data Structure, Indexing and Canonicals
I was wondering if anyone would be able to share some data structure/indexing best practices. We have a site that has pages designed to display National/State/City level data - all pages have slight variations in the data and descriptions - but we're seeing google index some of the city level data for national level keywords. the URL structure is www.mysite.com/Country/State/City/Topic.html For example - if the query was "what is the price of beans?" we're seeing Google pick up localized versions - i.e. mysite.com/US/CA/San_Francisco/price-of-beans.html - when it should be picking up mysite.com/US/price-of-beans.html I've toyed with the idea of using the national level page as the canonical for the state/city pages - but I don't want to hurt state/city level keywords. Because some of the pages have only slight variances - we are also seeing a lot of soft 404 errors - We're assuming that Google is seeing the pages as duplicates even though the content is different. Any insight/suggestions are appreciated.
Local Website Optimization | | Nobody16081562591621 -
Is there a way to "protect" yourself from non-local traffic?
I'll start with the story, but the main question is at the bottom. Feel free to scroll down :-). I've got good news and bad news regarding a client of mine. It's a service area business that only serves one metropolitan area. We've got a great blog with really valuable content that truly helps people while firmly establishing my client's industry expertise. As a result, local traffic has spiked and the company generates more leads. So that's the good news. The bad (bad-ish?) news is that the client also gets tons of traffic from outside the service area. Not only that, people are calling them all the time who either live in a different state and don't realize that the company isn't local to them or are located out of state but are calling for free advice. On one hand, the client gets a kick out of it and thinks it's funny. On the other hand, it's annoying and they're having to train all their intake people to ask for callers' locations before they chat with them. Some things we're doing to combat this problem: 1. The title tag on our home page specifies the metro area where we're active. 2. Our blog articles frequently include lines like, "Here in [name of our city], we usually take this approach." 3. There are references to our location all over the site. 4. We've got an actual location page with our address; for that matter, the address is listed in the footer on every page. 5. The listed phone number does not begin with 800; rather, it uses the local area code. 6. All of our local business listings, including our Google My Business listing, is up to date. 7. We recently published a "Cities We Serve" area of the site with highly customized/individualized local landing pages for 12 actual municipalities in our metro region. This will take some time to cook, but hopefully that will help. "Cities We Serve" is not a primary navigation item, but the local landing pages are situated as such: "About Us > Cities We Serve > [individual city page]" **Anyway, here's my main question: **In light of all this, is there any other way to somehow shield my client from all this irrelevant traffic and protect them from time-wasting phone calls?
Local Website Optimization | | Greenery0 -
Google for Jobs: how to deal with third-party sites that appear instead of your own?
We have shared our company's job postings on several third-party websites, including The Muse, as well as putting the job postings on our own website. Our site and The Muse have about the same schema markup except for these differences: The Muse...
Local Website Optimization | | Kevin_P
• Lists Experience Requirements
• Uses HTML in the description with tags and other markup (our website just has plain text)
• Has a Name in JobPosting
• URL is specific to the position (our website's URL just goes to the homepage)
• Has a logo URL for Organization When you type the exact job posting's title into Google, The Muse posting shows up in Google for Jobs--not our website's duplicate copy. The only way to see our website's job posting is to type in the exact job title plus "site:http://www.oursite.com". What is a good approach for getting our website's posting to be the priority in Google for Jobs? Do we need to remove postings from third-party sites? Structure them differently? Do organic factors affect which version of the job posting is shown, and if so, can I assume that our site will face challenges outranking a big third-party site?1 -
Site Audit: Indexed Pages Issue
Over the last couple of months I've been working through some issues with a client. One of my starting points was doing a site Audit. I'm following a post written by Geoff Kenyon https://moz.com/blog/technical-site-audit-for-2015 . One of the main issues of the site audit seems to be that when I run a "site:domain.com" query in Google my homepage isn't the first page listed in fact it isn't listed in this search when I go through all of the listings. I understand that it isn't required to have your homepage listed first when running this type of query, but I would prefer it. Here are some things I've done I ran another query "info:homepage.com" and the home page is indexed by Google. When I run a branded search for the company name the home page does come up first. The current page that is showing up first in the "site:domain.com" listing is my blog index page. Several months back I redirected the index.php page to the root of the domain. Not sure if this is helping or hurting. In the sitemap I removed the index.php and left only the root domain as the page to index. Also all interior links are sent to the root, index.php has been eliminated from all internal links everything links to root The main site navigation does not refer to the "Home" page, but instead my logo is the link to the Home page. Should I noindex my blog/index.php page? This page is only a compilation of posts and does not have any original content instead it actually throws up duplicate content warnings. Any help would be much appreciated. I apologize if this is a silly question, but I'm getting frustrated/ annoyed at the whole situation.
Local Website Optimization | | SEO_Matt0 -
What is the best way for a UK company to source SEO Support to boost SERPS in USA Google?
We are a niche web retailer with a world leading product and as such are probably the best option for USA customers (even though we are based in the UK) up to 18 months ago google agreed and placed us high for USA searches and we had good business as a result however since penguin (or around that time anyways) google increased our SERPS for more local markets (UK and EUROPE) and decreased our ranks for USA with a consequent reduction in our USA sales We want to improve rank again in USA (and Canada and Australia and Russia) but need specialist help What's the best way to source that? (short of someone saying they know exactly how to do that) ant recommendation most gratefully received Tom
Local Website Optimization | | tomnivore0 -
Google plus page multiple domains
Hi I have had a .com domain for many years linked to my google plus page and local verified to my UK office address. This site sells and advertises my products, some of them are uk only like the school and computers I sell and the rest are digital and world wide. I decided to start a .co.uk domain to be more targeted to the uk and advertise only the school and computers which I sell to the uk and just link to the .com for digital products. I want the .com domain to attract world wide customers and the .co.uk for uk customers. What do I do, does it make sense to connect my google plus business page to the .co.uk site? Should I still have a google plus page for the .com site? I only have 1 office and thats in the uk. Not sure what to do here. I dont want to lose rankings or do anything negative. Thoughts? Thanks.
Local Website Optimization | | theindic0 -
What is the Best Keyword Placement within a URL for Inner Location Pages?
I'm working on a website with 100s of locations. There is a location search page (Find Widget Dealer), a page for each state (Tennessee Widget Dealers) and finally a page for each individual location which has localized unique content and contact info (Nashville Widget Dealer). My question is is related to how I should structure my URL and the keywords within the URL. Keywords in my examples being the location and the product (i.e. widget). Here is a quick overview of each of the 3 tiered pages, with the Nashville page being the most optimized: Find Widget Dealer - Dealer Page only includes a location search bar and bullet list links to states Tennessee Widget Dealers - Page includes brief unique content for the the state and basic listing info for each location along with links to the local page) Nashville Widget Dealer - Page includes a good amount of unique content for this specific location (Most optimized page) That said, here are the 3 URL structure options I am considering: http://website.com/widget-dealers/tennesee/nashville http://website.com/dealers/tennesee-widget-dealers/nashville http://website.com/dealers/tennesee/nashville-widget-dealer Any help is appreciated! Thank you
Local Website Optimization | | the-coopersmith0 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0