International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
-
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/
Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc).
I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples...
I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help.
Appreciate your answers.
Cheers, Lincoln
-
Both should happen.
You should design your coming soon page in a way that allows visitors to visit the AU version meanwhile. Perhaps even adding a newsletter sign up form...
If you are already getting links, then Google is already aware of your site. They were probably not indexing the AU version as you were forcing them to go the US, which is an "under construction" page.
-
Actually, one last minor touch!
In this case, the US site is a 'coming soon' page. Is it necessary at this point to put the 'Choose Location' link in it (so that Google Bots follow the link to find the AU site, because they'll be initially redirected to the US site), or will the Google Bots find the AU site naturally through our other SEO efforts (and be able to index it because they've followed links that won't redirect them to the US site)?
-
Amazing. Exactly what I wanted to hear. After all the other posts that I've read, I think you've summed it up perfectly, as a lot of the other posts really didn't answer it technically or as specifically.
I agree with the setup proposed above and will see if my client's dev team can implement for him.
Thanks, Lincoln
-
Subfolders ARE the best approach
As to your options, I would go with A, but if a visitor goes to say the Canadian version: domain.com/ca even though he previously accessed domain.com and was redirected to the AU version as it was the first time and a cookie was created/session var to store that, I wouldn't redirect him/her.
Let me put this differently.
AU visitor accesses domain.com, you redirect to domain.com/au (and you don't create any cookie, as the user actually didn't select a specific location).
Visitor accesses again, redirected to the AU version, but chooses to change the Country to CA, then he/she is redirected to domain.com/ca (a cookie/session var is now created as the user actually chose another option).
Visitor accesses again domain.com (he has the cookie/session var), he is redirected to the CA version regardless he is in Australia.
Visitor accesses again, but this time he types domain.com/au instead of the naked domain. He has the cookie, but I wouldn't redirect him... as I figure he typed the AU version because he WANTED the AU version.
That's what I would do. However, you can choose to redirect him anyway to the CA version as he has a cookie/sessio var stored. That's up to you.
Then on the 302, what I meant is that every redirection you make in this case should return a 302 status code, not a 301, as the 301s can be stored by the browser and then the user will be "forced" to the redirection. EX: he is in the AU page, chooses to go to CA, you create a 301 (instead of a 302) then next time he accesses the AU version he is redirected BY THE BROWSER to the CA version.
Hope that clears it up.
-
Hi Frederico,
Understood regarding the subdomains. I've always thought subfolders to be a cleaner and more professional approach, especially when it comes to SEO.
What would I ask for from a technical standpoint for the following two options. Appreciate you clarifying for me:
Option 1 (best option) - subfolders
An Australian visitor visits domain.com and is redirected to domain.com/au - the website remembers that this person is Australian. The same Australian visitor then visits the site from the US and the domain.com/au site shows.
The same as Logitech, they have an option to select a different country via a link at the bottom of the page, and if they do so, the website remembers for next time.
Option 2 - subdomains
Idea A: An Australian visitor visits domain.com and is redirected to au.domain.com the first time. domain.com remembers this preference the first time and redirects them there every time thereafter.
The same as Logitech, the user has an option to select a different location, which would update domain.com to remember a different location next time.
**Idea B: **An Australian visitor visits domain.com - the first time they visit they are prompted to select a location. This remembers the preference and every time thereafter would redirect them there.
The same as Logitech, the user has an option to select a different location, which would redirect them to domain.com to update their location preference.
Not sure I follow you on 302 redirect. You mean once the fix has been coded?
Thanks Frederico!
-
Glad I was of help.
I do have some technical knowledge on redirections, however, as we are dealing with subdomains here, I'm not sure cookies will work. Remember that x.domain.com is a different domain than y.domain.com, thus making a cookie created by x.domain.com useless on y.domain.com.
I've checked a couple of sites that do this kind of redirection, and can't hardly found an example of it using cookies, I find lots of them using subfolders: domain.com/au/, etc. as the cookie is valid for all subfolders.
How about forgeting about a "Global" cookie, and just using one for the particular subdomain (if you still want to go with the subdomain route), here's how it will work:
domain.com -> redirects to the "best match" (US go to us version, AU go to au version, others go to whatever version you consider the "default").
Then, in the subdomain, you implement the lightbox pop-up (the less intrusive one you can come up with) and save their response, so if the user accesses the next day to au.domain.com they won't be prompted again to change the location, BUT if they access domain.com (a US visitor) he/she will be redirected to the US version and get the lightbox again.
You end up "basically" with the same results, however, it could be a little annoying for some users, at least I know I would be annoyed if that happened to me.
Give it a day and think if subfolders aren't better in your case, that should solve all problems, and implementation will as easy as 1,2,3 (I am capable of helping you with that approach). You won't be using cookies, but session variables (although cookies will allow you to remember the user choice for any time-frame you want).
Oops, forgot to mention, 302 redirects
-
Frederico, this is exactly the advice I was looking for. Just the simple statement that bots should be treated as a user and not 'forced' is obvious logic I've overlooked.
I believe that then the best scenario for my situation is the same as Logitech:
- User enters and is redirected by location when they visit domain.com
- When redirected to us.domain.com or au.domain.com they then have the option to change location using a link in the footer and by cookies are not forced to change location again
Now to have the developers figure out how to code this. I wonder if you might shed light on the technical terminology of exactly what style of redirection this is? IP redirection w/ cookies, plus choose location page that updates cookies?
Cheers, Linc.
-
A few Webmaster videos ago, Google's Matt Cutts pointed out that that Googlebot should be treated exactly the same as if it was a regular person visiting your site, which you are currently doing.
However, you are now FORCING users to stay on "their" corresponding location, instead you should "suggest" but not force it.
Example: user access the naked domain: domain.com then you check his/her IP and redirect the the appropriate location, you must in this case, use some kind of "we already redirected him/her" method to avoid forcing the user to a specific country subdomain, you can use either sessions or cookies. Once you redirect, you create a cookie saving the option or a session variable. You now have the visitor in the location you want and you should offer an easy way to switch the location (check live examples, like logitech.com), a drop down menu for example on the footer. Now, IF a user accesses a location, say au.domain.com, you shouldn't do the automatic redirection, but instead, you could bring a lightbox pop-up suggesting the user to go to their "best match" location.
Using the above method allows Google to access any page, without forcing it to a specific location, plus, from my point of view, it is the easier and friendlier way for users too. If I type au.domain.com (while in the US) I probably want to see the AU version, if not, the page will suggest me to switch (and based on my response (closing the window or clicking a "stay here") the site should remember it and avoid re-asking).
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the current best practice for URL structure?
We’re really confused about the current best practice of URL structure. For example what would anyone advise to rank for luxury hotel rooms? name.com/luxury-hotel-rooms/
Local Website Optimization | | SolveWebMedia
name.com/hotel/luxury-hotel-rooms/
name.com/hotel/luxury-rooms/
name.com/hotel/luxury/
name.com/luxury-rooms/ Or do we add location? name.com/luxury-hotel-rooms-location/
name.com/hotel/luxury-hotel-rooms-location/ name.com/hotel/luxury-rooms-location/ They also do cottages name.com/cottages/sea-view-holiday-cottages/0 -
How to Get 1st Page Google Rankings for a Local Company?
Hi guys, I'm owning a London removal company - Mega Removals and wants to achieve 1st page rankings on Google UK for keywords like: "removals London", "removal company London", "house removals London" but have no success so far. I need professional advice on how to do it. Should I hire an SEO or should focus on content? I will be very grateful for your help.
Local Website Optimization | | nanton1 -
SEO Company wants to rebuild site
Hello Community, I am a designer and web developer and I mostly work with squarespace. Squarespace has SEO best practices built into the platform, as well as developer modes for inserting custom code when necessary. I recently built a beautiful website for a Hail Repair Company and referred them to several companies to help them with SEO and paid search. Several of these companies have told this client that in order to do any kind of SEO, they'll need to completely rebuild the site. I've seen some of the sites these companies have built, and they are tacky, over crowded and hard to use. My client is now thinking they need to have their site rebuilt. Is there any merit to this idea? Or are these companies just using the knowledge gap to swindle people into buying more services? The current site is : https://www.denverautohailspecialists.com/ Any advice would be appreciated.
Local Website Optimization | | arzawacki2 -
Should I use Rel-Canonicals links for a News site with similar articles each year
Our small town news site provides coverage in a lot of seasonal areas, and we're struggling with the current year's content ranking above previous years. For instance, every year we cover the local high school football team, and create 2-3 articles per game. We'll also have some articles preseason with upcoming schedule and general team "talk". We've seen where articles from past seasons will rank higher than the current season, presumably because the older articles have more links to them from other sources (among other factors). We don't want to delete these old articles and 301 them to the newer article, since most articles include information/stories about specific players...and their families don't want the article to ever come down. Should we rel-canonical the older articles to the newer one, or perhaps to the "high school football" category page? If to the category page, should we rel-canonical even the new articles to that main category page? Thanks for the help.
Local Website Optimization | | YourMark.com0 -
Google can't discern the identity of my site
I have a website, http://NewYorkJazzEvents.com, that promotes jazz bands that are available for brides looking to hire a jazz band to perform at their wedding, or event planners looking to hire a jazz band to perform for a corporate event, etc. This identity, that my site is an Entertainment Agency, is made clear by all of the content on my site, as well as all of the content on its associated sites (such as its linked Facebook, YouTube, and Google Business pages, and many local citations). Yet, contrary to all of this data, the mere presence of the word "events" in my URL and business name has led Google to believe that my site is a Live Jazz Guide, i.e., a site that lists public performances of jazz groups in New York City. The problem, then, is that Google displays the site when people search for local events listings, and not when they search for jazz bands to contract for private events. For example, do a search for "jazz bands new york" and up pops the listings for sites catering to searchers looking to hire bands for private events, like Gigmasters, Gigsalad, right at the top of the list, followed by lots of individual bands. My site is buried (in my results, anyway), on the middle of page 2. (My paid Adwords ad, on the other hand, shows up at the top of paid ads.): https://www.dropbox.com/s/sv4we4gvnb6wkyb/Screenshot%202016-04-11%2019.22.40.png?dl=0 Now do a search for "new york jazz events." Boom! I'm #1 in the natural results, and, unlike in the search for "new york jazz band," my Google plus page and map (or is it the "knowledge graph"?) display right at the top of the right column: https://www.dropbox.com/s/nob24x1b8u1g4or/Screenshot%202016-04-11%2019.18.49.png?dl=0. (Pretty useless to people searching for live jazz listings in New York, though.) (This, by the way, is an additional related frustration: why does Google display all of its local information (its map, links to my Google reviews, etc.) next to my site listing when people are searching for events, but but hides this valuable information next to my site listing when people are search for jazz bands (when my site comes up on page 2)?) For a further confirmation of Google's confusion, see this data from Google that indicates the top search queries that it is using to display my site are centered around searches for local live jazz listings: Google Search Console > Search Traffic > Search Analytics > Queries: https://www.dropbox.com/s/t8blxv6a077iuw6/Screenshot%202016-03-07%2012.28.38.png?dl=0 See also see this data from Google that indicates that it see "events" (which it understands as local live jazz listings) rather than "new york jazz bands" as the essential keyword describing the identity of the site: Google Search Console > Google Index > Content Keywords: https://www.dropbox.com/s/6nk6skfgx9zjzgc/Screenshot%202016-03-07%2012.46.04.png?dl=0 It's been this way for several years. I thought Google was supposed to be smart, but it's pretty dumb in this case (all the other search engines, including Bing, are quite a bit more intelligent). All this trouble, essentially from a word within a URL? Does anyone have an idea of the cause of this issue, and any potential cures? What can I do to clear up Google's confusion?
Local Website Optimization | | ChuckBraman0 -
Schema markup for a local directory listing and Web Site name
Howdy there! Two schema related questions here Schema markup for local directory We have a page that lists multiple location information on a single page as a directory type listing. Each listing has a link to another page that contains more in depth information about that location. We have seen markups using Schema Local Business markup for each location listed on the directory page. Examples: http://www.yellowpages.com/metairie-la/gold-buyers http://yellowpages.superpages.com/listings.jsp?CS=L&MCBP=true&C=plumber%2C+dallas+tx Both of these validate using the Google testing tool, but what is strange is that the yellowpages.com example puts the URL to the profile page for a given location as the "name" in the schema for the local business, superpages.com uses the actual name of the location. Other sites such as Yelp etc have no markup for a location at all on a directory type page. We want to stay with schema and leaning towards the superpages option. Any opinions on the best route to go with this? Schema markup for logo and social profiles vs website name. If you read the article for schema markup for your logo and social profiles, it recommends/shows using the @type of Organization in the schema markup https://developers.google.com/structured-data/customize/social-profiles If you then click down the left column on that page to "Show your name in search results" it recommends/shows using the @type of WebSite in the schema markup. https://developers.google.com/structured-data/site-name We want to have the markup for the logo, social profiles and website name. Do we just need to repeat the schema for the @website name in addition to what we have for @organization (two sets of markup?). Our concern is that in both we are referencing the same home page and in one case on the page we are saying we are an organization and in another a website. Does this matter? Will Google be ok with the logo and social profile markup if we use the @website designation? Thanks!
Local Website Optimization | | HeaHea0 -
Best marketing for a language learning site
Hello everybody, I'm a programmer so I'm not very good at marketing. Any idea what the best way is to promote my language learning site? (http://www.antosch-and-lin.com/) Since Google Penguin the site has taken a big hit and the changes suggested by a SEO expert hasn't helped. Thanks for any suggestions!
Local Website Optimization | | delpino0 -
International site, be visible on both .com and .co.uk?
Do you guys have any tips to increase the visibility in both Google.com and Google.co.uk? The site today, have good visibility in USA, but its poor in the UK... Information: The server is based in US. No region is set in the Google Webmaster Tools. Incoming links are from global regions, mostly US. Do we need to add a specific section for the UK (uk.site.com or site.com/uk/) and specify region in GWT to make sure Google handle this the right way? Its a lot of work, rewrite all the content for another section, which also is in english...
Local Website Optimization | | Vivamedia0