Geotargeting a folder in GWT & IP targeting
-
I am curently managing a .com that targets Canada and we will soon be launching a .com/us/ that will target the US. Once we launch the /us/ folder, we want to display the /us/ content to any US IP. My concern is that Google will then only index the /us/ content, as their IP is in the US.
So, if I set up .com and .com/us/ as two different sites in GWT, and geotarget each to the Country it is targeting, will this take care of the problem and ensure that Google indexes the .com for Canada, and the /us/ for the US?
Is there any alternative method (that does not include using the .ca domain)? I am concerned that Google would not be able to see the .com content if we are redirecting all US traffic to .com/us/.
Any examples of this online anywhere?
-
Geotargeting is not uncommon and the search engines are pretty good at negotiating sites using it, depends on your implimentation though.
We use geotargeting of users on our site to send users to /country/ based on their IP, with a few conditions.
-
Users are only redirected when coming to the homepage, either as direct or from search.
If a user from France is coming to the main dot com we'll send them to the /fr/ site automatically.
If the user tries to get to the homepage via a search engine we redirect them to the appropriate language site. This is because the main dot com often shows up above or near the /country/ sites in search, especially for brand terms.
If a user has been given a subpage for a search (e.g. site/cat/page/) then they won't be redirected as we assume the search engines have done a decent enough job at matching them to a page they want. -
Users/bots can override the routing.
Users can internally navigate to other languages through the language menu.
If we send somebody in France to the French site, but they only speak English, instantly forcing them back to the /fr/ site will just frustrate them.
Link to the other language sites on your pages and check referral headers, if internal do not redirect.
All language folders are specified in WMT and it definitely works.
So Google comes in to the dot com, in the header there is a link to the /us/, so they'll crawl down that link getting both versions of the sites in their index.
You biggest challenge here is getting the /us/ ranking above the dot com in the USA, which will require some creative link building
Make sense?
-
-
Hi Bobby,
What eBay used to do when you were reaching the www.ebay.com from Québec in Canada, it used to display a lightbox saying something like : "Hey, we have a french canadian version of our website, would you like to use it? Click here" and then you would be redirected to www.cafr.ebay.ca
So, my suggestion is, instead of using a 301 redirection to redirect the user to the appropriate geotargeted portion of the website, let him chose by showing him a friendly javascript lightbox.
As web spider won't be prompted the javascript lightbox, they will be able to crawl and index both version without problem.
Best regards,
Guillaume Voyer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
Hi! The Problem We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them. The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed. Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions? Thanks for any input on this one. PmHmG
Technical SEO | | AlisonMills0 -
Can increase in crawl errors in GWT) be caused by input fields and jquery?
Dear Mozzerz We took over www.urgiganten.dk not long ago and last week we opened up for indexation, after having taken the old website down for a couple of months. One week after opening for indexation we saw a huge increase in crawl errors.Google is discovering some weird links to e.g http://www.urgiganten.dk/30-garmin-urremme/ which returns a 404. In GWT we are told that we are linking to this url from http://www.urgiganten.dk/garmin-urremme. But nowhere on http://www.urgiganten.dk/garmin-urremme will you find this link. However you will find the following script in the source code, which is the only code part that contains "/30-garmin-urremme/":Can it be true that google take the id and adds it to our tld to form a url? We have seen quite a lot of these errors not only on Urgiganten.dk but also some of our other websites!
Technical SEO | | urgiganten0 -
Adding academic content for a school in a sub folder, sub domain, or different site?
I manage the website for a school and we are planning to put our academic policies/student handbook online. I’m curious if there is any SEO value to including this content in our main website? This isn’t stuff that anyone is going to link to externally (student orientation procedures, how to enroll/drop credits, academic warning policies etc) and there would be limited internal linking as well (someone looking for course information doesn’t want to see this type of stuff). I’m not interested in SERPs for this content, but I’m wondering if the additional content could help the site’s SEO overall? It is naturally rich with ‘academic’ keywords and the only websites that use this type of content are universities. On a similar note, I need to put up student profiles for potential employers to view. Like the policies, this is not priority content for someone visiting our website, but it is still keyword-rich content, which would add to the overall 'size' of the site. Should this stuff go in a folder, a subdomain, or in a different location altogether?
Technical SEO | | AISFM0 -
Using same IP for differenct country TLD versions
Hi Will having a websites in several languages hosted on the same IP be a problem SEO wise if they are using different country TLD's? In this case shopdomain.de, at and co.uk on the exact same server IP
Technical SEO | | AndersDK0 -
Translating Page Titles & Page Descriptions
I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company. Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?
Technical SEO | | DanielFreedman0 -
Subdomains & CDNs
I've set up a CDN to speed up my domain. I've set up a CNAME to map the subdomain cdn.example.com to the URL where the CDN hosts my static content (images, CSS and JS files, and PDFs). www.example.com and cdn.example.com are now two different IP addresses. Internal links to my PDF files (white papers and articles) used to be www.example.com/downloads but now they are cdn.example.com/downloads The same PDF files can be accessed at both the www and the cdn. subdomain. Thus, external links to the www version will continue to work. Question 1: Should I set up 301 redirects in .htaccess such as: Redirect permanent /downloads/filename.pdf http://cdn.example.com/downloads/filename.pdf Question 2: Do I need to do anything else in my .htaccess file (or anywhere else) to ensure that any SEO benefit provided by the PDF files remains associated with my domain? Question 3: Am I better off keeping my PDF files on the www side and off of the CDN? Thanks, Akira
Technical SEO | | ahirai0 -
New Domain Page 7 Google but Page 1 Bing & Yahoo
Hi just wondered what other people's experience is with a new domain. Basically have a client with a domain registered end of May this year, so less than 3 months old! The site ranks for his keyword choice (not very competitive), which is in the domain name. For me I'm not at all surprised with Google's low ranking after such a short period but quite surprsied to see it ranking page 1 on Bing and Yahoo. No seo work has been done yet and there are no inbound links. Anyone else have experience of this? Should I be surprised or is that normal in the other two search engines? Thanks in advance Trevor
Technical SEO | | TrevorJones0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1