Question regarding geo-targeting in Google Webmaster Tools.
-
I understand that it's possible to target both domains/subdomains and subfolders to different geographical regions in GWT.
However, I was wondering about the effect of targeting the domain to a single country, say the UK. Then targeting subfolders to other regions (say the US and France).
e.g.
www.domain.com -> UK
www.domain.com/us -> US
www.domain.com/fr -> Franceetc
Would it be better to leave the main domain without a geographical target but set geo-targeting for the subfolders? Or would it be best to set geo-targeting for both the domain and subfolders.
-
Hi David,
Thanks for your response. That makes perfect sense.
I assumed that to be the case but thought it was worth checking before making any changes.
I suppose by adding appropriate hreflang="x" mark-up combined with the geo-targeting of root domain and subfolders - that should be enough to inform search engines of our intended geographical targets.
Strangely there wasn't a lot of information out there about this specific question - so thanks again.
Yusuf
-
Hi there, your suggested setup is perfectly fine. You're able to, and allowed to target the root domain to a specific country, while targeting subfolders to others. The geotargeting on the main domain's URLs will be overridden if you specify a different target for subfolders, for example:
if you target www.domain.com -> UK, then
- www.domain.com/this -> UK
- www.domain.com/that -> UK
and if you target www.domain.com/us -> US, then
- www.domain.com/us/this -> US
- www.domain.com/us/that -> US
Does that make sense?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wrong target content in the SERP regarding language
Hi Guys! I'm currently under an SEO issue and need some advices about it. My problem is that, Google doesn't show the good pages in the SERPs regarding the languages. In fact, I translated some content in Italian, German, French etc ... When someone use the branding name of the project to find it by google, if this guy is French, German, or something else, Google shows the English version in the results. I of course would like google showing the German version for a German guy in the SERP ... I already made properly my hreflang tags. Some tips to fix it? Thanks a lot in advance! And hope everybody had a merry christmas!
Intermediate & Advanced SEO | | SEOBubble0 -
Fetch as Google - Redirected
Hi I have swaped from HTTP to HTTPS and put a redirect on for HTTP to redirect to HTTPS. I also put www.xyz.co.uk/index.html to redirect to www.xyz.co.uk When I fetch as Google it shows up redirect! Does this mean that I have too many 301 looping? Do I need the redirect on index.html to root domain if I have a rel conanical in place for index.html htaccess (Linix) - RewriteCond %{HTTP_HOST} ^xyz.co.uk
Intermediate & Advanced SEO | | Cocoonfxmedia
RewriteRule (.*) https://www.xyz.co.uk/$1 [R=301,L] RewriteRule ^$ index.html [R=301,L]0 -
Local Listing Question
We will be starting local SEO efforts on a medical practice that has 4 locations & 15 doctors each location (so 60 listings total). I will submit each doctor & each location to InfoGroup, LocalEze, Axciom & Factual. Also, I will only submit each location (not doctors) to Google. The problem I'm seeing is the fact that each listing would have the same exact phone number - it all goes to one main routing center. What kind of problems could come of this? Do we need a separate phone numbers for each of the four locations (at the very least)?
Intermediate & Advanced SEO | | JohnWeb120 -
Whole site blocked by robots in webmaster tools
My URL is: www.wheretobuybeauty.com.auThis new site has been re-crawled over last 2 weeks, and in webmaster tools index status the following is displayed:Indexed 50,000 pagesblocked by robots 69,000Search query 'site:wheretobuybeauty.com.au' returns 55,000 pagesHowever, all pages in the site do appear to be blocked and over the 2 weeks, the google search query site traffic declined from significant to zero (proving this is in fact the case ).This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster toolsrobots.txt file existed but did not have any entries to allow or disallow URLs - today I have removed robots.txt file completely URL re-direction within Linux .htaccess file - there are many rows within this complex set of re-directions. Developer has double checked this file and found that it is valid.I have read everything that google and other sources have on this topic and this does not help. Also checked webmaster crawl errors, crawl stats, malware and there is no problem there related to this issue.Is this a duplicate content issue - this is a price comparison site where approx half the products have duplicate product descriptions - duplicated because they are obtained from the suppliers through an XML data file. The suppliers have the descriptions from the files in their own sites.Help!!
Intermediate & Advanced SEO | | rrogers0 -
E-Commerce site - How do I geo-target towns/cities/states if there aren't any store locations?
Site = e-commerce Products = clothing (no apparel can be location specific like sports gear where you can do the location specific team gear (NBA, NFL, etc)) Problems = a. no store front b. I don't want to do any sitewides (footers, sidebars, etc) because of the penguin update Question = How do you geo-target these category pages and product pages? Ideas = a. reviews with clients locations b. blog posts with clients images wearing apparel and location description and keywords that also links back to that category or be it product page (images geo- targeted, tags, and description) c. ? Thanks in advance!
Intermediate & Advanced SEO | | Cyclone0 -
Somthing weird in my Google Webmaster Tools Crawl Errors...
Hey, I recently (this past may) redesigned my e-commerce site from .asp to .php. I am trying to fix all the old pages with 301 redirects that didn't make it in the switch, but I keep getting weird pages coming up in GWT. I have about 400 pages under crawl errors that look like this "emailus.php?id=MD908070" I delete them and they come back. my site is http://www.moondoggieinc.com the id #'s are product #'s for products that are no longer on the site, but the site is .php now. They also do not show a sitemap they are linked in or any other page that they are linked from. Are these hurting me? and how do I get rid of them? Thanks! KristyO
Intermediate & Advanced SEO | | KristyO0 -
Who is beating you on Google (after Penguin)?
Hi,
Intermediate & Advanced SEO | | rayvensoft
After about a month of Penguin and 1 update, I am starting to notice an annoying pattern as to who is beating me in the rankings on google. I was wondering if anybody else has noticed this.
The sites who are beating me - almost without exception - fall into these 2 categories. 1) Super sites that have little or nothing to do with the service I am offering. Now it is not the homepages that are beating me. In almost all cases they are simply pages hidden in their forums where somebody in passing mentioned something relating to what I do. 2) Nobodies. Sites that have absolutely no links back to them, and look like they were made by a 5 year old. Has anybody else noticed this? I am just wondering if what I see only apply to my sites or if this is a pattern across the web. Does this mean that for small sites to rank, it is now all about on-page SEO? If it all about on-page, well that is great... much easier than link building. But I want to make sure others see the same thing before dedicating a lot of time to overhaul my sites and create new content.| Thanks!0 -
Questions regarding Google's "improved url handling parameters"
Google recently posted about improving url handling parameters http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html I have a couple questions: Is it better to canonicalize urls or use parameter handling? Will Google inform us if it finds a parameter issue? Or, should we have a prepare a list of parameters that should be addressed?
Intermediate & Advanced SEO | | nicole.healthline0