International Sites - Sitemaps, Robots & Geolocating in WMT
-
Hi Guys,
I have a site that has now been launched in the US having originally just been UK. In order to accommodate this, the website has been set-up using directories for each country. Example:
As the site was originally set-up for UK, the sitemap, robots file & Webmaster Tools account were added to the main domain. Example:
The question is does this now need changing to make it specific for each country. Example:
The sitemap and robots.txt for the UK would move to:
and the US would have its own separate sitemap and robots.txt. Example :
Also in order to Geolocate this in WMT would this need to be done for each directory version instead of the main domain? Currently the WMT account for the UK site is verified at www.domain.com, would this need reverifying at domain.com/en-gb?
Any help would be appreciated! Thanks!
-
Thanks for the insights. Very helpful. What about the robots.txt, though? Should it stay under http://www.example.com, where crawlers can find the file?
-
Yes, that's how I did it.
-
Thanks for the response, much appreciated.
Would this mean that the sitemap featured at www.domain.com would need to reference each directory version (UK & US etc etc), leaving the en-gb & en-us to reference just their own?
-
Hi, here's what Google officially has to say:
Webmaster Tools data and reporting work best on a site level. For example, if your site http://www.example.com has separate sections for different countries, we recommend adding each of those subsites or subfolders as a separate site. For example, if you have a travel site with specific subfolders covering Ireland, France, and Spain, you could add the following sites to your Webmaster Tools account:
- http://www.example.com
- http://www.example.com/france
- http://www.example.com/ireland
- http://www.example.com/spain
Following this I have solved the exactly the same problem you had for a client. And it works perfectly.
- _The question is does this ["sitemap"] now need changing to make it specific for each country. _What I have done is add the sitemaps for each subdirectory as well as the main domain. It works (tracking and all that) and is easy to do.
- _Also in order to Geolocate this in WMT would this need to be done for each directory version instead of the main domain? _Yes, that is the way to go.
- _Currently the WMT account for the UK site is verified at www.domain.com, would this need reverifying at domain.com/en-gb? _Yes, it does. And so do every other subdirectory you add down the line. That allows precise international targeting per subdirectory via the hreflang and country targeting settings and setting the main GWT site for the root domain to be for global traffic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multi National Company that Doesn't Want to Implement International SEO
I have got an interesting situation where I have a client who wants to merge two ccTLD's into one. They currently have .fi and .com and they want to merge both sites to .com .fi is for finland and .com for USA. They want to merge the sites and the original plan was to use subfolders for each country and pair with hreflang. However the team now wants to merge both sites with NO subfolders differentiating between finland or the US. My understanding of International SEO that this is the most opposite from best practices, but is there any specific reasons why they wouldn't want to do this? I'm struggling to find any specific reasons that I can cite to the client that would argue why we should at least do a subfolder or some sort of international seo strategy.
International SEO | | JKhoo1 -
How to Configure Robots.txt File
How to configure correctly the Robots.txt File of the website. Need proper process to follow.Because a lot of my website URLs are excluded by google with the issues in the Robots.txt file.
International SEO | | seobac1 -
MultiRegional site indexing problems
Hello there!!! I have a multiregional site and dealing with some indexing problems. The problem is that google have only indexed our USA site We have: -set up hreflang tags -set up specific subdirectories https://www.website.com/ (en-us site and our main site) https://www.website.com/en-gb https://www.website.com/en-ca https://www.website.com/fr-ca https://www.website.com/fr-fr https://www.website.com/es-es ..... -set up automatic GEO IP redirects (301 redirects) -created a sitemap index and a different sitemap for each regional site -created a google webmaster's tool for each country targeted -created translations for each different language and added some canonicals to the US' site when using English content. The problem is that Google is not indexing our regional sites. I think that the problem is that google is using a US bot when spidering the site, so it will be always redirect to the US version by a 301 redirect. I have used fetch as google with some of our regional folders and asked for "Indexing requested for URL and linked pages", but still waiting. Some ideas?? changing 301 to 302? Really don't know what to do. Thank you so much!!
International SEO | | Alejandrodurn0 -
International SEO errors
Hello, In a muddle here. A website has a .co.uk and a .com version. They want to target the UK market and the USA market respectively. The content for the UK version has been localised for the UK audience (e.g. spellings etc) but the content is the same in both sites. There are errors in .co.uk version in webmaster tools : International Targeting | Language > 'en' - no return tags__URLs for your site and alternate URLs in 'en' that do not have return tags.**Q 1) What does this mean?**I can see that both the .com and .co.uk version has only this in place:**2) Should they actually have respectively?**3) Do they also need rel=canonical from the .co.uk to the .comAny help would be appreciated.
International SEO | | AL123al0 -
Best International Rank Checker?
Anyone have a recommendation for the best international ranking tool. I'm looking to gather accurate ranking trends and am looking for a service that will return rankings in as many localized countries as possible.
International SEO | | marcbn0 -
Shall I automatically redirect international visitors from www.domain.com to e.g. www.domain.com/es? What is best SEO practice?
We have chosen the one domain approach with our international site having different language versions in subdirectory of main domain:
International SEO | | lcourse
www.domain.com/es
www.domain.com/it
etc. What is SEO-wise best practice for implementing international index pages. I see following options: entering www.domain.com will display without redirection the index page in language of user (e.g based on IP or browser) in www.domain.com
Example: www.booking.com entering www.domain.com will always show English index page.
Additionally one may display a message in the header if IP from other country with link to other language version.
Example: www.apple.com entering www.domain.com will always redirect automatically to country specific subdirectory based on IP
Example: www.samsung.com Any thoughts/suggestions on what may be best solution from a SEO perspective? For a user I believe options 1) & 3) are preferable.0 -
Multilingual site - separate domain or all under the same umbrella
this has been asked before with not clear winner. I am trying to sum up pros and cons of doing a multilingual site and sharing the same domain for all languages or breaking it into dedicated subdomains e.g. as an example lets assume we are talking about a french property portal with an english version as well. Assume most of the current incoming links and traffic is from France. A) www.french-name.fr/fr/pageX for the french version www.english-name.com/en/pageX for the english version B) www.french-name.fr/fr/ for the french name (as is) www.french-name.fr/en for the english version the client currently follows approach A but is thinking to move towards B we see the following pros and cons for B take advantage of the french-name.fr domain strength and incoming links scalable: can add more languages without registering and building SE position for each one individually potential issues with duplicate content as we are not able to geotarget differenly on web master tools of google potential dilution of each page's strength as we will now have much more pages under the same domain (double the pages basically) - is this a valid concern? usability/marketing concerns as the name of the site is not in english (but then people looking for a house in France would be at least not completely alien to it) what are your thoughts on this? thanks in advance
International SEO | | seo-cat0 -
What is the best SEO site structure for multi country targeting?
Hi There, We are an online retailer with four (and soon to be five) distinct geographic target markets (we have physical operations in both the UK and New Zealand). We currently target these markets like this: United Kingdom (www.natureshop.co.uk) New Zealand (www.natureshop.co.nz) Australia (www.natureshop.com/au) - using a google web master tools geo targeted folder United States (www.natureshop.com) - using google web master tools geo targeted domain Germany (www.natureshop.de) - in german and yet to be launched as full site We have various issues we want to address. The key one is this: our www.natureshop.co.uk website was adversely affected by the panda update on April 12. We had some external seo firms work on this site for us and unfortunately the links they gained for us were very low quality, from sometimes spammy sites and also "keyword" packed with very littlle anchor text variation. Our other websites (the .co.nz and .com) moved up after the updates so I can only assume our external seo consultants were responsible for this. I have since managed to get them to remove around 70% of these links and we have bought all seo efforts back in house again. I have also worked to improve the quality of our content on this site and I have 404'ed the six worst affected pages (the ones that had far too many single phrase anchor text links coming into them). We have however not budged much in our rankings (we have made some small gains but not a lot). Our other weakness's are not the fastest page load times and some "thin" content. We are on the cusp (around 4 weeks away) of deploying a brand new platform using asp.net MVP with N2 and this looks like it will address our page load speed issues. We also have been working hard on our content building and I believe we will address that as well with this release. Sorry for the long build up, however I felt some background was needed to get to my questions. My questions are: Do you think we are best to proceed with trying to get our www.natureshop.co.uk website out of the panda trap or should we consider deploying a new version of the site on www.natureshop.com/uk/ (geo targeted to the UK)? If we are to do this should we do the same for New Zealand and Germany and redirect the existing domains to the new geo targeted folders? If we do this should we redirect the natureshop.co.uk pages to the new www.natureshop.com/uk/ pages or will this simply pass on the panda "penalty". Will this model build stronger authority on the .com domain that benefit all of the geo targeted sub folders or does it not work this way? Finally can we deploy the same pages and content on the different geo targeted sub folders (with some subtle regional variations of spelling and language) or will this result in a duplicate content penalty? Thank you very much in advance to all of you and I apologise for the length and complexity of the question. Kind Regards
International SEO | | ConradC
Conrad Cranfield
Founder: Nature Shop Ltd0