How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is robots.txt file issue?
I hope you are well. Mostly moz send me a notification that your website can,t be crawled and it says me o check robots.txt file. Now the Question is how can solve this problem and what should I write in robots.txt file? Here is my website. https://www.myqurantutor.com/ need your help brohers.... and Thanks in advance
On-Page Optimization | | matee.usman0 -
How to 301 redirect, without access to .htaccess and to a new domain
There are few ways to do this and I would like to ask other Mozzers if they have found the best way. We have a site .co.uk and are moving it back to .com. However we do not have any access to the site folders for .co.uk. (We have to move it anyway as our provider is withdrawing their service). We have built our URL 301 redirect file and it is ready to go, but how to impliment it? We can repoint .co.uk to another site, and then redirect all traffic for each URL but this is quite messy, or just forget trying to 301 each page and just rediect the whole site.
On-Page Optimization | | BruceA
the .com has more authority already, but we ready do not want to frustrate visitors who are using a link to reach a product, only to find they hit our homepage and not the product. Your thoughts would be very welcome or other ideas Bruce0 -
Deciding on domain
Hi, I have a company website that I have access to and it's bcannon.remax-mississippi.com. It allows you to choose a domain and have it 301 redirected so I changed it to Oxfordmissrealestate.com. I noticed that the page and domain authority for bcannon.remax-mississippi.com was pretty high so I took off the Oxfordmissrealestate.com domain and just left it as this subdomain. My question is, would you leave it how it is as the subdomain or change it? Thanks!
On-Page Optimization | | Veebs0 -
Multiple domains for the same business
My client purchased over 500 URLs for targeting various customers and ranking for different keywords. It is for the same business though. What is the best strategy to deal with this kind of approach in your opinion. They use different meta data for each of the URLs starting with brand name in meta title. Are there any other points to keep in mind when developing strategy for all those URLs. Is this a good approach?
On-Page Optimization | | alicaomisem1 -
One site, one location, multiple languages - best approach?
Hey folks, Has anyone created a multilingual site targeted at a single location? I have a site that I need to create which is targeting users in Spain. There are going to need to be English and Spanish versions of the text. My thoughts would be to handle it this way: 1. Geolocate the entire site to spain 2. Have the english content in a folder /en/ 3. Have the spanish content in a folder /es/ As far as I am aware the same content in another language is not considered duplicate content and Google should handle folks searching in spanish or english and show them the correct landing page. Sounds easy enough in principle but I also have these other options to seemingly solidify the approach: 4. Add: rel="alternate" hreflang="x" (3) 5. Add language information to a sitemap (4) Again, none of that seems terribly difficult but would welcome any feedback and particularly experience of multilingual sites targeting a single location. Thanks all Marcus References and info 1. Multi Regional:
On-Page Optimization | | Marcus_Miller
http://googlewebmastercentral.blogspot.co.uk/2010/03/working-with-multi-regional-websites.html 2. Multi Language:
http://googlewebmastercentral.blogspot.co.uk/2008/08/how-to-start-multilingual-site.html 3. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 4. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=26208650 -
Seo'ing Sub domains for images
We are currently adding some performance improvement to our websites, to improve user experience. One of the things we are looking at is splitting of images over several sub-domains, to increase the number of images that can be downloaded at the same. We have seen that using key phrases within image names has an improvement in rankings. So, the question is should we create sub-domains as key-phrase.example.co.uk or as i1.example.co.uk? K
On-Page Optimization | | soltec0 -
Tool for Generating Sitemap/ URL List
HI, I'm looking for a tool that'll generate a URL list for a site. I looked at this thread here http://www.seomoz.org/q/online-sitemap-generator which came up when I searched for sitemap generator. However, I don't need a sitemap per se, and I don't need to submit it to Google - just a list of pages is what I need.If it updated automatically, that would be useful as well. Anyone know of a tool, on or offline? Or anyone used Xenu and know if it's what I'm looking for? Or is there a simple solution that I'm missing? Thanks.
On-Page Optimization | | 5225Marketing0 -
Subfolder v Root Domain for attaining ranking
Hi, We are working with a site that we can refer to as: www.clientsbrand.com.au It has a subsection of our clients services at www.clientsbrand.com.au/keyword The client has now decided that this subsection is in fact the most important thing they should be trying to rank for. A large amount of the content that is already existing under clientsbrand.com.au/topic1 /topic2 /topic3 etc would fit naturally under clientsbrand.com.au/keyword/topic1 etc The root domain is currently not ranking for the keyword they wish to target. I am planning on moving a lot of this content to the subfolder to try and boost that silo and get things going for that keyword. As long as we 301 the old URL's to their corresponding new position are there any issues any of you see with doing this? I am worried there may be some traps for young players I have not considered. Thanks in advance for any responses.
On-Page Optimization | | MrPaulB0