How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to rank in Google during domain name search?
We have received one requirement for this website. http://www.coldcasebeer.com/ We would like to rank in Google during domain name search. If we are going to search following search terms in Google then We are not able to see this domain on first page of Google search result. cold case beer coldcasebeer We have done quick research on this issue. And, We have decided to implement following tasks to make it happen.But, We are quite excited to read certain inputs on this question from experts! 1. Upload default Robots.txt file 2. Verify website on Google webmaster tools 3. Set up USA region on Google webmaster tools 4. Submit Google crawl request on Google webmaster tools 5. Title tag and Meta description optimization for all pages (There are 5 pages only on website) 6. Audit Google local listing 7. Develop quality links with domain name
On-Page Optimization | | CommercePundit0 -
Duplicate content issue, across site domains (blogging)
Hi all, I've just come to learn that a client has been cross-posting their blog posts to other blogs (on higher quality domains, in some cases). For example - this is the same post on 3 different blogs. http://thebioethicsprogram.wordpress.com/2014/06/30/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
On-Page Optimization | | ketanmv
http://blogs.law.harvard.edu/billofhealth/2014/06/29/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
http://www.thefacultylounge.org/2014/06/how-an-irb-could-have-legitimately-approved-the-facebook-experimentand-why-that-may-be-a-good-thing.html
And, sometimes a 4th time, on an NPR website. I'm assuming this is doing no one any favors and Harvard or NPR is going to earn the rank most every time. I'm going to encourage them to publish only fresh content on their real blog, would you agree? Can this actually harm the ranking of their blog and website - should we delete the old entries when migrating the blog? They are going to move their Wordpress Blog to hosting on their real domain soon:
http://www.bioethics.uniongraduatecollege.edu/news/ The current set up is not adding any value to their domain. Thank you for any advice! Ketan0 -
Yoast SEO sitemap link 404 problem
I have recently moved my wordpress blog from a subdomain into a directory e.g. www.mysite.com/blog/ and installed yeast SEO however when I go to the site map as directed in the pluign panel www.mysite.com/blog/sitemap_index.xml its not there are I get a 404 error? Any help much appreciated.
On-Page Optimization | | SamCUK0 -
How can I fix multiple 404 errors with Wildcard htaccess redirect
Hi all I hope that someone can help.... How can I fix multiple 404 errors with Wildcard htaccess redirect The url in question is: How can I fix multiple 404 errors with Wildcard htaccess redirect http://www.5starweddingdirectory.com/listing/search/Category/luxury_hotels_venues_uk_wedding_venues/exclusive_use_venues/letter/c http://www.5starweddingdirectory.com/listing/location/uk-england/bedfordshire-weddings/franklin-park http://www.5starweddingdirectory.com/deal/location/uk-england/chorley-weddings/curtis-bay etc, going to http://www.5starweddingdirectory.com/business the above is just a few examples, google webmaster is showing over 8.000 404 page not found errors. Thanks in advance.
On-Page Optimization | | Taiger0 -
Using Robots Meta Tag on Review Form Pages
I have gone over this so many times and I just can't seem to get it straight and hope someone can help me out with a couple of questions: Right now, on my dynamically created pages created by filters (located on the category pages) I am using rel""canonical" to point them to their respective category page. Should I also use the robots meta tag as well? Similarly, each product I have on my site has a review form on it and thus is getting indexed by Google. I have placed the same canonical tag on them as well pointing them to the page with the review form on it. In the past I used robots.txt to block google from the review pages but this didn't really do much. Should I be using the robots meta tag on these pages as well? If I used the robots meta tag should I noindex,nofollow? Thanks in advance, Jake
On-Page Optimization | | jake3720 -
Similar content multiple pages
I have run in to a situation on an e-commerce store where products from a certain manufacturer require a fairly large chunk of corporate information to be posted underneath the product description: I.E. Trademark information, etc. This information happens to be close to half the size of the product description information. Am I at risk of getting hit negatively for this portion of text duplicated across multiple products? I was considering putting a link to a separate informational page with this information but am not sure if it even matters? What are your recommendations brilliant SEO'erz?
On-Page Optimization | | wishmedia0 -
What does this mean on first step up setting up a campaign? "Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here."
I am BRAND new to this, and setting up my first campaign. I choose subdomain, and entered www.pdsaz.com. This is the message I receive: We have detected that the domain www.pdsaz.com and the domain pdsaz.com both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here.
On-Page Optimization | | cschwartzel0 -
Disallow a spammed sub-page from robots.txt
Hi, I have a sub-page on my website with a lot of spam links pointing on it. I was wondering if Google will ignore that spam links on my site if i go and hide this page using the robots.txt Does that will get me out of Google's randar on that page or its useless?
On-Page Optimization | | Lakiscy0