How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate URL's in Sitemap? Is that a problem?
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
On-Page Optimization | | Luciana_BAH0 -
How often is your domain authority updated?
I can't seem to figure out how often our domain authority is updated - it seems random, do you know typically when this happens? Thanks!
On-Page Optimization | | regineraab0 -
Will pushing a visitor to a conversion page hosted on a 3rd-party domain hurt the landing page ranking
Had an interesting question from a client. The client has a page that is optimized for a specific term. The goal of the page is to push users to sign-up for a trial. The trial registration (conversion) page is hosted by a third-party. Will pushing users to the conversion page cannibalize the SEO authority of the landing page. My reflexive answer is to say no, but now am not so sure.
On-Page Optimization | | infoblue0 -
Which version of the homepage on the sitemap?
We have been wondering this for a while now. When we build our sitemaps, or when the yoast plugin does in WP we are often left with www.yourdomain.co.uk/ and www.yourdomain.co.uk/index.html in our sitemaps. Surely it isn't healthy to have both in the sitemap. Which one should we take out? Thanks
On-Page Optimization | | EveronSEO0 -
Are Bullet Points Bad For Context?
Cyrus' webinar yesterday got me thinking about the use of bullet points and Google's increasing affection for context and semantics. Because of their brevity, I can see bullet points parsing up context/semantics etc. Some of our competitors (apparently trying to appeal to Google) are writing 2-4 sentences about the product in the desciption section followed by bullet points repeating the same information. They'll also put a few sentences of description at the top of the page above the price and then repeat it below in the description (this is relatively common). I put text in the description section and like bullet points as a shopper because I can quickly and easily see important information - therefore I use them heavily in product descriptions. Is Google smart enough process bullet points and mash together context or would Google prefer content in well written sentences? Should I be frying bigger fish?
On-Page Optimization | | AWCthreads0 -
Is it redundant to include a redirect to my canonical domain (www) in my .htaccess file since I already have the correct rel="canonical" in my header?
I've been reading the benefits of each practice, but not found anyone mentioning whether it's really necessary to do both? Personally I try to stay clear of .htaccess rewrites unless it's absolutely necessary, since because I've read they can slow down a website.
On-Page Optimization | | HOPdigital0 -
Forcing keywords into domain structure
Hi there, Over the last few years, I've seen people structuring their site so that their main content is all housed in a folder named after the site's primary keywords. For example, if I had some content about home insurance, normally naming conventions state that I might put the content at a URL such as: www.mydomain.com/home-insurance However, some sites, may change this structure to include their main keyword again in the URL string: www.mydomain.com/insurance/home-insurance The folder 'insurance' would normally hold the site's Sitemap to increase internal linking strategy too. I'd be really interested to hear whether anyone has seen any serious benefits from re-structuring their site in this way? What are your thoughts on this? Thanks,
On-Page Optimization | | theshortstack0 -
Local Marketing for Multiple Locations
I have a client who recently expanded to New York from Miami, so now they have 2 active locations. They currently rank very well for local Miami terms both organically and on local maps. Any specific recommendations as to how to go about optimizing for the New York terms without compromising the Miami terms? BOTH organically and on Local map listings.
On-Page Optimization | | First0