Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup

-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side

-
you can't submit a sitemap in GA so I'm guessing you mean GWT

Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.

-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple H1s and Header Tags in Hero/Banner Images
I work on education websites, and our sites are being flagged by SEO and accessibility checkers for having multiple H1s. The home pages have the site name as an h3 in the hero image, and an aspirational headline (think: Be Like Mike) as an H1. The sub-pages have two H1s: one on the site name in the banner image, and the other on the page title. Note that the site name is very keyword-rich. If we were to remove the H1 and H3 tags from the hero/banner images, would it do any SEO harm? At the same time, we’d rewrite the H1 on the home page to be more keyword-focused. Any other options? I also read that it’s OK to have multiple H1s as long as it’s clear which H1 belongs to the heading area and which one belongs to the body area of the page. Thanks in advance!
On-Page Optimization | | UWPCE0 -
Link flow for multiple links to same URL
Hi there,
On-Page Optimization | | doctecs
my question is as follows: How does Google handle link flow if two links in a given page point to the same URL? (do they flow link individually or not?) This seems to be a newbie question, but actually it seems that there is little evidence and even also little consensus in the SEO community about this detail. Answers should include source Information about the current state of art at Google is preferable The question is not about anchor text, general best practises for linking, "PageRank is dead" etc. We do know that the "historical" PageRank was implemented (a long time ago) without special handling for multiple links, as e.g. last stated by Matt Cutts in this video: http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 On the other hand, many people from the SEO community say that only the first link counts. But so far I could not find any data to back this up, which is quite surprising.0 -
How to 301 redirect, without access to .htaccess and to a new domain
There are few ways to do this and I would like to ask other Mozzers if they have found the best way. We have a site .co.uk and are moving it back to .com. However we do not have any access to the site folders for .co.uk. (We have to move it anyway as our provider is withdrawing their service). We have built our URL 301 redirect file and it is ready to go, but how to impliment it? We can repoint .co.uk to another site, and then redirect all traffic for each URL but this is quite messy, or just forget trying to 301 each page and just rediect the whole site.
On-Page Optimization | | BruceA
the .com has more authority already, but we ready do not want to frustrate visitors who are using a link to reach a product, only to find they hit our homepage and not the product. Your thoughts would be very welcome or other ideas Bruce0 -
How will it effect SEO to have multiple h1 tags on a page?
I have a client who recieved this advice from his marketing consultant: "If there are multiple h1 tags on a page, this can confuse Google and it may have a negative impact on the keyword rankings. If you could ask your web developer to go in and remove the h1 tags on the header images that would be helpful. This way it will be easier for Google to index your site and will help your keyword rankings." How will it effect SEO to have multiple h1 tags on a page?
On-Page Optimization | | GRIP-SEO0 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
How do i block an entire category/directory with robots.txt?
Anyone has any idea how to block an entire product category, including all the products in that category using the robots.txt file? I'm using woocommerce in wordpress and i'd like to prevent bots from crawling every single one of products urls for now. The confusing part right now is that i have several different url structures linking to every single one of my products for example www.mystore.com/all-products, www.mystore.com/product-category, etc etc. I'm not really sure how i'd type it into the robots.txt file, or where to place the file. any help would be appreciated thanks
On-Page Optimization | | bricerhodes0 -
SEO Location Pages - ALT Image Tag Question
Hello Guru's, I have a Hire Website whereby you can rent products online. I have created different Location pages for these which are in essence the same pages page but with different location specific urls, title tags , on page content etc etc. This helps me to rank for local search. These location pages also display 20 products per page. My question is Should I make the ALT IMAGE TEXT location specific for each of the 20 products . Example - Steam Cleaner Rental in "location" or should I only amend a few of the Atl Image Texts to be location specific. I don't want to come accross as spammy in google eyes but I also don't want to be seen as having duplicate content , images etc etc What do you think ? thanks Sarah.
On-Page Optimization | | SarahCollins0 -
Hyphens in Domain Name
I have a client who is a business broker. I have just begun working with them, and they are trying to determine the best domain name to use - they have several. business-broker-alabama.com vs businessbrokeralabama.com Which one of these has more value? I know that search engines have somewhat devalued keywords. The first one probably has a little more SEO value, but is going to be a drag in terms of marketing -- saying "business hyphen broker hyphen alabama" is going to get old for them. Thanks!
On-Page Optimization | | csmithal0