How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fundamental things to concentrate on for Domain Authority
I'm looking for the fundamental things to concentrate on when trying to improve my Domain Authority as it is's poor for my area and I want our business to compete with other businesses in our county. We're an automotive company www.chorleygroup.co.uk - we need to compete with other automotive companies in our areas.
On-Page Optimization | | CGroup1230 -
Handling multiple locations in the footer
I have a client with several locations. Should I include only the main office's address in the footer? The client is wanting to add them all.
On-Page Optimization | | SearchParty0 -
Description tag not showing in the SERPs because page is blocked by Robots, but the page isn't blocked. Any help?
While checking some SERP results for a few pages of a site this morning I noticed that some pages were returning this message instead of a description tag, A description for this result is not avaliable because of this site's robot.s.txt The odd thing is the page isn't blocked in the Robots.txt. The page is using Yoast SEO Plugin to populate meta data though. Anyone else had this happen and have a fix?
On-Page Optimization | | mac22330 -
How to handle long dynamic meta tags?
Hi All, I have a site that has upwards of 40 000 pages and I'm redeveloping it so really want to get some SEO elements spot on for the new development. Hoe do I go about handling the following: The user creates a title for their advert which I use as the meta title. The problem is titles are quite often longer that the accepted lengths. How should I handle this? String manipulation down to the desired size, leave it as is or is there another solution? The meta descriptionn is pulled from a summary they created as part of their profile. Is this the right way to do it? Any advice would be appreciated. Ross
On-Page Optimization | | Mulith0 -
Multiple H1's
Hi, My SEOMOZ report states that I'm using two H1's on most of my pages, for example on this page: http://www.absolutepower.nl/eiwitshakes/proteine-shakes/ I only see one though. Anyone who could clarify this? Thanks! Jasper
On-Page Optimization | | Japking0 -
How do I avoid cannibalization and point search engines to the correct page?
Currently this page ranks for the phrase "keep calm and carry on" http://www.paper-source.com/cgi-bin/paper/item/Keep-Calm-and-Carry-On-Bandages/3307.010/413160.html This page is just one product of many in the Keep Calm category. I'd rather this page be the result: http://www.paper-source.com/cgi-bin/paper/gifts/keep-calm-and-carry-on.html seomoz states: "To prevent engines from potentially seeing a signal that this page is not the intended ranking target and creating additional competition for your page, we suggest staying away from linking internally to another page with the target keyword(s) as the exact anchor text." So obviously I should remove the link from the product detail page, but what other things can I do to get the category page to rank higher than the one product page? Thanks!
On-Page Optimization | | leighw0 -
Location in keyword terms
I'm optimizing a website for a dentist and I'm looking for the best approach to incorporating the location into the keyword terms. For example if a dental practice in Boston has a page on Cosmetic Dentistry what would be the best approach for optimizing for "Boston Cosmetic Dentist", "Boston Teeth Whitening" and "Cosmetic Dentist in Boston"? How should I handle the repetition of the location name? Will I get the best results by using the full keyword terms several times on the page "example a" or will "example b" provide similar results? Title Tag: a) Boston Cosmetic Dentist | Boston Teeth Whitening | Cosmetic Dentist in Boston
On-Page Optimization | | OptioPublishing
b) Boston Cosmetic Dentist | Teeth Whitening H1
a) Boston Cosmetic Dentist | Boston Teeth Whitening | Cosmetic Dentist in Boston
b) Boston Cosmetic Dentist | Teeth Whitening keywords to sprinkle through content
a) Boston Cosmetic Dentist, Boston Teeth Whitening, Cosmetic Dentist in Boston
b) Boston Cosmetic Dentist, Teeth Whitening etc... It's important to rank for all 3 keywords but the pages would be flooded with the words Dentist and Boston if I use each phrase exactly. Any suggestions? Thanks in advance,
Jason0 -
Can duplicate content issues be solved with a noindex robot metatag?
Hi all I have a number of duplicate content issues arising from a recent crawl diagnostics report. Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem? Thanks for any / all replies
On-Page Optimization | | joeprice0