How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
XML Sitemaps for Property Website
Hi all, I was hoping that someone might have a link to a good example of an XML Sitemap for a large property (real estate) website? Thanks in advance to anyone who does! 🙂 Gavin
On-Page Optimization | | IcanAgency0 -
How much weight does domain age really carry?
One of my clients competitors launched a new site in January 2014 (totally new site on a domain that had previously never been used). The competitor has very few backlinks (only double digits), most of which are directory links (dofollow and nofollow). Their authority level is good but not as high as others who rank on top pages with them and their on-page optimization is lacking in a few areas. For all intents and purposes, the site should not be ranking where it is from what I can see. However, it is literally skyrocketing up the ranks faster than I would have ever imagined. The only thing I found that this domain has going for it is age (roughly 4 years). Does this carry more weight than I think it does? When compared to my clients site, we have more backlinks (similar mix), higher DA and PA and better on-page optimization for the same keywords. However, our domain age is only a little over 1 year.
On-Page Optimization | | mattylac0 -
How to Optimize Multiple Listing Pages
Hello Webmasters, How can I optimize a site having a listings which creates various multiple pages? e.g: Pages like below: http://moz.com/blog?page=2 http://moz.com/blog?page=3 etc I want to optimize meta tags of these pages. If i put common title and description. My moz analysis showing duplicate meta tags and duplicate description issues. Please guide me to optimize these type of pages.
On-Page Optimization | | wmsindia0 -
Can a country level domain perform well in international SEO if all the targeted keywords are related to that country domain?
Hi fellow Mozers, I am doing international SEO on Google US, UK, UAE and Saudi Arabia. All my targeted keywords have my country name in it, for example, export companies in France, best import companies in France and so on. Finally, my website has a country level domain i.e. www.xyz.co.fr _Hence, my questions are: _ 1. Is it good have a country level domain in this case or should I go TLDs? 2. Should my Google Plus page be a local business page or company page from SEO perspective? I have more than 10000 users who have +1 my website. _Thanks in advance. _
On-Page Optimization | | Abhi81870 -
What's the point of my blog?
My website, www.toplinecomms.com has a reasonably good blog that gets quite good interaction and sharing. I introduced the blog at the start of 2013 because the general sentiment from all the SEO books and articles I had read was that a good blog could be invaluable to a search marketing campaign. The posts on the blog are keyword optimised and they get great shares and social engagement. However, I have noticed that the blog is stealing my services' pages' thunder! There are some keywords that I am keen for our services pages to rank for, but the blog is beating them to it! So my question is: How should I be using my blog to get my services pages to rank higher?
On-Page Optimization | | HeatherBakerTopLine0 -
Sitemap generator and /index.htm
I use DMX Zone sitemap generator in Dreamweaver. When generating the sitemap, it includes both http://www.domain.com and http://www.domain.com/index.htm. Should the /index.htm be deleted from the sitemap, or will the 301 redirect take care of that? Best,
On-Page Optimization | | ChristopherGlaeser
Christopher0 -
Domain.com and www.domain.com
I recently changed the settings in Google Webmaster Tools so that domain.com and www.domain.com are the same. Several quick questions. About how long will it take for Google to list www.domain.com and stop listing domain.com? The .htaccess file uses a 301 to redirect all domain.com paths.to www.domain.com paths. Now that Google has been informed these two are the same, are the 301 rules to add www necessary? The default page is index.php. so domain.com gets 301 to www.domain.com gets 301 to www.domain.com/index.php. Is this the correct way to do things? Are there SEO consultants who will help on small projects such assist on issues like this? Best,
On-Page Optimization | | ChristopherGlaeser
Christopher0