How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is robots.txt file issue?
I hope you are well. Mostly moz send me a notification that your website can,t be crawled and it says me o check robots.txt file. Now the Question is how can solve this problem and what should I write in robots.txt file? Here is my website. https://www.myqurantutor.com/ need your help brohers.... and Thanks in advance
On-Page Optimization | | matee.usman0 -
Snippet showing as domain name with apostrophe, instead of page title when searching for the domain name.
Hi, We have an issue with one of our websites, with the snippet dispaying differently in Google serps when searching for the domain or the website name rather than a search term. When searching for a search term, the page title shows as expected, but when searching for the site by the domain name either with or without the tld, it shows the snippet as the domain name with an apostrophe at the end. Domain is subli.co.uk Thanks in advance for any advice!
On-Page Optimization | | K3v1n0 -
How to handle lots of outbound links
I decided to create a page on my website where I would list all of my favorite resources and 3rd party tools. There are now 35 links in the main content section of the page, all with anchor text, pointing to websites in my industry. My question is this: what is the best practice here? Should I add nofollow tags to the links? Should I do something else to indicate that these links shouldn't be crawled? Frankly, I don't mind passing some link juice to these tools (in this case, and from this particular website), but I might make a different decision with a client's website. Does anyone have any thoughts on this? Here is the page I'm referring to, in case anyone wants to look: http://willmarlow.com/resources-2/the-digital-marketing-toolbox/. Thanks!
On-Page Optimization | | williammarlow0 -
Redirecting a Parked Domain
A client's friend has had a parked domain for over 11 years. He has done nothing with it but throw a couple of ads on it. It has some traffic, most likely from typos: the domain domain is somewhat similar to a leading search engine. But the parked domain has no content, no links, no DA. I can't see the benefit of the redirect. My client might get a little traffic -- but the parked domain has nothing to do with his niche. Am I missing something? Is there any risk involved in the redirect (beyond unqualified traffic that isn't valuable?)
On-Page Optimization | | DanielFreedman0 -
Better page optimization for specific locations
I have a client that gets great ranking in a certain city mainly because that is their main corporate office and the address and city name is all over the place in their content. I am about to embark on getting them higher ranking in other cities as well and am looking for the best approach to make that possible. My thoughts... 1- create seperate content for the other locations, but the body information would probably end up looking duplicate, but I could be more specific with title, description and content realting to that specific city. 2- add the additional cities to the current content??? Need some expert advice. Thanks
On-Page Optimization | | brantwadz0 -
Do anchor links pointing to bottom/top of page count as link?
As the title says: Do anchor links pointing to bottom/top of page count as link? This page: http://www.betxpert.com/forum/bookmakere/vis/ladbrokes-kommentar I has over 300 links, but I don't see that many links. Is it the "#15" and the top/bottom of page anchors that count? Is this harmful in terms of link juice? -Rasmus
On-Page Optimization | | rasmusbang0 -
Redirecting to an exact match root domain: good, bad, or neutral?
We have a client who wants to secure an exact match domain for their new website, but it's very long. They're wondering about securing an additional domain that is much shorter for marketing purposes (business cards, email addresses, etc). We would then 301 redirect the short domain to the main domain. Are we going to see negative SEO implications from that?
On-Page Optimization | | MackenzieFogelson1 -
Www1 and www domain
hi, I have a client who has an e-commerce business. My client does not want to fill the pages with too much content and has set up a www1 version with the same domain-name as the www. The plan is to create a lot of content and push www1 in ranking and then sending users (via links) to the www for ordering. Although there will be no duplicate content published on www and www1 this seems like an odd strategy, especially since the www already has a good page rank, and I'm not sure about how engines view a www.domain.com and www1domain.com situation even with unique content in each. Any thoughts?
On-Page Optimization | | vibelingo0