How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
titles length, URL length and meta descriptions on a subdomain effecting SEO on main domain?
Hi all, I am currently evaluating areas for optimization on my main domain. When doing this, Moz has identified multiple titles and urls that should be shortened and missing meta descriptions on my subdomain (a help center of sorts). As far as I am aware, we have not set up any "no-index" rules for this subdomain. Are these items affecting SEO on my main domain? Thanks,
On-Page Optimization | | annegretwidmer
Kasey0 -
How to format URL if main key word is my domain name?
Hello All! I have a question about having my search term in my URL when the first two words are actually my domain name. For example, my domain is plutobeach.com and I want to optimize for events. Which is better? plutobeach.com/events plutobeach.com/plutobeach-events Is the latter keyword stuffing? I'm using the on-page grader here and wondering how much of a difference that can make. Thanks! Steve
On-Page Optimization | | recoil0 -
Are there detrimental effects of having multiple robot tags
Hi All, I came across some pages on our site that have multiple robot tags, but they have the same directives. Two are identical while one is for Google only. I know there aren't any real benefits from having it set up this way, but are there any detrimental effects such as slowing down the bots crawling these pages? name="googlebot" content="index, follow, noodp"/> Thanks!
On-Page Optimization | | STP_SEO0 -
Is there a limit to the number of duplicate pages pointing to a rel='canonical ' primary?
We have a situation on twiends where a number of our 'dead' user pages have generated links for us over the years. Our options are to 404 them, 301 them to the home page, or just serve back the home page with a canonical tag. We've been 404'ing them for years, but i understand that we lose all the link juice from doing this. Correct me if I'm wrong? Our next plan would be to 301 them to the home page. Probably the best solution but our concern is if a user page is only temporarily down (under review, etc) it could be permanently removed from the index, or at least cached for a very long time. A final plan is to just serve back the home page on the old URL, with a canonical tag pointing to the home page URL. This is quick, retains most of the link juice, and allows the URL to become active again in future. The problem is that there could be 100,000's of these. Q1) Is it a problem to have 100,000 URLs pointing to a primary with a rel=canonical tag? (Problem for Google?) Q2) How long does it take a canonical duplicate page to become unique in the index again if the tag is removed? Will google recrawl it and add it back into the index? Do we need to use WMT to speed this process up? Thanks
On-Page Optimization | | dsumter0 -
Can "window.location" javascript on homepage affect seo?
Hi! I need to add a splashpage to my wordpress site. I use "window.location" javascript on the homepage to redirect on the splashpage (controlled by cookie to redirect only for the first access). Can this technique affect the SEO on homepage? Thanks in advance!
On-Page Optimization | | StudioCiteroni0 -
Help an SEO-DUMMY : ) Established hyphenated domain...redirect?!...new domain?!
Hello, everybody. I am definitely not an SEO specialist. My family owns a transportation business (since 2010) and i am the one responsible for the website (until we find a good SEO company). My question: Several years ago i did not know much about SEO and have chosen a domain name www.airporttransportation-limo.com (it is not the actual domain...just an example...i'm not sure if i can post the real website here) and another domain that is just the name of our company (it also has hyphen in it). Both websites are still doing good and we receive quite a bit of traffic, but i read more an more about how hyphenated domains and domains with more then two worlds can be bad for your SEO/business/traffic. I feel like the websites are stuck and not moving up any more..could that be because of the hyphens? I registered another domain that is the name of our company (which is well known by now) without any hyphens. Now i have no idea what to do. Should i redirect both old domains (old websites are different and do not have duplicate content) to the new one, or should i just redirect the old domain (just the name of our company with hyphen) to a new one (without hyphen) and leave the www.airportransportation-limo.com as is... Or maybe i should register another domain without any hyphens (two words only) and redirect the www.airporttransportation-limo.com to it... I am very nervous to make any changes and loose all the traffic. My family will kill me. Please help! I'm lost!
On-Page Optimization | | KL20140 -
One site, one location, multiple languages - best approach?
Hey folks, Has anyone created a multilingual site targeted at a single location? I have a site that I need to create which is targeting users in Spain. There are going to need to be English and Spanish versions of the text. My thoughts would be to handle it this way: 1. Geolocate the entire site to spain 2. Have the english content in a folder /en/ 3. Have the spanish content in a folder /es/ As far as I am aware the same content in another language is not considered duplicate content and Google should handle folks searching in spanish or english and show them the correct landing page. Sounds easy enough in principle but I also have these other options to seemingly solidify the approach: 4. Add: rel="alternate" hreflang="x" (3) 5. Add language information to a sitemap (4) Again, none of that seems terribly difficult but would welcome any feedback and particularly experience of multilingual sites targeting a single location. Thanks all Marcus References and info 1. Multi Regional:
On-Page Optimization | | Marcus_Miller
http://googlewebmastercentral.blogspot.co.uk/2010/03/working-with-multi-regional-websites.html 2. Multi Language:
http://googlewebmastercentral.blogspot.co.uk/2008/08/how-to-start-multilingual-site.html 3. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 4. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=26208650 -
How would you handle network header links?
Some companies have a lot of sites covering various topics, for example, http://ninemsn.com.au/. Each category also have dropdown menus where there are more links, taking their pages to well over 100 links. Should these headers be implemented in javascript? Is there a list of best practices somewhere when dealing with a lot of network sites? I'd prefer to reduce the number of links, but sometimes company policies don't allow this. Any suggestions or tips would be helpful.
On-Page Optimization | | bigpond0