How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved I have lost SEO Ranking while removing www from domain
I have lost search SEO ranking for 4-6 core keywords while removing www from domain switch.
On-Page Optimization | | velomate
Referring domain: https://cashforscrapcarsydney.com.au/ Earlier the domain was in the format: https://www.cashforscrapcarsydney.com.au/ But when I checked the search result, search engines had not yet crawled to the new format. Let me know if the server change or any algorithm hit might cause it. Also please share the feedback on - does removing www from the domain losses keyword ranking. Helpful replies are needed.0 -
How to handle "app" pages.
Hey guys, We've got an app - a drag & drop email builder - and we are looking to improve our seo efforts. That being said - we're not sure how to treat pages of the app that wouldn't tell google nothing at all basically (loads of duplicate content, lorem ipsum, etc). They're pages that are used by the clients to build their own templates ex: builder pages they are extremely useful for our clients, but GGL wouldn't prolly make too much sense out of them. That being said - rather randomly, before we nofollow noindexed them, some of them started ranking (probably given to the really great analytics data we have on them. Loads of clients, loads of time spent on page, etc). Can we harness them in a better way, or just nofollownoindex them? I don't really see how they can be "canonicalised" since they don't really provide any quality content for Google. Much like MOZ's keyword explorer tool for ex. Mucho quality for us - but not a google fan favorite content-wise. Thanks for your help 🙂
On-Page Optimization | | andy.bigbangthemes0 -
SEO Moz Says I have Multiple Page Title Elements
I am getting this recommendation from page grader. I recently added Yoast to my site. Can you confirm that I really have multiple title tags because my seo guy says it is fine and it is just a problem with the crawl. I think I see it on line 18 and line 125 am I correct? urbanforestprofessionals.com
On-Page Optimization | | lehcherry0 -
Robots file include sitemap
Hello, I see that google, facebook and moz... have robots.txt include sitemap at the footer.
On-Page Optimization | | JohnHuynh
Eg: http://www.google.com.vn/robots.txt Sitemap: http://www.google.com/sitemaps_webmasters.xml
Sitemap: http://www.google.com/ventures/sitemap_ventures.xml Should I include my sitemap file (sitemap.xml) at the footer of robots.txt and why should do this? Thanks,0 -
Changing Location & Losing Location Based Keywords
We're a web agency, and we've just moved from Bromley to Sevenoaks. Our website ranks really well for all Bromley led keywords and regularly brings enquiries, but we're no longer based in Bromley. We don't want to lose our rankings, but clearly we can't claim to be based in Bromley when we are now in Sevenoaks. Obviously, we also need to start using Sevenoaks to build up traction for searches in our new area. So the question is - should we create a Bromley or Sevenoaks focused landing page (or both)? Should we change all references of Bromley to Sevenoaks across the site in one go or gradually? Would it be terrible to leave Bromley onsite? Thanks!
On-Page Optimization | | Ecce0 -
Multiple Sizes of eCommerce Product Best Practice
I sell a product that comes in 9 different sizes, two materials, and two different shapes. People often search for this product by size, material and shape as follows: #9 material1 square widget #5 material2 circle widget The dilemma I'm facing is should I create 1 page for each of these products resulting in 36 different pages, or should I create one page that the users can select size shape and material? I'm thinking that from a usability stance, the 36 different pages are easier to navigate and determine price on, but I'm afraid that going the route that is easier for the customer to use in this case could hurt me duplicate content wise. I'm all about making a good user experience, but don't want to hurt myself because the content on all 9 sizes is basically the same. Are images of the product enough to be considered non-duplicate content? I also list out the dimensions of each product, but beyond that there isn't much to delineate the content. My plan is to create one page with all the content that relates to all of the products as a top level page with links to the individual products broken down, but just wanted to get some feedback from you guys before making the effort.
On-Page Optimization | | kadesmith0 -
Do you handle website on-page/site optimization?
Our website is gaining traction and we are looking for an individual who is an expert at examining websites to see if the structure and on-page optimization is being optimized, and what. Please contact me if you are interested. Thank you.
On-Page Optimization | | balboafinance0 -
Should I include location in title tag to rank higher in local search
I'm working on a site for a small guest house (http://www.tommysonthebeach.com). I have created a Google Place page (Bing and Yahoo Local) as well and I have the address in the footer on every page. I have the location (Indian Rocks Beach) at the beginning of most titles tags because that is how people tend to search, e.g. "Indian Rocks Beach vacation rental." In theory I would think that I don't need location in the title tag because Google knows the location, and I could use the real estate for other keywords suchs as "pet friendly" or "beach hotel," etc. But when I look at the SERPS, those ranking highly all seem to have the location at the beginning of the title tag. Thanks. P.S. The site is currently not showing up in Google local search apparently because Google thinks it's a vacation rental agency, which are not allowed in local search. I'm trying to get that fixed.
On-Page Optimization | | bvalentine0