How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multi-Locations Business Internet Presence
We are at a crossroad and it's time to decide which direction to travel. We had 4 physical locations represented by 3 websites, August 1st we now have 6 addresses and are going to redesign our websites. Heritage Printing .com is our primary and does very well in DC for printing & signage. Heritage Printing Charlotte .com does well in NC for signs. How would you proceed? Build 2 more websites for a total of 5: Heritage Custom Signs & Heritage Custom Signs DC .coms Build a unified site under Heritage Printing .com (w/ subdomains or locations folders) I fear losing our Internet Presence and Page Rank w/ Google by unifying but also fear fueling the ever growing fragmentation of our brand. FYI: We recently Trade Marked Heritage Printing & our logo So fellow Mozzers, what do you recommend and how would you proceed? TY in advance KJr
On-Page Optimization | | KevnJr0 -
How come my Domain Authority is slower by 2 points today. Did I do something wrong?
Hello. I've bein working hard Seo in the last months, i started testing Moz Pro, found many issues with my website which i fixed. Today was the big day i was going to be crawled again. But to my surprise my Moz Authority went down by two point. Did I do something wrong? I also had page rank 1 now i have cero. I going backwards. Please any help would be apreciated, regards.
On-Page Optimization | | ebest.cl0 -
Multiple menu items pointing to same page
I have an automotive dealer as a client. The primary nav has a finance menu item, which (of course) points to the finance related pages. He just requested that I add a finance link item, as child menu items, under the new and used car nav items. Now, this is wrong for a host of reasons, what is the best way to communicate to this to him? I mean, I see this as a usability issue, it's wrong thematically, we would end up having 3 links pointing to the same page. Would this classify as dupe content?
On-Page Optimization | | AfroSEO0 -
One service - multiple locations
Just started working on a website with 2 services and a LOT of locations... So website is something like categorized in states and subcategories are cities... But it only offers 2 services. So each city out of 100 has one page per service... It seems to me like something not pretty good but I have no experience with such sites. I know that it is ok if you have stores on different locations so you can do local for all them but in this case website owner just wanted to rank high for 100 cities and actually he is doing pretty good... But I somehow think that may cause problems in future and google could consider it as a spam, no matter of unique content on 100 pages when it is actually the same... So if you have any advice for this situation, I am listening 🙂
On-Page Optimization | | m2webs0 -
Redirecting to new domain
Hi, I switched my ecommerce site to a new domain, how can I get my new site indexed aneast art building links when my oils site is still indexed?
On-Page Optimization | | GTCarter11870 -
What is the best way to handle small business site architecture?
I do allot of work for small businesses with around 15-20 total pages. What is the best site architecture? For example if its a landscaping website, should there be a services category page and pages under that (domain.com/services/lawn-mowing.html) or should it be flatter (domain.com/lawn-mowing.html) They offer about 10 different services.
On-Page Optimization | | JohnWeb120 -
How to fix duplicate issue among multiple root domains
Hello, I’m doing SEO for one E-commerce website which name is Lily Ann Cabinets & I’ve around 300 different root domains which having same linking structures, same design & even same product database for all 300 websites, but currently I’m focusing only on Lily Ann Cabinets website & trying to get ranking on some targeted keywords, but website is not performing well in Google.com For Example: http://www.lilyanncabinets.com/ (Main Websites)
On-Page Optimization | | CommercePundit
http://www.orlandocabinets.com/
http://www.chicagocabinets.org/
http://www.miamicabinets.org/
http://www.newyorkcabinets.org/
http://www.renocabinets.org/ So please can anyone tell that Will it create duplicate issue in search engines or may be due to this reason website doesn’t have good ranking in search engines, then how can I fix this issue? Do I have to make different structures for Lily Ann Cabinets?0 -
Is it ok to use the H1 tag for bullet points?
Our search results page doesn't have a typical H1 tag because adding a true header would take up space unnecessarily. Therefore, we've assigned the h1 tag to be the breadcrumb. As filters are applied, the breadcrumb grows to include these filters. This breadcrumb is coded as bullet points, even though they're not the typical style of bullet points. Here's a screenshot: http://screencast.com/t/AjGC9iAYR3 For example, the breadcrumb: Home >> NYC Social Media Classes >> Adult >> Manhattan is currently coded as: | |
On-Page Optimization | | mevseo
| | * class="first"><a <span="">href</a><a <span="">="</a>/">Home |
| | * <a <span="">href</a><a <span="">="</a>/nyc/classes/social-media/age-adults/neighborhood-manhattan" class="Selected">Search results |
| | |
| | |
| | id="cat_social-media" type="checkbox" checked onclick="setCategory('social-media')" /> |
| | # style="font-size: 12px; display: inline;">NYC Social Media Classes |
| | <label <span="">for</label>="cat_social-media"> |
| | |
| | |
| | |
| | <nobr>id="age_adults" type="checkbox" checked onclick="setAge('adults')" /><label <span="">for</label>="age_adults">Adults</nobr> |
| | |
| | |
| | <nobr>id="nbhd_manhattan" type="checkbox" checked onclick="setNeighborhood('manhattan')" /><label <span="">for</label>="nbhd_manhattan">Manhattan</nobr> |
| | |
| | | Right now that H1 tag just relates to 'NYC Social media classes', but we'd like to expand it to include both 'Manhattan' & 'Adults' - would that be ok? And if so, would it be better to put the tag before and after the tag?0