One robots.txt file for multiple sites?
-
I have 2 sites hosted with Blue Host and was told to put the robots.txt in the root folder and just use the one robots.txt for both sites. Is this right? It seems wrong. I want to block certain things on one site.
Thanks for the help,
Rena
-
Hi Rena. Yes, if both sites are separate domains that you want to use in different ways, then you should place a different robots.txt file in each domain root so that they're accessible at xyz.com/robots.txt and abc.com/robots.txt. Cheers!
-
Hi Rena,
You technically can do that, but it's not recommended - for the exact reason you state above. More often than not, 2 sites aren't going to have the same set of disallow rules.
Additionally, you should also be using robots.txt files to direct search engines to your XML sitemap, and if you're sharing a robots file, then you can't specify 2 different sitemaps on 2 different domains.
-
Each individual website (and some subdomains if you add them) needs a unique robots.txt file. You can copy the same file and use it again and again on each site, but each one needs a robots.txt file.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking in Robots.txt and the re-indexing - DA effects?
I have two good high level DA sites that target the US (.com) and UK (.co.uk). The .com ranks well but is dormant from a commercial aspect - the .co.uk is the commercial focus and gets great traffic. Issue is the .com ranks for brand in the UK - I want the .co.uk to rank for brand in the UK. I can't 301 the .com as it will be used again in the near future. I want to block the .com in Robots.txt with a view to un-block it again when I need it. I don't think the DA would be affected as the links stay and the sites live (just not indexed) so when I unblock it should be fine - HOWEVER - my query is things like organic CTR data that Google records and other factors won't contribute to its value. Has anyone ever blocked and un-blocked and whats the affects pls? All answers greatly received - cheers GB
Technical SEO | | Bush_JSM0 -
Site Migration from One Dev. and Server to Another Dev. and Server
Hi Mozzers! I've got a client that is in the early stages of moving the development of their site to another company and therefore, a new server. The site is very large and the migration will take place over 18 months. In the beginning, smaller chunks of the site will be moved, and as that process gets dialed in, larger portions will migrate. It was brought to our attention today that they (on either side of development) have not yet worked out the logistics of keeping the domain and URL structure consistent throughout the migration. The initial proposal was that they publish newly migrated pages to a subdomain, which we obviously want to steer away from. I'm now on a mission to find a solution that will make everyone happy; client, old dev, new dev, and us (as the SEO partner). Does anyone have experience in managing SEO through a migration such as this?
Technical SEO | | LoganRay0 -
Would merging a site with strong DA with one that has weak DA be a smart move?
I am working on a project for a client that has two ecommerce sites each with several thousands of products. Site A has a strong DA, is ranking well on Google for thousands of competitive keywords and generating high traffic and conversions. Site B has a poor DA, ranking poorly and much less traffic. We are considering the idea of merging the 5,000+ product pages from site B into site A. How can we evaluate whether this would be a wise move with the least risk to site A?
Technical SEO | | richdan0 -
Multiple Sites Duplicate Content Best Practice
Hi there, I have one client (atlantawidgets.com) who has a main site. But also has duplicate sites with different urls targeting specific geo areas. I.e. (widgetmakersinmarietta.com) Would it be best to go ahead and create a static home page at these add'l sites and make the rest of the site be nonindexed? Or should I go in and allow more pages to be indexed and change the content? If so how many, 3, 5, 8? I don't have tons of time at this point. 3)If I change content within the duplicate sites, what % do I need to change. Does switching the order of the sentences of the content count? Or does it need to be 100%fresh? Thanks everyone.
Technical SEO | | greenhornet770 -
How can you get the right site links for your site?
Hello all, I have been trying to get Google to list relevant site links for my site when you type in our brand name, Loco2 or for when Loco2 comes up in a search result. Different things come up when you search Loco2 and Loco 2. We would like site links to look like how they do when you search Loco 2. However Loco2 is our brand name, NOT Loco 2. Does anyone know why Google is doing this and whether we can influence results? We have done as much as possible via Google webmaster, in terms of specifying the links we DO NOT want Google to list for Loco2. However, when you search "Loco2", results only show simple site links. Ideally what we want is: Loco2 to be recognised as the brand NOT Loco 2 The same results (substantial, identical) for Loco2 as for Loco 2 (think o2 and o 2) For the site links to reflect the main pages of our site (Times & Tickets, Engine Room forum etc.) Many thanks in advance! Anila
Technical SEO | | anilababla0 -
Multiple URLs
I'm trying to check the URLs of this site- http://www.ofo.com.au, and I see that their old site has 301 re-directed to it...but the site http://ofo.com.au and http://outdoorfurnitureoutlet.com.au are both still up and I can't see any 301 redirects from them. Is it a problem even if when I do a site: search for them I get no results?
Technical SEO | | UnaRealidad0 -
BEST Wordpress Robots.txt Sitemap Practice??
Alright, my question comes directly from this article by SEOmoz http://www.seomoz.org/learn-seo/robotstxt Yes, I have submitted the sitemap to google, bing's webmaster tools and and I want to add the location of our site's sitemaps and does it mean that I erase everything in the robots.txt right now and replace it with? <code>User-agent: * Disallow: Sitemap: http://www.example.com/none-standard-location/sitemap.xml</code> <code>???</code> because Wordpress comes with some default disallows like wp-admin, trackback, plugins. I have also read other questions. but was wondering if this is the correct way to add sitemap on Wordpress Robots.txt http://www.seomoz.org/q/robots-txt-question-2 http://www.seomoz.org/q/quick-robots-txt-check. http://www.seomoz.org/q/xml-sitemap-instruction-in-robots-txt-worth-doing I am using Multisite with Yoast plugin so I have more than one sitemap.xml to submit Do I erase everything in Robots.txt and replace it with how SEOmoz recommended? hmm that sounds not right. User-agent: *
Technical SEO | | joony2008
Disallow:
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-login.php
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: /comments **ERASE EVERYTHING??? and changed it to** <code> <code>
<code>User-agent: *
Disallow: </code> Sitemap: http://www.example.com/sitemap_index.xml</code> <code>``` Sitemap: http://www.example.com/sub/sitemap_index.xml ```</code> <code>?????????</code> ```</code>0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0