Robots.txt disallow subdomain
-
Hi all,
I have a development subdomain, which gets copied to the live domain. Because I don't want this dev domain to get crawled, I'd like to implement a robots.txt for this domain only. The problem is that I don't want this robots.txt to disallow the live domain. Is there a way to create a robots.txt for this development subdomain only?
Thanks in advance!
-
I would suggest you talk to the developers as Theo suggests to exclude visitors from your test site.
-
The copying is a manual process and I don't want any risks for the live environment. A Httphandler for robots.txt could be a solution and I'm going to discuss this with one of our developers. Other suggestions are still welcome of course!
-
Do you ftp copy one domain to the other? If this is a manual process the excluding the robots.txt that is on the test domain would be as simple as excluding it.
If you automate the copy and want code to function based on base url address then you could create a Httphandler for robots.txt that delivered a different version based on the request url host in the http request header.
-
You could use enviromental variables (for example in your env.ini or config.ini file) that are set to DEVELOPMENT, STAGING, or LIVE based on the appropriate environments the code finds itself in.
With the exact same code, your website would either be limiting IP addresses (on the development environment) or allow all IP addresses (in the live environment). With this setup you can also set different variables per environment such as the level of detail that is shown in your error reporting, connect to a testing database rather than a live one, etc.
[this was supposed to be a reply, but I accidentely clicked the wrong button. Hitting 'Delete reply' results in an error.]
-
Thanks for your quick reply, Theo. Unfortunately, this htpasswd will also get copied to the live environment, so our websites will get password protected live. Could there be any other solution for this?
-
I'm sure there is, but I'm guessing you don't want any human visitors to go to your development subdomain and view what is being done there as well? I'd suggest you either limit the visitors that have access by IP address (thereby effectively blocking out Google in one move) and/or implement a .htpasswd solution where developers can log in with their credentials to your development area (which blocks out Google as well).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hundreds of Subdomains under a powerful domain
Hello, I own a domain FeedsPortal - com As you can see from the link profile, it has some fantastic referring domains and links. Because of this, it has a DA of 86, with a good CF/TF. The problem is that nearly all of these powerful links are for the sub domains under the main domain. For example, it has a link on this page on MSN... https://www.msn.com/en-us/news/offbeat/finland-home-of-the-dollar103000-speeding-ticket/ar-AA9GA9i?ocid=ansAtlantic11 On this MSN article, it has a link to http://feedproxy.google.com/~r/TheAtlantic/~3/PK4bw0Tkkps/story01.htm (a link under Google.com), which then forwards to my domain under a sub domain http://theatlantic.feedsportal.com/c/34375/f/..... I have many hundreds of sub domains like this. I have a feeling redirecting all non-existent sub domains to the homepage would be a bad idea for SEO. Does anyone else see of a way to do this without harming my SEO? I suppose the only way to do it properly would be to write articles about each subdomain. For example, http://theatlantic.feedsportal.com, write an article about The Atlantic, then forward all traffic meant for theatlantic.feedsportal.com to feedsportal.com/10-reasons-why-the-atlantic-is-great/ Does anyone else have an idea of how to at least get a list of the non-existant sub domains that have links so I can maybe create articles for each sub domain? Or is there a simpler way to do this. Thanks!
Intermediate & Advanced SEO | | thinkingdif0 -
Launching Brand New Subdomain To Outrank & Outperform Main Domain
Fellow Mozzers! I have a tricky problem that may or may not have a viable solution. It's one that our team are 100% pressing ahead with regardless & I wanted to canvass some opinion on what options I have. We have the Main Company Site which has been going for +15 years, has a DA of 58 and 63,000 links pointing at it. It currently ranks on page 1 for a number of our important keywords. However.... We are now launching a Subdomain, which will be something like marketing.maincompanysite.com which we now want to drive a lot more of our traffic to. In particular for items like new customer acquisition, product discovery etc. In fact we would actually rather the subdomain outrank our main site for branded queries. However the Main Site is still important for existing customers who login to our product and we don't want to do anything that will destabilize it's rankings too much. My questions are: Are there any strategies I can use to get a subdomain (with no links or history) outranking the main site in position 1 on Google? Main site should still rank in position 2! Any other tips to actively take legacy traffic from a main site onto a subdomain seamlessly. Should we just be 301 redirecting unnecessary pages on the old site to new and improved pages on the subdomain? There will still be a 100,000+ pages on the main site, lots of authority and traffic going through it. It's not becoming redundant. Thanks so much guys - hopefully I've explained that okay!!
Intermediate & Advanced SEO | | MattStott40 -
The Great Subdomain vs. Subfolder Debate, what is the best answer?
Recently one of my clients was hesitant to move their new store locator pages to a subdomain. They have some SEO knowledge and cited the whiteboard Friday article at https://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday. While it is very possible that Rand Fiskin has a valid point I felt hesitant to let this be the final verdict. John Mueller from Google Webmaster Central claims that Google is indifferent towards subdomains vs subfolders. https://www.youtube.com/watch?v=9h1t5fs5VcI#t=50 Also this SEO disagreed with Rand Fiskin’s post about using sub folders instead of sub domains. He claims that Rand Fiskin ran only 3 experiments over 2 years, while he has tested multiple subdomain vs subfolder experiments over 10 years and observed no difference. http://www.seo-theory.com/2015/02/06/subdomains-vs-subfolders-what-are-the-facts-on-rankings/ Here is another post from the Website Magazine. They too believe that there is no SEO benefits of a subdomain vs subfolder infrastructure. Proper SEO and infrastructure is what is most important. http://www.websitemagazine.com/content/blogs/posts/archive/2015/03/10/seo-inquiry-subdomains-subdirectories.aspx Again Rand might be right, but I rather provide a recommendation to my client based on an authoritative source such as a Google engineer like John Mueller. Does anybody else have any thoughts and/or insight about this?
Intermediate & Advanced SEO | | RosemaryB3 -
Robots.txt - Do I block Bots from crawling the non-www version if I use www.site.com ?
my site uses is set up at http://www.site.com I have my site redirected from non- www to the www in htacess file. My question is... what should my robots.txt file look like for the non-www site? Do you block robots from crawling the site like this? Or do you leave it blank? User-agent: * Disallow: / Sitemap: http://www.morganlindsayphotography.com/sitemap.xml Sitemap: http://www.morganlindsayphotography.com/video-sitemap.xml
Intermediate & Advanced SEO | | morg454540 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Baidu Spider appearing on robots.txt
Hi, I'm not too sure what to do about this or what to think of it. This magically appeared in my companies robots.txt file (literally magically appeared/text is below) User-agent: Baiduspider
Intermediate & Advanced SEO | | IceIcebaby
User-agent: Baiduspider-video
User-agent: Baiduspider-image
Disallow: / I know that Baidu is the Google of China, but I'm not sure why this would appear in our robots.txt all of a sudden. Should I be worried about a hack? Also, would I want to disallow Baidu from crawling my companies website? Thanks for your help,
-Reed0 -
Do you add 404 page into robot file or just add no index tag?
Hi, got different opinion on this so i wanted to double check with your comment is. We've got /404.html page and I was wondering if you would add this page to robot text so it wouldn't be indexed or would you just add no index tag? What would be the best approach? Thanks!
Intermediate & Advanced SEO | | Rubix0 -
What are the best way to get a new subdomain ranked properly
Our main site (blog with 700 high quality articles) ranks pretty well and we recently launced a rapidly growing forum (55.000 posts in the first 11 weeks) on a subdomain. What would be a good strategy for ranking the forum quickly
Intermediate & Advanced SEO | | xpd1