Robots.txt disallow subdomain
-
Hi all,
I have a development subdomain, which gets copied to the live domain. Because I don't want this dev domain to get crawled, I'd like to implement a robots.txt for this domain only. The problem is that I don't want this robots.txt to disallow the live domain. Is there a way to create a robots.txt for this development subdomain only?
Thanks in advance!
-
I would suggest you talk to the developers as Theo suggests to exclude visitors from your test site.
-
The copying is a manual process and I don't want any risks for the live environment. A Httphandler for robots.txt could be a solution and I'm going to discuss this with one of our developers. Other suggestions are still welcome of course!
-
Do you ftp copy one domain to the other? If this is a manual process the excluding the robots.txt that is on the test domain would be as simple as excluding it.
If you automate the copy and want code to function based on base url address then you could create a Httphandler for robots.txt that delivered a different version based on the request url host in the http request header.
-
You could use enviromental variables (for example in your env.ini or config.ini file) that are set to DEVELOPMENT, STAGING, or LIVE based on the appropriate environments the code finds itself in.
With the exact same code, your website would either be limiting IP addresses (on the development environment) or allow all IP addresses (in the live environment). With this setup you can also set different variables per environment such as the level of detail that is shown in your error reporting, connect to a testing database rather than a live one, etc.
[this was supposed to be a reply, but I accidentely clicked the wrong button. Hitting 'Delete reply' results in an error.]
-
Thanks for your quick reply, Theo. Unfortunately, this htpasswd will also get copied to the live environment, so our websites will get password protected live. Could there be any other solution for this?
-
I'm sure there is, but I'm guessing you don't want any human visitors to go to your development subdomain and view what is being done there as well? I'd suggest you either limit the visitors that have access by IP address (thereby effectively blocking out Google in one move) and/or implement a .htpasswd solution where developers can log in with their credentials to your development area (which blocks out Google as well).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use the Change of Address in Search Console when moving subdomains to subfolders?
We have several subdomains for various markets for our business. We are in the process of moving those subdomains to subfolders on the main site. Example: boston.example.com will become example.com/boston And seattle.example.com will become example.com/seattle and so on. It's not truly a change of address, but should I use the change of address tool in GSC for all of these subdomains moving?
Intermediate & Advanced SEO | | MJTrevens0 -
Block subdomain directory in robots.txt
Instead of block an entire sub-domain (fr.sitegeek.com) with robots.txt, we like to block one directory (fr.sitegeek.com/blog).
Intermediate & Advanced SEO | | gamesecure
'fr.sitegeek.com/blog' and 'wwww.sitegeek.com/blog' contain the same articles in one language only labels are changed for 'fr' version and we suppose that duplicate content cause problem for SEO. We would like to crawl and index 'www.sitegee.com/blog' articles not 'fr.sitegeek.com/blog'. so, suggest us how to block single sub-domain directory (fr.sitegeek.com/blog) with robot.txt? This is only for blog directory of 'fr' version even all other directories or pages would be crawled and indexed for 'fr' version. Thanks,
Rajiv0 -
How to make Google index your site? (Blocked with robots.txt for a long time)
The problem is the for the long time we had a website m.imones.lt but it was blocked with robots.txt.
Intermediate & Advanced SEO | | FCRMediaLietuva
But after a long time we want Google to index it. We unblocked it 1 week or 8 days ago. But Google still does not recognize it. I type site:m.imones.lt and it says it is still blocked with robots.txt What should be the process to make Google crawl this mobile version faster? Thanks!0 -
Root Domain v Subdomain
Hi, Just doing some analysis on a domain, and the (external) linking root domains show as:
Intermediate & Advanced SEO | | bjs2010
21 to Root Domain
4 to Subdomain The site is hosted under the www. subdomain version and there is no 301 from domain to www.domain Should the site be: Hosted on the root domain instead of subdomain 301 all incoming requests on domain to point to www.domain (subdomain) Any comments and experience on this type of situation appreciated!0 -
Subdomain Metrics Links??
I have been analysing my companies website against our competitors and we beat them hands down on everything apart from the total links in the subdomain metrics. Our competitor jumped above us a couple of months ago to grab the number one spot for our industries most valuable keyword. They have had a new website designed and after looking at the source code and running it through SEO MOZ in comparison to our site I can't see how they have manged to do it. We beat them hands down on all factors apart from subdomain metrics > Total links where they have twice as many. When it comes to Page Specific Metrics and Root Domain Metrics we easily beat them on all factors. Does anyone have any ideas what I need to do to improve the subdomain metrics? Thanks
Intermediate & Advanced SEO | | Detectamet0 -
Subdomain blog vs. subfolder blog in 2013.
Having read this ( http://www.seomoz.org/q/blog-on-a-subdomain-vs-subfolder ) & countless of blog posts on never to put your blog on a domain because a subdomain is treated as a different site & your blog traffic won't help with your main sites authority. I've always pushed for subfolder blogs. However I've been seeing a lot of blogs now and days saying that Google is now treating subdomains as the same site as your main site. http://www.brafton.com/news/subdomains-vs-subdirectories-for-seo-no-serp-benefits-for-subdomains-anymore http://webmasters.stackexchange.com/questions/34173/subdomains-vs-subdirectory-status-as-of-2012/34366#34366 ETC... What does everyone think? Is it acceptable to have a blog in a subdomain in 2013? Thanks!
Intermediate & Advanced SEO | | DCochrane0 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
If we add noindex to a subdomain, will the traffic to that subdomain still generate domain authority for the primary domain?
We are trying to decide whether a password protected site, that we will noindex, should be set up as a subdomain or if it should be its own domain. The determining factor here is whether or not having that noindexed subdomain will increase domain authority since its noindexed. Any ideas???
Intermediate & Advanced SEO | | grayloon0